Organizational transformation with a significant digital presence is like navigating a major city’s subway system. You get on one train, only to get off at the next stop and board another. And so it continues. But, unlike a hapless visitor clinging to the hope that the next leg will be the final one, you understand that your trip is never really going to end.
Organizational transformation is a journey that never stops. It’s up to you to keep steering in the correct direction. Accordingly, staying on the right course with your organizational transformation mandates your continued evolution as a leader as well.
A successful transformation is certainly a testimony to your leadership skills. Not everyone can help devise, implement, and motivate to the level required by sweeping change—the steep failure rate attests to that. And, as an accomplished leader, you’re well aware that meaningful change can’t possibly take place without equal conviction and energy from everyone on your team. Acknowledged inclusion is everything.
But the challenge is by no means over. As you and your organization move forward, so too must you continue to hone, sharpen, and, if need be, change elements of your leadership. Your organization isn’t what it was yesterday—and neither should you stay the same.
Organizations of all types have shredded hundreds of millions of dollars on poorly planned and executed swipes at transformation. Although the reasons for those pervasive missteps can vary, one undeniable mistake has to do with finances—an organization’s failure to view investment technology as an ongoing factor in the overall direction of the organization.
Below are eight lessons every leader must embrace if they want to avoid the same misfortunes:
1. Follow the money
They must focus on the business and understand how it creates value. Leaders must understand the business strategy: Regarding each product market, is the firm in an exploratory or exploitative posture? This assumes that the organization has a strategy that is well articulated and supported by appropriate structures, processes, and information.
2. Know that managing technology is as important as the technology itself
The next generation of leaders will understand—or should if they want to succeed—that they must invest in managing technology as well as in the technology itself. If there is any remaining doubt today, there will not be in the future; technology, per se, is an equalizer. Only in the management of it can firms eke out an advantage.
3. Understand what technology does
Unless they appreciate that technology often plays a critical role in establishing or maintaining a strategic position, future leaders may well spend inappropriately. But that appreciation must evolve to an understanding of how the various types of technology—those that enable transactions, decisions, or relationships, for example—contribute to an organization’s strategic actions. More often than not, in the first half century of technology, it was thought about only tactically.
4. See through walls
Tomorrow’s leaders will be far more comfortable with deriving value through partnerships and other types of engagements. They must understand the role technology plays in enabling these partnerships and learn to manage the technology that stretches across internal organizational boundaries.
5. Manage business and technology as one
The moments of dissension and the finger-pointing at failures will disappear as executives come to see that technology failure is often due to weak or nonexistent business strategy or failure to create a business-driven technology strategy. Alignment will increasingly be seen as only the first step; it will occur to all that the design and management of business cannot be done apart from the design and management of technology.
6. Scrap the org chart
We are already seeing the blending of corporate roles. It will be commonplace in the future. Leaders will have to be comfortable in both the business and technology realms. This re-identity is already underway.
7. Get underneath the hood
Leaders of the next generation must be able to discern business processes below the overarching posture of an organization that advance strategy. Moreover, they must see technology as part and parcel of these processes; the two are inseparable. This is going to require untying the functional straitjackets in which many organizations have existed.
8. Get comfortable with speed
Is it too much of a cliché to say that everything will move faster and faster, that interconnectivity makes the whole world our playing field, and that we must give up command and control so that our people on the edges of the organization can react to events in real time? And that this is an entirely new way of thinking about management and leadership? And that even if we pay lip service to it today, it will be very hard to accomplish?
Above all, what really matters moving forward is your understanding and recognition of the scope of change, in which you are a valuable player. Armed with that powerful mindset, you’ve positioned yourself to be just as aware of how different your organization will look in the future—and how you can adapt to help that journey continue toward growth and further success.
Deep fake or deepfake technology as AI or artificial intelligence as a biometrics fake visual... [+] Getty
Facial recognition software has become increasingly popular in the past several years. It is used everywhere from airports, venues, shopping centers and even by law enforcement. While there are a few potential benefits to using the technology to prevent and solve crimes, there are many concerns about the privacy, safety and legislation regarding the use of the technology.
Facial recognition technology uses a database of photos, such as mugshots and driver’s license photos to identify people in security photos and videos. It uses biometrics to map facial features and help verify identity through key features of the face. The most key feature is the geometry of a face such as the distance between a person’s eyes and the distance from their forehead to their chin.
This then creates what is called a “facial signature.” It is a mathematical formula that is then compared to a database of known faces. The market for this technology is growing exponentially. According to a research report “Facial Recognition Market” by Component, the facial recognition industry is expected to grow $3.2 billion in 2019 to $7.0 billion by 2024 in the U.S. The most significant uses for the technology being for surveillance and marketing. This, however, raises concerns for many people.
The main reason for concerns amongst citizens is the lack of federal regulations surrounding the use of facial recognition technology. Many are worried about how accurate the technology is and if there are biases and misinformation in these technologies. One issue, for example, is that the technology has been proven in multiple studies to be inaccurate at identifying people of color, especially black women.
Another major concern is the use of facial recognition for law enforcement purposes. Today, many police departments in the U.S., including New York City, Chicago, Detroit and Orlando, have begun utilizing the technology. According to a May 2018 report, the FBI has access to 412 million facial images for searches.
Not only is this a concern with the possibility of misidentifying someone and leading to wrongful convictions, it can also be very damaging to our society by being abused by law enforcement for things like constant surveillance of the public. Currently, the Chinese government is already using facial recognition to arrest jaywalkers and other petty crimes that cause debate amongst what is considered basic civil rights and privacy issues versus protecting the public.
Accuracy and accountability are necessary when it comes to the use of technology, especially regarding the justice system. The concerns have not gone unnoticed by politicians and many cities have started to create legislation around these issues. Oregon and New Hampshire have banned the use of facial recognition in body cameras for police officers. California cities, such as San Francisco and Oakland, and some cities in Massachusetts have outlawed certain uses of facial recognition technology for city officials including law enforcement.
The Utah Department of Public Safety has also put forth some bans on the use of facial recognition for active criminal cases. Law enforcement in Utah claim that the use of facial recognition software helps keep dangerous criminals off the streets, but advocates say that there is no checks and balances when it comes to the system. Recent pushes from Portland, Oregon show that they are soon to follow suit.
The latest legislation push to put limitations on facial recognition technology is a California bill, AB 1215, also referred to as the Body Camera Accountability Act. This bill will temporarily stop California law enforcement from adding face and other biometric surveillance technology to officer-worn body cameras for use against the public in California.
According to the ACLU of Southern California, “AB 1215 is a common-sense bill that rightly concludes that keeping our communities safe doesn’t have to come at the expense of our fundamental freedoms. We should all be able to safely live our lives without being watched and targeted by the government.”
Governor Gavin Newsom must decide whether or not to sign it into law by October 13. If he does, it will go into effect in January. Law enforcement isn’t the only issue with the technology that is of concern. U.S. Customs and Border Protection in partnership with Delta have added facial scanning to the Atlanta airport’s Concourse E, its Detroit hub, boarding gates in Minneapolis and Salt Lake City, and this month to Los Angeles International Airport.
The use of this technology causes concerns about how much people are being watched and if hackers can access this data causing more harm than good. “Facial recognition really doesn’t have a place in society,” said Evan Greer, deputy director of Fight for the Future. “It’s deeply invasive, and from our perspective, the potential harm to society and human liberties far outweigh the potential benefits.”
With the vast number of concerns and privacy issues surrounding facial recognition software and its use, cities around the U.S. will face more dilemmas as they attempt to tackle these issues. AI and facial recognition technology are only growing and they can be powerful and helpful tools when used correctly, but can also cause harm with privacy and security issues. Lawmakers will have to balance this and determine when and how facial technology will be utilized and monitor the use, or in some cases abuse, of the technology.
Nicole Martin is the owner of NR Digital Consulting and host of Talk Digital To Me Podcast. She has worked in many different industries on customer journeys, website management, social…
The neural representations of a perceived image and the memory of it are almost the same. New work shows how and why they are different. Memory and perception seem like entirely distinct experiences, and neuroscientists used to be confident that the brain produced them differently, too. But in the 1990s neuroimaging studies revealed that parts of the brain that were thought to be active only during sensory perception are also active during the recall of memories.
“It started to raise the question of whether a memory representation is actually different from a perceptual representation at all,” said Sam Ling, an associate professor of neuroscience and director of the Visual Neuroscience Lab at Boston University. Could our memory of a beautiful forest glade, for example, be just a re-creation of the neural activity that previously enabled us to see it?
“The argument has swung from being this debate over whether there’s even any involvement of sensory cortices to saying ‘Oh, wait a minute, is there any difference?’” said Christopher Baker, an investigator at the National Institute of Mental Health who runs the learning and plasticity unit. “The pendulum has swung from one side to the other, but it’s swung too far.”
Even if there is a very strong neurological similarity between memories and experiences, we know that they can’t be exactly the same. “People don’t get confused between them,” said Serra Favila, a postdoctoral scientist at Columbia University and the lead author of a recent Nature Communications study. Her team’s work has identified at least one of the ways in which memories and perceptions of images are assembled differently at the neurological level.
Blurry Spots
When we look at the world, visual information about it streams through the photoreceptors of the retina and into the visual cortex, where it is processed sequentially in different groups of neurons. Each group adds new levels of complexity to the image: Simple dots of light turn into lines and edges, then contours, then shapes, then complete scenes that embody what we’re seeing.
In the new study, the researchers focused on a feature of vision processing that’s very important in the early groups of neurons: where things are located in space. The pixels and contours making up an image need to be in the correct places or else the brain will create a shuffled, unrecognizable distortion of what we’re seeing.
The researchers trained participants to memorize the positions of four different patterns on a backdrop that resembled a dartboard. Each pattern was placed in a very specific location on the board and associated with a color at the center of the board. Each participant was tested to make sure that they had memorized this information correctly — that if they saw a green dot, for example, they knew the star shape was at the far left position.
Then, as the participants perceived and remembered the locations of the patterns, the researchers recorded their brain activity. The brain scans allowed the researchers to map out how neurons recorded where something was as well as how they later remembered it. Each neuron attends to one space, or “receptive field,” in the expanse of your vision, such as the lower left corner.
A neuron is “only going to fire when you put something in that little spot,” Favila said. Neurons that are tuned to a certain spot in space tend to cluster together, making their activity easy to detect in brain scans. Previous studies of visual perception established that neurons in the early, lower levels of processing have small receptive fields, and neurons in later, higher levels have larger ones.
This makes sense because the higher-tier neurons are compiling signals from many lower-tier neurons, drawing in information across a wider patch of the visual field. But the bigger receptive field also means lower spatial precision, producing an effect like putting a large blob of ink over North America on a map to indicate New Jersey. In effect, visual processing during perception is a matter of small crisp dots evolving into larger, blurrier but more meaningful blobs.
But when Favila and her colleagues looked at how perceptions and memories were represented in the various areas of the visual cortex, they discovered major differences. As participants recalled the images, the receptive fields in the highest level of visual processing were the same size they had been during perception — but the receptive fields stayed that size down through all the other levels painting the mental image. The remembered image was a large, blurry blob at every stage.
This suggests that when the memory of the image was stored, only the highest-level representation of it was kept. When the memory was experienced again, all the areas of the visual cortex were activated — but their activity was based on the less precise version as an input. So depending on whether information is coming from the retina or from wherever memories are stored, the brain handles and processes it very differently.
Some of the precision of the original perception gets lost on its way into memory, and “you can’t magically get it back,” Favila said. A “really beautiful” aspect of this study was that the researchers could read out the information about a memory directly from the brain rather than rely on the human subject to report what they were seeing, said Adam Steel, a postdoctoral researcher at Dartmouth College. “The empirical work that they did, I think, is really outstanding.”
A Feature or a Bug?
But why are memories recalled in this “blurrier” way? To find out, the researchers created a model of the visual cortex that had different levels of neurons with receptive fields of increasing size. They then simulated an evoked memory by sending a signal through the levels in reverse order. As in the brain scans, the spatial blurriness seen in the level with the largest receptive field persisted through all the rest. That suggests that the remembered image forms in this way due to the hierarchical nature of the visual system, Favila said.
One theory about why the visual system is arranged hierarchically is that it helps with object recognition. If receptive fields were tiny, the brain would need to integrate more information to make sense of what was in view; that could make it hard to recognize something big like the Eiffel Tower, Favila said. The “blurrier” memory image might be the “consequence of having a system that’s been optimized for things like object recognition.”
Many people chase achievement, assuming it will lead to well-being. They should reverse that order of operations.
Without going too far out on a limb, I believe almost everyone would like two things from their jobs and careers: success and happiness. They want to do relatively well financially, receive fair recognition for their accomplishments, enjoy their work as much as one can, and become happier as a person as a result.
These are reasonable goals, but they can be a lot to ask—so many people, especially ambitious, hard-working people, simplify them in a logical way: They first seek success and then assume that success will lead to happiness. But this reasoning is flawed. Chasing success has costs that can end up lowering happiness, as many a desiccated, lonely workaholic can tell you.
This is not to say that you have to choose between success and happiness. You can obtain both. But you have to reverse the order of operations: Instead of trying first to get success and hoping it leads to happiness, start by working on your happiness, which will enhance your success.
Success and happiness are generally positively correlated, as many workforce studies have shown. For example, companies in Fortune magazine’s “100 Best Companies to Work For” list saw an average 14 percent stock-price increase every year from 1998 to 2005, compared with 6 percent for the overall market.
And as Gallup data have shown, among business units with employee-engagement levels (that is, employees who reported feeling heard, respected, and intellectually stimulated, and who had a best friend at work) in the 99th percentile, 73 percent perform above the company average, and 78 percent perform above the industry average.
From this correlation, many assume causation—from success to happiness. During my years as an executive, I found that people strongly believe that pay increases—especially big ones—will have a large and long-lasting effect on their job satisfaction. The data tell us a different story, however: Large wage increases have only a small and transitory effect on well-being.
Researchers in 2017 tracked the pay and job satisfaction (measured on a 0–10 scale) of nearly 35,000 German workers over several years. The study found that the anticipation of a 100 percent pay bump increases job satisfaction by about a quarter of one digit in the year before the raise. The raise increases that satisfaction bump by another fifth of a digit. By the fourth year, the increase has fallen to less than a fifth in total.
In other words, say your job satisfaction is a six out of 10—not bad, but could be better. If your boss doubles your pay, it will get you to about 6.5, and then it will fall back to about 6.2. Maybe this isn’t the best strategy to help you love your job. And that doesn’t even take into account the cost that increased job success can have on overall life satisfaction. In 2016, psychologists measured career success by asking 990 college-educated full-time professionals to compare their career achievements to others’.
They found that people generally enjoyed the money and status that relative success produced. However, success did not lead to total contentment: It indirectly chipped away at life satisfaction, likely via time constraints, stress, and impoverished social relationships.
Much stronger and more positive results emerge, however, when researchers reverse the order, looking not at success’s effects on happiness, but happiness’s effect on success. Scholars in 2005 surveyed hundreds of studies—including experiments to establish causality—and concluded that happiness leads to success in many realms of life, including marriage, friendship, health, income, and work performance.
One explanation might be that happiness makes us more attractive, so we are rewarded by others. Alternatively, happiness might make us more productive. Novel experimental research suggests both are true. For example, scholars in 2021 studied Chinese livestream web broadcasters, for whom voluntary viewer tips are the primary source of income.
They found that when they showed more positive emotion, their tips immediately increased, suggesting that people who appear happy are rewarded in the market. Another experiment involved British test subjects engaging in a time-limited arithmetic task and math test. The researchers found that subjects who were shown a clip of a comedy movie beforehand were about 12 percent more productive on the task and test than those who weren’t, and that the funnier they found the clip, the more productive they were.
Whether you are an employee or employer, it is a better investment to increase happiness at work and in life, rather than simply trying to increase measures of success.
The first thing to remember is that happiness requires balance. No matter how much you enjoy your work, overwork will become an obstruction to well-being. Researchers in 2020 studying 414 Iranian bank employees found that workaholic behavior (such as perfectionism and work addiction) strongly predicted workplace incivility (such as hostility, privacy invasion, exclusionary behavior, and gossiping).
Workaholic behaviors also degraded the quality of family life (as measured in disagreement with statements such as “My involvement in work provides me with a sense of success; this, in turn, helps me to be a better person in my family”).
You should guard against workaholism in yourself and help your friends and family who suffer from it. But just as important, employers should not encourage overwork—which will likely require effort and attention on their part, as research shows that executives generally underestimate employees’ struggles with well-being.
Once work quantity is under control, happiness at work requires a sense of meaning and purpose. I have written in this column that the two key aspects of meaningful work are earned success and service to others. Earned success implies a sense of accomplishment and recognition for a job well done, while service to others requires knowledge of the real people who benefit from your work.
Lots of research shows the importance of these work aspects. For example, Gallup has revealed that people who serve their communities and receive recognition for it self-report significantly less stress and worry in their lives than those who do not (either because they don’t serve their communities or do not receive recognition).
Meanwhile, the most meaningful jobs tend to be those that are the most service-oriented. According to 2016 research by the Pew Research Center, proportionally, more workers in nonprofit and government sectors—i.e., work that is generally service-oriented—said their jobs give them a sense of identity than did private-sector workers.
It’s harder to find the link to service in some professions than others, but it can usually be done. Years ago, I was working with a team of academic researchers creating policies for improved bank regulation. One scholar who was particularly passionate about the project told me he always remembered that his work mattered, because poor people need access to reasonably priced credit, and that requires less bureaucratic red tape.
Even if you struggle to see who benefits, because the people you touch with your work are very far away or your work touches them indirectly, try looking a little closer—maybe even in the next cubicle. You can always enjoy the effects of service by helping your colleagues, and there is clear evidence that supporting co-workers can help ease negative emotions at work.
Ultimately, although success and happiness are linked, the alchemy is mostly one-way—and not in the way that most people think. Working on your success to get happier is inefficient at best, and may blow up in your face and lead you to unhappiness. But working on your happiness gives you the best chance at getting both.
Even if all of this makes sense to you, you may still find yourself falling into old habits of seeking happiness via worldly success at work. Don’t feel too bad—I do it too, even as a specialist in this field. Whenever I notice my hours creeping up to workaholic levels and my dreams of happiness revolving around some accomplishment, I like to reread a short story published in 1922 by Franz Kafka called “A Hunger Artist.”
It features a man who starves himself in a cage for a living as a traveling carnival act. He is obsessed with his work and, as a perfectionist, seeks what he calls “flawless fasting.” The hunger artist is proud of his success, although he is always gloomy, and, Kafka writes, “if a good-natured man who felt sorry for him ever wanted to explain to him that his sadness probably came from his fasting … the hunger artist responded with an outburst of rage.”
Over time, the hunger artist’s act falls out of public favor. In desperation to resuscitate his flagging career, he tries fasting longer than he ever has before. Instead, he is utterly ignored, and sits alone in his cage. In the end, the hunger artist starves himself to death. In a twist of absurdism—
we might even call it Kafkaesque—the protagonist admits just before expiring that the only reason he had engaged in his art was because he could not find any food to his liking. I’m not that bad, of course, but I have a bit of a hunger artist in me, and you might too. Here’s my advice: You won’t find happiness by forgoing happiness. Don’t starve yourself. Your odds of success will increase if you eat.
“happiness”. Wolfram Alpha. Archived from the original on 18 July 2011. Retrieved 24 February 2011.Anand, P (2016). Happiness Explained: What Human Flourishing is and What We Can Do to Promote It. Oxford University Press. ISBN9780198735458.[page needed]
“How Universal is Happiness?” Ruut Veenhoven, Chapter 11 in Ed Diener, John F. Helliwell & Daniel Kahneman (Eds.) International Differences in Well-Being, 2010, Oxford University Press, New York, ISBN978-0199732739Veenhoven, R. “Does Happiness Differ Across Cultures?”(PDF). Archived(PDF) from the original on 9 August 2017. Retrieved 10 October 2018.
http://www.happinessandwellbeing.org/project-team/Archived 12 October 2018 at the Wayback Machine); “I would suggest that when we talk about happiness, we are actually referring, much of the time, to a complex emotional phenomenon. Call it emotional well-being. Happiness as emotional well-being concerns your emotions and moods, more broadly your emotional condition as a whole.
The How of Happiness, Lyubomirsky, 2007Kashdan, Todd B.; Biswas-Diener, Robert; King, Laura A. (October 2008). “Reconsidering happiness: the costs of distinguishing between hedonics and eudaimonia”. The Journal of Positive Psychology. 3 (4): 219–233. doi:10.1080/17439760802303044. S2CID17056199.“Landes Xavier | Stockholm School of Economics in Riga”. Archived from the original on 30 August 2019. Retrieved 30 August 2019.
If the results hold up, the upshot appears to be that income is pretty strongly related to life satisfaction, but weakly related to emotional well-being, at least above a certain threshold.” Section 3.3, Happiness, Stanford Encyclopedia of Philosophy, https://plato.stanford.edu/entries/happiness/#HedVerEmoStaArchived 2018-06-11 at the Wayback Machine
World Happiness Report 2012 (Report). p. 11. Archived from the original on 18 July 2016. How does happiness come into this classification? For better or worse, it enters in three ways. It is sometimes used as a current emotional report – “How happy are you now?,” sometimes as a remembered emotion, as in “How happy were you yesterday?,”Chernoff, Naina N. (6 May 2002). “Memory Vs. Experience: Happiness is Relative”. Observer. Association for Psychological Science. Retrieved 10 November 2021.
Inge, W.R. (1926). Lay Thoughts of a Dean. Creative Media Partners, LLC. ISBN978-1379053095. Looking back, I think I can separate the years when I was happy and those when I was unhappy. But perhaps at the time I should have judged differently.Helliwell, John; et al. World Happiness Report 2015 (Report). Some have argued that it is misleading to use ‘happiness’ as a generic term to cover subjective well-being more generally.
Lyubomirsky, Sonja; Lepper, Heidi S. (February 1999). “A Measure of Subjective Happiness: Preliminary Reliability and Construct Validation”. Social Indicators Research. 46 (2): 137–155. doi:10.1023/A:1006824100041. JSTOR27522363. S2CID28334702.Watson, David; Clark, Lee A.; Tellegen, Auke (1988). “Development and validation of brief measures of positive and negative affect: The PANAS scales”. Journal of Personality and Social Psychology. 54 (6): 1063–1070. doi:10.1037/0022-3514.54.6.1063. PMID3397865.
Watson, David; Clark, Lee Anna (1994), The PANAS-X: Manual for the Positive and Negative Affect Schedule – Expanded Form, The University of Iowa, doi:10.17077/48vt-m4t2“SWLS Rating Form”. tbims.org. Archived from the original on 16 April 2012. Retrieved 1 April 2012.Diener, Ed; Emmons, Robert A.; Larsen, Randy J.; Griffin, Sharon (1985). “The Satisfaction With Life Scale”. Journal of Personality Assessment. 49 (1): 71–75. doi:10.1207/s15327752jpa4901_13. PMID16367493. S2CID27546553.
Levin, K. A.; Currie, C. (November 2014). “Reliability and Validity of an Adapted Version of the Cantril Ladder for Use with Adolescent Samples”. Social Indicators Research. 119 (2): 1047–1063. doi:10.1007/s11205-013-0507-4. S2CID144584204.“FAQ”. Archived from the original on 31 December 2018. Retrieved 27 January 2019.Inc, Gallup. “Gallup 2019 Global Emotions Report”. Gallup.com.
Aknin, Lara B.; Whillans, Ashley V. (January 2021). “Helping and Happiness: A Review and Guide for Public Policy”. Social Issues and Policy Review. 15 (1): 3–34. doi:10.1111/sipr.12069. S2CID225505120.Hui, Bryant P. H.; Ng, Jacky C. K.; Berzaghi, Erica; Cunningham-Amos, Lauren A.; Kogan, Aleksandr (December 2020). “Rewards of kindness? A meta-analysis of the link between prosociality and well-being”. Psychological Bulletin. 146 (12): 1084–1116. doi:10.1037/bul0000298. PMID32881540. S2CID221497259
Programming code and AI brain (Getty Images/Yuichiro Chino)
I am not myself lately. Then again, was I ever? I’m not the self I was a year ago, or the one I will be in five minutes. My sense of reality is ephemeral, and my circumstances are constantly rewriting the narrative. My brain wants to make sense of all that, though, so it keeps trying to find order and actualization. But what it keeps writing, as Emory University psychology professor Gregory Berns puts it, is its own “historical fiction.”
In his apt and timely new book, “The Self Delusion: The New Neuroscience of How We Invent — and Reinvent — Our Identities,” Berns, author of “How Dogs Love Us,” explores the neuroscience of self perception and the clever, confounding ways we attempt to tell the stories of our lives. Along the way, Berns explains the newest science of how memory, perception and influence play upon our pliable minds, and offers insights into better understanding who we are — and who can be.
Salon spoke to Berns recently about how our brains prime us to create stories — and superstitions, how COVID drove us into a “collective existential crisis” and the secret to shifting the tales we tell ourselves.
I was a newcomer to the concept of computational neuroscience. For those who have not yet read the book, what is this discipline and why is it significant in our understanding of the brain?
Computational neuroscience has been around, in some form or another, probably for fifty years. It used to go by different names. It first started out as AI, artificial intelligence, back in the sixties, then went through various iterations. By the time I was in training in the nineties, it was equivalent to what then was called neural networks, which now underlie everything in AI.
AI has evolved from the fifties and sixties style of AI, where people were hopeful that computers could be trained to do things that humans do.
It evolved into this area where neural nets were discovered. Originally, these neural nets were based on what we knew about the brain, but then they went off on their own. As we have them today, they underlie what we now know is AI. That’s everything from image recognition to self-driving cars.
Computational neuroscience is an umbrella term that covers all of these things, but with a little more emphasis on the neuroscience side, so understanding how the human brain does computations. Then all the AI people take that and put their twist on it and make computer algorithms.
We think of computers as operating like brains, but also our brains work like computers.
“The brain is fundamentally a prediction computer.”
That’s right. A computer’s obviously man made, but it’s an analogy. The brain is a type of computer. In particular, I think, along with a lot of neuroscientists, that it’s fundamentally a prediction computer or a prediction engine. That what brains evolve to do, which is try to make internal models of how the world works so the owner of that brain can survive and outwit competitors. Or if they’re prey, to avoid predators, always just staying one step ahead of things.
The better prediction that the brain does, the better the person or the animal will do. There’s been a strong evolutionary pressure to make brains very good at anticipating things that might happen in the future.
We now live in a world where anticipating things has bitten us in the butt, because we also live in this state of heightened anxiety. You open the book by talking about the self that is, at its simplest terms, our past self, our perceived current self and our future self.
What you just described is the beauty of what the brain does. I don’t think this is necessarily specific to humans, I think all animals do this to varying degree. It’s just that humans overlay it with a narrative on top so that we have a way of putting meaning on things, if you will.
The anticipation bit is not just about what’s going to happen. It’s the consideration of the world of things that might happen. And not only that. We also have the capacity to look back in time and imagine things that might have been, the what-if scenarios. These are all various forms of predictions that the brain has evolved to do, to help humans in particular flourish in this world.
It feels like the past few years, there has been a deeper understanding interest in this study of the self and self-perception. Our anticipation is different from the experience, which is different from the memory. Why do we need to have that understanding of that truth and those subtleties of our perception?
“COVID has put everyone into a collective existential crisis.”
I think COVID has put everyone into a collective existential crisis. The last few years has been somewhat excessive navel-gazing about why we’re here. Each of us comes to that individually and we each have our own idiosyncratic ways of dealing with that, but the whole conundrum is the curse and the benefit of the human condition.
However it is that we ended up the way we are, we have the skill of conceptualizing ourselves in the past, present and future. It’s not clear that any other animal can do that, not in any substantial way. I look at my dogs, and they’re clearly conscious and sentient, but I am not sure that they have a conception like we do, that they existed yesterday and they’re going to exist tomorrow.
Most other animals don’t have the need for this. If you know that you existed in the past and you know that you’re going to exist tomorrow, how do you make sense of that? If you think about it, that is a pretty awesome understanding. It requires time shifting, it requires a huge cognitive apparatus to do that.
As I maintain in the book, you also have to contend with the fact that we physically change.
Not so much day-to-day, but certainly over the years. If you look back at your childhood pictures, for all intents and purposes, those are different people. They’re not the same person you are today. Physically, there may be some resemblance, but the molecules have been rearranged so many times in your body and your brain that it’s just not the same person.
So, we have this realization that somehow that person was us at a different time, but they’re not us now, and we’re going to be different in a year or ten years. We have to construct some mechanism to link all these together. The way we do that is through narrative and storytelling. We have to, just for our own psychological health, construct something that links all these versions of ourselves together. Otherwise, the alternative is completely existential, that there is nothing unifying past, present and future, and the universe is random. Psychologically, we can’t handle that.
As you point out, while there are distinct cultural and individual differences, there are also some universal ground rules to the ways we construct these narratives. One we can all relate to is, for the most part, that episodic aspect of it. I don’t know any other way to really make sense of my life, but then COVID put us in a very nonlinear narrative.
The analogy I used was like being on a train, where you think of a train ride as a journey between stations, or stops, where not much happens in between. The way you encode it then is the stops, or as you say, episodic.
I think there’s a couple of reasons for that. Probably the most important is that our brains do not appear designed or evolved for continuous recording, or at least recalling things in kind of a continuous fashion. Our brains are not video recorders in the sense that a camera is. It seems as if the memories themselves are laid down in an episodic fashion and those episodes are defined by when things happen.
Most of the day, nothing happens. I don’t think it’s been calculated, but we go through the day, probably 90% of the day is pretty static, and then the other 10% is just stuff happening. That’s going to vary from day to day. When stuff happens, when something in the world changes or something changes in you, those are the things that we encode in memory and those are the things that get stored.
When you recall a memory, you can’t call up the exact recording of what happened. You have these sparse instances that you can call up. But you still have to fill in the gaps somehow, because they’re not just still images, they’re highlights. It’s the highlight reel of the day, or of your life.
The brain has to fill in those gaps. The thing I’ve become fascinated about is, how do you fill in those gaps? The best answer I have is that they’re built on what psychologists call schemas. Or if we want to be mathematical about it, I call them basis functions. These are the templates that get laid down early in life as children. These are the stories that we hear when we’re young, because those are the stories where the child doesn’t have many of their own experiences. Not much has happened. Those are the templates for understanding the world, when the parents tell their kids stories.
These are fairy tales and fables and simple stories, good versus evil. These are going to be culturally different depending on where you grew up, but there are some common themes. Importantly, those are the templates that stay with us throughout our lives and help us interpret these episodic events as they happen to us. They provide a ready framework for slotting things into as they come.
One of the things that is universal also is that fine line between superstition and myth-making and straight-up delusion. It’s a way of creating pattern more than anything else, looking for explanations and answers in things that might otherwise seem random. Is that another aspect of just our brains needing to create order?
It is. And that’s part of the prediction engine that’s baked into all animals’ brains. The brain is a prediction engine. It’s that way because there was, at some point in time, a survival advantage to that, and there still is.
If you think about the alternative, let’s say that life is just a series of random events that are completely unconnected to each other. If that were the case, then there really wouldn’t be any survival advantage to having a predictive brain, because if things were random, then there’s nothing to predict.
The fact that we can predict things is also a reflection of the world that we’ve evolved in, that there is some amount of order there, certainly not 100%, but there’s enough order that brains can extract it. That drives things, even when there is no predictability or causality. It’s not like you can turn off the prediction engine; it’s always going.
That’s where superstitions come from. It’s like if two events happen in close proximity to each other, then the brain’s naturally going to equate them in some causal way, even if they’re not. That’s how superstitions arise. Then you can consider superstitions the building blocks of storytelling or fables.
It doesn’t take much to spin up a superstition into something quite elaborate. Whether you call it a delusion or to talk about conspiracy theories, it doesn’t take much.
That leads us into groupthink and the double-edged sword of living in a social environment, because we need each other, we take our cues from each other. We are impacted in our morality by each other. Looking at this country today, do we seem more polarized, or are we actually more polarized? And what is that in our brains that we can learn from?
“We’re definitely polarized. The question then is why.”
We’re definitely polarized. I think the question then is why. It comes back to, okay, we humans have to ascribe meaning to events because that is the nature of being human. We tell stories. In this country, we’ve basically got a series of events have happened. Whether it’s COVID, climate change, politics, you can pick any one of those things.
Some things happen, and we can agree on specific events that happened probably because they’ve been recorded in various forms from the media. But the interpretation of them is vastly different. The thing that’s fascinating about all of this is, how can two people have completely diametrically opposite views of what happened? How does that come about? The answer is because they have different basis functions to interpret the events. They have different schemas.
You take January 6th. Perfect example. You’ve got a sizable portion of the country that looked at those events and interpret it in one narrative framework, one schema. Then you’ve got a whole bunch of other people who interpret it entirely different, and they’re operating on different schemes. They’re different narrative basis functions.
The book is called “The Self Delusion,” but you later more deeply describe it as the “self historical fiction.” What does that mean when you say that our concept of the self is historical fiction?
It means that the interpretation of our past. Self-identity comes from the story that you tell about your life, which is the historical part. But it is a story. I hope to convince the reader there isn’t just one story for anything. That story is one that you choose, and you have the capability of telling in different ways.
In that sense, it is fiction. The story you tell yourself is a sort of fiction. It’s almost a delusion. The story you tell about yourself to other people is probably a slightly different version, so that’s a different fiction. This goes on and on.
I hope to convey in the book that the stories you choose to tell, we have control over that to some degree. Actually, the best way to shift your storytelling, if that’s what you want to do, is by controlling what you consume. Because as we were just talking about, a lot of this is influenced by what other people say. Our brains are very good at mixing up things that happen to us versus things that happen to other people. The provenance of our memories gets all muddled.
That’s right, we are. And so, if you want to be a good curator, then you need to be careful about the types of things that you consume from other people, because that will heavily influence your own thought processes.
How has working in this field affected how you go about your day-to-day life, perceiving what you’re doing at any given moment? Do you have a different kind of selectivity as you experience a particular event, or recall something from your past?
For me, I don’t feel beholden to my past self, if that makes any sense. Some people have an ethos that they have a life purpose and then they have to carry on a legacy. And for some people, it can be very heavy. It might be passed down from generations.
I like to think that I’ve shed some of that; I’d be lying if I said I’ve done it completely. I think COVID in particular has made us all aware how short life really is. I’ve resolved to do what I want to do in however many years I may have left on this earth. I kind of allude to that in the epilogue that I’m a very different person now than even when I started writing the book.