“We’re all working, but not for work’s sake,” he says. “Ultimately, we’re working for something else.” Norton came to this realization after a string of tragedies. His 21-year-old brother-in-law, Gavin, died in his sleep. His son Gavin, named after his uncle, died at just 10 weeks of age from whooping cough. His wife had a stroke, and his 11-year-old son was hit by a car.
“I was like, ‘Does God hate me?’” he says. “So, I’ve tried to spend my time creating time. How can this job free up space? How can this job support me having more time with my family or more time to travel or more time for the things that really matter, as opposed to endlessly working toward something?”
Norton calls his philosophy Gavin’s Law, which is, “Live to start. Start to live.”
Rethinking Time Management
Norton was a mentee of Stephen Covey, the time management icon and author of The 7 Habits of Highly Effective People. “Stephen said to ‘begin with the end in mind,’” says Norton. “But he never said begin with means in mind. I think, in many ways, we’ve turned goals and habits into ends unto themselves, when in reality, all these things we’re doing are meant for us to live in a different way.”
Traditional time management tools are designed to measure every drop of blood, sweat, and tears from workers, explains Norton. “[They were] never designed for freedom,” he says. “The question is, ‘Who manages my time under time management?’ Traditionally, it is about control. Your employer controls your time. They create your schedule. They tell you what to do when and where. And, if you want to get crazy, they determine that you only have two weeks out of the year for vacation and when you get to retire.”
Norton suggests embracing “anti-time management.” Instead of being the opposite or the reverse, it’s a different level of thinking. “You control your time,” says Norton. “You decide what you want to do, when and where. You decide if you want to create space or not.”
To practice anti-time management, start by identifying “final causes.” “It’s a term from Aristotle,” says Norton. “The idea is [that] an acorn becomes an oak tree. But in real life, a lot of us are planting seeds thinking they’re going to be an oak, when they never will. Why not just plant an oak tree from the start?”
“Final cause” is the sake for which something is done. It’s not the goal; it’s the success that comes from achieving the goal. “Once you realize the final cause, you can change the decision tree around who you want to be and what you really want to do and set up from the dream instead of working endlessly toward it,” says Norton.
For example, you may want to create a table. You might make a design, hire a contractor, and build a table. “That’s great if the goal is to have an heirloom table, but what if the purpose was just to have dinner?” says Norton. “What does success look like after success? Once you understand that, you might realize you could have gotten UberEATS or gone to a food truck. In that process, steps disappear, and you get your time back because the steps weren’t necessary at all.”
“Tip” Time Instead of Managing It
Norton calls the process “time tipping”—rescuing your dream from the end of a timeline and putting it front and center. To tip time and get to the real goal of your goal, you need to get clear on the four Ps of purpose: personal, professional, people, and play.
Your personal purpose relates to the priorities that are just for you, such as health or spirituality. Professional purpose relates to your career, such as promotions or recognition. Your people purpose relates to those around you who are important in your life, such as your family. And your play purpose relates to activities that make you feel energized. Make a list of your Ps, and whenever you have a choice, ask yourself if it fits your purpose priorities.
“They become your North Star,” says Norton. “Does the action bring you closer to or fulfill one of these four goals or not?”
Stop managing time and start prioritizing attention, says Norton. “Pay attention to what you really want, then fast forward the model, going from purpose to priorities to projects,” he says. “The way you’re paid should be in alignment with what you want to be doing. Change the way you work. Change how you’re paid. Change your life. The work you choose determines your lifestyle.”
Aberfan – 21st October 1966: Children on the hillside overlooking Aberfan, watch as a bonfire is made near to where the mud slide has come to a halt, to burn the wood and wreckage of the school and local houses. This will help clear the area, as the rescue effort for survivors continues The Aberfan disaster was a catastrophic collapse of a colliery spoil tip in the Welsh village of Aberfan, near Merthyr Tydfil It was caused by a build-up of water in the accumulated rock and shale, which suddenly started to slide downhill in the form of slurry and engulfed The Pantglas Junior School below, on 21st October 1966, killing 116 children and 28 adults. Picture taken 21st October 1966. (Photo by Ron Burton/Mirrorpix/Getty Images)
Children looking over Aberfan, Wales, in the wake of the disaster there in 1966. When it finally happened, shortly after nine o’clock in the morning on October 21, 1966—when the teetering pile of mining waste known as a coal tip collapsed after days of heavy rain and an avalanche of black industrial sludge swept down the Welsh mountainside into the village of Aberfan, when rocks and mining equipment from the colliery slammed into people’s homes and the schools were buried and 116 young children were asphyxiated by this slurry dark as the river Styx—the anguished public response was that someone should have seen this disaster coming, ought to have predicted it.
Or at least, they claimed they had. Shortly after the tragedy at Aberfan, several women and men recalled having eerily specific premonitions of the event. A piano teacher named Kathleen Middleton awoke in North London, only hours before the tip fell, with a feeling of sheer dread, “choking and gasping and with the sense of the walls caving in.” A woman in Plymouth had a vision the evening before the disaster in which a small, frightened boy watched an “avalanche of coal” slide towards him but was rescued; she later recognized the child’s face on a television news segment about Aberfan.
One of the children who died had first dreamt of “something black” smothering her school. Paul Davies, an 8-year-old victim, drew a picture the night before the catastrophe that showed many people digging in a hillside. Above the scene, he had written two words: The End. Premonitions this dramatic and alarming are likely rare. But most of us have experienced odd coincidences that make us feel, even for an instant, that we have glimpsed the future. A phrase or scene that triggers a jarring sensation of déjà vu.
Thinking of someone right before they text or call. Inexplicably dreaming about a long-lost acquaintance or relative only to wake and find that they have fallen ill or died. It’s mostly accepted that these are not really forms of precognition or time travel but instead fluky accidents or momentary brain glitches, explainable by science. And so we don’t give them a second thought or take them that seriously. But what if we did?
The Premonitions Bureau, an adroit debut from TheNew Yorker staff writer Sam Knight, draws us into a world not that far gone in which psychic phenomena were yet untamed by science and uncanny sensations still whispered of the supernatural, of cosmic secrets. Knight’s book registers the spectral shockwaves that rippled out from Aberfan through the human instrument of John Barker, a British psychiatrist who began cataloguing and investigating the country’s premonitions and portents in the wake of the accident.
Barker spent his career seeking out the hidden joints between paranormal experience and modern medicine, asking scientific questions about the occult that we have now agreed no longer to ask. In Knight’s skillful hands, the life of this forgotten clinician becomes a meditation on time and a window through which we can perceive the long human history of fate and foresight. It’s also a tale about how we decide what is worthy of science and what it feels like to be left behind. It is a story about a scientific revolution that never happened.
Forty-two years old when the country learned of Aberfan, John Barker was a Cambridge-educated psychiatrist of terrific ambition and rather middling achievement. In his thirties, he had been an unusually young hospital superintendent at a facility in Dorset; a nervous breakdown led to his demotion and reassignment, by the mid-’60s, to Shelton Hospital, where he cared for about 200 of the facility’s thousand patients. Shelton was a Victorian-era asylum in western England, not far from Wales, and a hellish world unto itself.
Local doctors called it the “dumping ground,” this 15-acre gothic facility of red-brick buildings hidden behind red-brick walls, where women and men suffering from mental illness were deposited for the rest of their lives. One-third of Shelton’s population had never received a single visitor. Like other mental health facilities in midcentury Britain, it was a place of absolutely crushing neglect. “Nurses smoked constantly,” Knight writes, “in part to block out Shelton’s all-pervading smell: of a house, locked up for years, in which stray animals had occasionally come to piss.” Every week or two, another suicide. “The primary means of discharge was death.”
As a clinician, Barker was tough and demanding. He was also complicated (like all of us) and tough to caricature. Barker had arrived at Shelton as calls for psychiatric reform were growing louder, and he supported efforts to make conditions “as pleasant as possible” for the hospital’s permanent residents, including removing locks from most of the wards and arranging jazz concerts. But he also favored aversion shock therapies and once performed a lobotomy—which, to his credit, he later regretted.
At any rate, Barker’s true passion lay elsewhere. As a young medical student, he collected ghost stories from nurses and staff at the London hospital where he was training: sudden and unaccountable cold presences late at night, spectral ward sisters who shouldn’t have been there and who vanished when you looked twice. A “modern doctor” committed to rational methods, his interest in all things paranormal led him to join Britain’s Society for Psychical Research, whose members had been studying unexplained occult phenomena since 1882.
Barker had a crystal ball on his desk and spent his weekends at Shelton rambling around haunted houses with his son. He was a man caught between worlds who would eventually fall through the cracks. The day following the disaster, Barker showed up in Aberfan to interview residents for an ongoing project about people who frightened themselves to death. But he realized quickly that his questioning was insensitive—and as he learned more about the uncanny portents and premonitions that were already swirling around the tragedy, he sensed a much greater opportunity.
Barker contacted Peter Fairley, a journalist and science editor at the Evening Standard, with his hunch that some people may have foreseen the disaster through a kind of second sight. Days later, the paper broadcast Barker’s paranormal appeal to its 600,000 subscribers: “Did anyone have a genuine premonition before the coal tip fell on Aberfan? That is what a senior British psychiatrist would like to know.”
A gifted scientific popularizer, Fairley shared with Barker a knack for publicity as well as tremendous ambition. Within weeks, the two men had dramatically expanded the project. From January 1967, readers were told to send general auguries or prophecies to a newly established “Premonitions Bureau” within the newsroom. “We’re asking anyone,” Fairley told a BBC radio interviewer, “who has a dream or a vision or an intensely strong feeling of discomfort” which involves potential danger to themselves or others “to ring us.”
With Fairley’s brilliant assistant Jennifer Preston doing most of the work, the team categorized the predictions and tracked their accuracy. Their hope was to prove that precognition was real and convince Parliament to use this psychic power for good by developing a national early warning system for disasters. “Nobody will be scoffed at,” Fairley insisted. “Let us simply get at the truth.”
Seventy-six people wrote to Barker claiming premonitory visions of the Aberfan disaster. Throughout 1967, another 469 psychic warnings were submitted to the Bureau. Many of these submissions came from women and men who claimed to be seers, who experienced precognition throughout their lives as a sort of sixth sense. Kathleen Middleton, the piano teacher who awoke choking before the coal tip collapse, became a regular Bureau contact who had been sensitive to occult forces since she was a girl.
(During the Blitz, a vision of disaster convinced her to stay home one night instead of going out with friends; the dance hall was bombed.) Another frequent contributor was Alan Hencher, a telephone operator who wrote that he was “able to foretell certain events” but with “no idea how or why.” The premonitions gathered by Barker ran the gamut of believability. Some were instantly disqualified. Others were spookily prescient. In early November 1967, both Hencher and Middleton warned of a train derailment; one occurred days later, near London, killing 49 people.
Hencher suffered a severe headache on the evening of the disaster and suggested the time of the accident nearly to the minute, before the news had been reported. Most of the premonitions appear to have been vague enough to be right if you wanted them to be, if you were willing to cock your head to one side and squint. A woman reported a dream about a fire; on the day she mailed her letter, a department store in Brussels burned. One day in May 1967, Middleton warned about an impending maritime disaster; an oil tanker ran aground.
Visions of airliner crashes inevitably, if one waited long enough, came true somewhere in the world. Barker was determined to believe in them. “Somehow,” he told an interviewer, seers like Hencher and Middleton “can gate-crash the time barrier … see the unleashed wheels of disaster before the rest of us.… They are absolutely genuine. Quite honestly, it staggers me.” Visions of airliner crashes inevitably, if one waited long enough, came true somewhere in the world. Barker was determined to believe in them.
For those of us unable to gate-crash time itself, one wonders what it would be like to have this kind of premonitory sense, to perceive the future so viscerally and so involuntarily. It was like knowing the answer for a test, some explained, with cryptic keywords floating in space in their imaginations. ABERFAN. TRAIN. Others had physiological symptoms. Odd smells, like earth or rotting matter, that nobody else could perceive, or a spasm of tremors and pain at the precise moment when disaster struck far away.
People who sensed premonitions explained to Barker that it was an awful burden, that they grappled with, as one put it, “the torment of knowing” and “the problem of deciding whether we should tell what we have received” in the face of potential ridicule or error. Prone to a certain grandeur, Barker believed that the stakes of the project, which he called “essential material and perhaps the largest study on precognition in existence,” were high. Practically speaking, he thought it would help avert disaster.
(If the Premonitions Bureau had been up and running earlier, he boldly claimed, Aberfan could have been avoided and many children’s lives saved.) More daringly, Barker thought that proving the existence of precognition would overturn the basic human understanding of linear time. He wondered if some people were capable of registering “some sort of telepathic ‘shock wave’ induced by a disaster” before it occurred. It might be akin to the psychic bonds felt between twins, but able to vanquish time as well as space.
Inspired by Foreknowledge, a book by retired shipping agent and amateur psychic researcher Herbert Saltmarsh, Barker thought that our conscious minds could likely only experience time moving forward, and in three distinct categories: past, present, and future. To our unconscious, however, time might be less stable and more permeable. If scientists would “accept the evidence for precognition from the cases” gathered by the Bureau, he said, they would be “driven to the conclusion that the future does exist here and now—at the present moment.” Barker sensed a career-defining discovery just around the corner.
But it was not to be. John Barker died on August 20, 1968 after a sudden brain aneurysm. He was 44 years old. The Bureau, which Jennifer Preston dutifully continued through the 1970s, and which ultimately included more than 3,000 premonitions, represented the last, unfinished chapter of his brief life. He never wrote his book on precognition and fell into obscurity. The morning before he died, Kathleen Middleton woke up choking.
Knight narrates Barker’s story with considerable generosity and evident care. Rather than condescend or deride him as a crank, Knight thinks with Barker: about the strangeness of time and our human ways of moving through it, about how we make meaning from chaos and resist the truly random, about prediction and cognition and our hunger for prophecy. Yet the many disappointments in Barker’s career were not incidental to his significance, and emphasizing them does not diminish him.
In fact, his life can also be framed as a tale told much too rarely in the history of science, about how scientific inquiry relies as much upon failure as success in order to function, on exclusion as much as expansion. Around the time Barker was appointed to his role at Shelton, the American historian and philosopher of science Thomas Kuhn published a book called The Structure of Scientific Revolutions, a landmark work that now structures practically everyone’s thinking without them realizing it. What Kuhn proposed was that scientific research always occurs within a paradigm:
A set of rules and assumptions that reflect not only what we think we know about how the universe works, but also the questions we are permitted to ask about it. At any given moment, “normal science” beavers away within the borders of the current paradigm, working on “legitimate problems” and solving puzzles. For a long while, Kuhn explained, phenomena “that will not fit the box are … not seen at all,” and “fundamental novelties” are suppressed. Eventually, however, there are too many anomalies for which the reigning paradigm cannot account. When a critical mass is reached, the model breaks and a new one is adopted that can better explain things. This is a scientific revolution.
For Barker, precognition constituted what Kuhn would have called a legitimate problem within normal science: It ought to be studied using experimental methods and would, he thought, one day be explained by them. But he admitted the risk that modern psychiatry might not ever be able to accommodate the occult, that his work on premonitions could break the paradigm altogether. Hunches and visions that came true might demand a new way of explaining time and energy. “Existing scientific theories must be transformed or disregarded if they cannot explain all the facts,” he lectured his many critics. “Although unpalatable to many, this attitude is clearly essential to all scientific progress.”
He seems to have seen himself as a contemporary Galileo, insisting upon empirical truth in the face of “frivolous and irresponsible” gatekeepers. “What is now unfamiliar,” he argued in the BMJ, usually tends to be “not accepted, even despite overwhelming supportive evidence. Thus for generations the earth was traditionally regarded as flat, and those who opposed this notion were bitterly attacked.” Barker wanted the ruling scientific paradigm to make room for the paranormal—or give way.
It wasn’t so implausible, in midcentury Britain, that it just might. A craze for spiritualism and the paranormal had swept the country between the two world wars, and a rash of new technologies that seemed magical (telegram, radio, television, etc.) left many Britons, not unreasonably, to wonder if “supernatural” phenomena like prophecies or telepathy might turn out to be explainable after all. In Barker’s Britain, one quarter of the population had reported believing in some form of the occult. Even Sigmund Freud, nervously protecting the reputation of psychoanalysis, refused to dismiss paranormal activities “in advance” as being “unscientific, or unworthy, or harmful.”
In physics, too, Knight points out, “the old order of time was collapsing” by midcentury, thanks to developments in relativity as well as quantum mechanics. For experts, time had become less predictable and mechanisms of causation less clear, both subatomically and cosmically. Barker had been formed, in other words, by “a society in which one set of certainties had yet to be eclipsed by another.” Premonitions became understood not in terms of extrasensory perception but simply misperception: the work of cognitive error or misfiring neurons rather than the supernatural.
But instead of rearranging itself around Barker’s research into precognition, the paradigm shifted away from him and snapped more firmly into place. The walls sprang up, and the questions that interested Barker became seen as illegitimate and unscientific. The Bureau he built with Fairley was not all that successful. Only about 3 percent of submissions ever came true, and in February 1968 a deadly fire at Shelton Hospital itself went unpredicted, to the unabashed glee of critics and satirists.
Barker’s supervisors grew skeptical and then embarrassed. As time went on, and the boundaries of the scientific paradigm in which we still live grew less permeable, occult phenomena were explained not by bending time, but with recourse to cognitive science and neurology. Premonitions became understood not in terms of extrasensory perception but simply misperception: the work of cognitive error or misfiring neurons rather than the supernatural.
The popular understanding of scientific revolutions still revolves around big ruptures and great scientists, the paradigm-defining concepts (like heliocentrism, gravity, or relativity) that transform how human beings think they understand the universe: We shift the frame to move forward. Yet there is just as much to be learned from the times when revolutions don’t occur, when scientific inquiry is defined not by asking thrilling new questions, but by the determination that some old questions will no longer be asked.
What’s so brilliant about Knight’s account, in the end, is the way it portrays a creative workaday researcher rather than a modern-day Newton or Einstein, a man aspiring to do normal science while the rules shifted around him; the way it conveys the rarely captured feeling of a paradigm closing in around you and your ideas, until it all fades to black.
Kimberly Corman had a premonition of theRoute 23 pileup. She also had a premonition of herself driving into the the lake, drowning, and being revived by Ellen Kalarjian.
Sam is woken from a dream, believing to have had a premonition of a man being suffocated by his car’s exhaust fumes. He tells Dean it felt the same as when he dreamt of their old house and of Jessica, which prompts Dean to question why he would be dreaming of a random man in Michigan.
Are you exhausted from rushing through life doing the same monotonous things over and over again? Perhaps those things that were once meaningful now seem vacuous, and the passion has burned out. Do you feel that pleasures are short-lived and ultimately disappointing, that your life is a series of fragments punctuated with occasional ecstasies that flare up and then, like a firework, fade into darkness and despair? Perhaps you are lonely or pine for past loves. Or you feel empty and lost in the world, or nauseous and sleep-deprived.
Maybe you are still looking for a reason to live, or you have too many confused reasons, or you have forgotten what your reasons are. Congratulations – you’re having an existential crisis. Sometimes, the questions ‘Why am I here?’ and ‘What’s it all for?’ haunt you gently like a soft wingbeat with barely a whisper, but sometimes they can feel as if they are asphyxiating your entire being.
Whatever form your existential crisis takes, the problem, as the Danish philosopher Søren Kierkegaard (1813-55) saw it, was that living without passion amounts to not existing at all. And that’s bad for all of us because, without passion, rampant waves of negativity poison the world. Kierkegaard thought that one of the roots of this problem of a world without passion is that too many people – his contemporaries but, by extension, we too – are alienated from a society that overemphasises objectivity and ‘results’ (profits, productivity, outcomes, efficiency) at the expense of personal, passionate, subjective human experiences.
In his journal, Kierkegaard wrote: ‘What I really need is to be clear about what I am to do, not what I must know … the thing is to find a truth which is truth for me, to find the idea for which I am willing to live and die.’ Finding this truth, this passion, was what Kierkegaard thought could unite an existence, overcome melancholia, and help you to become more fulfilled. Kierkegaard had some ideas about how to harness the anguish of what we have come to think of as an existential crisis. Reading Kierkegaard won’t necessarily solve all problems, but it can help you understand some of the sources of your malaise and to see new possibilities for your life.
Sometimes, Kierkegaard is called the first existential philosopher because of his emphasis on the individual and subjective experience. Existential philosophers stress freedom of choice and responsibility for the consequences of your choices, and certainly one of the quintessential existentialist philosophers, Jean-Paul Sartre (1905-80), found this vein of thinking in Kierkegaard’s writing. For existentialists, it’s up to you to decide the kind of person you want to be and how to live your life meaningfully.
But these choices leaven despair because of the pressure that comes when you realise you’re free and responsible and have no one else to blame, no excuses for your behaviour. Anxiety, or despair, Kierkegaard wrote, is the ‘dizziness of freedom’. Despair is a kind of vertigo we get when overwhelmed with possibilities and choices. Kierkegaard described it as a similar feeling to standing on the edge of an abyss. You might be afraid of falling, but anxious when you realise that jumping is a possibility.
We are forced to make choices all the time, whether we like it or not. Consider toothpaste: there are so many types and it’s difficult to choose the one that’s best for your teeth. Whitening or stain-removal? Cavity protection, anti-plaque or enamel repair? What’s the difference? Why isn’t there one that does everything? It’s hard to know what the outcome of choosing one over the other will be. While choosing the wrong toothpaste probably won’t devastate your life, when you face more profound choices –
Such as what to study at college, whom to marry, whether to end a relationship, which career to pursue, whether to try to save someone who is drowning, if you should turn off a loved one’s life-support system – the closer you come to the edge of the abyss, the dizzier you will feel about your possibilities and responsibilities. Sometimes you live in ignorant bliss about your options but, once you become aware of them, wooziness is inevitable. As Kierkegaard wrote in TheConcept of Anxiety (1844):
He whose eye happens to look down into the yawning abyss becomes dizzy. But what is the reason for this? It is just as much in his own eye as in the abyss, for suppose he had not looked down.
Sometimes, the dizziness of your freedom is so overwhelming that you might feel compelled to step back, to shrink from making a choice. Making no choice, or letting someone else choose for you, can feel easier. The greater the stakes, the deeper the abyss, and the further you have to fall if you misstep. But your personal growth depends on your ability to handle big choices yourself and not to shirk them. For Kierkegaard, bravely facing up to our choices and learning to channel our anxiety in constructive ways is vital: ‘Whoever has learned to be anxious in the right way has learned the ultimate.’
During his lifetime, Kierkegaard made authorities nervous because he was an iconoclast who encouraged people to think for themselves. He challenged readers to break themselves free from the brainwashing of churches and community groups that preached what to do and what to believe, particularly the Lutheran Church of Denmark, with which he was at loggerheads for much of his later life. Kierkegaard also might have been deeply suspicious of today’s social media and advertising that tells us where to spend our money and time in the elusive pursuit of happiness. In a criticism that seems to have pre-empted online trolls, he proposed that ‘the crowd’ or the public is ‘untruth’ because it enables people to be anonymous, irresponsible, cowardly, and creates an impersonal atmosphere.
Kierkegaard was a Christian, ‘albeit a maverick Christian’, as the philosopher Gary Cox put it, because Kierkegaard emboldened people to develop a personal relationship with God instead of unreflectively assuming what the clergy sermonised. For Kierkegaard, living the truth is infinitely more important than objectively knowing it. At Kierkegaard’s funeral, the archdeacon who gave the eulogy told the huge crowd not to misunderstand or accept what Kierkegaard had written because he went too far and didn’t know it.
But you don’t need to be religious to glean practical wisdom from Kierkegaard’s work. He inspired many atheist philosophers. Sartre, as I’ve mentioned, deeply admired Kierkegaard. He called him an ‘anti-philosopher’ because Kierkegaard sought ‘a first beginning’ by pushing back against boring and abstract philosophies, such as G W F Hegel’s and Immanuel Kant’s, which were very popular during Kierkegaard’s time.
Kierkegaard wrote in unconventional ways. He was witty and came up with quirky pseudonyms such as ‘Hilarius Bookbinder’. Kierkegaard wrote pseudonymously not because he wanted to hide his authorship – pretty much everyone knew which books he’d authored – but to distance himself from his work; to challenge us to question the ideas he presents; to take responsibility for interpreting the text’s meaning; to inspire us to come to our own conclusions; and to create our own subjective truths. The strategy is called ‘indirect communication’. The effect of Kierkegaard’s work is that, instead of dictating and moralizing, he provokes – because you can’t tell if he’s being serious or not – and invites readers to dance with ideas.
Kierkegaard uses indirect communication in one of his most famous works, Either/Or (1843), a fictional collection of letters and essays written by different characters and presenting different points of view: aesthetic, ethical, and religious. These three views, or phases, provide a possible framework for how to endure and overcome an existential crisis. The phases are not rigid steps, but rather offer a scaffolding of possible experiences on an existential journey to reinvigorate our passion for life.
Think it through
Enjoy the aesthetic elements of your life
Kierkegaard suggested that the first mode of living is the aesthetic sphere. Aesthetic living is fun and impulsive, focused on sensual satisfaction, like a child who is discovering the world with awe and wonder. The aesthetic sphere is a beautiful phase of life, passionate and sparkling with possibilities. Consider the thrill of falling in love, the delight of seeing your all-time favourite musician live in concert, the elation of sharing a delicious bottle of wine or meal with a good friend, or the exhilaration of skinny-dipping on a whim. These experiences can be intoxicating, extraordinarily interesting, and make you feel like your life is transformed if you submit to them.
Don Giovanni – the protagonist of Mozart’s opera Don Giovanni (1787), a legendary seducer who is also sometimes known as Don Juan – is, Kierkegaard suggested, the ultimate archetype of the aesthetic mode because he lives for immediate sexual gratification and sensuality. Don Giovanni is a player. He is handsome, seductive and exciting. Women find him irresistible: he has slept with more than 2,000 women whose names he records in his not-so-little black book. Don Giovanni seeks pleasure above all else, and dances through his hedonistic life.
How can you live aesthetically? Make your life as interesting and enjoyable as possible. Fall in love a lot. Rotate crops – meaning that, if you’re bored with your life, don’t be afraid to leave behind what doesn’t serve you and start planting seeds for fresh projects and new relationships that energize you. Be impulsive. Live for and in the moment. Cultivate arbitrariness for the sheer pleasure of it: go to the theatre but watch only the middle of the performance; pick up a book and read a random passage. Enjoy experiences in disruptive ways, different than what others are spoon-feeding you. Practise the art of remembering the joys of your past. Practise the art of forgetting unpleasantness by focusing on the silver linings of your misfortunes. Burn the candle of your life at both ends.
Make existential commitments to live ethically
However, an aesthete’s actions can be self-sabotaging, because, as Kierkegaard pseudonymously writes:
As when one skims a stone over the surface of the water, it skips lightly for a time, but as soon as it stops skipping, instantly sinks down into the depths, that is how Don Giovanni dances over the abyss, jubilant in his brief respite.
Don Giovanni gets his comeuppance in the end when a ghost in the form of a statue of the Commendatore, the father of one of his conquests and a man whom Don Giovanni has killed in a fight, drags him down to hell. You might not be dragged to hell by a ghost, but living purely in the aesthetic mode – though it might offer temporary respite – puts you on the fast track to a further existential crisis.
Why is this? The answer is that the aesthetic lifestyle demands a high price. Aesthetic living can be a source of existential despair when you become overly dependent on its distractions to fill the voids in your life. The aesthetic mode is dangerous when you live in a state of immediacy and instant gratification, constantly overindulging in such pleasures as social media scrolling, shopping, television, busyness, alcohol, drugs, serial romancing or casual sex. At a certain point, these activities cease to offer the enjoyment they promise, and the world turns grey.
Wallowing in such distractions only entrenches your alienation more deeply and pushes you more squarely into dungeons of unhappiness. As soon as you’ve satisfied one pleasure, you’re chasing the dragon of newness for the next high. Sometimes you’re so excited about taking risks on new possibilities, so in love with starting new projects and relationships, that you’re constantly flitting from one to the next, never finishing anything. Constantly on the move, you are like an ocean wave, surging powerfully, cyclically, with raw primal energy.
But waves froth and fizzle away indefinitely. If you’re constantly and busily churning through life, your existence amounts to a sum of moments without any real cohesion. Excitement fades and leaves in its wake disappointment and loneliness. The aesthete in Either/Or is envious of insects that die after copulation because they are able to indulge in the pinnacle of sexual ecstasy and then escape life’s greatest anticlimax – the ‘petite mort’ becomes a real one. An aesthetic life will inevitably leave you morbidly tired.
Kierkegaard’s aesthete is plagued with such soul-crushing tedium and torturous despair that he is numb. Because he isn’t truly engaged in life, he lives as if he were dead. Living void of passion makes him feel both chained by his anxieties and also cast adrift, like a spider plunging and flailing around, unable to grasp hold of anything:
What is to come? What does the future hold? I don’t know, I have no idea. When from a fixed point a spider plunges down as is its nature, it sees always before it an empty space in which it cannot find a footing however much it flounders. That is how it is with me: always an empty space before me, what drives me on is a result that lies behind me. This life is back-to-front and terrible, unendurable.
So if living aesthetically can only be a short-term solution to an existential crisis, how can you go beyond that and live ethically? Stop skimming over life like that stone. Slow down and do what you can to carve out pockets of time for reflection. Cultivate the space to become less robotic. And stop using aesthetic activities as a distraction from facing up to your existential despair.
‘Despair!’ Kierkegaard’s pseudonym writes. Despair is the entry price for transitioning from the aesthetic to the ethical sphere. Learning to love despair is an adventure in moving to a higher mode of self-development. Don’t hide from your existential crisis because choosing despair means choosing yourself. To cosy up to your despair is to choose against being beholden to your animalistic, aesthetic impulses, and towards becoming a definite and solidly grounded individual. Choosing yourself means making meaningful commitments, such as dedicating yourself to a vocation. It means setting goals and sticking to them. Dodging commitment means you’re simply hovering over life, not truly living, and as empty of substance as those waves.
To choose despair also means to choose humanity. In the ethical mode, you recognize that you live in a world with other people, that they matter, and that every choice you make must reflect a responsibility towards them. You act with honesty, open-heartedness, understanding and generosity. You focus more on what you can give to others and less on what you’re getting out of them. To cultivate your humanity, go people-watching for an hour and consider the beauty in each individual. Appreciate every person you meet in their particularity – their tasks, challenges and triumphs. Join a club and build a community of friends. Act more charitably. Help people. Commit to making the world better for others.
Choosing this kind of despair also prepares you for marriage in a way that a life of seeking sensual gratification is unlikely to. Getting married – ideally to your first love, in Kierkegaard’s analysis – reflects an ethical decision because marriage is a serious, definitive and life-changing choice. Marriage calls for a more sophisticated awareness of your existence than a life driven purely by sexual instincts. Sure, you can always get divorced, but Kierkegaard’s ethicist suggests getting married helps people take love more seriously than an aesthete would, by focusing on creating a relationship that’s stable and constant. In the ethical sphere, you actively rejuvenate the love with your partner, instead of skipping to the next relationship for thrills and a confidence boost as soon as your first one gets tough.
Face your existential abyss bravely because, Kierkegaard suggested: ‘Anxiety is the organ through which the subject appropriates sorrow and assimilates it,’ and ‘indeed I would say that it is only when the individual has the tragic that he becomes happy.’ The key to the ethical sphere is to use your despair to galvanise you to overcome your sorry dark states, refresh your enthusiasm for living, and arouse your appetite for something more meaningful in your life.
You develop yourself by being patient with existence, seeing the beauty in stability, and recognising that you are your own source of happiness and creativity. You don’t need to seek excitement constantly from new external stimuli as the aesthete does. You don’t need a dance floor to dance, to enjoy life; your dance floor is inside of you, wherever you are. You nurture the ethical attitude by living intentionally (not accidentally, like the aesthete), and living each day as if it were your Judgment Day.
Leap to faith
The ethical mode can help stabilise you, but it might not be enough to resolve your existential crisis. Living ethically might even be another source of existential calamity because fulfilling your social duties can be onerous. Kierkegaard’s ethicist says of the duty of marriage: ‘Its uniformity, its total uneventfulness, its incessant vacuity, which is death and worse than death.’ Marriage doesn’t make love stay. People change and break promises, making any commitment insecure. Given how many other people are unjust and immoral, being ethical might also throw you deeper into despair. And sinking too heavily into reflection can thwart your enjoyment of life. Philosophers tend to be guilty of overthinking, and Kierkegaard’s aesthete quips: ‘What seems so difficult to philosophy and the philosophers is to stop.’
The only way truly to conquer an existential crisis is with a leap. A leap is what Kierkegaard calls an ‘inward deepening’, which recognises that the world is uncertain, but you can make a bold choice about the kind of life you want to lead. A leap is beyond the realm of feelings (aesthetic sphere) and commitments (ethical sphere). A leap is an act of will to transform your life. It’s the decision to design an existence to which you can enthusiastically devote yourself and that will uplift and sustain your being.
Kierkegaard’s leap was guided by the commandment to ‘love thy neighbour’. In Works of Love (1847), written under Kierkegaard’s real name, he proposes that universal love, or agapē, is the secret to happiness because it overcomes the fleetingness and insecurity of aesthetic and ethical relationships. Love is Ariadne’s thread of life because, as long as you love, as long as you commit yourself to being a loving person, you’ll be safe from being hurt and alone. Kierkegaard thought that this sort of unwavering faith reflects a supremely developed human being.
Perhaps you live in the aesthetic or ethical modes of life, and you’re perfectly happy and see no need to leap. Or perhaps you inhabit these realms and find comfort in your melancholy. But the rub with existential despair is that, once you have caught a glimpse of it, intentionally or not, it’s extraordinarily difficult to unsee it. If that’s you, Kierkegaard’s ideas might be a way to help you find your footing. But the only thing that will alleviate an existential crisis is to find the truth that is true for you, the subjective truth, the propulsion to leap that lies in the innermost depths of your heart. If you’re not sure what your subjective truth is, Kierkegaard suggested: ‘Ask yourself and keep on asking until you find the answer.’
Ultimately, though, a passionately lived life isn’t about an either/or choice. You can’t be all frivolous or all serious all the time. A fulfilling life is about enjoyment and ethical commitment and leaping. Your life needs some of the sort of energy, pleasures and possibilities that Don Giovanni’s life exhibits (though not necessarily indulging these in the ways he does), otherwise the world would be very dull. And the world is boring without him. You also need something of the ethical: you need to acknowledge how your choices affect other people and to take responsibility for your actions, otherwise you’ll end up alone and sad.
You also need a leap to find that thing that you can devote yourself to that unites the splinters of your life, even if, for you, that isn’t a leap into religious faith. The point is to see these different dimensions of life, the ruts you might be falling into, the potential sources of ennui and malaise that stem from the way you live your life. But, ultimately, it’s up to you to choose how you juggle these spheres and how you spark your own fire to bind the fragments of your life together into a coherent synthesis. That’s the point. It’s for you to shape your life.
Dennis Marcellino (1996). Why Are We Here?: The Scientific Answer to this Age-old Question (that you don’t need to be a scientist to understand). Lighthouse Pub. ISBN978-0-945272-10-6.
Scott Campbell, Paul W. Bruno (eds.), The Science, Politics, and Ontology of Life-Philosophy, Bloomsbury, 2013, p. 8.
Michael Chase, Stephen R. L. Clark, Michael McGhee (eds.), Philosophy as a Way of Life: Ancients and Moderns – Essays in Honor of Pierre Hadot, John Wiley & Sons, 2013, p. 107.
Further reading
William James and other essays on the philosophy of life, Josiah Royce
J. M. Keynes (1936), The General Theory of Employment, Interest and Money, Chapter 12. (New York: Harcourt Brace and Co.).
Bulow, Jeremy I.; Geanakoplos, John D.; Klemperer, Paul D. (June 1985). “Multimarket Oligopoly: Strategic Substitutes and Complements”. Journal of Political Economy. 93 (3): 488–511. doi:10.1086/261312. S2CID154872708.
Gordy, Michael B.; Howells, Bradley (July 2006). “Procyclicality in Basel II: Can we treat the disease without killing the patient?”. Journal of Financial Intermediation. 15 (3): 395–417. doi:10.1016/j.jfi.2005.12.002.
Kaufman, George G.; Scott, Kenneth E. (2003). “What Is Systemic Risk, and Do Bank Regulators Retard or Contribute to It?”. The Independent Review. 7 (3): 371–391. JSTOR24562449.
Dorn, N. (1 January 2010). “The Governance of Securities: Ponzi Finance, Regulatory Convergence, Credit Crunch”. British Journal of Criminology. 50 (1): 23–45. doi:10.1093/bjc/azp062.
Kothari, Vinay (2010). Executive Greed: Examining Business Failures that Contributed to the Economic Crisis. New York: Palgrave Macmillan.[page needed]
Craig Burnside, Martin Eichenbaum, and Sergio Rebelo (2008), ‘Currency crisis models‘, New Palgrave Dictionary of Economics, 2nd ed.
R. Cooper (1998), Coordination Games. Cambridge: Cambridge University Press.
Krugman, Paul (1979). “A Model of Balance-of-Payments Crises”. Journal of Money, Credit and Banking. 11 (3): 311–325. doi:10.2307/1991793. JSTOR1991793.
Morris, Stephen; Shin, Hyun Song (1998). “Unique Equilibrium in a Model of Self-Fulfilling Currency Attacks”. The American Economic Review. 88 (3): 587–597. JSTOR116850.
Banerjee, A. V. (1 August 1992). “A Simple Model of Herd Behavior”. The Quarterly Journal of Economics. 107 (3): 797–817. doi:10.2307/2118364. JSTOR2118364.
What a Sovereign-Debt Crisis Could Mean for You, “Prof. Rogoff and his longtime collaborator Carmen Reinhart, at the University of Maryland, probably know more about the history of financial crises than anyone alive.”
My work these days involves spending a lot of time with early stage companies, where we’re racing against the clock to turn bold new ideas into usable products, and see if they work.
It’s a land where you’re knee-deep in ambiguity, and surrounded by a sea of unanswered questions. It’s an environment where short-circuiting feedback loops pays off big time, and where fast action is highly valued.
But with so much to do and so little time, teams often get into hard scoping discussions. There’s no way to know for sure in advance what a product needs to offer in order to be validated. I’ve noticed two different types of people emerge from those discussions:
The ones who want to be right
And the ones who want to learn
The ones who want to be right defend their ideas based on their experience, their seniority, on their unmeasurable powers of divination of customer behavior. They come up with dozens of possible failure cases, just to justify their more complex solution. They get married to their ideas and never let go, irrespective of what’s learned.
They say “trust me, I know what I’m doing”, “no, that won’t work” and “let’s just do it my way this time”. They breed self-doubt and disempowerment.
Then there are the ones who want to learn. They’ve realized that when you’re first building something, chances are you’ll be wrong about at least a couple things — and try to identify them early on. They try to keep projects simple, so they can be tested fast, even if they have obvious holes. They maximize their opportunity for learning, by focusing on the problem at hand, and not on who came up with the solution or how it matches the initial big idea.
They can still have a bold vision, and they still listen to their gut, but they’re open to being wrong and eager to find out what will work for their audience.
They say “this is what worked for me before, would you be up for trying it?” and “which option would let us learn faster?”. They breed progress and are fun to hang around.
These days I just try to surround myself with people who are open to being wrong (even if they’re right most of the time), and above all interested in learning the truth, whatever it may be. I interview candidates looking for that heart-warming balance of experience and humility, and only invest in friendships with people who are willing to review previously held ideas. And I try to constantly revise what are facts and what are simply my own assumptions.
What about you? Would you rather be right, or would you rather learn the truth?
Your kindly Donations would be so effective in order to fulfill our future research and endeavors – Thank you https://www.paypal.me/ahamidian