Most of us strive and struggle to reach our goals. What we often forget is that accomplishing them is only half the battle. You also have to feel like you’ve accomplished them. And these two steps can be further apart than you might imagine.
And one author went viral when she coined the term “productivity dysphoria” to describe the yawning gap between how much she gets done and how satisfied she feels with her output.
In short, our actual achievements and our sense of our accomplishment often don’t line up. And according to one finance professor, that’s as true for our finances as it is for relationships, productivity, and success.
Are you good with money? Your bank balance can’t tell you.
You’d think it would be relatively easy to gauge how well you’re doing financially. Bank balances and credit card statements offer clear and objective measurements, after all. But when it comes to money, we start from wildly different places and have wildly different goals. Your bank account is more of a reflection of luck and circumstances than your level of financial literacy.
Plus, as Bomikazi Zeka of Australia’s University of Canberra pointed out on The Conversation recently, these are exceptionally trying times to achieve financial stability. “Every day, you’re making complex financial decisions (some of which carry huge ramifications) and there are more financial products and services available than ever before. Navigating this minefield can be overwhelming,” she writes.
So how do you know you’re actually doing OK when it comes to managing your money and planning for your financial future? Zeka offers a list of telltale signs, including:
1. You track your cashflow.
Our incomes and expenses may vary enormously, but whatever these numbers look like, nearly everyone should aim to be cash flow positive.
“By tracking your cashflow on a regular basis, you’re ensuring your expenses don’t exceed your income. In other words, you make sure you’re earning more than you spend,” writes Zeka. She adds that you’ll know you’re on the right track when “you have a surplus or a buffer.”
2. You distinguish between good debt and bad.
Some people have a horror of any debt at all. Others are prone to using credit to finance their lifestyle with little regard for their financial futures. Neither approach is a sign you’ve got your financial house in order. Those who are actually good with money understand debt can be good or bad and know how to tell the difference.
“Knowing how to make debt work for you is a skill and a sign of good financial knowledge,” says Zeka. She explains: “Good debt is debt used to improve your long-term financial position or net worth, such as a home loan. Bad debt tends to be consumption-driven and doesn’t have lasting value. Examples include payday loans or retail accounts.”
3. You diversify.
From Silicon Valley Bank to FTX, recent collapses and scandals have underlined the dangers of putting all your financial eggs in one basket. Those with a good grasp on their finances understand these risks and refuse to rely on just one basket.
“One of the key concepts of financial literacy is understanding the importance of diversification,” says Zeka. “By having your money spread across various places (such as a savings account, property, the share market, superannuation, and so on), you’ve reduced the concentration of risk. This helps protect your wealth in tough economic times.”
4. You know your own weaknesses.
No matter how sensible and informed you are about financial matters, you still have a weakness. Our attitudes toward money have deep roots in our pasts, and emotions creep into everyone’s financial decision making. If you understand your personal patterns and blind spots, you’re in a much better position to guard against them.
“Perhaps you buy unnecessary stuff when you feel sad. Or maybe you panic when faced with tough financial choices and make quick decisions just to make the problem go away. Neglecting to reflect on patterns of behavior can lead to serious and possibly irreversible financial mistakes,” cautions Zeka.
5. You have financial goals.
One person may want to retire by 45. Another may just be hoping to pay off their student loans by then. We don’t all need to have anything like the same goals, but if you want to be in the driver’s seat when it comes to your finances, you need to have some kind of goal.
“Financially literate people plan for their finances. This involves setting goals for either earnings, savings, investments, and debt management or putting measures in place to protect wealth (via, for example, insurance to protect your wealth against loss),” Zeka says. But goals alone aren’t enough. To be truly financially savvy, you need “to have a system and habits in place to achieve them,” she adds.
You can read more about Zeka’s signs of true financial literacy in her full article. The details are useful, but the overarching message is as simple as it is powerful: Whether or not you’re good with money isn’t measured by how much of it you have but by your realism, thoughtfulness, self-knowledge, and foresight.
If you know how much money comes in and out every month and have goals and reasonable plans to meet them, you’re probably much better with money than you sometimes feel.
Over the holiday season, there are five actions you can take to flip your perspective so you can ... [+].....getty
Your inbox slog overwhelms you. A colleague fails to meet his part of the team’s deadline, and you blow a fuse. A coworker talks over you in a meeting, and you seethe with anger. The game-changing new account you thought you had goes south, and you slam your fist. And when you’re late for work and the morning commuter cuts you off in traffic, you give him the finger.
I could go on and on, but you get the idea. We’re all destined for work stress, but there’s a point when it can feel like a powder keg, and our unbridled reactions sabotage our performance. ‘Tis the season to be jolly. But think of all the times you brooded for countless hours over one negative aspect of a situation when, in retrospect, it wasn’t as bad as you thought. In fact, your brain may have overlooked many positive elements.
Your boss squints over her glasses at you in a meeting, and you sizzle inside. The next day she gives you a raving performance review. And what about all those times you wigged out over a presentation and couldn’t get that one frowning face in the front row off your mind—convinced you were a disaster? Only to find out that you were a huge success. All that worry for nothing, the exact opposite of what your brain predicted.
Unrealistic deadlines. Job demands. Boss breathing down your neck. When the work doldrums come—and they surely will—we get mired in negativity and over focus on the problem, obscuring potential solutions. Neuroscientists call this hard-wired tendency to overestimate negativity and underestimate positivity, the negativity bias. Giving negativity greater credibility dwarfs work engagement and productivity and can even make us want to throw in the towel.
This is especially true when negativity lingers after our hard work goes unrecognized or we’re overlooked for a promotion. When we live in that amped-up state for too long—alarm bells ringing at full blast—it drains our clarity, well-being and resilience for personal growth and success.
“We’re not always powerful enough to fend off unwelcome work stressors,” says Eva Condron-Wells, senior vice president of user content at ComfortZones Digital. “The key is what you do with the stressors to stay cool under pressure,” she told me. “But the good news is you are powerful enough to choose how you respond to them.
It’s possible to sidestep your reactive brain from siphoning your energy by learning not to let every little hiccup throw you into a tizzy—whether it’s a printer paper jam, a traffic jam, or grape jam smeared across your office desk.” Condron-Wells offers five ways you can flip your perspective instead of your lid and manage work stress instead of letting it manage you.
1. Accept work struggles as one side of your career. “We see and experience different sides of physical things, but our thoughts are not tangible, so this visualization can help. Imagine the work stressor as a coin or a container. At a minimum a coin has two sides. A container has multiple sides. When we get stuck on just one side, it distorts our perspective and creates discouragement so large it can eclipse our confidence and cripple the motivation to persist, grow and succeed.
Work pressures, frustrations and letdowns are one side of our careers but only one side. You can’t have a front without a back, a right without a left, an up without a down. If we practice this mental exercise, challenges will still show up, but we will be able to get more out of them, which in turn keeps us calm, fosters our learning and helps grow our careers. This is not a passive acceptance of bad things but an active acceptance that work and stressors are a package deal.”
2. Focus on the solution. When faced with a work challenge, Condron-Wells recommends that you notice your perspective and how much time you spend focused on the problem. Then decide to re-frame work struggles in a new light, so you don’t get stuck in a swirl of negative responses that exacerbate stress and derail performance.
She says it’s important to celebrate the highs of your career (awards, achievements, and learning and growth) without taking them any more seriously than the lows. Conversely, it’s essential not to take the work lows anymore seriously than the highs. They exist together, collectively making up your career.
3. Look for Learning. “If you look at negative experiences and emotions such as fear, failure or frustration in a different way—as signs that you’re on a learning curve—it helps you focus on the path out of the negativity and toward better performance,” she explains. “After a project or problem, ask yourself and your team, ‘What worked well? What did not work well? What will we repeat?
What could we do differently in the future?’ These questions invite negative and positive experiences to be shared and help everyone see the future with a more informed perspective. No one wants to struggle, but if we do, this helps turn it into wisdom for the future. We will also never be perfect, ever. So knowing how to turn problems, challenges or mistakes into insights and learning turns something painful into something powerful.”
4. Balance the Scales. Condron-Wells suggests you picture a scale with two different plates. Then, imagine placing the negative event on one side of the scale. Next, picture yourself counter balancing it. But because that negative thing is pretty heavy, you need three positive things. “Over time you’ll start to see the good with the bad, not to minimize the bad, but to see that more than one truth is happening at the same time,” she says.
“Good (or great) things that you may have done may be: identified a risk, solved a problem, stayed committed, helped others, saved time, money, energy—even lives. How you process a lot of this is private. Few people know how we are managing work challenges, but when we share our approaches, we open up dialogue that helps everyone.”
5. Turn Your Frown Upside Down. “I hesitated sharing this at first until I read the groundbreaking research that smiling not only reflects how we feel, but it contributes to lifting our moods by tricking the mind into perceiving the world in a positive light. That made me feel better about a course of action I used with my kids to help them flip their perspectives and think through their challenges. I could not be responsible for their happiness, and I wanted to grow their ability to foster it themselves and move to a more constructive space.
When negativity hits home, you can do what I did and come up with a chart. On the left, write what upsets you. Then write three things you are grateful for. At work you can list the work stressors and three positive ways to overcome them. Or you can name the challenges and three lessons you learned. Stay positive, though. No, your boss is not a jerk. Include constructive strategies like, ‘I need to be more concise’ or ‘The leader needs materials 24 hours in advance.’”
Kids who learn and think differently aren’t the only ones who can feel lonely or “apart” from other kids. Most people feel that way at some point.
But research shows that kids who learn and think differently are more likely than their peers to struggle with loneliness. And they often have a harder time dealing with those feelings when they have them. Learn more about loneliness and kids who learn and think differently.
Why kids who are different might feel lonely
Kids who learn and think differently might feel lonely for many reasons. For starters, they’re more likely to be bullied or left out. They can have a hard time making friends or connecting with people. And struggling in school and socially can make kids feel bad about themselves.
They may feel like nobody understands them or their challenges. And they might even withdraw. Kids with certain challenges are most likely to feel left out and isolated. These challenges include trouble with:
The difference between being lonely and being alone
Some people like spending time alone. That goes for kids and adults. As long as they have the ability to make friends and connect with other people when they want to, being alone is a preference, not a problem.
Being unhappy when alone doesn’t necessarily mean someone is lonely, though. Having a hard time entertaining yourself and feeling bored aren’t the same thing as feeling socially isolated.
Also, loneliness isn’t always about being alone. Some kids feel isolated even when they’re with others. They feel like nobody around them shares or understands their challenges. There’s nobody to connect with.
How loneliness can impact kids
When kids go through the occasional lonely spell, it usually doesn’t have a lasting impact. Feeling lonely all the time is different, though. It can affect kids in lots of ways. And it can lead to other difficulties.
Kids who feel lonely might be:
More likely to have low self-esteem. They might feel like others are rejecting them. Kids might lose confidence in themselves and eventually believe they have nothing valuable to offer.
Less likely to take positive risks. Trying new things can build confidence and lead to new interests and skills. But kids who are already feeling rejected and vulnerable may not want to take this leap. They may be afraid to call attention to themselves and risk failing.
More likely to be sad, disconnected, and worried. Kids deal with loneliness in different ways. They may keep their sadness inside and pull away from others. Or they may become angry and act out. The combination of negative emotions and isolation can lead to depression and anxiety.
More likely to engage in risky behaviors. Teens may drink, smoke or vape, use drugs, vandalize property, or do other risky things if they think it will help them feel accepted.
Keep an eye on signs of depression, too. Don’t hesitate to reach out to your health care provider if you have concerns. And if your child has ADHD, read about the connection between ADHD and depression.
If your child is struggling to make friends, there are ways to help. First, try to figure out why. Some kids need help with social skills. This is common for kids who are immature or have ADHD, autism or non-verbal learning disorder. Other kids are anxious. They may feel overwhelmed in new social situations or big groups.
Kids who are depressed often want to stay in their rooms. They may interpret things negatively and doubt others want to see them. Finally, some kids may have a hard time fitting in because they have different interests.
If you think your child is lonely, ask them. Start by describing a time when you have felt lonely. If they don’t want to talk, try again in a few days. Don’t push them.
If your child says they are lonely, try to be a good listener. Show that you’re listening by reflecting back what they’re saying: “It sounds like you’re having a hard time.” You can also say supportive things like: “That sounds tough. Would you tell me more about that?”
Once you know more, you can try to help. For kids who need practice with social skills, you can break things down into small steps. Then you can role play them with your child. For kids who have a hard time putting themselves out there, acknowledge how they feel. Then remind them that they’ll probably have a good time once they’ve made the effort. Give them lots of support and praise for doing something tough.
Some kids tend to misunderstand interactions. You can give a reality check: “What makes you think he’s mad? Are there other explanations?” For kids who interpret things negatively a lot, pointing it out each time can help break the pattern.
Finally, help kids find a group or activity that is interesting to them. Many kids find success online, where there are lots of virtual groups for kids with specific interests. Getting excited about something will help them feel more confident, too.
Hankin BL, Abramson LY, Moffitt TE, Silva PA, McGee R, Angell KE (February 1998). “Development of depression from preadolescence to young adulthood: emerging gender differences in a 10-year longitudinal study”. Journal of Abnormal Psychology. 107 (1): 128–140. doi:10.1037/0021-843x.107.1.128. PMID9505045. S2CID29783051.
Hallfors DD, Waller MW, Ford CA, Halpern CT, Brodish PH, Iritani B (October 2004). “Adolescent depression and suicide risk: association with sex and drug behavior”. American Journal of Preventive Medicine. 27 (3): 224–231. doi:10.1016/s0749-3797(04)00124-2. PMID15450635.
Ahmed, Z., & Julius, S. H. (2015). The relationship between depression, anxiety and stress among women college students. Indian Journal of Health and Wellbeing, 6(12), 1232-1234. ProQuest1776182512Sutton A, ed. (2012). “Depression in Children and Adolescents”. Depression Sourcebook, 3rd Edition. Detroit: Omnigraphics: Health Reference Series. pp. 131–143.
Zahn-Waxler C, Klimes-Dougan B, Slattery MJ (2000). “Internalizing problems of childhood and adolescence: prospects, pitfalls, and progress in understanding the development of anxiety and depression”. Development and Psychopathology. 12 (3): 443–466. doi:10.1017/s0954579400003102. PMID11014747. S2CID25880656.
The drought has dragged on for a long time. Last year, orchard farmers in California’s Central Valley destroyed trees that were dying due to lack of water. This year, many more farmers did the same.
Lack of summer rain forced Nebraska farmer Kevin Fulton to go underground to find water for his crops. Not a perfect solution: the Ogallala Aquifer, where Fulton tapped in, has pumping restrictions in some areas, just not where Fulton is located. That’s because the aquifer is running dry.
As drought extends its deadly fingers from California to the eastern side of the Mississippi River — a vast stretch of the continent that produces most of America’s food, including three-quarters of its beef cattle and 70% of its vegetables, fruits and nuts — farmers and ranchers are facing a double whammy. They have to go farther to find water and higher fuel costs are forcing them to pay more to pump whatever isn’t coming from the sky. That predicament is still better than what’s happened to the land that’s not irrigated, Fulton says.
“The pastures are burning up,” Fulton, a 28-year veteran of farming the land he inherited, told Forbes. “Some aren’t going to produce anything and the yields have been drastically reduced. This wears on you mentally. You’re working hard to keep up with the irrigation. It’s depressing. These kinds of things sometimes push farmers over the edge.”
Only heroic efforts by farmers and ranchers have kept supermarket shelves supplied in the parched U.S. Even so, drought is expensive for consumers and limits their choices. Inflation caused by higher production costs will persist as long as hot, dry weather dominates vast swathes of the country. Many of the folks who make their living from agriculture realize that conditions in summer, the most dangerous season, will likely persist into the future.
“The hazards in recent years have been relentless,” Kristy Dahl, the Union of Concerned Scientists’ principal climate expert, told Forbes. “We need to be better prepared for ‘danger season,’ otherwise we’ll increasingly be caught off-guard every year. Climate change is getting worse.”
Despite the drought — in some places, the driest conditions going back more than 1,000 years — American producers have managed to bring in a projected harvest that’s nowhere near as bad as it could be. Soybean production will actually increase 2% from 2021, according to the U.S. Department of Agriculture, and wheat is up 8% over last year as global demand soared in the wake of Russia’s unprovoked war on Ukraine, a major wheat-producing country.
The concern is the corn harvest, which the USDA predicts will be down 5% from 2021, with less of the supply classified as good or excellent compared with last year. Still, the USDA forecasts record-high corn yields in California, Iowa, Washington and Wisconsin.
American food producers polled by the American Farm Bureau Federation last year complained of dangerously dry conditions. The farmers and ranchers surveyed in August 2022 say circumstances are pretty much the same or worse.
Nearly three-quarters of farmers saw a reduction in harvest yields due to drought, while 37% said they were tilling over fields that won’t produce anything because of a lack of water, up from 24% last year. One-third of orchard farmers nationwide, and 50% in California, said they were ripping up trees, an increase from 17% in 2021. In one case, the Farm Bureau said, a California producer dropped all the fruit on five acres of young Cabernet grapes to help the vines survive without water. That ensured that the farmer would have no revenue from those vines.
Similar measures plague the livestock industry, according to the Farm Bureau. Two-thirds of ranchers reported selling off animals or birds, with average herd sizes expected to be down 36%. The biggest herd declines are in Texas (down 50%), New Mexico (43%) and Oregon (41%), a good example of the wide geographic distribution of the distress.
Fulton says 2022 was the worst year of drought in the past decade, and the second-worst in his three decades of farming, after 2012. Some of Fulton’s neighbors are now reducing the sizes of their herds and taking the livestock to the auction barn. Fulton says he’s considering doing the same. Drought kills off the grass that cattle need to graze on, so farmers have to buy expensive hay to feed them instead.
“You can’t feed yourself out of a drought. It doesn’t work from a profit standpoint,” Fulton says. “We’re going to run out of grass.” In mid-August, rains finally hit farms in dried-out Southwestern states, including Texas, Arizona and New Mexico. Then the rains arrived on the Great Plains. Farmers like Fulton welcomed the brief break. But it was too little, too late.
On Fulton’s farm, the effects of a changing climate are pervasive. There are more grasshoppers, which love dry weather and eat the crops. Fulton’s bees have also been less active. Honey production is half of what it normally is, he said. There’s also the looming threat of heat-driven poison: If some plants don’t get enough water, they can produce high levels of nitrates, which makes them toxic for the livestock that eat them.
According to the USDA’s August production report, the rains in mid-August helped to replenish topsoil moisture and “revived drought-ravaged rangeland and pastures. However, hot, dry weather persisted.” From the Pacific Coast to the northern Plains, temperatures averaged at least 5°F above normal. Readings even averaged 10°F above normal in some locations across the interior Northwest and Northern California.
Drought-resistant seeds and drip-irrigation to conserve water are promising solutions, if they can be put to use at scale. A lot of money has gone towards funding startups and research, but there haven’t been many mainstream successes.
This year’s industrially grown commodity crops so far still seem strong overall. But the cracks driven by climate change are starting to show. Darvin Bentlage, a 66-year-old, fourth-generation cattle and grain farmer located north of Joplin, Missouri, says the extreme weather that he and his neighbors face has taken them on a roller-coaster ride. Earlier this year, it rained so much that he had to delay planting. Then the drought came.
“That was a rough start,” Bentlage told Forbes. “In my 50 years of farming it never went from being so wet to so dry – it’s the fastest I’ve seen.” He added: “Pray for rain.” Despite decreasing access to water and extreme weather projections on the horizon, Fulton says he’s optimistic for the future.
“Like most farmers, when we have a bad year, we say it will be better next year. We live to farm another year,” he says. “Sometimes it seems like it can’t get any worse.”
We’ve become convinced that if we can eat more healthily, we will be morally better people. But where does this idea come from? Near the end of the hellish first year of the coronavirus pandemic, I was possessed by the desire to eliminate sugar – all refined sugar – from my diet. In retrospect, it probably wasn’t the best time to add a new challenge to my life. My wife and I had been struggling to remote-school three young kids with no childcare. My elderly parents lived out of state and seemed to need a surprising number of reminders that pandemic restrictions were not lifted for Diwali parties or new Bollywood movie releases.
Like many people in those early days, we were looking around for masks and trying to make sense of shifting government guidelines about when to wear them. In addition, as a doctor, I was seeing patients in clinic at a time dominated by medical uncertainty, when personal protective equipment was scarce, and my hospital, facing staff shortages, was providing training videos and “how-to” tip sheets to specialists like me who hadn’t practised in an emergency room for years, in case we were needed as backup.
It would have been enough to focus on avoiding the virus and managing all this without putting more on my plate. But cutting processed sugar seemed like an opportunity to reassert some measure of order to the daily scrum, or at least to the body that entered the fray each day.
My former physique was behind me and the stress of clinical practice during the pandemic was taking its toll. Maybe it was all the pandemic death in the air, but I started feeling like I was what the narrator in Arundhati Roy’s novel The God of Small Things calls “Not old. Not young. But a viable die-able age.” Maybe doing away with sugar could slow things down? More tantalisingly, maybe it could even take me back to a fresher time, the days in college when I had actually gone sugar-free for a while.
My friends offered condolences on what they called my soon-to-be joyless lifestyle. But I was set, compelled by literature about the deleterious, even toxin-like effects of added sugar. I had my doubts about being able to pull something like this off again, though, so I decided – as doctors often do – to tackle the problem by studying it.
That year, in what was arguably an act of masochism, I began the coursework required to sit for a medical-board exam on dietetics, metabolism and appetite. By earning another qualification, I thought, I would credential my way to realising my goal. After shifts at work, during breaks or once the kids were asleep, I would attend virtual lectures and pore over board-review books in a quest to understand the body’s metabolism.
I immersed myself in the physiology of exercise, the thermodynamics of nutrition, and the neuroendocrine regulation of appetite. But this knowledge didn’t break my pandemic eating habits. Cupcakes and ice cream and cookies didn’t call to me any less. And big food corporations were winning the bet that Lay’s potato chips first made back in the 1960s with its “Betcha can’t just eat one” ad campaign. So, I found myself reaching for Double Stuf Oreos while flipping through my medical textbooks and scarfing chocolate bars even as I correctly answered my practice-exam questions.
My body refused to be disciplined by my intellectual mastery of its operations. I passed the board examination, but my appetite for sugar didn’t change. I was left with more questions than I had when I started. Was sugar really a problem? Or had I internalised hangups about desire from the culture at large? Why did my soul feel so inexplicably sick – so unsatisfied – with the outcome of my first effort to quit that I tried it all again? And what does my “success” – I’ve been sugar-free for a year now – even mean?
I turned to Plato – a man occupied by appetite – for some answers. In his body map of the soul, the stomach was the dwelling place of desire. Reason, of course, resided in the head, while courage rested in the chest. In this tripartite architecture, it was up to reason – with the help of courage – to subjugate appetite and elevate the individual. The thinking went that if we could just rule our stomachs, we might be able to hold our heads up high and our chests out wide. For the Greeks, the right moral posture was key to the good life, or eudaimonia.
Early medical science in the west borrowed heavily from Plato, beginning with Aristotle, who practiced and taught medicine throughout his life. Aristotle agreed that eudaimonia could be realized by moderating the visceral and sensual appetites. He saw the heart as the vessel of intelligence, and arguably the most virtuous of organs. In his hypothesis, the heart occupied – physically and figuratively – a central place in the body, controlling other organs. The brain and lungs played supporting roles, merely cooling and cushioning the heart. The heart was, for Aristotle, where reason flowed.
Five hundred years later, the Greek anatomist and surgeon Galen challenged the centrality of the heart but still adhered closely to Plato’s triadic notion of the soul. Galen’s treatises, foundational to the development of modern medicine, are suffused with Platonic assumptions, and he painstakingly tried to stitch the divided parts of the soul – the rational, the spirited and the appetitive – on to specific organs in the human body.
In a striking display of topographical certitude, Galen writes in On the Doctrines of Hippocrates and Plato: “I do claim to have proofs that the forms of the soul are more than one, that they are located in three different places … and further, that one of these parts [rational] is situated in the brain, one [spirited] in the heart, and one [appetitive] in the liver. These facts can be demonstrated scientifically.”
The Harvard classicist Mark Schiefsky writes that, in Galenic physiology, equilibrium is understood “as a balance of strength between the three parts; the best state is when reason is in charge, the spirited part is strong and obedient, and the appetitive part is weak”.
Should we be sceptical of this aspiration to tame appetite? Sigmund Freud doubted whether desire could ever be so readily controlled. In tossing Plato’s map aside, Freud erased the “soul” and instead sketched a three-part atlas of the “self” and its ratio of desires and repressions – endlessly fractured, negotiating between order (superego), consciousness (ego) and appetite (id). For Freud, appetites could not be overcome but only better managed. Perfect harmony and permanent equilibrium were nowhere in sight. Rather, in Freud’s idea of the self, anxiety for order loomed above the ego, with desire buried beneath it. Appetite was the subterranean tether that consciousness could never escape, but only sublimate.
There was something talismanic about my focus on sugar. So often, liberty is conceived of as the ability to say yes to things. To make affirmative choices: to open this door or that window. But there is also a flipside to that freedom: the power to say no. To refuse. Increasingly during the pandemic, I felt like I was powerless in the face of my cravings. If there was a knock at the door of appetite, a tap on the window of impulse, I had to answer it. And this felt shameful. Why couldn’t I say no? And why was realizing this so painful?
I don’t pretend to anything approaching total understanding of my motivations. But there were a few loosely detected currents worth illuminating here. For one thing, not being able to say no to sugar sometimes felt like a form of bondage to the demands of the body, the very body that I was eager to assert power over, particularly during a global health crisis that was damaging bodies everywhere.
If I couldn’t control this plague, could I not at the very least control myself? I wonder now if this insistence on regulating appetite was my sublimated response to the coronavirus’s immense death toll – a way of denying mortality in the midst of its excess. In this respect, perhaps there was not as much separating me from other kinds of pandemic deniers as I would like to believe. Were we all just coping with the inexorability of our decay – laid painfully bare by Covid-19 – in different ways?
Maybe. But there was something beyond the exigencies of the pandemic on my mind as well. The inability to resist sugar cravings – to break the habit – seemed like a victory of the past over the present. It felt like the triumph of the mere memory of pleasure over real satisfaction in the moment. Saying no to that memory – the neurological underpinning of craving – became important, because it felt like the only way to say yes to imagination. “I am free only to the extent that I can disengage myself,” the philosopher Simone Weil wrote.
Detachment from an indulgence, however small, felt like a way to stop being beholden to an old storehouse of desires (and aversions and beliefs). Developing the ability to refuse to reach for the cookie was also a way to break free from the impulse to reach for patterns of the past, from the compulsion of replicating yesterday at the expense of tomorrow. It’s the trick of habit to convince us that we are reaching forward, even as we are stepping back. Or, as the British scholar of asceticism Gavin Flood elegantly summarizes: “The less we are able to refuse, the more automated we become.”
If Freud dismantled the soul, modern medicine mechanized what he left of the self. But where Freud’s psychoanalytic theory allowed for a pinch of poetry, materialist models hold comparatively dry sway today. A look at the biomedical literature on appetite reveals a tortuous mix of neural circuits and endocrine pathways. What’s clear is that if there was a moral aspect of appetite for ancient philosophers and physicians, it’s not readily discernible in the language of contemporary scientific literature.
There are upsides to this development. In the modern era, medicine’s tradition-bound framing of appetite as a moral problem has been demoralizing for patients, who often felt – and still feel – objectified, policed and discriminated against by institutions that sermonize about it. The stigmatisation of appetite remains pervasive in the culture, in and out of medicine. The loss of at least an explicit moral charge in the scientific literature is a welcome shift.
In the century or so since Freud’s conjectures, appetite has been atomised by medicine into a problem of eating, or more specifically, of fighting the body’s tendency toward “disordered” eating. In the pursuit of better and longer lives, maladies of appetite – of eating too much, too little, or not the right kinds of food – have been studied and treated with varying degrees of success. The empirical study of digestion and appetite in the laboratory moved hunger from the moral arena into a biochemical one. Still, in both experimental physiology and clinical medicine, the ancient impulse to locate the appetite persisted: was it in the body or in the mind? Lines were drawn – and defended – between diseases of the stomach and diseases of the psyche.
What was at stake in the difference? Pinning down the appetite – claiming it belonged to the gut or the brain – was arguably the first in a series of steps leading to its regulation. Understood this way, medicine’s mission to uncover the mechanisms of appetite, despite the erasure of the soul from scientific databases, cannot escape Plato’s legacy. Whether we’re trying to improve or curtail appetite, we seem unable to resist the desire to control it.
It would have been different – I wouldn’t have felt the need to go all-or-nothing with sugar – if I could have simply walked away after a few bites. But increasingly during the pandemic, I wouldn’t stop even after I was full. What started off as pleasure would morph into painful excess. Sure, there’s pleasure in abundance, in overdoing a thing. But I found myself barrelling past that threshold.
While studying for the board exam in my first, failed attempt at going sugar-free, I was also using various apps and devices to keep track of my body. I had long used a smart watch to log my steps and workouts. I was also using a calorie-tracking app, studiously punching in numbers for every meal and scheming how much I could eat and still remain under the calorie limit. But all that logging and calculating felt joyless and anxiety-ridden. Sometimes, at a meal, in the middle of tallying up numbers like an accountant, I’d explain to impatient friends and family that “I’m just entering my data”. It was a lot of data.
I grew weary of all the inputting, and so I switched to an app with more of a behavioural focus. This app still had me tracking calories, but also came with recipes, a personal coach and “psychology-based” courses, as part of what the company calls your “journey”. The courses were a welcome shift from the myopic focus of calorie counting, and chatting with a coach added an opportunity to get some clarity about my goals.
The coach would share chipper motivational advice and provide tips to overcome obstacles. I diligently went through the app’s courses, answered its behavioural questions and followed its nudges. There were a few weeks where I was able to go sugar-free, but after a couple of months, the coaching advice seemed more and more generic, and the courses too simplistic when I was already spending so much time studying for my upcoming exam. I lost interest and reverted to simply recording calories.
I eventually passed that exam without much to show for it in terms of changes to my nutritional habits. I needed something different, a way to hold myself accountable and mean it. I stumbled upon another app that described itself as being “on a mission to disrupt diet culture and make our relationship with food, nutrition – and ourselves – healthier for good”. It promised live coaching calls with a certified nutritionist, shared recipes, and even offered to tailor my coaching with a vegetarian dietician. It did not ask you to track calories or enter food items from a database. All it wanted was for you to send pictures … of your food. It felt radically different than tapping numbers into a screen: someone else would see this.
The app’s slogan was “100% accountability and “0% judgment”. But, to be clear, it was the judgment that I came for. The simple fact that my nutritionist wouldn’t just know but also actually see what I was eating was the killer feature. I answered a questionnaire about my dietary habits and goals. I made it clear that I wanted to go sugar-free, and repeated as much to my nutritionist during a preliminary call.
She didn’t exactly endorse this goal, but rather acknowledged it as something that was important to me and gently marked it as a topic we would come back to, adding that she hoped I would get to the point where a more balanced approach would suffice. I told her we’d see. I made a promise to take a photo of every meal, good or bad. She kindly reminded me there are not “good” and “bad” foods, and we were on our way.
It’s been a year since I downloaded the app. Every day since then, I have taken a photo of every morsel of food I’ve eaten, whether it’s a handful of pistachios, a salad or a veggie burger. In every one of those pics, every day, I have been sugar-free. I’ve eaten more vegetables and greens and fruits than I’ve probably ever eaten in my life. My plates look balanced (I make sure of it). I take care to snap pictures that look nice for my nutritionist. Though she never judges me negatively, I look forward to the raising-hands emoji and approving words she sends if she sees a salad with asparagus and garlic balsamic drizzle and avocado up front.
Like an influencer on Instagram, I’ll take another shot if the lighting isn’t quite right, or if the framing is off. It’s been satisfying to upload a cache of sugar-free images, all beautifully arranged on the app’s user interface. Even more satisfying has been avoiding feeling like the guy who said he’d go sugar-free only to end up sending in pictures of donuts and cookies. Compared to calorie logs and food diaries, the prospect of someone else seeing photos of what I’m eating has made the potential pain of falling short feel more proximate than the pleasure of eating sweets. So I just stopped eating sugar. And it’s still working. Was this all it took?
Perhaps the persistent effort to control appetite, replicated across many cultures and times, reveals just how vigorously it resists that very control. The seemingly endless proliferation of constraints on appetite – from the disciplinary to the pharmacological – underscores its untamable quality. And yet the training of appetite – both as physiological fact and, more abstractly, as desire – can function as an ascetic practice. In this paradigm, as religion scholars such as Flood argue, the negation of desire amplifies the subjectivity of the individual.
Depriving the body paradoxically accentuates the conscious subject, because hunger unsatiated allows the pangs of the self to be felt more acutely, and renders being more vivid. In other words, appetite unfulfilled creates the conditions for expanding self-awareness. This is seen in the Bhagavad Gita in the figure of the ascetic, one who has renounced the pull of appetite and “attains extinction in the absolute” – in seeming contradiction, gaining infinity through loss.
If philosophy is after theoretical victories, science aims more concretely to hack, or at least short-circuit, a physiological truth. Take, for example, gastric bypass surgery, an operation that cuts the stomach into two parts (leaving one functional thumb-size pouch alongside a larger remnant) and radically reconstructs separate intestinal systems for each segment to restrict the amount of food that can be eaten. By shrinking the stomach to fool the mind into feeling satisfied with less, this surgery builds on growing recognition that the long-embraced brain-gut divide is far more porous than previously thought.
Recipients of the surgery generally do well in the short term, with reduced appetite, marked weight loss, better control of diabetes and improved health markers. But the percentage of patients who “fail” in the long-term after bariatric surgery (ie achieve less than half of excess weight loss) is reportedly as high as 35%. During that first post-op year, studies suggest, an influx of appetite-reducing intestinal hormones decreases patients’ urge to eat. Crucially, however, there are questions about the duration of those salutary hormonal changes and their effectiveness in controlling appetite as post-surgical days add up.
For a significant proportion of patients, even surgically shrinking the stomach – the historical seat of hunger – doesn’t offer complete freedom from unchecked appetite. This fact is not entirely surprising, given what is now known about the multiple neuroendocrine nodes that govern appetite, but it poses a conundrum for medical science: can appetite, as Freud asked in his own way, ever be fully controlled? And if not, is it a wonder that patients turn back to more personal strategies to pursue the work that prescriptions and sutures leave undone?
I can’t say I fully understand why teaming up with a nutritionist on an app worked so well, so fast. Would sharing pics of my food with friends and family in a group chat or a Facebook page have been as effective? Probably not. The issue seemed to be one of epistemology. My friends and family wouldn’t have been as suitable an audience, since they don’t just know me as I am, but also as I was. That knowledge of what’s bygone necessarily shapes the stories we can tell and believe about one another.
But with my nutritionist reviewing pictures of my meals from god knows what timezone, the app created an epistemological gap into which both of us could step. It was within this gap that my future self – the self I aspired to be, still unrealised and therefore unknown – could intercede in the present with slightly less inertia from the past. The app provided an illusion that daily life could not, offering a space for the dormant commitments of the future to come to fruition in the present. A space for imagination to overcome memory.
As my sugar-free streak extended, I began to wonder about the future of this illusion. Was it a rare example of tech living up to its glitteringly naive promise of liberation? Or was this an instance of the digital panopticon yet again determining our ability to imagine ourselves, revealing just how far-reaching its gaze is? And, more practically, I began thinking about how long I needed to keep eating this way. The cravings that had knocked so loudly at my door at the start of the pandemic now softly shuffled from leg to leg right outside it. I could still hear their shoes creaking at the threshold, but they couldn’t force their way in anymore. Things seemed quiet, maybe a little too quiet.
Whereas the Greeks soughtto regulate appetite in pursuit of the good life, perhaps what is sought after today is a facsimile of it: a corporatised eudaimonia-lite, where the goal isn’t virtue but efficiency; not equanimity, but productivity. In this view, it’s not a better way to live we’re seeking, just a less painful way to work and die – all while “looking good”. A more charitable and poetic possibility is that the constraint of appetite continues to appeal because it provides the same sense of structure to selfhood that metre does to a poem: a limit against which to construct narrative unity of the psyche.
As fascinating as it is to think about this question, even more essential ones – about the links between appetite, scarcity and loss – loom in the writings of Toni Morrison, a writer who provides a necessary counterbalance to the obsession with appetite restriction in societies glutted with luxury. In particular, I’m thinking of Beloved, which tells the story of human beings struggling for survival and wholeness in the face of slavery’s horrors. In portraying this struggle, Morrison uses the language of food and appetite to unfurl narratives saturated with the metaphysics of hunger: the difficulty of sating the self; the confusion between hunger, history and hurt.
I was struck by this unexpected resonance while rereading the book in the middle of my bid to quit sugar. Morrison’s characters think about what it would mean to satisfy what the narrator calls their “original hunger” – and whether doing so is even possible. They imagine getting to a place “beyond appetite”, but are also compelled by history to contemplate the price of doing so.
In my reading of the book, the denial of hunger risks becoming a costly exercise in self-abnegation – a severing of self from history, of self from self – whose consequences Plato doesn’t seem to fully consider, but which Morrison is deeply wary of. I think Morrison is, like Freud, skeptical of the metaphysicians who would have us render hunger subordinate. But where Freud is an anti-idealist, Morrison appears willing to reach for hunger, perilous though it may be. Straddling both the risk of self-destruction posed by contact with the original hunger, and the anguish of self-denial created by leaving it unrecognised, Morrison casts her faith in the human ability to embrace the beautiful, blood-hued predicament of incarnation.
About 10 months into my sugar-free life, a scent from the pantry hit me like it hadn’t for a while. My wife had just baked chocolate-chip cookies for our kids as a treat. By then, I was unfazed by sweets around the house. They might as well have been made of stone. But, at the end of a long day, I found myself unexpectedly at the pantry door. Minutes passed. After a while, I opened the plastic container and inhaled. My mouth began to water. I could almost taste the cookies.
I remembered the delightful way the chocolate melted at the back of the tongue. I remembered the satisfaction of soaking a warm cookie in milk. A part of my brain was humming, eager to replicate the memory of sugar, butter and dough on the cortex. Another part was already dreading the pain of not being able to stop. I picked up the cookie and, having built nearly a year’s worth of muscle memory, simultaneously opened the app on my phone. I centred the cookie in the glowing frame and was about to press send when, looking at the screen, it hit me: what would my nutritionist think?
As of this writing, my streak remains unbroken, despite a few close calls. In many ways the story seems to be going the way I intended: I am eating well balanced, sugar-free meals and haven’t counted a calorie in more than a year. The cravings that were troubling me aren’t gone, but the future version of me – the unsweetened aspirant – grows closer with each picture I snap. I feel the spiritual and physical acuity that comes with ascetic practice.
But I also feel some qualms about neglecting Morrison’s original hunger, with all its attendant risks and possibilities. I think about how I have sacrificed memory at the altar of imagination, recognising the chance that imagination ends up being overrated and memory proves to be the last storehouse of joy. But then I remind myself that visions like Morrison’s may be too large, too untimely for us to inhabit. They come from a place we haven’t arrived at. At least not yet.