Train Your Brain to Remember Anything You Learn With This Simple, 20-Minute Habit

Not too long ago, a colleague and I were lamenting the process of growing older and the inevitable increasing difficulty of remembering things we want to remember. That becomes particularly annoying when you attend a conference or a learning seminar and find yourself forgetting the entire session just days later.

But then my colleague told me about the Ebbinghaus Forgetting Curve, a 100-year-old formula developed by German psychologist Hermann Ebbinghaus, who pioneered the experimental study of memory. The psychologist’s work has resurfaced and has been making its way around college campuses as a tool to help students remember lecture material. For example, the University of Waterloo explains the curve and how to use it on the Campus Wellness website.

I teach at Indiana University and a student mentioned it to me in class as a study aid he uses. Intrigued, I tried it out too–more on that in a moment. The Forgetting Curve describes how we retain or lose information that we take in, using a one-hour lecture as the basis of the model. The curve is at its highest point (the most information retained) right after the one-hour lecture. One day after the lecture, if you’ve done nothing with the material, you’ll have lost between 50 and 80 percent of it from your memory.

By day seven, that erodes to about 10 percent retained, and by day 30, the information is virtually gone (only 2-3 percent retained). After this, without any intervention, you’ll likely need to relearn the material from scratch. Sounds about right from my experience. But here comes the amazing part–how easily you can train your brain to reverse the curve.


With just 20 minutes of work, you’ll retain almost all of what you learned.

This is possible through the practice of what’s called spaced intervals, where you revisit and reprocess the same material, but in a very specific pattern. Doing so means it takes you less and less time to retrieve the information from your long-term memory when you need it. Here’s where the 20 minutes and very specifically spaced intervals come in.

Ebbinghaus’s formula calls for you to spend 10 minutes reviewing the material within 24 hours of having received it (that will raise the curve back up to almost 100 percent retained again). Seven days later, spend five minutes to “reactivate” the same material and raise the curve up again. By day 30, your brain needs only two to four minutes to completely “reactivate” the same material, again raising the curve back up.

Thus, a total of 20 minutes invested in review at specific intervals and, voila, a month later you have fantastic retention of that interesting seminar. After that, monthly brush-ups of just a few minutes will help you keep the material fresh.


Here’s what happened when I tried it.

I put the specific formula to the test. I keynoted at a conference and was also able to take in two other one-hour keynotes at the conference. For one of the keynotes, I took no notes, and sure enough, just shy of a month later I can barely remember any of it.

For the second keynote, I took copious notes and followed the spaced interval formula. A month later, by golly, I remember virtually all of the material. And in case if you’re wondering, both talks were equally interesting to me–the difference was the reversal of Ebbinghaus’ Forgetting Curve.

So the bottom line here is if you want to remember what you learned from an interesting seminar or session, don’t take a “cram for the exam” approach when you want to use the info. That might have worked in college (although Waterloo University specifically advises against cramming, encouraging students to follow the aforementioned approach). Instead, invest the 20 minutes (in spaced-out intervals), so that a month later it’s all still there in the old noggin. Now that approach is really using your head.

Science has proven that reading can enhance your cognitive function, develop your language skills, and increase your attention span. Plus, not only does the act of reading train your brain for success, but you’ll also learn new things! The founder of Microsoft, Bill Gates, said, “Reading is still the main way that I both learn new things and test my understanding.”

By: Scott Mautz

Source: Pocket

.

Critics:

Dr. John N. Morris is the director of social and health policy research at the Harvard-affiliated Institute for Aging Research. He believes there are three main guidelines you should follow when training your mind:

  1. Do Something Challenging: Whatever you do to train your brain, it should be challenging and take you beyond your comfort zone.
  2. Choose Complex Activities: Good brain training exercises should require you to practice complex thought processes, such as creative thinking and problem-solving.
  3. Practice Consistently: You know the saying: practice makes perfect! Dr. Morris says, “You can’t improve memory if you don’t work at it. The more time you devote to engaging your brain, the more it benefits.”
  4. If you’re looking for reading material, check out our guides covering 40 must-read books and the best books for entrepreneurs.
  5. Practice self-awareness. Whenever you feel low, check-in with yourself and try to identify the negative thought-loop at play. Perhaps you’re thinking something like, “who cares,” “I’ll never get this right,” “this won’t work,” or “what’s the point?” 
  6. Science has shown that mindfulness meditation helps engage new neural pathways in the brain. These pathways can improve self-observational skills and mental flexibility – two attributes that are crucial for success. What’s more, another study found that “brief, daily meditation enhances attention, memory, mood, and emotional regulation in non-experienced meditators.”
  7. Brain Age Concentration Training is a brain training and mental fitness system for the Nintendo 3DS system.
  8. Queendom has thousands of personality tests and surveys. It also has an extensive collection of “brain tools”—including logic, verbal, spatial, and math puzzles; trivia quizzes; and aptitude tests
  9. Claiming to have the world’s largest collection of brain teasers, Braingle’s free website provides more than 15,000 puzzles, games, and other brain teasers as well as an online community of enthusiasts.

 

3 Simple Habits That Can Protect Your Brain From Cognitive Decline

You might think that the impact of aging on the brain is something you can’t do much about. After all, isn’t it an inevitability? To an extent, as we may not be able to rewind the clock and change our levels of higher education or intelligence (both factors that delay the onset of symptoms of aging).

But adopting specific lifestyle behaviors–whether you’re in your thirties or late forties–can have a tangible effect on how well you age. Even in your fifties and beyond, activities like learning a new language or musical instrument, taking part in aerobic exercise, and developing meaningful social relationships can do wonders for your brain. There’s no question that when we compromise on looking after ourselves, our aging minds pick up the tab.

The Aging Process and Cognitive Decline

Over time, there is a build-up of toxins such as tau proteins and beta-amyloid plaques in the brain that correlate to the aging process and associated cognitive decline. Although this is a natural part of growing older, many factors can exacerbate it. Stress, neurotoxins such as alcohol and lack of (quality and quantity) sleep can speed up the process.

Neuroplasticity–the function that allows the brain to change and develop in our lifetime–has three mechanisms: synaptic connection, myelination, and neurogenesis. The key to resilient aging is improving neurogenesis, the birth of new neurons. Neurogenesis happens far more in babies and children than adults.

A 2018 study by researchers at Columbia University shows that in adults, this type of neuroplastic activity occurs in the hippocampus, the part of the brain that lays down memories. This makes sense as we respond to and store new experiences every day, and cement them during sleep. The more we can experience new things, activities, people, places, and emotions, the more likely we are to encourage neurogenesis.

With all this in mind, we can come up with a three-point plan to encourage “resilient aging” by activating neurogenesis in the brain:

1. Get your heart rate up

Aerobic exercise such as running or brisk walking has a potentially massive impact on neurogenesis. A 2016 rat study found that endurance exercise was most effective in increasing neurogenesis. It wins out over HIIT sessions and resistance training, although doing a variety of exercise also has its benefits.

Aim to do aerobic exercise for 150 minutes per week, and choose the gym, the park, or natural landscape over busy roads to avoid compromising brain-derived neurotrophic factor production (BDNF), a growth factor that encourages neurogenesis that aerobic exercise can boost. However, exercising in polluted areas decreases production.

If exercising alone isn’t your thing, consider taking up a team sport or one with a social element like table tennis. Exposure to social interaction can also increase the neurogenesis, and in many instances, doing so lets you practice your hand-eye coordination, which research has suggested leads to structural changes in the brain that may relate to a range of cognitive benefit. This combination of coordination and socializing has been shown to increase brain thickness in the parts of the cortex related to social/emotional welfare, which is crucial as we age.

2. Change your eating patterns

Evidence shows that calorie restriction, intermittent fasting, and time-restricted eating encourage neurogenesis in humans. In rodent studies, intermittent fasting has been found to improve cognitive function and brain structure, and reduce symptoms of metabolic disorders such as diabetes.

Reducing refined sugar will help reduce oxidative damage to brain cells, too, and we know that increased oxidative damage has been linked with a higher risk of developing Alzheimer’s disease. Twenty-four hour water-only fasts have also been proven to increase longevity and encourage neurogenesis.

Try any of the following, after checking with your doctor:

  • 24-hour water-only fast once a month
  •  Reducing your calorie intake by 50%-60% on two non-consecutive days of the week for two to three months or on an ongoing basis
  • Reducing calories by 20% every day for two weeks. You can do this three to four times a year
  • Eating only between 8 a.m. to 8 p.m., or 12 p.m. to 8 p.m. as a general rule

3. Prioritize sleep

Sleep helps promote the brain’s neural “cleaning” glymphatic system, which flushes out the build-up of age-related toxins in the brain (the tau proteins and beta amyloid plaques mentioned above). When people are sleep-deprived, we see evidence of memory deficits, and if you miss a whole night of sleep, research proves that it impacts IQ. Aim for seven to nine hours, and nap if it suits you. Our need to sleep decreases as we age.

Of course, there are individual exceptions, but having consistent sleep times and making sure you’re getting sufficient quality and length of sleep supports brain resilience over time. So how do you know if you’re getting enough? If you naturally wake up at the same time on weekends that you have to during the week, you probably are.

If you need to lie-in or take long naps, you’re probably not. Try practicing mindfulness or yoga nidra before bed at night, a guided breath-based meditation that has been shown in studies to improve sleep quality. There are plenty of recordings online if you want to experience it.

Pick any of the above that work for you and build it up until it becomes a habit, then move onto the next one and so on. You might find that by the end of the year, you’ll feel even healthier, more energized, and motivated than you do now, even as you turn another year older.

By: Fast Company / Tara Swart

Dr. Tara Swart is a neuroscientist, leadership coach, author, and medical doctor. Follow her on Twitter at @TaraSwart.

Source: Open-Your-Mind-Change

.

Critics:

Cognitive deficit is an inclusive term to describe any characteristic that acts as a barrier to the cognition process.

The term may describe

Mild cognitive impairment (MCI) is a neurocognitive disorder which involves cognitive impairments beyond those expected based on an individual’s age and education but which are not significant enough to interfere with instrumental activities of daily living. MCI may occur as a transitional stage between normal aging and dementia, especially Alzheimer’s disease. It includes both memory and non-memory impairments.Mild cognitive impairment has been relisted as mild neurocognitive disorder in DSM-5, and in ICD-11.

The cause of the disorder remains unclear, as well as its prevention and treatment. MCI can present with a variety of symptoms, but is divided generally into two types.

Amnestic MCI (aMCI) is mild cognitive impairment with memory loss as the predominant symptom; aMCI is frequently seen as a prodromal stage of Alzheimer’s disease. Studies suggest that these individuals tend to progress to probable Alzheimer’s disease at a rate of approximately 10% to 15% per year.[needs update]It is possible that being diagnosed with cognitive decline may serve as an indicator of aMCI.

Nonamnestic MCI (naMCI) is mild cognitive impairment in which impairments in domains other than memory (for example, language, visuospatial, executive) are more prominent. It may be further divided as nonamnestic single- or multiple-domain MCI, and these individuals are believed to be more likely to convert to other dementias (for example, dementia with Lewy bodies).

See also

The Link Between Bioelectricity and Consciousness

Life seems to be tied to bioelectricity at every level. The late electrophysiologist and surgeon Robert Becker spent decades researching the role of the body’s electric fields in development, wound healing, and limb regrowth. His 1985 book, The Body Electric: Electromagnetism and the Foundation of Life, was a fascinating deep dive into how the body is electric through and through—despite our inability to see or sense these fields with our unaided senses. But Becker’s work was far from complete.

One scientist who has taken up Becker’s line of inquiry is Michael Levin. He got hooked on the subject after he read The Body Electric. Levin has been working on “cracking the bioelectric code,” as a 2013 paper of his put it, ever since. “Evolution,” Levin has said, “really did discover how good the biophysics of electricity is for computing and processing information in non-neural tissues,” the many thousands of cell types that make up the body, our word for trillions of cells cooperating. “It’s really hard to define what’s special about neurons,” he told me. “Almost all cells do the things neurons do, just more slowly.”

How do disarranged cells and organs intuit what do to?

His team at Tufts University develops new molecular-genetic and conceptual tools to probe large-scale information processing in regeneration, embryo development, and cancer suppression—all mediated by bioelectric fields in varying degrees. This work involves examining, for example, how frogs, which normally don’t regenerate whole limbs (like salamanders do) can regrow limbs, repair their brains and spinal cords, or normalize tumors with the help of “electroceuticals” (a pun based on “pharmaceuticals”).

These are therapies that target the bioelectric circuits of tumors instead of, or together with, chemical-based therapies. Bioelectric fields are, in other words, more powerful than we have suspected and perform many surprising roles in the human body and all animal bodies.

Nature seems to have figured out that electric fields, similar to the role they play in human-created machines, can power a wide array of processes essential to life. Perhaps even consciousness itself. A veritable army of neuroscientists and electrophysiologists around the world are developing steadily deeper insights into the degree that electric and magnetic fields—“brainwaves” or “neural oscillations”—seem to reveal key aspects of consciousness.

The prevailing view for some time now has been that the brain’s bioelectric fields, which are electrical and magnetic fields produced at various physical scales, are an interesting side effect—or epiphenomenon—of the brains’ activity, but not necessarily relevant to the functioning of consciousness itself.

A number of thinkers are suggesting now, instead, that these fields may in fact be the main game in town when it comes to explaining consciousness. In a 2013 paper, philosopher Mostyn Jones reviewed various field theories of consciousness, still a minority school of thought in the field but growing.

If that approach is right, it is likely that the body’s bioelectric fields are also, more generally, associated in some manner with some kind of consciousness at various levels. Levin provided some support for this notion when I asked him about the potential for consciousness, in at least some rudimentary form, in the body’s electric fields.

“There are very few fundamental differences between neural networks and other tissues of bioelectrically communicating cells,” he said in an email. “If you think that consciousness in the brain is somehow a consequence of the brain’s electrical activity, then there’s no principled reason to assume that non-neural electric networks won’t underlie some primitive, basal (ancient) form of nonverbal consciousness.”

This way of thinking opens up exciting possibilities. It recognizes that there is perhaps some intelligence (and, to some thinkers, maybe even consciousness) in all of the body’s bioelectric fields, which are efficient sources of information transfer and even a kind of computation. In his work, Levin pieces together how these fields can contain information that guides growth and regeneration.

He sometimes describes these guiding forces as “morphogenetic fields.” It may sound like a mystical notion, but it’s quite physical and real, backed up by hard data. This information, Levin said, can be stored in multicellular electric fields “in a way that is likely very similar to how behavioral memories—of seeing a specific shape for example—are stored in a neuronal network.”

Take the case of a frog. “To become frogs, tadpoles have to rearrange their faces during metamorphosis,” Levin said. “It used to be thought that these movements were hardcoded, but our ‘Picasso’ tadpoles—which have all the organs in the wrong places—showed otherwise.” The apparent know-how that these bioelectric fields demonstrate, in terms of growing normal frogs in very un-normal circumstances, is uncanny. “Amazingly, they still largely became normal frogs!”

How do disarranged cells and organs intuit what do to? Levin, and the renowned philosopher and cognitive scientist Daniel Dennet, recently tackled this question in a rather provocatively titled article, “Cognition All the Way Down.” Something like thinking, they argue, isn’t just something we do in our heads that requires brains.

It’s a process even individual cells themselves, and not requiring any kind of brain, also take part in. To the biologists who see this as a cavalier form of anthropomorphization, Levin and Dennet say, “Chill out.” It’s useful to anthropomorphize many different kinds of life, to see in their parts and processes a variety of teleological experience. “Ever since the cybernetics advances of the 1940s and ’50s, engineers have had a robust, practical science of mechanisms with purpose and goal-directedness—without mysticism,” they write. “We suggest that biologists catch up.”

With respect to purposes and teleology (goal-directed behavior), they make their key point clear: “We think that this commendable scientific caution has gone too far, putting biologists into a straitjacket.”

A promising route for better understanding may be found, they write, in “thinking of parts of organisms as agents, detecting opportunities and trying to accomplish missions.” This is “risky, but the payoff in insight can be large.” For Levin, at least, bioelectric fields are key mechanisms for this kind of collective decision-making. These fields connect cells and tissues together, allowing, along with synaptic connections, for rapid information exchange, not only with immediate neighbors but distant ones as well.

These communication channels are involved in the emergence of cancer, which means that, according to Levin, they can potentially be useful in curing some forms of cancer. “You can [use bioelectric fields to] induce full-on metastatic melanoma—a kind of skin cancer—in perfectly normal animals with no carcinogens or nasty chemicals that break DNA,” he said. You can also use these same fields “to normalize existing tumors or prevent them from forming.” He’s currently moving this work to human clinical models.

The importance of bioelectric fields is all about connection, information, and computation. These ingredients equal cognition for Levin and Dennett, which is, for them, a continuum of complexity that has developed over a billion years of biological evolution. It’s not an all or nothing kind of thing but a spectrum—one that plays a role in development, evolution, cancer, and in the workings of consciousness itself.

By: Tom Hunt

Tam Hunt is a philosopher, a practicing lawyer, and writer. He is the author of two books on the philosophy of consciousness: Eco, Ego, Eros: Essays in Philosophy, Spirituality, and Science and Mind, World, God: Science and Spirit in the 21st Century.

Source: The Link Between Bioelectricity and Consciousness

.

.

Our bodies rely on an ultrafast nervous system to send impulses very quickly and it all starts with a special cell called the neuron. In this episode, Patrick will explain how these cells tell your body what to do. » Subscribe to Seeker! http://bit.ly/subscribeseeker » Watch more Human! http://bit.ly/HUMANplaylist » Visit our shop at http://shop.seeker.com
.
.

Big Ethical Questions about the Future of AI

Artificial intelligence is already changing the way we live our daily lives and interact with machines. From optimizing supply chains to chatting with Amazon Alexa, artificial intelligence already has a profound impact on our society and economy. Over the coming years, that impact will only grow as the capabilities and applications of AI continue to expand.

AI promises to make our lives easier and more connected than ever. However, there are serious ethical considerations to any technology that affects society so profoundly. This is especially true in the case of designing and creating intelligence that humans will interact with and trust. Experts have warned about the serious ethical dangers involved in developing AI too quickly or without proper forethought. These are the top issues keeping AI researchers up at night.

Bias: Is AI fair

Bias is a well-established facet of AI (or of human intelligence, for that matter). AI takes on the biases of the dataset it learns from. This means that if researchers train an AI on data that are skewed for race, gender, education, wealth, or any other point of bias, the AI will learn that bias. For instance, an artificial intelligence application used to predict future criminals in the United States showed higher risk scores and recommended harsher actions for black people than white based on the racial bias in America’s criminal incarceration data.

Of course, the challenge with AI training is there’s no such thing as a perfect dataset. There will always be under- and overrepresentation in any sample. These are not problems that can be addressed quickly. Mitigating bias in training data and providing equal treatment from AI is a major key to developing ethical artificial intelligence.

Liability: Who is responsible for AI?

Last month when an Uber autonomous vehicle killed a pedestrian, it raised many ethical questions. Chief among them is “Who is responsible, and who’s to blame when something goes wrong?” One could blame the developer who wrote the code, the sensor hardware manufacturer, Uber itself, the Uber supervisor sitting in the car, or the pedestrian for crossing outside a crosswalk.

Developing AI will have errors, long-term changes, and unforeseen consequences of the technology. Since AI is so complex, determining liability isn’t trivial. This is especially true when AI has serious implications on human lives, like piloting vehicles, determining prison sentences, or automating university admissions. These decisions will affect real people for the rest of their lives. On one hand, AI may be able to handle these situations more safely and efficiently than humans. On the other hand, it’s unrealistic to expect AI will never make a mistake. Should we write that off as the cost of switching to AI systems, or should we prosecute AI developers when their models inevitably make mistakes?

Security: How do we protect access to AI from bad actors?

As AI becomes more powerful across our society, it will also become more dangerous as a weapon. It’s possible to imagine a scary scenario where a bad actor takes over the AI model that controls a city’s water supply, power grid, or traffic signals. More scary is the militarization of AI, where robots learn to fight and drones can fly themselves into combat.

Cybersecurity will become more important than ever. Controlling access to the power of AI is a huge challenge and a difficult tightrope to walk. We shouldn’t centralise the benefits of AI, but we also don’t want the dangers of AI to spread. This becomes especially challenging in the coming years as AI becomes more intelligent and faster than our brains by an order of magnitude.

Human Interaction: Will we stop talking to one another?

An interesting ethical dilemma of AI is the decline in human interaction. Now more than any time in history it’s possible to entertain yourself at home, alone. Online shopping means you don’t ever have to go out if you don’t want to.

While most of us still have a social life, the amount of in-person interactions we have has diminished. Now, we’re content to maintain relationships via text messages and Facebook posts. In the future, AI could be a better friend to you than your closest friends. It could learn what you like and tell you what you want to hear. Many have worried that this digitization (and perhaps eventual replacement) of human relationships is sacrificing an essential, social part of our humanity.

Employment: Is AI getting rid of jobs?

This is a concern that repeatedly appears in the press. It’s true that AI will be able to do some of today’s jobs better than humans. Inevitably, those people will lose their jobs, and it will take a major societal initiative to retrain those employees for new work. However, it’s likely that AI will replace jobs that were boring, menial, or unfulfilling. Individuals will be able to spend their time on more creative pursuits, and higher-level tasks. While jobs will go away, AI will also create new markets, industries, and jobs for future generations.

Wealth Inequality: Who benefits from AI?

The companies who are spending the most on AI development today are companies that have a lot of money to spend. A major ethical concern is AI will only serve to centralizecoro wealth further. If an employer can lay off workers and replace them with unpaid AI, then it can generate the same amount of profit without the need to pay for employees.

Machines will create wealth more than ever in the economy of the future. Governments and corporations should start thinking now about how we redistribute that wealth so that everyone can participate in the AI-powered economy.

Power & Control: Who decides how to deploy AI?

Along with the centralization of wealth comes the centralization of power and control. The companies that control AI will have tremendous influence over how our society thinks and acts each day. Regulating the development and operation of AI applications will be critical for governments and consumers. Just as we’ve recently seen Facebook get in trouble for the influence its technology and advertising has had on society, we might also see AI regulations that codify equal opportunity for everyone and consumer data privacy.

Robot Rights: Can AI suffer?

A more conceptual ethical concern is whether AI can or should have rights. As a piece of computer code, it’s tempting to think that artificially intelligent systems can’t have feelings. You can get angry with Siri or Alexa without hurting their feelings. However, it’s clear that consciousness and intelligence operate on a system of reward and aversion. As artificially intelligent machines become smarter than us, we’ll want them to be our partners, not our enemies. Codifying humane treatment of machines could play a big role in that.

Ethics in AI in the coming years

Artificial intelligence is one of the most promising technological innovations in human history. It could help us solve a myriad of technical, economic, and societal problems. However, it will also come with serious drawbacks and ethical challenges. It’s important that experts and consumers alike be mindful of these questions, as they’ll determine the success and fairness of AI over the coming years.

By: By Steve Kilpatrick
Co-Founder & Director
Artificial Intelligence & Machine Learning

More contents:

Future Space

Future Robotics

Future of Mankind

Future Medicine

Improve Your Cognitive Health with This Brain-Training App

Your brain can be a great indicator of your overall health. These days, with so many of us confined to self-isolation and offices going fully remote, there’s no shame in feeling a little brain drain. But don’t let the doldrums get you down and harm your health.

CogniFit Premium Brain Training is designed to detect risk factors for alterations in cognitive functioning using neuropsychological assessments. Whether you’ve been feeling a bit slower lately or you just aren’t as motivated as you used to be, CogniFit can help you identify why. Millions of users already use CogniFit to identify possible cognitive alterations and deficiencies so they can create a personalized brain training regimen for their needs.Through validated tasks to evaluate 23 cognitive skills, CogniFit helps measure, train, and properly monitor mental fitness and its relation to neurological pathologies.

The intuitive app lets you personalize your training by choosing your preferred programs and the age group you’re in, so it can provide better insights. CogniFit’s exercises and brain games help stimulate cognitive functions and improve brain plasticity while providing real-time monitoring on the evolution of your skills and compares the results to age group norms.

Through validated tasks to evaluate 23 cognitive skills, CogniFit helps measure, train, and properly monitor mental fitness and its relation to neurological pathologies. The intuitive app lets you personalize your training by choosing your preferred programs and the age group you’re in, so it can provide better insights. CogniFit’s exercises and brain games help stimulate cognitive functions and improve brain plasticity while providing real-time monitoring on the evolution of your skills and compares the results to age group norms.

By: Entrepreneur Store Entrepreneur Leadership Network VIP

.

.

Apparently, alcohol, not getting enough sleep, no physical activity, smoking, high-fat or high-sodium foods, or being lonely could add to early cognitive decline. So if you want to be a member of the brainiacs’ club in your 70s or just remember your grandson’s birthday, you’d better start exercising in your 30s and put down those cheeseballs rolled in bacon (at least every now and then!).

In any event, there are methods to actively strengthen your cognitive abilities. Life-long learning, reading books (The Medical Futurist believes especially science fiction stimulates the brain) or playing mind-games all help. According to a study at the Fisher Center for Alzheimer’s Research Foundation, mental stimulation like reading can help protect memory and thinking skills, especially with age. The authors of the study suggest that reading every day can slow down a late-life cognitive decline, keeping the brain healthier and higher functioning for longer.

As part of our From Chance To Choice campaign, which through the HOW TO series aims to show methods and tools to take more control over our own health in the long term, we would like to suggest you some technological solutions, apps and games, to help keep your brain as fit as possible from early on.

Gameplay focuses and controls our attention, taps into our innate strengths, thrills us utterly and compels us to greater resilience in the attainment of more powerful and useful skills. That’s why gamified apps are perfect for improving and maintaining cognitive abilities.

Recently, several start-ups have started to experiment with bringing challenging offline games to digital brain-training apps. These usually stick to the same format: collections of mini-games you can play on any device with the purpose of improving comprehension, focus, and self-confidence, nicely drawn graphs to show how you’re developing over time; and optional subscriptions for extra games and features. So, here are our four favorites!

Source: https://medicalfuturist.com

advertisement

%d bloggers like this: