Train Your Brain to Remember Anything You Learn With This Simple, 20-Minute Habit

Not too long ago, a colleague and I were lamenting the process of growing older and the inevitable increasing difficulty of remembering things we want to remember. That becomes particularly annoying when you attend a conference or a learning seminar and find yourself forgetting the entire session just days later.

But then my colleague told me about the Ebbinghaus Forgetting Curve, a 100-year-old formula developed by German psychologist Hermann Ebbinghaus, who pioneered the experimental study of memory. The psychologist’s work has resurfaced and has been making its way around college campuses as a tool to help students remember lecture material. For example, the University of Waterloo explains the curve and how to use it on the Campus Wellness website.

I teach at Indiana University and a student mentioned it to me in class as a study aid he uses. Intrigued, I tried it out too–more on that in a moment. The Forgetting Curve describes how we retain or lose information that we take in, using a one-hour lecture as the basis of the model. The curve is at its highest point (the most information retained) right after the one-hour lecture. One day after the lecture, if you’ve done nothing with the material, you’ll have lost between 50 and 80 percent of it from your memory.

By day seven, that erodes to about 10 percent retained, and by day 30, the information is virtually gone (only 2-3 percent retained). After this, without any intervention, you’ll likely need to relearn the material from scratch. Sounds about right from my experience. But here comes the amazing part–how easily you can train your brain to reverse the curve.

With just 20 minutes of work, you’ll retain almost all of what you learned.

This is possible through the practice of what’s called spaced intervals, where you revisit and reprocess the same material, but in a very specific pattern. Doing so means it takes you less and less time to retrieve the information from your long-term memory when you need it. Here’s where the 20 minutes and very specifically spaced intervals come in.

Ebbinghaus’s formula calls for you to spend 10 minutes reviewing the material within 24 hours of having received it (that will raise the curve back up to almost 100 percent retained again). Seven days later, spend five minutes to “reactivate” the same material and raise the curve up again. By day 30, your brain needs only two to four minutes to completely “reactivate” the same material, again raising the curve back up.

Thus, a total of 20 minutes invested in review at specific intervals and, voila, a month later you have fantastic retention of that interesting seminar. After that, monthly brush-ups of just a few minutes will help you keep the material fresh.

Here’s what happened when I tried it.

I put the specific formula to the test. I keynoted at a conference and was also able to take in two other one-hour keynotes at the conference. For one of the keynotes, I took no notes, and sure enough, just shy of a month later I can barely remember any of it.

For the second keynote, I took copious notes and followed the spaced interval formula. A month later, by golly, I remember virtually all of the material. And in case if you’re wondering, both talks were equally interesting to me–the difference was the reversal of Ebbinghaus’ Forgetting Curve.

So the bottom line here is if you want to remember what you learned from an interesting seminar or session, don’t take a “cram for the exam” approach when you want to use the info. That might have worked in college (although Waterloo University specifically advises against cramming, encouraging students to follow the aforementioned approach). Instead, invest the 20 minutes (in spaced-out intervals), so that a month later it’s all still there in the old noggin. Now that approach is really using your head.

Science has proven that reading can enhance your cognitive function, develop your language skills, and increase your attention span. Plus, not only does the act of reading train your brain for success, but you’ll also learn new things! The founder of Microsoft, Bill Gates, said, “Reading is still the main way that I both learn new things and test my understanding.”

By: Scott Mautz

Source: Pocket



Dr. John N. Morris is the director of social and health policy research at the Harvard-affiliated Institute for Aging Research. He believes there are three main guidelines you should follow when training your mind:

  1. Do Something Challenging: Whatever you do to train your brain, it should be challenging and take you beyond your comfort zone.
  2. Choose Complex Activities: Good brain training exercises should require you to practice complex thought processes, such as creative thinking and problem-solving.
  3. Practice Consistently: You know the saying: practice makes perfect! Dr. Morris says, “You can’t improve memory if you don’t work at it. The more time you devote to engaging your brain, the more it benefits.”
  4. If you’re looking for reading material, check out our guides covering 40 must-read books and the best books for entrepreneurs.
  5. Practice self-awareness. Whenever you feel low, check-in with yourself and try to identify the negative thought-loop at play. Perhaps you’re thinking something like, “who cares,” “I’ll never get this right,” “this won’t work,” or “what’s the point?” 
  6. Science has shown that mindfulness meditation helps engage new neural pathways in the brain. These pathways can improve self-observational skills and mental flexibility – two attributes that are crucial for success. What’s more, another study found that “brief, daily meditation enhances attention, memory, mood, and emotional regulation in non-experienced meditators.”
  7. Brain Age Concentration Training is a brain training and mental fitness system for the Nintendo 3DS system.
  8. Queendom has thousands of personality tests and surveys. It also has an extensive collection of “brain tools”—including logic, verbal, spatial, and math puzzles; trivia quizzes; and aptitude tests
  9. Claiming to have the world’s largest collection of brain teasers, Braingle’s free website provides more than 15,000 puzzles, games, and other brain teasers as well as an online community of enthusiasts.


Hey, There’s a Second Brain In Your Gut

Scientists have known for years that there’s a “second brain” of autonomous neurons in your long, winding human digestive tract—but that’s about where their knowledge of the so-called abdominal brain ends.

Now, research published in 2020 shows that scientists have catalogued 12 different kinds of neurons in the enteric nervous system (ENS) of mice. This “fundamental knowledge” unlocks a huge number of paths to new experiments and findings.

The gut brain greatly affects on how you body works. Your digestive system has a daily job to do as part of your metabolism, but it’s also subject to fluctuations in functionality, and otherwise related to your emotions.

More: Getting the Inside Dope on Ketamine’s Mysterious Ability to Rapidly Relieve Depression

Digestive symptoms and anxiety can be comorbid, and your gut is heavily affected by stress. So scientists believe having a better understanding of what happens in your ENS could lead to better medicines and treatments for a variety of conditions, as well as improved knowledge of the connection between the ENS and central nervous system.

The research appears in Nature Neuroscience. In a related commentary, scientist Julia Ganz explains what the researchers found and why it’s so important:

“Using single-cell RNA-sequencing to profile the developing and juvenile ENS, the authors discovered a conceptually new model of neuronal diversification in the ENS and establish a new molecular taxonomy of enteric neurons based on a plethora of molecular markers.”

Neuronal diversification happens in, well, all the organisms that have neurons. Similar to stem cells, neurons develop first as more generic “blanks” and then into functional specialties. The human brain has types like sensory and motor neurons, each of which has subtypes. There are so many subtypes, in fact, that scientists aren’t sure how to even fully catalog them yet.

More: Here’s How Long Alcohol-Induced Brain Damage Persists After Drinking

Neurons of the same superficial type are different in the brain versus the brain stem—let alone in the digestive tract. So researchers had to start at the very beginning and trace how these neurons develop. They tracked RNA, which determines how DNA is expressed in the cells made by your body, to follow how neurons formed both before and after birth. Some specialties emerge in utero, and some split and form afterward.

To find this new information, the scientists developed a finer way to separate and identify cells. Ganz explains:

“Using extensive co-staining with established markers, they were able to relate the twelve neuron classes to previously discovered molecular characteristics of functional enteric neuron types, thus classifying the ENCs into excitatory and inhibitory motor neurons, interneurons, and intrinsic primary afferent neurons.”

With a sharper protocol and new information, the researchers were able to confirm and expand on the existing body of ENS neuron knowledge. And now they can work on finding out what each of the 12 ENS neuron types is responsible for, they say.

By isolating different kinds and “switching” them on or off using genetic information, scientists can try to identify what’s missing from the function of the mouse ENS. And studying these genes could lead to new treatments that use stem cells or RNA to control the expression of harmful genes.

The Mind-Gut Connection is something that people have intuitively known for a long time but science has only I would say in the last few years gotten a grasp and acceptance of this concept. It essentially means that your brain has intimate connections with the gut and another entity in our gut, the second brain, which is about 100 million nerve cells that are sandwiched in between the layers of the gut.

And they can do a lot of things on their own in terms of regulating our digestive processes. But there’s a very intimate conversation between that little brain, the second brain in the gut and our main brain. They use the same neurotransmitters. They’re connected by nerve pathways. And so we have really an integrated system from our brain to the little brain in the gut and it goes in both directions.

The little brain, or the second brain, in the gut you’re not able to see it because as I said it’s spread out through the entire length of the gut from your esophagus to the end of your large intestine, several layers of nerve cells interconnected. And what they do is even if you – and you can do this in animal experiments if you completely disconnect this little brain in the gut from your main brain this little brain can completely take care of all the digestive processes, the contractions, peristaltic reflex, regulation of blood flow in the intestine.

And it has many sensors so it knows exactly what’s going on inside the gut, what goes on in the wall of the gut, any distention, any chemicals. All of this is being picked up by these sensory nerves, fed into the interior nervous system, the second brain. And then the second brain generates these stereotypic responses. So when you vomit, when you have diarrhea, when you have normal digestion, all of this is encoded in programs in your second brain.

What the second brain can’t do it cannot generate any conscious perceptions or gut feelings. That really is the only ability that allows us to do this and perceive all the stuff that goes on inside of us is really the big brain and the specific areas and circuits within the brain that process information that comes up from the gut. Still most of that information is not really consciously perceived. So 95 percent of all this massive amount of information coming from the gut is processed, integrated with other inputs that the brain gets from the outside, from smell, visual stimuli.

And only a very small portion is then actually made conscious. So when you feel good after a meal or when you ate the wrong thing and you’re nauseated those are the few occasions where actually we realize and become aware of our gut feelings. Even though a lot of other stuff is going on in this brain-gut access all the time.

When we talk about the connection between depression and the gut there’s some very intriguing observations both clinically but also now more recently scientifically that make it highly plausible that there is an integrate connection between serotonin in the gut, serotonin in our food, depression and gut function.

By: Caroline Delbert

Caroline Delbert is a writer, book editor, researcher, and avid reader. She’s also an enthusiast of just about everything.

Source: Pocket



The enteric nervous system (ENS) or intrinsic nervous system is one of the main divisions of the autonomic nervous system (ANS) and consists of a mesh-like system of neurons that governs the function of the gastrointestinal tract. It is capable of acting independently of the sympathetic and parasympathetic nervous systems, although it may be influenced by them. The ENS is also called the second brain. It is derived from neural crest cells.

The enteric nervous system is capable of operating independently of the brain and spinal cord,but does rely on innervation from the autonomic nervous system via the vagus nerve and prevertebral ganglia in healthy subjects. However, studies have shown that the system is operable with a severed vagus nerve.

The neurons of the enteric nervous system control the motor functions of the system, in addition to the secretion of gastrointestinal enzymes. These neurons communicate through many neurotransmitters similar to the CNS, including acetylcholine, dopamine, and serotonin. The large presence of serotonin and dopamine in the gut are key areas of research for neurogastroenterologists.

Neurogastroenterology societies

See also

3 Simple Habits That Can Protect Your Brain From Cognitive Decline

You might think that the impact of aging on the brain is something you can’t do much about. After all, isn’t it an inevitability? To an extent, as we may not be able to rewind the clock and change our levels of higher education or intelligence (both factors that delay the onset of symptoms of aging).

But adopting specific lifestyle behaviors–whether you’re in your thirties or late forties–can have a tangible effect on how well you age. Even in your fifties and beyond, activities like learning a new language or musical instrument, taking part in aerobic exercise, and developing meaningful social relationships can do wonders for your brain. There’s no question that when we compromise on looking after ourselves, our aging minds pick up the tab.

The Aging Process and Cognitive Decline

Over time, there is a build-up of toxins such as tau proteins and beta-amyloid plaques in the brain that correlate to the aging process and associated cognitive decline. Although this is a natural part of growing older, many factors can exacerbate it. Stress, neurotoxins such as alcohol and lack of (quality and quantity) sleep can speed up the process.

Neuroplasticity–the function that allows the brain to change and develop in our lifetime–has three mechanisms: synaptic connection, myelination, and neurogenesis. The key to resilient aging is improving neurogenesis, the birth of new neurons. Neurogenesis happens far more in babies and children than adults.

A 2018 study by researchers at Columbia University shows that in adults, this type of neuroplastic activity occurs in the hippocampus, the part of the brain that lays down memories. This makes sense as we respond to and store new experiences every day, and cement them during sleep. The more we can experience new things, activities, people, places, and emotions, the more likely we are to encourage neurogenesis.

With all this in mind, we can come up with a three-point plan to encourage “resilient aging” by activating neurogenesis in the brain:

1. Get your heart rate up

Aerobic exercise such as running or brisk walking has a potentially massive impact on neurogenesis. A 2016 rat study found that endurance exercise was most effective in increasing neurogenesis. It wins out over HIIT sessions and resistance training, although doing a variety of exercise also has its benefits.

Aim to do aerobic exercise for 150 minutes per week, and choose the gym, the park, or natural landscape over busy roads to avoid compromising brain-derived neurotrophic factor production (BDNF), a growth factor that encourages neurogenesis that aerobic exercise can boost. However, exercising in polluted areas decreases production.

If exercising alone isn’t your thing, consider taking up a team sport or one with a social element like table tennis. Exposure to social interaction can also increase the neurogenesis, and in many instances, doing so lets you practice your hand-eye coordination, which research has suggested leads to structural changes in the brain that may relate to a range of cognitive benefit. This combination of coordination and socializing has been shown to increase brain thickness in the parts of the cortex related to social/emotional welfare, which is crucial as we age.

2. Change your eating patterns

Evidence shows that calorie restriction, intermittent fasting, and time-restricted eating encourage neurogenesis in humans. In rodent studies, intermittent fasting has been found to improve cognitive function and brain structure, and reduce symptoms of metabolic disorders such as diabetes.

Reducing refined sugar will help reduce oxidative damage to brain cells, too, and we know that increased oxidative damage has been linked with a higher risk of developing Alzheimer’s disease. Twenty-four hour water-only fasts have also been proven to increase longevity and encourage neurogenesis.

Try any of the following, after checking with your doctor:

  • 24-hour water-only fast once a month
  •  Reducing your calorie intake by 50%-60% on two non-consecutive days of the week for two to three months or on an ongoing basis
  • Reducing calories by 20% every day for two weeks. You can do this three to four times a year
  • Eating only between 8 a.m. to 8 p.m., or 12 p.m. to 8 p.m. as a general rule

3. Prioritize sleep

Sleep helps promote the brain’s neural “cleaning” glymphatic system, which flushes out the build-up of age-related toxins in the brain (the tau proteins and beta amyloid plaques mentioned above). When people are sleep-deprived, we see evidence of memory deficits, and if you miss a whole night of sleep, research proves that it impacts IQ. Aim for seven to nine hours, and nap if it suits you. Our need to sleep decreases as we age.

Of course, there are individual exceptions, but having consistent sleep times and making sure you’re getting sufficient quality and length of sleep supports brain resilience over time. So how do you know if you’re getting enough? If you naturally wake up at the same time on weekends that you have to during the week, you probably are.

If you need to lie-in or take long naps, you’re probably not. Try practicing mindfulness or yoga nidra before bed at night, a guided breath-based meditation that has been shown in studies to improve sleep quality. There are plenty of recordings online if you want to experience it.

Pick any of the above that work for you and build it up until it becomes a habit, then move onto the next one and so on. You might find that by the end of the year, you’ll feel even healthier, more energized, and motivated than you do now, even as you turn another year older.

By: Fast Company / Tara Swart

Dr. Tara Swart is a neuroscientist, leadership coach, author, and medical doctor. Follow her on Twitter at @TaraSwart.

Source: Open-Your-Mind-Change



Cognitive deficit is an inclusive term to describe any characteristic that acts as a barrier to the cognition process.

The term may describe

Mild cognitive impairment (MCI) is a neurocognitive disorder which involves cognitive impairments beyond those expected based on an individual’s age and education but which are not significant enough to interfere with instrumental activities of daily living. MCI may occur as a transitional stage between normal aging and dementia, especially Alzheimer’s disease. It includes both memory and non-memory impairments.Mild cognitive impairment has been relisted as mild neurocognitive disorder in DSM-5, and in ICD-11.

The cause of the disorder remains unclear, as well as its prevention and treatment. MCI can present with a variety of symptoms, but is divided generally into two types.

Amnestic MCI (aMCI) is mild cognitive impairment with memory loss as the predominant symptom; aMCI is frequently seen as a prodromal stage of Alzheimer’s disease. Studies suggest that these individuals tend to progress to probable Alzheimer’s disease at a rate of approximately 10% to 15% per year.[needs update]It is possible that being diagnosed with cognitive decline may serve as an indicator of aMCI.

Nonamnestic MCI (naMCI) is mild cognitive impairment in which impairments in domains other than memory (for example, language, visuospatial, executive) are more prominent. It may be further divided as nonamnestic single- or multiple-domain MCI, and these individuals are believed to be more likely to convert to other dementias (for example, dementia with Lewy bodies).

See also

How Does The Brain Interpret Computer Languages

In the US, a 2016 Gallup poll found that the majority of schools want to start teaching code, with 66 percent of K-12 school principals thinking that computer science learning should be incorporated into other subjects. Most countries in Europe have added coding classes and computer science to their school curricula, with France and Spain introducing theirs in 2015. This new generation of coders is expected to boost the worldwide developer population from 23.9 million in 2019 to 28.7 million in 2024.

Despite all this effort, there’s still some confusion on how to teach coding. Is it more like a language, or more like math? Some new research may have settled this question by watching the brain’s activity while subjects read Python code.

Two schools on schooling

Right now, there are two schools of thought. The prevailing one is that coding is a type of language, with its own grammar rules and syntax that must be followed. After all, they’re called coding languages for a reason, right? This idea even has its own snazzy acronym: Coding as Another Language, or CAL. Others think that it’s a bit like learning the logic found in math; formulas and algorithms to create output from input. There’s even a free online course to teach you both coding and math at the same time.

Which approach is more effective? The debate has been around since coding was first taught in schools, but it looks like the language argument is now winning. Laws in Texas, Oklahoma, and Georgia allow high school students to take computer science to fulfill their foreign language credits (the 2013 Texas law says this applies if the student has already taken a foreign language class and appears unlikely to advance).

The debate holds a special interest for neuroscientists; since computer programming has only been around for a few decades, the brain has not evolved any special region to handle it. It must be repurposing a region of the brain normally used for something else.

So late last year, neuroscientists in MIT tried to see what parts of the brain people use when dealing with computer programming. “The ability to interpret computer code is a remarkable cognitive skill that bears parallels to diverse cognitive domains, including general executive functions, math, logic, and language,” they wrote.

Since coding can be learned as an adult, they figured it must rely on some pre-existing cognitive system in our brains. Two brain systems seemed like likely candidates: either the brain’s language system, or the system that tackles complex cognitive tasks such as solving math problems or a crossword. The latter is known as the “multiple demand network.”

Coding on the brain

In their experiment, researchers asked participants already proficient at coding to lie in an fMRI machine to measure their brain activity. They were then asked to read a coding problem and asked to predict the output.The two coding languages used in the study are known for their “readability”—Python and ScratchJr. The latter was specifically developed for children and is symbol-based so that children who have not yet learned to read can still use it.

The main task involved giving participants a person’s height and weight and asking them to calculate a person’s BMI. This problem was either presented as Python-style code or as a normal sentence. The same method was done for ScratchJr, but participants were asked to track the position of a kitten as it walked and jumped.

Control tasks involved memorizing a sequence of squares on a grid (to activate participants’ multiple demand system) and reading one normal and one nonsense sentence (to activate their language system). Their results showed that the language part of the brain responded weakly when reading code (the paper’s authors think this might be because there was no speaking/listening involved). Instead, these tasks were mostly handled by the multiple demand network.

The multiple demand network is spread across the frontal and parietal (top) lobes of our brain, and it’s responsible for intense mental tasks—the parts of our lives that make us think hard. The network can be roughly split between the left part (responsible for logic) and the right (more suited to abstract thinking). The MIT researchers found that reading Python code appears to activate both the left and right sides of the multiple demand network, and ScratchJr activated the right side slightly more than the left.

“We found that the language system does not respond consistently during code comprehension in spite of numerous similarities between code and natural languages,” they write.Interestingly, code-solving activated parts of the multiple-demand network that are not activated when solving math problems. So the brain doesn’t tackle it as language or logic—it appears to be its own thing.

The distinct process involved in interpreting computer code was backed up by an experiment done by Japanese neuroscientists last year. This work showed snippets of code to novice, experienced, and expert programmers while they lay in an fMRI. The participants were asked to categorize them into one of four types of algorithms. As expected, the programmers with higher skills were better at categorizing the snippets. But the researchers also found that activity in brain regions associated with natural language processing, episodic memory retrieval, and attention control also strengthened with the skill level of the programmer.

So while coding may not be as similar to languages as we had thought, it looks like both benefit from starting young.

By: Fintan Burke

Fintan is a freelance science journalist based in Hamburg, Germany. He has also written for The Irish Times, Horizon Magazine, and and covers European science policy, biology, health and bioethics.




Newest Videos

Play Patrick (H) Willems Reacts To His Top 1000…Personal History Play Sitrep: Boeing 707technology Play Steve Burke of GamersNexus Reacts To Their Top…Personal History Play Amnesia: The Dark Descent Creative Director Thomas…gaming & entertainment Play Biomarkers, from diagnosis to treatmenttechnology Play Modern Vintage Gamer Reacts To His Top 1000…Personal History Play Blade Runner Game Director Louis Castle: Extended…technology Play Sitrep: Azerbaijan’s TB2 Dronetechnology Play How The NES Conquered A Skeptical America In 1985War Stories Play Scott Manley Reacts To His Top 1000 YouTube CommentsPersonal History Play How Horror Works in Amnesia: Rebirth, Soma and…gaming & entertainment Play Sitrep: Skyborgtechnology



🔥Intellipaat Python course:…​ 🔥Intellipaat Programming courses:…​ In this video you will know how one can start coding and best programming languages to learn in 2020 for Job in Google, Microsoft, Infosys, TCS etc. Also you will know the top 5 programming languages to learn in 2020 for a rewarding career. #HowToStartCoding#ProgrammingForBeginners#LearnCoding#Python#Php#Java#Jbpm#Angular#C#c​++ #Django#Linux#WebDevelopement​ 📌 Do subscribe to Intellipaat channel & get regular updates on videos:…​ 💡 Know top 5 reasons to learn python:…​ 🔗 Watch complete Python tutorials here:…​ 📕 Read complete Python tutorial here:…​ 📕Read insightful blog on Python certification:…​ Are you looking for something more? Enroll in our programming courses and become a certified Professional (…​). All the trainings are instructor led training provided by Intellipaat which is completely aligned with industry standards and certification bodies.


Beauty Is In The Brain: AI Reads Brain Data, Generates Personally Attractive Images

Researchers have succeeded in making an AI understand our subjective notions of what makes faces attractive. The device demonstrated this knowledge by its ability to create new portraits on its own that were tailored to be found personally attractive to individuals. The results can be utilised, for example, in modelling preferences and decision-making as well as potentially identifying unconscious attitudes.

Researchers at the University of Helsinki and University of Copenhagen investigated whether a computer would be able to identify the facial features we consider attractive and, based on this, create new images matching our criteria. The researchers used artificial intelligence to interpret brain signals and combined the resulting brain-computer interface with a generative model of artificial faces. This enabled the computer to create facial images that appealed to individual preferences.

“In our previous studies, we designed models that could identify and control simple portrait features, such as hair color and emotion. However, people largely agree on who is blond and who smiles. Attractiveness is a more challenging subject of study, as it is associated with cultural and psychological factors that likely play unconscious roles in our individual preferences. Indeed, we often find it very hard to explain what it is exactly that makes something, or someone, beautiful: Beauty is in the eye of the beholder,” says Senior Researcher and Docent Michiel Spapé from the Department of Psychology and Logopedics, University of Helsinki.

The study, which combines computer science and psychology, was published in February in the IEEE Transactions in Affective Computing journal.

Preferences exposed by the brain

Initially, the researchers gave a generative adversarial neural network (GAN) the task of creating hundreds of artificial portraits. The images were shown, one at a time, to 30 volunteers who were asked to pay attention to faces they found attractive while their brain responses were recorded via electroencephalography (EEG).

“It worked a bit like the dating app Tinder: the participants ‘swiped right’ when coming across an attractive face. Here, however, they did not have to do anything but look at the images. We measured their immediate brain response to the images,” Spapé explains.

The researchers analysed the EEG data with machine learning techniques, connecting individual EEG data through a brain-computer interface to a generative neural network.

“A brain-computer interface such as this is able to interpret users’ opinions on the attractiveness of a range of images. By interpreting their views, the AI model interpreting brain responses and the generative neural network modelling the face images can together produce an entirely new face image by combining what a particular person finds attractive,” says Academy Research Fellow and Associate Professor Tuukka Ruotsalo, who heads the project.

To test the validity of their modelling, the researchers generated new portraits for each participant, predicting they would find them personally attractive. Testing them in a double-blind procedure against matched controls, they found that the new images matched the preferences of the subjects with an accuracy of over 80%.

“The study demonstrates that we are capable of generating images that match personal preference by connecting an artificial neural network to brain responses. Succeeding in assessing attractiveness is especially significant, as this is such a poignant, psychological property of the stimuli.

Computer vision has thus far been very successful at categorising images based on objective patterns. By bringing in brain responses to the mix, we show it is possible to detect and generate images based on psychological properties, like personal taste,” Spapé explains.

Potential for exposing unconscious attitudes

Ultimately, the study may benefit society by advancing the capacity for computers to learn and increasingly understand subjective preferences, through interaction between AI solutions and brain-computer interfaces.

“If this is possible in something that is as personal and subjective as attractiveness, we may also be able to look into other cognitive functions such as perception and decision-making. Potentially, we might gear the device towards identifying stereotypes or implicit bias and better understand individual differences,” says Spapé.

By: University of Helsinki

Source: Beauty is in the brain: AI reads brain data, generates personally attractive images — ScienceDaily



Anjan Chatterjee uses tools from evolutionary psychology and cognitive neuroscience to study one of nature’s most captivating concepts: beauty. Learn more about the science behind why certain configurations of line, color and form excite us in this fascinating, deep look inside your brain. Check out more TED talks: The TED Talks channel features the best talks and performances from the TED Conference, where the world’s leading thinkers and doers give the talk of their lives in 18 minutes (or less). Look for talks on Technology, Entertainment and Design — plus science, business, global issues, the arts and more. Follow TED on Twitter: Like TED on Facebook: Subscribe to our channel:

Journal Reference:

  1. Michiel Spape, Keith Davis, Lauri Kangassalo, Niklas Ravaja, Zania Sovijarvi-Spape, Tuukka Ruotsalo. Brain-computer interface for generating personally attractive images. IEEE Transactions on Affective Computing, 2021; 1 DOI: 10.1109/TAFFC.2021.3059043

ScienceDaily shares links with sites in the TrendMD network and earns revenue from third-party advertisers, where indicated.
%d bloggers like this: