Train Your Brain to Remember Anything You Learn With This Simple, 20-Minute Habit

Not too long ago, a colleague and I were lamenting the process of growing older and the inevitable increasing difficulty of remembering things we want to remember. That becomes particularly annoying when you attend a conference or a learning seminar and find yourself forgetting the entire session just days later.

But then my colleague told me about the Ebbinghaus Forgetting Curve, a 100-year-old formula developed by German psychologist Hermann Ebbinghaus, who pioneered the experimental study of memory. The psychologist’s work has resurfaced and has been making its way around college campuses as a tool to help students remember lecture material. For example, the University of Waterloo explains the curve and how to use it on the Campus Wellness website.

I teach at Indiana University and a student mentioned it to me in class as a study aid he uses. Intrigued, I tried it out too–more on that in a moment. The Forgetting Curve describes how we retain or lose information that we take in, using a one-hour lecture as the basis of the model. The curve is at its highest point (the most information retained) right after the one-hour lecture. One day after the lecture, if you’ve done nothing with the material, you’ll have lost between 50 and 80 percent of it from your memory.

By day seven, that erodes to about 10 percent retained, and by day 30, the information is virtually gone (only 2-3 percent retained). After this, without any intervention, you’ll likely need to relearn the material from scratch. Sounds about right from my experience. But here comes the amazing part–how easily you can train your brain to reverse the curve.


With just 20 minutes of work, you’ll retain almost all of what you learned.

This is possible through the practice of what’s called spaced intervals, where you revisit and reprocess the same material, but in a very specific pattern. Doing so means it takes you less and less time to retrieve the information from your long-term memory when you need it. Here’s where the 20 minutes and very specifically spaced intervals come in.

Ebbinghaus’s formula calls for you to spend 10 minutes reviewing the material within 24 hours of having received it (that will raise the curve back up to almost 100 percent retained again). Seven days later, spend five minutes to “reactivate” the same material and raise the curve up again. By day 30, your brain needs only two to four minutes to completely “reactivate” the same material, again raising the curve back up.

Thus, a total of 20 minutes invested in review at specific intervals and, voila, a month later you have fantastic retention of that interesting seminar. After that, monthly brush-ups of just a few minutes will help you keep the material fresh.


Here’s what happened when I tried it.

I put the specific formula to the test. I keynoted at a conference and was also able to take in two other one-hour keynotes at the conference. For one of the keynotes, I took no notes, and sure enough, just shy of a month later I can barely remember any of it.

For the second keynote, I took copious notes and followed the spaced interval formula. A month later, by golly, I remember virtually all of the material. And in case if you’re wondering, both talks were equally interesting to me–the difference was the reversal of Ebbinghaus’ Forgetting Curve.

So the bottom line here is if you want to remember what you learned from an interesting seminar or session, don’t take a “cram for the exam” approach when you want to use the info. That might have worked in college (although Waterloo University specifically advises against cramming, encouraging students to follow the aforementioned approach). Instead, invest the 20 minutes (in spaced-out intervals), so that a month later it’s all still there in the old noggin. Now that approach is really using your head.

Science has proven that reading can enhance your cognitive function, develop your language skills, and increase your attention span. Plus, not only does the act of reading train your brain for success, but you’ll also learn new things! The founder of Microsoft, Bill Gates, said, “Reading is still the main way that I both learn new things and test my understanding.”

By: Scott Mautz

Source: Pocket

.

Critics:

Dr. John N. Morris is the director of social and health policy research at the Harvard-affiliated Institute for Aging Research. He believes there are three main guidelines you should follow when training your mind:

  1. Do Something Challenging: Whatever you do to train your brain, it should be challenging and take you beyond your comfort zone.
  2. Choose Complex Activities: Good brain training exercises should require you to practice complex thought processes, such as creative thinking and problem-solving.
  3. Practice Consistently: You know the saying: practice makes perfect! Dr. Morris says, “You can’t improve memory if you don’t work at it. The more time you devote to engaging your brain, the more it benefits.”
  4. If you’re looking for reading material, check out our guides covering 40 must-read books and the best books for entrepreneurs.
  5. Practice self-awareness. Whenever you feel low, check-in with yourself and try to identify the negative thought-loop at play. Perhaps you’re thinking something like, “who cares,” “I’ll never get this right,” “this won’t work,” or “what’s the point?” 
  6. Science has shown that mindfulness meditation helps engage new neural pathways in the brain. These pathways can improve self-observational skills and mental flexibility – two attributes that are crucial for success. What’s more, another study found that “brief, daily meditation enhances attention, memory, mood, and emotional regulation in non-experienced meditators.”
  7. Brain Age Concentration Training is a brain training and mental fitness system for the Nintendo 3DS system.
  8. Queendom has thousands of personality tests and surveys. It also has an extensive collection of “brain tools”—including logic, verbal, spatial, and math puzzles; trivia quizzes; and aptitude tests
  9. Claiming to have the world’s largest collection of brain teasers, Braingle’s free website provides more than 15,000 puzzles, games, and other brain teasers as well as an online community of enthusiasts.

 

3 Simple Habits That Can Protect Your Brain From Cognitive Decline

You might think that the impact of aging on the brain is something you can’t do much about. After all, isn’t it an inevitability? To an extent, as we may not be able to rewind the clock and change our levels of higher education or intelligence (both factors that delay the onset of symptoms of aging).

But adopting specific lifestyle behaviors–whether you’re in your thirties or late forties–can have a tangible effect on how well you age. Even in your fifties and beyond, activities like learning a new language or musical instrument, taking part in aerobic exercise, and developing meaningful social relationships can do wonders for your brain. There’s no question that when we compromise on looking after ourselves, our aging minds pick up the tab.

The Aging Process and Cognitive Decline

Over time, there is a build-up of toxins such as tau proteins and beta-amyloid plaques in the brain that correlate to the aging process and associated cognitive decline. Although this is a natural part of growing older, many factors can exacerbate it. Stress, neurotoxins such as alcohol and lack of (quality and quantity) sleep can speed up the process.

Neuroplasticity–the function that allows the brain to change and develop in our lifetime–has three mechanisms: synaptic connection, myelination, and neurogenesis. The key to resilient aging is improving neurogenesis, the birth of new neurons. Neurogenesis happens far more in babies and children than adults.

A 2018 study by researchers at Columbia University shows that in adults, this type of neuroplastic activity occurs in the hippocampus, the part of the brain that lays down memories. This makes sense as we respond to and store new experiences every day, and cement them during sleep. The more we can experience new things, activities, people, places, and emotions, the more likely we are to encourage neurogenesis.

With all this in mind, we can come up with a three-point plan to encourage “resilient aging” by activating neurogenesis in the brain:

1. Get your heart rate up

Aerobic exercise such as running or brisk walking has a potentially massive impact on neurogenesis. A 2016 rat study found that endurance exercise was most effective in increasing neurogenesis. It wins out over HIIT sessions and resistance training, although doing a variety of exercise also has its benefits.

Aim to do aerobic exercise for 150 minutes per week, and choose the gym, the park, or natural landscape over busy roads to avoid compromising brain-derived neurotrophic factor production (BDNF), a growth factor that encourages neurogenesis that aerobic exercise can boost. However, exercising in polluted areas decreases production.

If exercising alone isn’t your thing, consider taking up a team sport or one with a social element like table tennis. Exposure to social interaction can also increase the neurogenesis, and in many instances, doing so lets you practice your hand-eye coordination, which research has suggested leads to structural changes in the brain that may relate to a range of cognitive benefit. This combination of coordination and socializing has been shown to increase brain thickness in the parts of the cortex related to social/emotional welfare, which is crucial as we age.

2. Change your eating patterns

Evidence shows that calorie restriction, intermittent fasting, and time-restricted eating encourage neurogenesis in humans. In rodent studies, intermittent fasting has been found to improve cognitive function and brain structure, and reduce symptoms of metabolic disorders such as diabetes.

Reducing refined sugar will help reduce oxidative damage to brain cells, too, and we know that increased oxidative damage has been linked with a higher risk of developing Alzheimer’s disease. Twenty-four hour water-only fasts have also been proven to increase longevity and encourage neurogenesis.

Try any of the following, after checking with your doctor:

  • 24-hour water-only fast once a month
  •  Reducing your calorie intake by 50%-60% on two non-consecutive days of the week for two to three months or on an ongoing basis
  • Reducing calories by 20% every day for two weeks. You can do this three to four times a year
  • Eating only between 8 a.m. to 8 p.m., or 12 p.m. to 8 p.m. as a general rule

3. Prioritize sleep

Sleep helps promote the brain’s neural “cleaning” glymphatic system, which flushes out the build-up of age-related toxins in the brain (the tau proteins and beta amyloid plaques mentioned above). When people are sleep-deprived, we see evidence of memory deficits, and if you miss a whole night of sleep, research proves that it impacts IQ. Aim for seven to nine hours, and nap if it suits you. Our need to sleep decreases as we age.

Of course, there are individual exceptions, but having consistent sleep times and making sure you’re getting sufficient quality and length of sleep supports brain resilience over time. So how do you know if you’re getting enough? If you naturally wake up at the same time on weekends that you have to during the week, you probably are.

If you need to lie-in or take long naps, you’re probably not. Try practicing mindfulness or yoga nidra before bed at night, a guided breath-based meditation that has been shown in studies to improve sleep quality. There are plenty of recordings online if you want to experience it.

Pick any of the above that work for you and build it up until it becomes a habit, then move onto the next one and so on. You might find that by the end of the year, you’ll feel even healthier, more energized, and motivated than you do now, even as you turn another year older.

By: Fast Company / Tara Swart

Dr. Tara Swart is a neuroscientist, leadership coach, author, and medical doctor. Follow her on Twitter at @TaraSwart.

Source: Open-Your-Mind-Change

.

Critics:

Cognitive deficit is an inclusive term to describe any characteristic that acts as a barrier to the cognition process.

The term may describe

Mild cognitive impairment (MCI) is a neurocognitive disorder which involves cognitive impairments beyond those expected based on an individual’s age and education but which are not significant enough to interfere with instrumental activities of daily living. MCI may occur as a transitional stage between normal aging and dementia, especially Alzheimer’s disease. It includes both memory and non-memory impairments.Mild cognitive impairment has been relisted as mild neurocognitive disorder in DSM-5, and in ICD-11.

The cause of the disorder remains unclear, as well as its prevention and treatment. MCI can present with a variety of symptoms, but is divided generally into two types.

Amnestic MCI (aMCI) is mild cognitive impairment with memory loss as the predominant symptom; aMCI is frequently seen as a prodromal stage of Alzheimer’s disease. Studies suggest that these individuals tend to progress to probable Alzheimer’s disease at a rate of approximately 10% to 15% per year.[needs update]It is possible that being diagnosed with cognitive decline may serve as an indicator of aMCI.

Nonamnestic MCI (naMCI) is mild cognitive impairment in which impairments in domains other than memory (for example, language, visuospatial, executive) are more prominent. It may be further divided as nonamnestic single- or multiple-domain MCI, and these individuals are believed to be more likely to convert to other dementias (for example, dementia with Lewy bodies).

See also

The Link Between Bioelectricity and Consciousness

Life seems to be tied to bioelectricity at every level. The late electrophysiologist and surgeon Robert Becker spent decades researching the role of the body’s electric fields in development, wound healing, and limb regrowth. His 1985 book, The Body Electric: Electromagnetism and the Foundation of Life, was a fascinating deep dive into how the body is electric through and through—despite our inability to see or sense these fields with our unaided senses. But Becker’s work was far from complete.

One scientist who has taken up Becker’s line of inquiry is Michael Levin. He got hooked on the subject after he read The Body Electric. Levin has been working on “cracking the bioelectric code,” as a 2013 paper of his put it, ever since. “Evolution,” Levin has said, “really did discover how good the biophysics of electricity is for computing and processing information in non-neural tissues,” the many thousands of cell types that make up the body, our word for trillions of cells cooperating. “It’s really hard to define what’s special about neurons,” he told me. “Almost all cells do the things neurons do, just more slowly.”

How do disarranged cells and organs intuit what do to?

His team at Tufts University develops new molecular-genetic and conceptual tools to probe large-scale information processing in regeneration, embryo development, and cancer suppression—all mediated by bioelectric fields in varying degrees. This work involves examining, for example, how frogs, which normally don’t regenerate whole limbs (like salamanders do) can regrow limbs, repair their brains and spinal cords, or normalize tumors with the help of “electroceuticals” (a pun based on “pharmaceuticals”).

These are therapies that target the bioelectric circuits of tumors instead of, or together with, chemical-based therapies. Bioelectric fields are, in other words, more powerful than we have suspected and perform many surprising roles in the human body and all animal bodies.

Nature seems to have figured out that electric fields, similar to the role they play in human-created machines, can power a wide array of processes essential to life. Perhaps even consciousness itself. A veritable army of neuroscientists and electrophysiologists around the world are developing steadily deeper insights into the degree that electric and magnetic fields—“brainwaves” or “neural oscillations”—seem to reveal key aspects of consciousness.

The prevailing view for some time now has been that the brain’s bioelectric fields, which are electrical and magnetic fields produced at various physical scales, are an interesting side effect—or epiphenomenon—of the brains’ activity, but not necessarily relevant to the functioning of consciousness itself.

A number of thinkers are suggesting now, instead, that these fields may in fact be the main game in town when it comes to explaining consciousness. In a 2013 paper, philosopher Mostyn Jones reviewed various field theories of consciousness, still a minority school of thought in the field but growing.

If that approach is right, it is likely that the body’s bioelectric fields are also, more generally, associated in some manner with some kind of consciousness at various levels. Levin provided some support for this notion when I asked him about the potential for consciousness, in at least some rudimentary form, in the body’s electric fields.

“There are very few fundamental differences between neural networks and other tissues of bioelectrically communicating cells,” he said in an email. “If you think that consciousness in the brain is somehow a consequence of the brain’s electrical activity, then there’s no principled reason to assume that non-neural electric networks won’t underlie some primitive, basal (ancient) form of nonverbal consciousness.”

This way of thinking opens up exciting possibilities. It recognizes that there is perhaps some intelligence (and, to some thinkers, maybe even consciousness) in all of the body’s bioelectric fields, which are efficient sources of information transfer and even a kind of computation. In his work, Levin pieces together how these fields can contain information that guides growth and regeneration.

He sometimes describes these guiding forces as “morphogenetic fields.” It may sound like a mystical notion, but it’s quite physical and real, backed up by hard data. This information, Levin said, can be stored in multicellular electric fields “in a way that is likely very similar to how behavioral memories—of seeing a specific shape for example—are stored in a neuronal network.”

Take the case of a frog. “To become frogs, tadpoles have to rearrange their faces during metamorphosis,” Levin said. “It used to be thought that these movements were hardcoded, but our ‘Picasso’ tadpoles—which have all the organs in the wrong places—showed otherwise.” The apparent know-how that these bioelectric fields demonstrate, in terms of growing normal frogs in very un-normal circumstances, is uncanny. “Amazingly, they still largely became normal frogs!”

How do disarranged cells and organs intuit what do to? Levin, and the renowned philosopher and cognitive scientist Daniel Dennet, recently tackled this question in a rather provocatively titled article, “Cognition All the Way Down.” Something like thinking, they argue, isn’t just something we do in our heads that requires brains.

It’s a process even individual cells themselves, and not requiring any kind of brain, also take part in. To the biologists who see this as a cavalier form of anthropomorphization, Levin and Dennet say, “Chill out.” It’s useful to anthropomorphize many different kinds of life, to see in their parts and processes a variety of teleological experience. “Ever since the cybernetics advances of the 1940s and ’50s, engineers have had a robust, practical science of mechanisms with purpose and goal-directedness—without mysticism,” they write. “We suggest that biologists catch up.”

With respect to purposes and teleology (goal-directed behavior), they make their key point clear: “We think that this commendable scientific caution has gone too far, putting biologists into a straitjacket.”

A promising route for better understanding may be found, they write, in “thinking of parts of organisms as agents, detecting opportunities and trying to accomplish missions.” This is “risky, but the payoff in insight can be large.” For Levin, at least, bioelectric fields are key mechanisms for this kind of collective decision-making. These fields connect cells and tissues together, allowing, along with synaptic connections, for rapid information exchange, not only with immediate neighbors but distant ones as well.

These communication channels are involved in the emergence of cancer, which means that, according to Levin, they can potentially be useful in curing some forms of cancer. “You can [use bioelectric fields to] induce full-on metastatic melanoma—a kind of skin cancer—in perfectly normal animals with no carcinogens or nasty chemicals that break DNA,” he said. You can also use these same fields “to normalize existing tumors or prevent them from forming.” He’s currently moving this work to human clinical models.

The importance of bioelectric fields is all about connection, information, and computation. These ingredients equal cognition for Levin and Dennett, which is, for them, a continuum of complexity that has developed over a billion years of biological evolution. It’s not an all or nothing kind of thing but a spectrum—one that plays a role in development, evolution, cancer, and in the workings of consciousness itself.

By: Tom Hunt

Tam Hunt is a philosopher, a practicing lawyer, and writer. He is the author of two books on the philosophy of consciousness: Eco, Ego, Eros: Essays in Philosophy, Spirituality, and Science and Mind, World, God: Science and Spirit in the 21st Century.

Source: The Link Between Bioelectricity and Consciousness

.

.

Our bodies rely on an ultrafast nervous system to send impulses very quickly and it all starts with a special cell called the neuron. In this episode, Patrick will explain how these cells tell your body what to do. » Subscribe to Seeker! http://bit.ly/subscribeseeker » Watch more Human! http://bit.ly/HUMANplaylist » Visit our shop at http://shop.seeker.com
.
.

Big Ethical Questions about the Future of AI

Artificial intelligence is already changing the way we live our daily lives and interact with machines. From optimizing supply chains to chatting with Amazon Alexa, artificial intelligence already has a profound impact on our society and economy. Over the coming years, that impact will only grow as the capabilities and applications of AI continue to expand.

AI promises to make our lives easier and more connected than ever. However, there are serious ethical considerations to any technology that affects society so profoundly. This is especially true in the case of designing and creating intelligence that humans will interact with and trust. Experts have warned about the serious ethical dangers involved in developing AI too quickly or without proper forethought. These are the top issues keeping AI researchers up at night.

Bias: Is AI fair

Bias is a well-established facet of AI (or of human intelligence, for that matter). AI takes on the biases of the dataset it learns from. This means that if researchers train an AI on data that are skewed for race, gender, education, wealth, or any other point of bias, the AI will learn that bias. For instance, an artificial intelligence application used to predict future criminals in the United States showed higher risk scores and recommended harsher actions for black people than white based on the racial bias in America’s criminal incarceration data.

Of course, the challenge with AI training is there’s no such thing as a perfect dataset. There will always be under- and overrepresentation in any sample. These are not problems that can be addressed quickly. Mitigating bias in training data and providing equal treatment from AI is a major key to developing ethical artificial intelligence.

Liability: Who is responsible for AI?

Last month when an Uber autonomous vehicle killed a pedestrian, it raised many ethical questions. Chief among them is “Who is responsible, and who’s to blame when something goes wrong?” One could blame the developer who wrote the code, the sensor hardware manufacturer, Uber itself, the Uber supervisor sitting in the car, or the pedestrian for crossing outside a crosswalk.

Developing AI will have errors, long-term changes, and unforeseen consequences of the technology. Since AI is so complex, determining liability isn’t trivial. This is especially true when AI has serious implications on human lives, like piloting vehicles, determining prison sentences, or automating university admissions. These decisions will affect real people for the rest of their lives. On one hand, AI may be able to handle these situations more safely and efficiently than humans. On the other hand, it’s unrealistic to expect AI will never make a mistake. Should we write that off as the cost of switching to AI systems, or should we prosecute AI developers when their models inevitably make mistakes?

Security: How do we protect access to AI from bad actors?

As AI becomes more powerful across our society, it will also become more dangerous as a weapon. It’s possible to imagine a scary scenario where a bad actor takes over the AI model that controls a city’s water supply, power grid, or traffic signals. More scary is the militarization of AI, where robots learn to fight and drones can fly themselves into combat.

Cybersecurity will become more important than ever. Controlling access to the power of AI is a huge challenge and a difficult tightrope to walk. We shouldn’t centralise the benefits of AI, but we also don’t want the dangers of AI to spread. This becomes especially challenging in the coming years as AI becomes more intelligent and faster than our brains by an order of magnitude.

Human Interaction: Will we stop talking to one another?

An interesting ethical dilemma of AI is the decline in human interaction. Now more than any time in history it’s possible to entertain yourself at home, alone. Online shopping means you don’t ever have to go out if you don’t want to.

While most of us still have a social life, the amount of in-person interactions we have has diminished. Now, we’re content to maintain relationships via text messages and Facebook posts. In the future, AI could be a better friend to you than your closest friends. It could learn what you like and tell you what you want to hear. Many have worried that this digitization (and perhaps eventual replacement) of human relationships is sacrificing an essential, social part of our humanity.

Employment: Is AI getting rid of jobs?

This is a concern that repeatedly appears in the press. It’s true that AI will be able to do some of today’s jobs better than humans. Inevitably, those people will lose their jobs, and it will take a major societal initiative to retrain those employees for new work. However, it’s likely that AI will replace jobs that were boring, menial, or unfulfilling. Individuals will be able to spend their time on more creative pursuits, and higher-level tasks. While jobs will go away, AI will also create new markets, industries, and jobs for future generations.

Wealth Inequality: Who benefits from AI?

The companies who are spending the most on AI development today are companies that have a lot of money to spend. A major ethical concern is AI will only serve to centralizecoro wealth further. If an employer can lay off workers and replace them with unpaid AI, then it can generate the same amount of profit without the need to pay for employees.

Machines will create wealth more than ever in the economy of the future. Governments and corporations should start thinking now about how we redistribute that wealth so that everyone can participate in the AI-powered economy.

Power & Control: Who decides how to deploy AI?

Along with the centralization of wealth comes the centralization of power and control. The companies that control AI will have tremendous influence over how our society thinks and acts each day. Regulating the development and operation of AI applications will be critical for governments and consumers. Just as we’ve recently seen Facebook get in trouble for the influence its technology and advertising has had on society, we might also see AI regulations that codify equal opportunity for everyone and consumer data privacy.

Robot Rights: Can AI suffer?

A more conceptual ethical concern is whether AI can or should have rights. As a piece of computer code, it’s tempting to think that artificially intelligent systems can’t have feelings. You can get angry with Siri or Alexa without hurting their feelings. However, it’s clear that consciousness and intelligence operate on a system of reward and aversion. As artificially intelligent machines become smarter than us, we’ll want them to be our partners, not our enemies. Codifying humane treatment of machines could play a big role in that.

Ethics in AI in the coming years

Artificial intelligence is one of the most promising technological innovations in human history. It could help us solve a myriad of technical, economic, and societal problems. However, it will also come with serious drawbacks and ethical challenges. It’s important that experts and consumers alike be mindful of these questions, as they’ll determine the success and fairness of AI over the coming years.

By: By Steve Kilpatrick
Co-Founder & Director
Artificial Intelligence & Machine Learning

More contents:

Future Space

Future Robotics

Future of Mankind

Future Medicine

Improve Your Cognitive Health with This Brain-Training App

Your brain can be a great indicator of your overall health. These days, with so many of us confined to self-isolation and offices going fully remote, there’s no shame in feeling a little brain drain. But don’t let the doldrums get you down and harm your health.

CogniFit Premium Brain Training is designed to detect risk factors for alterations in cognitive functioning using neuropsychological assessments. Whether you’ve been feeling a bit slower lately or you just aren’t as motivated as you used to be, CogniFit can help you identify why. Millions of users already use CogniFit to identify possible cognitive alterations and deficiencies so they can create a personalized brain training regimen for their needs.Through validated tasks to evaluate 23 cognitive skills, CogniFit helps measure, train, and properly monitor mental fitness and its relation to neurological pathologies.

The intuitive app lets you personalize your training by choosing your preferred programs and the age group you’re in, so it can provide better insights. CogniFit’s exercises and brain games help stimulate cognitive functions and improve brain plasticity while providing real-time monitoring on the evolution of your skills and compares the results to age group norms.

Through validated tasks to evaluate 23 cognitive skills, CogniFit helps measure, train, and properly monitor mental fitness and its relation to neurological pathologies. The intuitive app lets you personalize your training by choosing your preferred programs and the age group you’re in, so it can provide better insights. CogniFit’s exercises and brain games help stimulate cognitive functions and improve brain plasticity while providing real-time monitoring on the evolution of your skills and compares the results to age group norms.

By: Entrepreneur Store Entrepreneur Leadership Network VIP

.

.

Apparently, alcohol, not getting enough sleep, no physical activity, smoking, high-fat or high-sodium foods, or being lonely could add to early cognitive decline. So if you want to be a member of the brainiacs’ club in your 70s or just remember your grandson’s birthday, you’d better start exercising in your 30s and put down those cheeseballs rolled in bacon (at least every now and then!).

In any event, there are methods to actively strengthen your cognitive abilities. Life-long learning, reading books (The Medical Futurist believes especially science fiction stimulates the brain) or playing mind-games all help. According to a study at the Fisher Center for Alzheimer’s Research Foundation, mental stimulation like reading can help protect memory and thinking skills, especially with age. The authors of the study suggest that reading every day can slow down a late-life cognitive decline, keeping the brain healthier and higher functioning for longer.

As part of our From Chance To Choice campaign, which through the HOW TO series aims to show methods and tools to take more control over our own health in the long term, we would like to suggest you some technological solutions, apps and games, to help keep your brain as fit as possible from early on.

Gameplay focuses and controls our attention, taps into our innate strengths, thrills us utterly and compels us to greater resilience in the attainment of more powerful and useful skills. That’s why gamified apps are perfect for improving and maintaining cognitive abilities.

Recently, several start-ups have started to experiment with bringing challenging offline games to digital brain-training apps. These usually stick to the same format: collections of mini-games you can play on any device with the purpose of improving comprehension, focus, and self-confidence, nicely drawn graphs to show how you’re developing over time; and optional subscriptions for extra games and features. So, here are our four favorites!

Source: https://medicalfuturist.com

advertisement

7 Riddles That Will Test Your Brain Power – Bright Side

These 7 puzzles will trick your brain.  Take this fun test to check the sharpness and productivity of your brain. Try to answer these questions as quickly as possible and see the results!  Our brain is a mysterious thing. We know more about stars than about the things inside our heads! But what we do know about the brain is that it gets less sharp and productive with age.

You have a maximum of 20 seconds for each task, but try to answer the questions as fast as possible. TIMESTAMPS What is the mistake two photos have in common? 0:45 How many holes does the T-shirt have? 1:53 How would you name this tree? 2:40 Can you solve this riddle one in 5 seconds? 3:21 Do you see a hidden baby? 4:26 Which line is longer? 5:12 Can you spot Mike Wazowski? 6:30 SUMMARY If it took you more than 20 seconds to answer each question, or you didn’t manage all the tasks, it means that you have the brain of a mature person.

It ‘s hard for you to make your mind see beyond the obvious and you can’t handle change easily. If took you less than 20 seconds, your brain is quite young, and you can approach tasks from different angles. If you answered each question correctly in less than 5 seconds, your brain is very young and flexible! You can notice the tiniest details right away and adapt to new situations easily! What is your result? Tell us in the comment section below!

Subscribe to our new channel ‘SLICK SLIME SAM’ – https://goo.gl/zarVZo

Give a thumbs–up to see more adventures! Subscribe to Bright Side : https://goo.gl/rQTJZz

Our Social Media: Facebook: https://www.facebook.com/brightside/

Instagram: https://www.instagram.com/brightgram/

5-Minute Crafts Youtube: https://www.goo.gl/8JVmuC 

For more videos and articles visit: http://www.brightside.me/

 

 

Your kindly Donations would be so effective in order to fulfill our future research and endeavors – Thank you

New Font Sans Forgetica Designed to Boost Your Memory – Jackson Ryan

1.jpg

Researchers at the Royal Melbourne Institute of Technology (RMIT) in Australia have developed an entirely new font designed “using the principles of cognitive psychology” to help you better remember your study notes. The font is a sans serif style typeface, with two unusual features: It slants slightly left, which is a rarely used design principle in typography, and it’s full of holes. Those holes have a purpose though. They make Sans Forgetica harder to read, tricking your brain into using “deeper cognitive processing” and promoting better memory retention…….

Read more: https://www.cnet.com/news/new-font-sans-forgetica-is-designed-to-boost-your-memory/

 

 

 

Your kindly Donations would be so effective in order to fulfill our future research and endeavors – Thank you

Why Mathematicians Can’t Find the Hay in a Haystack – Vladyslav Danilin

1.jpg

The first time I heard a mathematician use the phrase, I was sure he’d misspoken. We were on the phone, talking about the search for shapes with certain properties, and he said, “It’s like looking for hay in a haystack.” “Don’t you mean a needle?” I almost interjected. Then he said it again. In mathematics, it turns out, conventional modes of thought sometimes get turned on their head. The mathematician I was speaking with, Dave Jensen of the University of Kentucky, really did mean “hay in a haystack.” By it, he was expressing a strange fact about mathematical research: Sometimes the most common things are the hardest to find…….

Read more: https://www.quantamagazine.org/why-mathematicians-cant-find-the-hay-in-a-haystack-20180917/

 

 

Your kindly Donations would be so effective in order to fulfill our future research and endeavors – Thank you

 

 

The World’s Most Valuable Resource Is No Longer Oil, But Data – The Economist

1.jpg

A NEW commodity spawns a lucrative, fast-growing industry, prompting antitrust regulators to step in to restrain those who control its flow. A century ago, the resource in question was oil. Now similar concerns are being raised by the giants that deal in data, the oil of the digital era.

These titans—Alphabet (Google’s parent company), Amazon, Apple, Facebook and Microsoft—look unstoppable. They are the five most valuable listed firms in the world. Their profits are surging: they collectively racked up over $25bn in net profit in the first quarter of 2017. Amazon captures half of all dollars spent online in America. Google and Facebook accounted for almost all the revenue growth in digital advertising in America last year.

Such dominance has prompted calls for the tech giants to be broken up, as Standard Oil was in the early 20th century. This newspaper has argued against such drastic action in the past. Size alone is not a crime. The giants’ success has benefited consumers. Few want to live without Google’s search engine, Amazon’s one-day delivery or Facebook’s newsfeed.

Nor do these firms raise the alarm when standard antitrust tests are applied. Far from gouging consumers, many of their services are free (users pay, in effect, by handing over yet more data). Take account of offline rivals, and their market shares look less worrying. And the emergence of upstarts like Snapchat suggests that new entrants can still make waves.

But there is cause for concern. Internet companies’ control of data gives them enormous power. Old ways of thinking about competition, devised in the era of oil, look outdated in what has come to be called the “data economy” (see Briefing). A new approach is needed.

Quantity has a quality all its own

What has changed? Smartphones and the internet have made data abundant, ubiquitous and far more valuable. Whether you are going for a run, watching TV or even just sitting in traffic, virtually every activity creates a digital trace—more raw material for the data distilleries. As devices from watches to cars connect to the internet, the volume is increasing:

some estimate that a self-driving car will generate 100 gigabytes per second. Meanwhile, artificial-intelligence (AI) techniques such as machine learning extract more value from data. Algorithms can predict when a customer is ready to buy, a jet-engine needs servicing or a person is at risk of a disease. Industrial giants such as GE and Siemens now sell themselves as data firms.

This abundance of data changes the nature of competition. Technology giants have always benefited from network effects: the more users Facebook signs up, the more attractive signing up becomes for others. With data there are extra network effects. By collecting more data, a firm has more scope to improve its products, which attracts more users, generating even more data, and so on.

The more data Tesla gathers from its self-driving cars, the better it can make them at driving themselves—part of the reason the firm, which sold only 25,000 cars in the first quarter, is now worth more than GM, which sold 2.3m. Vast pools of data can thus act as protective moats.

Access to data also protects companies from rivals in another way. The case for being sanguine about competition in the tech industry rests on the potential for incumbents to be blindsided by a startup in a garage or an unexpected technological shift. But both are less likely in the data age. The giants’ surveillance systems span the entire economy:

Google can see what people search for, Facebook what they share, Amazon what they buy. They own app stores and operating systems, and rent out computing power to startups. They have a “God’s eye view” of activities in their own markets and beyond. They can see when a new product or service gains traction, allowing them to copy it or simply buy the upstart before it becomes too great a threat.

Many think Facebook’s $22bn purchase in 2014 of WhatsApp, a messaging app with fewer than 60 employees, falls into this category of “shoot-out acquisitions” that eliminate potential rivals. By providing barriers to entry and early-warning systems, data can stifle competition.

Who ya gonna call, trustbusters?

The nature of data makes the antitrust remedies of the past less useful. Breaking up a firm like Google into five Googlets would not stop network effects from reasserting themselves: in time, one of them would become dominant again. A radical rethink is required—and as the outlines of a new approach start to become apparent, two ideas stand out.

The first is that antitrust authorities need to move from the industrial era into the 21st century. When considering a merger, for example, they have traditionally used size to determine when to intervene. They now need to take into account the extent of firms’ data assets when assessing the impact of deals.

The purchase price could also be a signal that an incumbent is buying a nascent threat. On these measures, Facebook’s willingness to pay so much for WhatsApp, which had no revenue to speak of, would have raised red flags. Trustbusters must also become more data-savvy in their analysis of market dynamics, for example by using simulations to hunt for algorithms colluding over prices or to determine how best to promote competition .

The second principle is to loosen the grip that providers of online services have over data and give more control to those who supply them. More transparency would help: companies could be forced to reveal to consumers what information they hold and how much money they make from it.

Governments could encourage the emergence of new services by opening up more of their own data vaults or managing crucial parts of the data economy as public infrastructure, as India does with its digital-identity system, Aadhaar. They could also mandate the sharing of certain kinds of data, with users’ consent—an approach Europe is taking in financial services by requiring banks to make customers’ data accessible to third parties.

Rebooting antitrust for the information age will not be easy. It will entail new risks: more data sharing, for instance, could threaten privacy. But if governments don’t want a data economy dominated by a few giants, they will need to act soon.

Your kindly Donations would be so effective in order to fulfill our future research and endeavors – Thank you
https://www.paypal.me/ahamidian

7 Personal Growth Questions Every Teacher Must Ask Themselves – Lee Watanabe-Crockett – Lee Watanabe Crockett

1.jpg

Every teacher knows that consistently asking personal growth questions is part of the game in education. They exist in all shapes and sizes and are meant to challenge educators to meet and exceed professional goals. It’s for the good of themselves, their colleagues, and most of all their learners, that they devote themselves to this. You have enough to do already, so why make PD complicated?

Personal development goes hand in hand with professional development. It enhances it by ensuring we look deep within ourselves to discover the true motivations for why we do what we do, and what’s most important to us as teachers. Ultimately, these realizations drive us to excel for the benefit of our learners, and for the future of education.

By no means are we advocating that the 7 personal growth questions we’ve provided below are the be-all-end-all of what you can reflect on during your journey. What they will do is provide you with a baseline for developing your craft in your own way.

7 Personal Growth Questions for All Teachers

These personal growth questions are ones that are simple enough to ask yourself every day, while also complex enough to ponder deeply and critically whenever you have time. And no matter how busy you are, there is always time.

1. What is most important to me as a teacher?

This is the key to determining your professional development direction right here. What matters to you most about being a teacher? What kind of teacher do you want to be, and why? What are the biggest reasons you have for your choice?

Don’t fall into the trap of making this one about policy and educational doctrine. This is an introspective and emotional inquiry—perhaps even spiritual for many of you. Consider it carefully and, above all else, listen to your heart.

2. What takes me out of my comfort zone?

Progress happens in the face of overcoming challenges. But how do we constructively challenge ourselves if we can’t step away from feeling safe in our vocations? Do something that you’ve never done before—in your practice, in a relationship with a colleague, or what have you.

Think “what if …” and then act on it. If it makes you uncomfortable to consider or even scares you a little, you might be on to something.

3. How can I make sure I am learning every day?

Modeling lifelong learning is something every teacher must do for their learners. It comes through curiosity and a willingness to explore the unknown. Our learners benefit from our passion as educators when we display the same love for learning we want them to have when they leave us. How can you best do this every day?

4. What is the most amazing thing about me and how can I use it in my teaching?

Stop being modest—you’re awesome and you know it. So it’s time to let your learners know it too. Think about what you can do that no one else can. Recall a time when someone pointed out something remarkable about you that you’ve always taken for granted. “Wow, you really know how to _______.”

Are you good with humour? Are you highly creative with design and visuals? Are you able to use wisdom and compassion to turn any negative experience into a positive one? Are you an entertaining storyteller? What’s your special talent? And for crying out loud, why aren’t you making it part of your teaching?

5. What is the most important thing my learners need from me?

There is a simple and highly effective way to figure this one out: ask them. It also happens to be the only way. You don’t have to let yourself be afraid of the answers you get either, especially when you come from a place of heartfelt concern for your kids. So ask them what the need; they’ll surprise you and delight you, and they might even make you cry. Isn’t meaningful connection amazing?

6. How can I connect and communicate better with parents and colleagues?

Nothing changes you like perspective. As young and experienced teachers, we often do many things wrong. As parents, we also do things wrong. These moments present prime opportunities for teachers and parents to support each other and consistently bridge the communication gap.

In the end, nothing beats how parents and teachers can unite to solve problems and tackle issues together. The same is true for teachers who come together in the same way. What are the most proactive ways you can improve rapport with parents and colleagues to sustain a culture of support?

7. What am I going to start doing today to become a better teacher than I was yesterday?

You’ll find there is never a bad time to ponder this question. This doesn’t mean you’re not a fantastic teacher already; quite the opposite, in fact. It’s the idea that you are constantly looking for ways to improve that make you as incredible as you are. Everyone that’s a part of your life experience benefits from this.

Ask it as a personal reflection at the end of your day. Ask it at the beginning of your morning as a mediation. Ask it as you write in your daily journal. Ask it multiple times a day, even. Just make sure you ask it.

Your kindly Donations would be so effective in order to fulfill our future research and endeavors – Thank you