How Will Pumped Hydro Energy Storage Power Our Future?

Pumped storage hydropower has proven to be an ideal solution to the growing list of challenges faced by grid operators.

As the transition to a clean energy future rapidly unfolds, this flexible technology will become even more important for a reliable, affordable and low carbon grid, write IHA analysts Nicholas Troja and Samuel Law.

“Anything that can go wrong will go wrong”. That old adage, Murphy’s law, must seem appropriate for many power grid operators in 2020.

This year has tested the safe running and reliability of grids around the world like few others. Often termed ‘the biggest machine ever built,’ managing a power system, involving the coordination of complex and instantaneous interactions, is a formidable task at the best of times.

With the impacts of the Covid-19 pandemic on top of extreme weather events, greater penetrations of variable renewables and increasingly aged thermal assets, the task has only become more demanding in many markets.

These challenges have brought into sharp focus the growing need for energy storage, such as that offered by pumped storage hydropower.

Recent events highlight the need for pumped storage

Covid-19 continues to have an extraordinary impact on electricity markets. During the height of worldwide lockdowns, with large sections of the economy shutdown or greatly impaired, electricity demand declined by up to 30 per cent in some countries across Europe and in India.

As Fatih Birol, Executive Director of the International Energy Agency (IEA) stated, the demand drop “fast forwarded some power systems 10 years into the future” regarding integrating higher percentages of variable renewable energy (VRE) which receive priority dispatch to the grid. Managing periods of such low demand can create “significant operational risks” for grid operators. In some markets, this has led to curtailing, or shutting down, wind and solar facilities to stabilise the grid.

During such periods, pumped storage hydropower, with its ability to both store and generate large quantities of energy over long periods, was the first port of call for those grid operators lucky enough to have such stations on hand. In Britain, its four pumped storage stations were hailed by the Financial Times newspaper as the “first line of defence in the battle to keep Britain’s lights on”. Able to increase system demand by pumping water back up to their upper reservoir, pumped storage is a more cost-effective way of managing the grid than paying operators to curtail variable supply.

In August, the U.S. state of California experienced rolling blackouts for the first time since 2001 due to a combination of record heatwaves driving up demand, faltering gas-fired stations and a lack of dispatchable generation. As Stephen Berberich, President of the California Independent System Operator (CAISO) said, “we thought there would be adequate power to supply the demand…we were wrong” and the costs to the Californian economy will be significant.

These managed blackouts provide yet another wake-up call for policymakers on the need to appropriately plan for a zero-emissions future. With limited balancing resources such as pumped storage, California’s grid did not have the flexibility to shift sufficient generating capacity to the evenings when the sun had set yet the demand remained high.

Given California’s aim of reaching 100 per cent clean electricity by 2045, mainly from wind and solar power which currently accounts for 20 per cent of generation, significant investment in flexible, low carbon balancing resources will be required.

In response, California is betting big on batteries for short-duration storage, from sub-seconds to up to four hours, to manage intraday variations in net load. However, with those high levels of VRE on the grid, long-duration storage, which can discharge for 10 hours or more at rated power, will be needed to accommodate the seasonal patterns of VREs. It will do so by shifting generation over days, weeks and months of supply and demand imbalance. This is a story that rings true for many countries across the world with ambitious climate targets.

Achieving California’s clean energy target is made even harder by the government’s decision to classify conventional hydropower stations greater than 30 MW as a non-renewable resource under its Renewables Portfolio Standard. This arbitrary classification is at odds with international consensus and penalises the state’s oldest source of affordable, flexible and low-carbon electricity.

Figure 1: Illustration of a closed-loop (off-river) pumped storage station and how it can be used support VRE.

Capabilities of pumped storage

With a total installed capacity of nearly 160 GW, pumped storage currently accounts for over 94 per cent of both storage capacity and stored energy in grid scale applications globally. This has earned pumped storage its name as the world’s “water battery”. It is a mature and reliable technology capable of storing energy for daily or weekly cycles and up to months, as well as seasonal applications, depending on project scale and configurations.

Pumped storage operates by storing electricity in the form of gravitational potential energy through pumping water from a lower to an upper reservoir (see figure 1). The result of this simple solution is a very high round-trip efficiency of 80 per cent, which compares favourably to other storage technologies.

Pumped storage tends to have high energy-to-power ratios and is well suited to provide long discharge durations at very low energy storage costs. Across different timescales, pumped storage can serve multiple functions (see figure 2). For example, at shorter discharge durations, it is suitable for ancillary services such as frequency balancing and back-up reserve.

With four to eight hours of discharge, it can provide daily shifting for day-night energy arbitrage. For longer durations over 10 hours, it can accommodate multi-day supply profile changes, reduce energy curtailment, replace peak generation capacity and provide transmission benefits.

Figure 2: The plot above visualises (logarithmic scale used) the estimated discharge durations relative to installed capacity and energy storage capacity for some 250 pumped storage stations currently in operation, based on information from IHA’s Pumped Storage Tracking Tool. The vast majority of pumped storage stations have a discharge duration longer than 6 hours, and some are capable of seasonal storage.

The majority of today’s pumped storage stations were built some forty years ago. Yet, they are still providing vital services to our power systems today. With occasional refurbishment, these long-term assets can last for many decades to come.

Despite being a mature technology, the resurgence of interest in pumped storage has brought forth numerous new R&D initiatives. One prominent example is the European Commission’s four-year XFLEX HYDRO project, which aims to develop new technological solutions to enhance hydropower’s flexibility. Latest innovations, such as variable speed turbines and smart digital operating systems, will be tested on a range of pumped storage demonstration sites.

While often thought of as geographically constrained, recent studies have identified vast technical potential for pumped storage development worldwide. Research by the Australian National University highlighted over 600,000 potential sites for low-impact off-river pumped storage development, including locations in California. There is also growing interest in retrofitting pumped storage at disused mines, underground caverns, non-powered dams and reservoir hydropower stations.                              

Seeking a path toward a clean, affordable and secure transition

California is a pioneer in the energy transition. Though many opponents of wind and solar have unfortunately used the blackouts as an example of why their rapid roll-out is a threat to a secure, reliable grid. As noted earlier, the blackouts were not due to too much VRE capacity being on the grid, but a lack of integrated planning to support an evolving electricity mix with sufficient dispatchable generation and storage.

The IEA recently stated that, dispatchable pumped storage, along with conventional hydropower, is the often overlooked workhorse of flexibility. However, its development, like many energy storage technologies, is currently being hampered by the lack of appropriate regulatory frameworks and market signals to reward its contribution to the grid. Outside China, year-on-year installed capacity growth has been anaemic at just 1.5 per cent since 2014 (see figure 3).

Figure 3: Global pumped storage installed capacity by region. Note that 2019 recorded the lowest growth in pumped storage capacity for over a decade, with only 304 MW added. Source: IHA’s database.

Given the technology’s long lead times, investment decisions are needed urgently to ensure that pumped storage, in conjunction with other low-carbon flexibility options, are available to grid operators without needing to rely on carbon-intensive gas-fired generation as a backup. This is especially important as VRE penetration reaches increasingly high levels not yet experienced on a regular basis.

IHA is continuing to work across the hydropower sector and is seeking to learn lessons from other sectors to support the development and deployment of pumped storage. Together with national authorities and multilateral development banks, we are developing a new global initiative to shape and enhance the role of the technology in future power systems.

Further information

Join our Hydropower Pro online community or sign-up to our email newsletter via our website homepage for latest developments.

To learn more about IHA and our work on pumped storage, please visit: www.hydropower.org/pumped-storage

To contact the authors please email nicholas.troja@hydropower.org and samuel.law@hydropower.org

Nick Troja is a Senior Hydropower Sector Analyst. His work focuses on building and sharing knowledge on global hydropower, including identifying trends in project financing, policies and market dynamics.

Before joining IHA, Nick worked for the UK’s steel industry focusing on the EU Emissions Trading System and the impact of other EU level climate change and energy policies on the sector. Prior to this he worked for the UK’s department of energy and climate change, covering a wide range of policy areas and as an adviser to the shadow minister for emissions trading and climate change in Canberra. He holds a bachelor’s degree in international business and master’s degree in public policy.  

Samuel Law is Hydropower Sector Analyst. His work focuses on building and sharing knowledge on sustainable hydropower development, working on topics such as clean energy systems, green financing mechanisms and regional hydropower development.

Samuel holds a master’s degree in environmental technology from Imperial College London and has a technical background in environmental engineering. Prior to joining IHA, he completed an internship with the United Nations in Bangkok. At the UN, he conducted research on Sustainable Development Goals, integrated resource management and collaborative governance, as well as supported project implementation and organised international conferences. He also has experience as a business intelligence analyst in London, where he conducted research on market dynamics and investment trends across industries.

.

.

Australian Renewable Energy Agency

Like the hydroelectric power stations that have powered Tasmania for a century, a new generation of pumped hydro plants will play an important role in Australia’s future energy mix. With the Australian Energy Market Operator forecasting that 15 GW of large-scale storage will be needed by the early 2040s, pumped hydro is expected to operate alongside large-scale batteries and other energy storage technologies. Learn more about pumped hydro here – https://arena.gov.au/blog/how-could-p

.

More Contents:

Artificial Leaf Technology could one Day Power our World June 2, 2020

Kakadu Climate Change Concerns January 23, 2021

10 Future Alternative Energy Sources June 30, 2015

Renewable Energy from Evaporating Water June 30, 2015

Could This Technology make Cheap Clean Energy? June 30, 2015

Innovations In Clean, Renewable Energy

Related Contents:

Advances In Solar Power Exploration Into Technology

Solar power is in a constant state of innovation in 2019, with new advances in solar panel technology announced constantly. In the past year alone, there have been milestones in solar efficiency, solar energy storage, wearable solar tech, and solar design tech. Read on to get the complete update on all the breakthroughs you should know about in the world of new solar panel technology. The cost of solar is dropping across the nation. See prices in your area and get free solar quotes on the EnergySage Marketplace.

Solar technology: what’s new in 2019?

There are two main types of solar technology: photovoltaics (PV) and concentrated solar power (CSP). Solar PV technology captures sunlight to generate electric power, and CSP harnesses the sun’s heat and uses it to generate thermal energy that powers heaters or turbines. With these two forms of solar energy comes a wide range of opportunities for technical innovation. Here are some of the latest emerging/further developing solar panel technologies for 2019:

Solar skin design

One major barrier for the solar industry is the fact that a high percentage of homeowners consider solar panels to be an unsightly home addition. Luckily, one new venture has a solution. Sistine Solar, a Boston-based design firm, is making major strides with the concept of aesthetic enhancement that allow solar panels to have a customized look. The MIT startup has created a “solar skin” product that makes it possible for solar panels to match the appearance of a roof without interfering with panel efficiency or production.

Solar powered roads

Last summer paved the way for tests of an exciting new PV technology – solar powered roads. The sidewalks along Route 66, America’s historic interstate highway, were chosen as the testing location for solar-powered pavement tech. These roadways are heralded for their ability to generate clean energy, but they also include LED bulbs that can light roads at night and have the thermal heating capacity to melt snow during winter weather. The next stop following sidewalk tests is to install these roadways on designated segments of Route 66.

Wearable solar

Though wearable solar devices are nothing new (solar-powered watches and other gadgets have been on the market for several years), the past few years saw an innovation in solar textiles: tiny solar panels can now be stitched into the fabric of clothing. The wearable solar products of the past, like solar-powered watches, have typically been made with hard plastic material. This new textile concept makes it possible for solar to expand into home products like window curtains and dynamic consumer clean tech like heated car seats. This emerging solar technology is credited to textile designer Marianne Fairbanks and chemist Trisha Andrew.

Solar batteries: innovation in solar storage

The concepts of off-grid solar and solar plus storage have gained popularity in U.S. markets, and solar manufacturers have taken notice. The industry-famous Tesla Powerwall, a rechargeable lithium-ion ion battery product launched in 2015, continues to lead the pack with regard to market share and brand recognition for solar batteries.  Tesla offers two storage products, the Powerwall 2.0 for residential use and the Powerpack for commercial use. Solar storage is still a fairly expensive product in 2019, but a surge in demand from solar shoppers is expected to bring significantly more efficient and affordable batteries to market in 2019.

Solar tracking mounts

As solar starts to reach mainstream status, more and more homeowners are considering solar – even those who have roofs that are less than ideal for panels. Because of this expansion, ground mounted solar is becoming a viable clean energy option, thanks in part to tracking mount technology. Trackers allow solar panels to maximize electricity production by following the sun as it moves across the sky. PV tracking systems tilt and shift the angle of a solar array as the day goes by to best match the location of the sun.

 Though this panel add-on has been available for some time, solar manufacturers are truly embracing the technology. GTM Research recently unveiled a recent report that shows a major upward trend in the popularity of tracking systems. GTM projects a 254 percent year-over-year increase for the PV tracking market this year. The report stated that by 2021, almost half of all ground mount arrays will include solar tracking capability.

Advances in solar panel efficiency

The past few years in the solar industry have been a race to the top in terms of solar cell efficiency, and recent times have been no different. A number of achievements by various panel manufacturers have brought us to higher and higher maximum efficiencies each year. The solar cell types used in mainstream markets could also see major improvements in cost per watt – a metric that compares relative affordability of solar panels. Thanks to Swiss and American researchers, Perovskite solar cells (as compared to the silicon cells that are used predominantly today) have seen some major breakthroughs in the past two years.

The result will be a solar panel that can generate 20+ percent efficiency while still being one of the lowest cost options on the market. Of course, the work doesn’t stop there, as MIT researchers reminded us in May when they announced new technology that could double the efficiency of solar cells overall. The MIT lab team revealed a new tech concept that captures and utilizes the waste heat that is usually emitted by solar panels. This typically released and non-harnessed thermal energy is a setback and opportunity for improvement for solar technology, which means this innovation could help the cost of solar to plummet even further.

Solar thermal fuel (STF)

There is little debate when it comes to solar power’s ultimate drawback as an energy source: storage. While the past decade has seen incredible growth of the PV industry, the path forward for solar involves an affordable storage solution that will make solar a truly sustainable energy source 24 hours a day. Though solar batteries (mentioned above) are a storage option, they are still not economically viable for the mainstream. Luckily, MIT Professor Jeffrey Grossman and his team of researchers have spent much of the past few years developing alternative storage solutions for solar, the best one appears to be solar thermal fuels (STFs).

The technology and process behind STFs is comparable to a typical battery. The STF can harness sunlight energy, store it as a charge and then release it when prompted. The issue with storing solar as heat, according to the team’s findings, is that heat will always dissipate over time, which is why it is crucial that solar storage tech can charge energy rather than capture heat. For Grossman’s team, the latest STF prototype is simply an improvement of a prior design that allowed solar power to be stored as a liquid substance. Recent years saw the invention of a solid state STF application that could be implemented in windows, windshields, car tops, and other surfaces exposed to sunlight.

Solar water purifiers

Stanford University researchers collaborated with the Department of Energy this year to develop a new solar device that can purify water when exposed to sunlight.  The minuscule tablet (roughly half the size of a postage stamp) is not the first solar device to filter water, but it has made major strides in efficiency compared to past inventions. Prior purifier designs needed to harness UV rays and required hours of sun exposure to fully purify water. By contrast, Stanford’s new product can access visible light and only requires a few minutes to produce reliable drinking water. As the technology behind solar purifiers continues to improve, expect these chiclet-sized devices to come to market with hikers and campers in mind as an ideal consumer audience.

What new solar technology means for homeowners

For those considering solar panels systems, this long list of solar panel technology innovations from recent years is nothing but good news. Efficiency upgrades, storage improvements and equipment capabilities all contribute to more efficient power output for solar panels and lower costs for systems. Many of the products mentioned in this article, such as tracking mounts and solar batteries, are available in the EnergySage Solar Marketplace – all you have to do is indicate your preference for particular equipment options when you register your property. To get an instant estimate for your home’s potential solar costs and savings, try our free Solar Calculator.

By: Luke Richardson

Post navigation

Flexible solar panels: are they right for you?

Solar panel kits for sale: how they cost and how to compare them

Related Contents

Advances in Solar Power | Exploration into Technology The Rise and Fall & Rise again of UK Solar The Worst Natural Disasters of 2020 Climate Change sparks new round of Water wars, threatens Ranchers, Wildlife and Recreation

This Is How We’d All Die Instantly If The Sun Suddenly Went Supernova

As far as raw explosive power goes, no other cataclysm in the Universe is both as common and as destructive as a core-collapse supernova. In one brief event lasting only seconds, a runaway reaction causes a star to give off as much energy as our Sun will emit over its entire 10-12 billion year lifetime. While many supernovae have been observed both historically and since the invention of the telescope, humanity has never witnessed one up close.

Recently, the nearby red supergiant star, Betelgeuse, has started exhibiting interesting signs of dimming, leading some to suspect that it might be on the verge of going supernova. While our Sun isn’t massive enough to experience that same fate, it’s a fun and macabre thought experiment to imagine what would happen if it did. Yes, we’d all die in short order, but not from either the blast wave or from radiation. Instead, the neutrinos would get us first. Here’s how.

An animation sequence of the 17th century supernova in the constellation of Cassiopeia. This... [+] explosion, despite occurring in the Milky Way and about 60-70 years after 1604, could not be seen with the naked eye due to the intervening dust. Surrounding material plus continued emission of EM radiation both play a role in the remnant's continued illumination. A supernova is the typical fate for a star greater than about 10 solar masses, although there are some exceptions.

NASA, ESA, and the Hubble Heritage STScI/AURA)-ESA/Hubble Collaboration. Acknowledgement: Robert A. Fesen (Dartmouth College, USA) and James Long (ESA/Hubble)

A supernova — specifically, a core-collapse supernova — can only occur when a star many times more massive than our Sun runs out of nuclear fuel to burn in its core. All stars start off doing what our Sun does: fusing the most common element in the Universe, hydrogen, into helium through a series of chain reactions. During this part of a star’s life, it’s the radiation pressure from these nuclear fusion reactions that prevent the star’s interior from collapsing due to the enormous force of gravitation.

So what happens, then, when the star burns through all the hydrogen in its core? The radiation pressure drops and gravity starts to win in this titanic struggle, causing the core to contract. As it contracts, it heats up, and if the temperature can pass a certain critical threshold, the star will start fusing the next-lightest element in line, helium, to produce carbon.

This cutaway showcases the various regions of the surface and interior of the Sun, including the... [+] core, which is where nuclear fusion occurs. As time goes on, the helium-containing region in the core expands and the maximum temperature increases, causing the Sun's energy output to increase. When our Sun runs out of hydrogen fuel in the core, it will contract and heat up to a sufficient degree that helium fusion can begin.

Wikimedia Commons user Kelvinsong

This will occur in our own Sun some 5-to-7 billion years in the future, causing it to swell into a red giant. Our parent star will expand so much that Mercury, Venus, and possibly even Earth will be engulfed, but let’s instead imagine that we come up some clever plan to migrate our planet to a safe orbit, while mitigating the increased luminosity to prevent our planet from getting fried. This helium burning will last for hundreds of millions of years before our Sun runs out of helium and the core contracts and heats up once again.

For our Sun, that’s the end of the line, as we don’t have enough mass to move to the next stage and begin carbon fusion. In a star far more massive than our Sun, however, hydrogen-burning only takes millions of years to complete, and the helium-burning phase lasts merely hundreds of thousands of years. After that, the core’s contraction will enable carbon fusion to proceed, and things will move very quickly after that.

As it nears the end of its evolution, heavy elements produced by nuclear fusion inside the star are... [+] concentrated toward the center of the star. When the star explodes, the vast majority of the outer layers absorb neutrons rapidly, climbing the periodic table, and also get expelled back into the Universe where they participate in the next generation of star and planet formation.

NASA / CXC / S. Lee

Carbon fusion can produce elements such as oxygen, neon, and magnesium, but only takes hundreds of years to complete. When carbon becomes scarce in the core, it again contracts and heats up, leading to neon fusion (which lasts about a year), followed by oxygen fusion (lasting for a few months), and then silicon fusion (which lasts less than a day). In that final phase of silicon-burning, core temperatures can reach ~3 billion K, some 200 times the hottest temperatures currently found at the center of the Sun.

And then the critical moment occurs: the core runs out of silicon. Again, the pressure drops, but this time there’s nowhere to go. The elements that are produced from silicon fusion — elements like cobalt, nickel and iron — are more stable than the heavier elements that they’d conceivably fuse into. Instead, nothing there is capable of resisting gravitational collapse, and the core implodes.

Artist's illustration (left) of the interior of a massive star in the final stages, pre-supernova,... [+] of silicon-burning. (Silicon-burning is where iron, nickel, and cobalt form in the core.) A Chandra image (right) of the Cassiopeia A supernova remnant today shows elements like Iron (in blue), sulphur (green), and magnesium (red). We do not know whether all core-collapse supernovae follow the same pathway or not.

NASA/CXC/M.Weiss; X-ray: NASA/CXC/GSFC/U.Hwang & J.Laming

This is where the core-collapse supernova happens. A runaway fusion reaction occurs, producing what’s basically one giant atomic nucleus made of neutrons in the star’s core, while the outer layers have a tremendous amount of energy injected into them. The fusion reaction itself lasts for only around 10 seconds, liberating about 1044 Joules of energy, or the mass-equivalent (via Einstein’s E = mc2) of about 1027 kg: as much as you’d release by transforming two Saturns into pure energy.

That energy goes into a mix of radiation (photons), the kinetic energy of the material in the now-exploding stellar material, and neutrinos. All three of these are more than capable of ending any life that’s managed to survive on an orbiting planet up to that point, but the big question of how we’d all die if the Sun went supernova depends on the answer to one question: who gets there first?

The anatomy of a very massive star throughout its life, culminating in a Type II Supernova when the... [+] core runs out of nuclear fuel. The final stage of fusion is typically silicon-burning, producing iron and iron-like elements in the core for only a brief while before a supernova ensues. Many of the supernova remnants will lead to the formation of neutron stars, which can produce the greatest abundances of the heaviest elements of all by colliding and merging.

Nicole Rager Fuller/NSF

When the runaway fusion reaction occurs, the only delay in the light getting out comes from the fact that it’s produced in the core of this star, and the core is surrounded by the star’s outer layers. It takes a finite amount of time for that signal to propagate to the outermost surface of the star — the photosphere — where it’s then free to travel in a straight line at the speed of light.

As soon as it gets out, the radiation will scorch everything in its path, blowing the atmosphere (and any remaining ocean) clean off of the star-facing side of an Earth-like planet immediately, while the night side would last for seconds-to-minutes longer. The blast wave of the matter would follow soon afterwards, engulfing the remnants of our scorched world and quite possibly, dependent on the specifics of the explosion, destroying the planet entirely.

                        

But any living creature would surely die even before the light or the blast wave from the supernova arrived; they’d never see their demise coming. Instead, the neutrinos — which interact with matter so rarely that an entire star, to them, functions like a pane of glass does to visible light — simply speed away omnidirectionally, from the moment of their creation, at speeds indistinguishable from the speed of light.

Moreover, neutrinos carry an enormous fraction of a supernova’s energy away: approximately 99% of it. In any given moment, with our paltry Sun emitting just ~4 × 1026 joules of energy each second, approximately 70 trillion (7 × 1013) neutrinos pass through your hand. The probability that they’ll interact is tiny, but occasionally it will happen, depositing the energy it carries into your body when it happens. Only a few neutrinos actually do this over the course of a typical day with our current Sun, but if it went supernova, the story would change dramatically.

A neutrino event, identifiable by the rings of Cerenkov radiation that show up along the... [+] photomultiplier tubes lining the detector walls, showcase the successful methodology of neutrino astronomy and leveraging the use of Cherenkov radiation. This image shows multiple events, and is part of the suite of experiments paving our way to a greater understanding of neutrinos. The neutrinos detected in 1987 marked the dawn of both neutrino astronomy as well as multi-messenger astronomy.

Super Kamiokande collaboration

When a supernova occurs, the neutrino flux increases by approximately a factor of 10 quadrillion (1016), while the energy-per-neutrino goes up by around a factor of 10, increasing the probability of a neutrino interacting with your body tremendously. When you work through the math, you’ll find that even with their extraordinary low probability of interaction, any living creature — from a single-celled organism to a complex human being — would be boiled from the inside out from neutrino interactions alone.

This is the scariest outcome imaginable, because you’d never see it coming. In 1987, we observed a supernova from 168,000 light-years away with both light and neutrinos. The neutrinos arrived at three different detectors across the world, spanning about 10 seconds from the earliest to the latest. The light from the supernova, however, didn’t begin arriving until hours later. By the time the first visual signatures arrived, everything on Earth would have already been vaporized for hours.

A supernova explosion enriches the surrounding interstellar medium with heavy elements. The outer... [+] rings are caused by previous ejecta, long before the final explosion. This explosion also emitted a huge variety of neutrinos, some of which made it all the way to Earth.

ESO / L. Calçada

Perhaps the scariest part of neutrinos is how there’s no good way to shield yourself from them. Even if you tried to block their path to you with lead, or a planet, or even a neutron star, more than 50% of the neutrinos would still get through. According to some estimates, not only would all life on an Earth-like planet be destroyed by neutrinos, but any life anywhere in a comparable solar system would meet that same fate, even out at the distance of Pluto, before the first light from the supernova ever arrived.

https://www.forbes.com/video/6111169884001/

The only early detection system we’d ever be able to install to know something was coming is a sufficiently sensitive neutrino detector, which could detect the unique, surefire signatures of neutrinos generated from each of carbon, neon, oxygen, and silicon burning. We would know when each of these transitions happened, giving life a few hours to say their final goodbyes during the silicon-burning phase before the supernova occurred.

There are many natural neutrino signatures produced by stars and other processes in the Universe.... [+] Every set of neutrinos produced by a different fusion process inside a star will have a different spectral energy signature, enabling astronomers to determine whether their parent star is fusing carbon, oxygen, neon, and silicon in its interior, or not.

IceCube collaboration / NSF / University of Wisconsin

It’s horrifying to think that an event as fascinating and destructive as a supernova, despite all the spectacular effects it produces, would kill anything nearby before a single perceptible signal arrived, but that’s absolutely the case with neutrinos. Produced in the core of a supernova and carrying away 99% of its energy, all life on an Earth-like would receive a lethal dose of neutrinos within 1/20th of a second as every other location on the planet. No amount of shielding, even from being on the opposite side of the planet from the supernova, would help at all.

Whenever any star goes supernova, neutrinos are the first signal that can be detected from them, but by the time they arrive, it’s already too late. Even with how rarely they interact, they’d sterilize their entire solar system before the light or matter from the blast ever arrived. At the moment of a supernova’s ignition, the fate of death is sealed by the stealthiest killer of all: the elusive neutrino.

Follow me on Twitter. Check out my website or some of my other work here.

Ethan Siegel Ethan Siegel

I am a Ph.D. astrophysicist, author, and science communicator, who professes physics and astronomy at various colleges. I have won numerous awards for science writing since 2008 for my blog, Starts With A Bang, including the award for best science blog by the Institute of Physics. My two books, Treknology: The Science of Star Trek from Tricorders to Warp Drive, Beyond the Galaxy: How humanity looked beyond our Milky Way and discovered the entire Universe, are available for purchase at Amazon. Follow me on Twitter @startswithabang.

Source: This Is How We’d All Die Instantly If The Sun Suddenly Went Supernova

108K subscribers
Our Sun would never undergo a Supernova explosion. But what if it does? Video clips from NASA’s Goddard Space Flight Center and ESA/Hubble Images by: ESA/NASA, pixabay.com Music: Olympus by Ross Budgen – Music ( https://youtu.be/BnmglWHoVrk ) Licensed under CC BY 4.0 International License We’re on Facebook: https://www.facebook.com/astrogeekz/ We’re on Instagram: https://www.instagram.com/astrogeekz/ Support us on Patreon.

Category

Science & Technology

Share this:

Like this:

Like Loading...

This Is Why We Don’t Shoot Earth’s Garbage Into The Sun

Imagine our planet as it was for the first 4.55 billion years of its existence. Fires, volcanoes, earthquakes, tsunamis, asteroid strikes, hurricanes and many other natural disasters were ubiquitous, as was biological activity throughout our entire measured history. Most of the environmental changes that occurred were gradual and isolated; only in a few instances — often correlated with mass extinctions — were the changes global, immediate, and catastrophic.

But with the arrival of human beings, Earth’s natural environment has another element to contend with: the changes wrought upon it by our species. For tens of thousands of years, the largest wars were merely regional skermishes; the largest problems with waste led only to isolated disease outbreaks. But our numbers and technological capabilities have grown, and with it, a waste management problem. You might think a great solution would be to send our worst garbage into the Sun, but we’ll never make it happen. Here’s why.

The very first launch of the Falcon Heavy, on February 6, 2018, was a tremendous success. The rocket... [+] reached low-Earth-orbit, deployed its payload successfully, and the main boosters returned to Cape Kennedy, where they landed successfully. The promise of a reusable heavy-lift vehicle is now a reality, and could lower launch costs to ~$1000/pound. Still, even with all these advances, we won't be launching our garbage into the Sun anytime soon.

Jim Watson/AFP/Getty Images

Today In: Innovation

At present, there are a little more than 7 billion humans on the planet, and the previous century saw us at last become a spacefaring civilization, where we’ve broken the gravitational bonds that have kept us shackled to Earth. We’ve extracted valuable and rare minerals and elements, synthesized new chemical compounds, developed nuclear technologies, and produced new technologies that far exceed even the wildest dreams of our distant ancestors.

Although these new technologies have transformed our world and improved our quality of life, there are negative side-effects that have come along for the ride. We now have the capacity to cause widespread damage and destruction to our environment in a variety of ways, from deforestation to atmospheric pollution to ocean acidification and more. With time and care, the Earth will begin self-regulating as soon as we stop exacerbating these problems. But other problems just aren’t going to get better on their own on any reasonable timescale.

Nuclear weapon test Mike (yield 10.4 Mt) on Enewetak Atoll. The test was part of the Operation Ivy.... [+] Mike was the first hydrogen bomb ever tested. A release of this much energy corresponds to approximately 500 grams of matter being converted into pure energy: an astonishingly large explosion for such a tiny amount of mass. Nuclear reactions involving fission or fusion (or both, as in the case of Ivy Mike) can produce tremendously dangerous, long-term radioactive waste.

National Nuclear Security Administration / Nevada Site Office

Some of what we’ve produced here on Earth isn’t merely a problem to be reckoned with over the short-term, but poses a danger that will not significantly lessen with time. Our most dangerous, long-term pollutants include nuclear by-products and waste, hazardous chemicals and biohazards, plastics that off-gas and don’t biodegrade, and could wreak havoc on a significant fraction of the living beings on Earth if they got into the environment in the wrong way.

You might think that the “worst of the worst” of these offenders should be packed onto a rocket, launched into space, and sent on a collision course with the Sun, where at last they won’t plague Earth anymore. (Yes, that was similar to the plot of Superman IV.) From a physics point of view, it’s possible to do so.

But should we do it? That’s another story entirely, and it begins with considering how gravitation works on Earth and in our Solar System.

The Mercury-bound MESSENGER spacecraft captured several stunning images of Earth during a gravity... [+] assist swingby of its home planet on Aug. 2, 2005. Several hundred images, taken with the wide-angle camera in MESSENGER's Mercury Dual Imaging System (MDIS), were sequenced into a movie documenting the view from MESSENGER as it departed Earth. Earth rotates roughly once every 24 hours on its axis and moves through space in an elliptical orbit around our Sun.

NASA / Messenger mission

Human beings evolved on Earth, grew to prominence on this world, and developed extraordinary technologies that our corner of the cosmos had never seen before. We all have long dreamed of exploring the Universe beyond our home, but only in the past few decades have we managed to escape the gravitational bonds of Earth. The gravitational pull exerted by our massive planet is only dependent on our distance from Earth’s center, which causes spacetime to curve and causes all objects on or near it — including humans — to constantly accelerate “downwards.”

There’s a certain amount of energy keeping any massive object bound to Earth: gravitational potential energy. However, if we move fast enough (i.e., impart enough kinetic energy) to an object, it can cross two important thresholds.

  1. The threshold of a stable orbital speed to never collide with Earth: about 7.9 km/s (17,700 mph).
  2. The threshold of escaping from Earth’s gravity entirely: 11.2 km/s (25,000 mph).

It takes a speed of 7.9 km/s to achieve "C" (stable orbit), while it takes a speed of 11.2 km/s for... [+] "E" to escape Earth's gravity. Speeds less than "C" will fall back to Earth; speeds between "C" and "E" will remain bound to Earth in a stable orbit.

Brian Brondel under a c.c.a.-s.a.-3.0 license

For comparison, a human at the equator of our planet, where Earth’s rotation is maximized, is moving only at about 0.47 km/s (1,000 mph), leading to the conclusion that we’re in no danger of escaping unless there’s some tremendous intervention that changes the situation.

Luckily, we’ve developed just such an intervention: rocketry. To get a rocket into Earth’s orbit, we require at least the amount of energy it would take to accelerate that rocket to the necessary threshold speed we mentioned earlier. Humanity has been doing this since the 1950s, and once we’ve escaped from Earth, there was so much more to see occurring on larger scales.

Earth isn’t stationary, but orbits the Sun at approximately 30 km/s (67,000 mph), meaning that even if you escape from Earth, you’ll still find yourself not only gravitationally bound to the Sun, but in a stable elliptical orbit around it.

The Dove satellites, launched from the ISS, are designed for Earth imaging and have numbered... [+] approximately 300 in total. There are ~130 Dove satellites, created by Planet, that are still in Earth's orbit, but that number will drop to zero by the 2030s due to orbital decay. If these satellites were boosted to escape from Earth's gravity, they would still orbit the Sun unless they were boosted by much greater amounts.

NASA

This is a key point: you might think that here on Earth, we’re bound by Earth’s gravity and that’s the dominant factor as far as gravitation is concerned. Quite to the contrary, the gravitational pull of the Sun far exceeds the gravitational pull of Earth! The only reason we don’t notice it is because you, me, and the entire planet Earth are in free-fall with respect to the Sun, and so we’re all accelerated by it at the same relative rate.

If we were in space and managed to escape from Earth’s gravity, we’d still find ourselves moving at approximately 30 km/s with respect to the Sun, and at an approximate distance of 150 million km (93 million miles) from our parent star. If we wanted to escape from the Solar System, we’d have to gain about another 12 km/s of speed to reach escape velocity, something that a few of our spacecraft (Pioneer 10 and 11, Voyager 1 and 2, and New Horizons) have already achieved.

The escape speed from the Sun at Earth's distance is 42 km/s, and we already move at 30 km/s just by... [+] orbiting the Sun. Once Voyager 2 flew by Jupiter, which gravitationally 'slingshotted' it, it was destined to leave the Solar System.

Wikimedia Commons user Cmglee

But if we wanted to go in the opposite direction, and launch a spacecraft payload into the Sun, we’d have a big challenge at hand: we’d have to lose enough kinetic energy that a stable elliptical orbit around our Sun would transition to an orbit that came close enough to the Sun to collide with it. There are only two ways to accomplish this:

  1. Bring enough fuel with you so that you can decelerate your payload sufficiently (i.e., have it lose as much of its relative speed with respect to the Sun as possible), and then watch your payload gravitationally free-fall into the Sun.
  2. Configure enough fly-bys with the innermost planets of our Solar System — Earth, Venus and/or Mercury — so that the orbiting payload gets de-boosted (as opposed to the positive boosts that spacecraft like Pioneer, Voyager, and New Horizons received from gravitationally interacting with the outer planets) and eventually comes close enough to the Sun that it gets devoured.

The idea of a gravitational slingshot, or gravity assist, is to have a spacecraft approach a planet... [+] orbiting the Sun that it is not bound to. Depending on the orientation of the spacecraft's relative trajectory, it will either receive a speed boost or a de-boost with respect to the Sun, compensated for by the energy lost or gained (respectively) by the planet orbiting the Sun.

Wikimedia Commons user Zeimusu

The first option, in reality, requires so much fuel that it’s practically impossible with current (chemical rocket) technology. If you loaded up a rocket with a massive payload, like you might expect for all the hazardous waste you want to fire into the Sun, you’d have to load it up with a lot of rocket fuel, in orbit, to decelerate it sufficiently so that it’d fall into the Sun. To launch both that payload and the additional fuel requires a rocket that’s larger, more powerful and more massive than any we’ve ever built on Earth by a large margin.

Instead, we can use the gravity assist technique to either add or remove kinetic energy from a payload. If you approach a large mass (like a planet) from behind, fly in front of it, and get gravitationally slingshotted behind the planet, the spacecraft loses energy while the planet gains energy. If you go the opposite way, though, approaching the planet from ahead, flying behind it and getting gravitationally slingshotted back in front again, your spacecraft gains energy while removing it from the orbiting planet.

The Messenger mission took seven years and a total of six gravity assists and five deep-space... [+] maneuvers to reach its final destination: in orbit around the planet Mercury. The Parker Solar Probe will need to do even more to reach its final destination: the corona of the Sun. When it comes to reaching for the inner Solar System, spacecraft are required to lose a lot of energy to make it possible: a difficult task.

NASA/JPL

Two decades ago, we successfully used this gravitational slingshot method to successfully send an orbiter to rendezvous and continuously image the planet Mercury: the Messenger mission. It enabled us to construct the first all-planet mosaic of our Solar System’s innermost world. More recently, we’ve used the same technique to launch the Parker Solar Probe into a highly elliptical orbit that will take it to within just a few solar radii of the Sun.

A carefully calculated set of future trajectories is all that’s required to reach the Sun, so long as you orient your payload with the correct initial velocity. It’s difficult to do, but not impossible, and the Parker Solar Probe is perhaps the poster child for how we would, from Earth, successfully launch a rocket payload into the Sun.

Keeping all this in mind, then, you might conclude that it’s technologically feasible to launch our garbage — including hazardous waste like poisonous chemicals, biohazards, and even radioactive waste — but it’s something we’ll almost certainly never do.

Why not? There are currently three barriers to the idea:

  1. The possibility of a launch failure. If your payload is radioactive or hazardous and you have an explosion on launch or during a fly-by with Earth, all of that waste will be uncontrollably distributed across Earth.
  2. Energetically, it costs less to shoot your payload out of the Solar System (from a positive gravity assist with planets like Jupiter) than it does to shoot your payload into the Sun.
  3. And finally, even if we chose to do it, the cost to send our garbage into the Sun is prohibitively expensive at present.

This time-series photograph of the uncrewed Antares rocket launch in 2014 shows a catastrophic... [+] explosion-on-launch, which is an unavoidable possibility for any and all rockets. Even if we could achieve a much improved success rate, the risk of contaminating our planet with hazardous waste is prohibitive for launching our garbage into the Sun (or out of the Solar System) at present.

NASA/Joel Kowsky

The most successful and reliable space launch system of all time is the Soyuz rocket, which has a 97% success rate after more than 1,000 launches. Yet a 2% or 3% failure rate, when you apply that to a rocket loaded up with all the dangerous waste you want launched off of your planet, leads to the catastrophic possibility of having that waste spread into the oceans, atmosphere, into populated areas, drinking water, etc. This scenario doesn’t end well for humanity; the risk is too high.

Considering that the United States alone is storing about 60,000 tons of high-level nuclear waste, it would take approximately 8,600 Soyuz rockets to remove this waste from the Earth. Even if we could reduce the launch failure rate to an unprecedented 0.1%, it would cost approximately a trillion dollars and, with an estimated 9 launch failures to look forward to, would lead to over 60,000 pounds of hazardous waste being randomly redistributed across the Earth.

Unless we’re willing to pay an unprecedented cost and accept the near-certainty of catastrophic environmental pollution, we have to leave the idea of shooting our garbage into the Sun to the realm of science fiction and future hopeful technologies like space elevators. It’s undeniable that we’ve made quite the mess on planet Earth. Now, it’s up to us to figure out our own way out of it.

Follow me on Twitter. Check out my website or some of my other work here.

I am a Ph.D. astrophysicist, author, and science communicator, who professes physics and astronomy at various colleges. I have won numerous awards for science writing since 2008 for my blog, Starts With A Bang, including the award for best science blog by the Institute of Physics. My two books, Treknology: The Science of Star Trek from Tricorders to Warp Drive, Beyond the Galaxy: How humanity looked beyond our Milky Way and discovered the entire Universe, are available for purchase at Amazon. Follow me on Twitter @startswithabang.

Source: This Is Why We Don’t Shoot Earth’s Garbage Into The Sun

30.2M subscribers
How to Stop Water Polution. In case you’re wondering what water polution has to do with a new continent discoevered in the Pacific Ocean, here’s the answer to this mystery. This new continent is an island that consists solely of garbage and plastic waste. Some countries are ready to announce an ecological disaster. Let’s see if there’s something we can all do to save the planet. TIMESTAMPS The popularity of plastic 0:26 Garbage islands 1:47 The Great Pacific Garbage Patch 2:30 Problems connected with the plastic pollution of the ocean 4:39 Bali ecological disaster 7:31 Several ways to solve problem 8:26 #newcontinent #garbageisland #ecologicalproblem Music: Butchers – Silent Partner https://www.youtube.com/audiolibrary/… SUMMARY -2 million plastic bags a minute are thrown away. As for bubble wrap, the amount produced in just one year would be enough to cover our planet around the equator. 500 billion plastic cups are used and disposed of annually. -There are 3 huge garbage islands in the world: in the central North Pacific Ocean, in the Indian Ocean, and in the Atlantic Ocean. -The size of the Great Pacific Garbage Patch is currently more than 600,000 square miles. According to the journal Scientific Reports, there are more than 1.8 trillion pieces of plastic that have accumulated in this area. -Plastic objects in the ocean kill animals or get stuck in their bodies. Some types of plastic are toxic. In addition, plastic has the ability to absorb such poisonous substances as mercury. Birds often choke to death after trying to swallow a bright object that has caught their eye. -Indonesian authorities have recently declared a “garbage emergency.” More than 100 tons of waste brought ashore every day to beaches from Seminyak and Jimbaran to Kuta. -To solve the problem, people can find a way to remove the garbage that is already in the ocean. Another way out is to decrease pollution or stop it completely. Subscribe to Bright Side : https://goo.gl/rQTJZz —————————————————————————————- Our Social Media: Facebook: https://www.facebook.com/brightside/ Instagram: https://www.instagram.com/brightgram/ 5-Minute Crafts Youtube: https://www.goo.gl/8JVmuC —————————————————————————————- For more videos and articles visit: http://www.brightside.me/

 

Like this:

Why The Track Forecast For Hurricane Dorian Has Been So Challenging

Here is something that you can take to the bank. We will not see the name “Dorian” used in the Atlantic basin for any future hurricane. The names of particularly destructive or impactful storms are retired. According to the National Hurricane Center, Dorian is now tied with the 1935 Labor Day hurricane for the strongest Atlantic hurricane landfall on record. In a 3 pm advisory on September 1st, the National Hurricane Center warned of gusts to 220 mph and 18 to 23 feet storm surges for parts of the Abacos.

I have been in the field of meteorology over 25 years and do not recall seeing warnings about 220 mph gusts for a hurricane. Hurricane watches have also been issued for Andros Island and from North of Deerfield Beach to the Volusia/Brevard County Line in Florida. At the time of writing, the official forecast from the National Hurricane Center is for a northward curve and no direct Florida landfall. This is dramatically different from forecasts only a few days ago.

There is still uncertainty with the forecast so coastal Florida, Georgia, and the Carolinas should remain on high alert. Why has the track forecast been so challenging with Hurricane Dorian?

Historically, hurricane track forecasts have outpaced intensity forecasts. I discuss the reasons why in a previous Forbes article at this link. With Hurricane Dorian, uncertainty about the forecast track and timing of the storm forced officials to move the Florida State – Boise State football game from Jacksonville, slated for a 7 pm kickoff on Saturday, to noon in Tallahassee. I am certain that many businesses and people are questioning the move given that timing of when impacts are now expected. Unfortunately, officials and emergency managers often must make decision on the best information at the moment.

Some people may be tempted to use uncertainty with this forecast to spew vitriol or skepticism at meteorologists and our models. However, challenges with Hurricane Dorian’s track forecast do not define the legacy of weather forecasts. It would be silly to say that the NFL’s best field goal kicker is terrible based on a few misses.

So what’s going on? I asked a panel of tropical meteorology experts.

Today In: Innovation

Speed of motion of Hurricane Dorian has been a significant challenge. Professor John Knox, a recent recipient of the American Meteorological Society’s Edward Lorenz Teaching Award, offers an important lesson. The University of Georgia atmospheric sciences professor pointed out:

Before you bash the meteorologists for being stupid: one reason the forecasted track has changed is because the forecasts of the forward speed of Dorian have slowed it down more and more. If it had chugged along as originally forecast, it likely would have hit east-central Florida and then maybe gone into the Gulf, before the high pressure above us in the Southeast would break down. But, because it’s moving more slowly, the high-pressure break down is opening the gate, so to speak, for Dorian to go more northward and eastward. So, the change in forecast is tied tightly to the arrival timing.

Professor John Knox, University of Georgia

Dr. Phillippe Papin is an Atmospheric Scientist and Associate Postdoctoctor Scientist at the U.S. Naval Research Laboratory. Papin also points to the high pressure as being a factor. He wrote:

the ridge to the north of Dorian has been steering Dorian off to the west the last few days….But there is a weak trough that is swinging into the eastern US that is going to erode the strength to the ridge enough so that a gap forms to the north of Dorian and it begins to move further to the north.

Dr. Phillippe Papin, U.S. Naval Research Laboratory

The timing of when that weakness develops and on how far Dorian makes it west in the meantime has been the source of uncertainty in the model guidance for the last 2-3 days according to Papin. At the time of writing, there is still some spread in the model solutions.

Dr. Michael Ventrice is a tropical weather expert with IBM and The Weather Company. He has been concerned about the storm environment and how well the models are capturing the rapidly evolving situation. He told me:

I believe the uncertainty is derived from how the models are resolving Dorian, locally. The recent intensification of the storm today is not being resolved by the models properly at the time of the 12z initialization. The interaction with the Bahamas, how that interaction might alter the mesoscale structure of the Hurricane, if that interaction induces a wobble, are all valid questions at this point in time

Michael Ventrice — IBM/The Weather Company

A hurricane of this size and intensity can certainly modify its environment and be modified by that environment. Sam Lillo, a doctoral candidate at the University of Oklahoma, tweeted an interesting point on the afternoon of September 1st about how worrisome the rapid intensification and track uncertainty of Hurricane Dorian has been:

The track uncertainty in NWP at under 3-day lead-time is very uncomfortable, especially considering proximity to land. This would be uncomfortable for any hurricane. But then make it a category 5.

Sam Lillo, doctoral candidate in meteorology at the University of Oklahoma

Our best models have oscillated (and in some cases continue to do so) within the past 24-36 hours on just how close Dorian will get to Florida before curving northward. Lillo offers some further insight into what Dr. Ventrice was alluding to about the environment:

As Dorian strengthened faster than expected, diabatic outflow developed an upper level anticyclone to the southwest, adding southerly and westerly components to the steering flow. The westerly component in particular slowed the forward motion of the hurricane, and now its track across the Bahamas coincides with a trough that sweeps across the Mid Atlantic and Northeast on Monday. This trough cuts into the ridge to the north of Dorian, with multiple steering currents now trying to tug the hurricane in all different directions. The future track is highly sensitive to each of these currents, with large feedback on every mile the hurricane jogs to the left or right over the next 24 to 48 hours.

Sam Lillo, doctoral candidate in meteorology at the University of Oklahoma

Lillo offers a nice meteorological explanation. In a nutshell, he is saying that the rapid intensification perturbed the near-storm environment and now there may be other steering influences besides the ridge of high pressure that the models are struggling to resolve.

In a previous Forbes piece last week, I mentioned that forecasts in the 5+ day window and beyond can have errors of 200 miles and that the information should be used as “guidance” not “Gospel.” Because there is still uncertainty with the models and Dorian is such a strong storm, residents from coastal Florida to the Carolinas must pay attention and be prepared to act. I have complete confidence in my colleagues at the National Hurricane Center, and they should always be your definitive source with storms like this. They still maintain an eventual curve northward before the storm reaches the Florida coast. However, the issuance of hurricane watches in Florida also indicates that they know the margin of error is razor thin.

Follow me on Twitter. Check out my website.

Dr. J. Marshall Shepherd, a leading international expert in weather and climate, was the 2013 President of American Meteorological Society (AMS) and is Director of the University of Georgia’s (UGA) Atmospheric Sciences Program. Dr. Shepherd is the Georgia Athletic Association Distinguished Professor and hosts The Weather Channel’s Weather Geeks Podcast, which can be found at all podcast outlets. Prior to UGA, Dr. Shepherd spent 12 years as a Research Meteorologist at NASA-Goddard Space Flight Center and was Deputy Project Scientist for the Global Precipitation Measurement (GPM) mission. In 2004, he was honored at the White House with a prestigious PECASE award. He also has received major honors from the American Meteorological Society, American Association of Geographers, and the Captain Planet Foundation. Shepherd is frequently sought as an expert on weather and climate by major media outlets, the White House, and Congress. He has over 80 peer-reviewed scholarly publications and numerous editorials. Dr. Shepherd received his B.S., M.S. and PhD in physical meteorology from Florida State University.

Source: Why The Track Forecast For Hurricane Dorian Has Been So Challenging

National Hurricane Center director Ken Graham provides an update on Hurricane Dorian. RELATED: https://bit.ly/2NFZCak Dorian’s slow crawl, estimated at about 7 mph on Sunday afternoon, placed it within 185 miles of West Palm Beach, Florida. But forecasters remained unsure of whether, or where, it might make landfall in the U.S. after it makes an expected turn to the north.

That left millions of people from South Florida to North Carolina on alert and preparing for the worst. » Subscribe to USA TODAY: http://bit.ly/1xa3XAh » Watch more on this and other topics from USA TODAY: https://bit.ly/2JYptss » USA TODAY delivers current local and national news, sports, entertainment, finance, technology, and more through award-winning journalism, photos, videos and VR. #hurricanedorian #dorian #hurricanes

%d bloggers like this: