This Is How We’d All Die Instantly If The Sun Suddenly Went Supernova

As far as raw explosive power goes, no other cataclysm in the Universe is both as common and as destructive as a core-collapse supernova. In one brief event lasting only seconds, a runaway reaction causes a star to give off as much energy as our Sun will emit over its entire 10-12 billion year lifetime. While many supernovae have been observed both historically and since the invention of the telescope, humanity has never witnessed one up close.

Recently, the nearby red supergiant star, Betelgeuse, has started exhibiting interesting signs of dimming, leading some to suspect that it might be on the verge of going supernova. While our Sun isn’t massive enough to experience that same fate, it’s a fun and macabre thought experiment to imagine what would happen if it did. Yes, we’d all die in short order, but not from either the blast wave or from radiation. Instead, the neutrinos would get us first. Here’s how.

An animation sequence of the 17th century supernova in the constellation of Cassiopeia. This... [+] explosion, despite occurring in the Milky Way and about 60-70 years after 1604, could not be seen with the naked eye due to the intervening dust. Surrounding material plus continued emission of EM radiation both play a role in the remnant's continued illumination. A supernova is the typical fate for a star greater than about 10 solar masses, although there are some exceptions.

NASA, ESA, and the Hubble Heritage STScI/AURA)-ESA/Hubble Collaboration. Acknowledgement: Robert A. Fesen (Dartmouth College, USA) and James Long (ESA/Hubble)

A supernova — specifically, a core-collapse supernova — can only occur when a star many times more massive than our Sun runs out of nuclear fuel to burn in its core. All stars start off doing what our Sun does: fusing the most common element in the Universe, hydrogen, into helium through a series of chain reactions. During this part of a star’s life, it’s the radiation pressure from these nuclear fusion reactions that prevent the star’s interior from collapsing due to the enormous force of gravitation.

So what happens, then, when the star burns through all the hydrogen in its core? The radiation pressure drops and gravity starts to win in this titanic struggle, causing the core to contract. As it contracts, it heats up, and if the temperature can pass a certain critical threshold, the star will start fusing the next-lightest element in line, helium, to produce carbon.

This cutaway showcases the various regions of the surface and interior of the Sun, including the... [+] core, which is where nuclear fusion occurs. As time goes on, the helium-containing region in the core expands and the maximum temperature increases, causing the Sun's energy output to increase. When our Sun runs out of hydrogen fuel in the core, it will contract and heat up to a sufficient degree that helium fusion can begin.

Wikimedia Commons user Kelvinsong

This will occur in our own Sun some 5-to-7 billion years in the future, causing it to swell into a red giant. Our parent star will expand so much that Mercury, Venus, and possibly even Earth will be engulfed, but let’s instead imagine that we come up some clever plan to migrate our planet to a safe orbit, while mitigating the increased luminosity to prevent our planet from getting fried. This helium burning will last for hundreds of millions of years before our Sun runs out of helium and the core contracts and heats up once again.

For our Sun, that’s the end of the line, as we don’t have enough mass to move to the next stage and begin carbon fusion. In a star far more massive than our Sun, however, hydrogen-burning only takes millions of years to complete, and the helium-burning phase lasts merely hundreds of thousands of years. After that, the core’s contraction will enable carbon fusion to proceed, and things will move very quickly after that.

As it nears the end of its evolution, heavy elements produced by nuclear fusion inside the star are... [+] concentrated toward the center of the star. When the star explodes, the vast majority of the outer layers absorb neutrons rapidly, climbing the periodic table, and also get expelled back into the Universe where they participate in the next generation of star and planet formation.

NASA / CXC / S. Lee

Carbon fusion can produce elements such as oxygen, neon, and magnesium, but only takes hundreds of years to complete. When carbon becomes scarce in the core, it again contracts and heats up, leading to neon fusion (which lasts about a year), followed by oxygen fusion (lasting for a few months), and then silicon fusion (which lasts less than a day). In that final phase of silicon-burning, core temperatures can reach ~3 billion K, some 200 times the hottest temperatures currently found at the center of the Sun.

And then the critical moment occurs: the core runs out of silicon. Again, the pressure drops, but this time there’s nowhere to go. The elements that are produced from silicon fusion — elements like cobalt, nickel and iron — are more stable than the heavier elements that they’d conceivably fuse into. Instead, nothing there is capable of resisting gravitational collapse, and the core implodes.

Artist's illustration (left) of the interior of a massive star in the final stages, pre-supernova,... [+] of silicon-burning. (Silicon-burning is where iron, nickel, and cobalt form in the core.) A Chandra image (right) of the Cassiopeia A supernova remnant today shows elements like Iron (in blue), sulphur (green), and magnesium (red). We do not know whether all core-collapse supernovae follow the same pathway or not.

NASA/CXC/M.Weiss; X-ray: NASA/CXC/GSFC/U.Hwang & J.Laming

This is where the core-collapse supernova happens. A runaway fusion reaction occurs, producing what’s basically one giant atomic nucleus made of neutrons in the star’s core, while the outer layers have a tremendous amount of energy injected into them. The fusion reaction itself lasts for only around 10 seconds, liberating about 1044 Joules of energy, or the mass-equivalent (via Einstein’s E = mc2) of about 1027 kg: as much as you’d release by transforming two Saturns into pure energy.

That energy goes into a mix of radiation (photons), the kinetic energy of the material in the now-exploding stellar material, and neutrinos. All three of these are more than capable of ending any life that’s managed to survive on an orbiting planet up to that point, but the big question of how we’d all die if the Sun went supernova depends on the answer to one question: who gets there first?

The anatomy of a very massive star throughout its life, culminating in a Type II Supernova when the... [+] core runs out of nuclear fuel. The final stage of fusion is typically silicon-burning, producing iron and iron-like elements in the core for only a brief while before a supernova ensues. Many of the supernova remnants will lead to the formation of neutron stars, which can produce the greatest abundances of the heaviest elements of all by colliding and merging.

Nicole Rager Fuller/NSF

When the runaway fusion reaction occurs, the only delay in the light getting out comes from the fact that it’s produced in the core of this star, and the core is surrounded by the star’s outer layers. It takes a finite amount of time for that signal to propagate to the outermost surface of the star — the photosphere — where it’s then free to travel in a straight line at the speed of light.

As soon as it gets out, the radiation will scorch everything in its path, blowing the atmosphere (and any remaining ocean) clean off of the star-facing side of an Earth-like planet immediately, while the night side would last for seconds-to-minutes longer. The blast wave of the matter would follow soon afterwards, engulfing the remnants of our scorched world and quite possibly, dependent on the specifics of the explosion, destroying the planet entirely.

                        

But any living creature would surely die even before the light or the blast wave from the supernova arrived; they’d never see their demise coming. Instead, the neutrinos — which interact with matter so rarely that an entire star, to them, functions like a pane of glass does to visible light — simply speed away omnidirectionally, from the moment of their creation, at speeds indistinguishable from the speed of light.

Moreover, neutrinos carry an enormous fraction of a supernova’s energy away: approximately 99% of it. In any given moment, with our paltry Sun emitting just ~4 × 1026 joules of energy each second, approximately 70 trillion (7 × 1013) neutrinos pass through your hand. The probability that they’ll interact is tiny, but occasionally it will happen, depositing the energy it carries into your body when it happens. Only a few neutrinos actually do this over the course of a typical day with our current Sun, but if it went supernova, the story would change dramatically.

A neutrino event, identifiable by the rings of Cerenkov radiation that show up along the... [+] photomultiplier tubes lining the detector walls, showcase the successful methodology of neutrino astronomy and leveraging the use of Cherenkov radiation. This image shows multiple events, and is part of the suite of experiments paving our way to a greater understanding of neutrinos. The neutrinos detected in 1987 marked the dawn of both neutrino astronomy as well as multi-messenger astronomy.

Super Kamiokande collaboration

When a supernova occurs, the neutrino flux increases by approximately a factor of 10 quadrillion (1016), while the energy-per-neutrino goes up by around a factor of 10, increasing the probability of a neutrino interacting with your body tremendously. When you work through the math, you’ll find that even with their extraordinary low probability of interaction, any living creature — from a single-celled organism to a complex human being — would be boiled from the inside out from neutrino interactions alone.

This is the scariest outcome imaginable, because you’d never see it coming. In 1987, we observed a supernova from 168,000 light-years away with both light and neutrinos. The neutrinos arrived at three different detectors across the world, spanning about 10 seconds from the earliest to the latest. The light from the supernova, however, didn’t begin arriving until hours later. By the time the first visual signatures arrived, everything on Earth would have already been vaporized for hours.

A supernova explosion enriches the surrounding interstellar medium with heavy elements. The outer... [+] rings are caused by previous ejecta, long before the final explosion. This explosion also emitted a huge variety of neutrinos, some of which made it all the way to Earth.

ESO / L. Calçada

Perhaps the scariest part of neutrinos is how there’s no good way to shield yourself from them. Even if you tried to block their path to you with lead, or a planet, or even a neutron star, more than 50% of the neutrinos would still get through. According to some estimates, not only would all life on an Earth-like planet be destroyed by neutrinos, but any life anywhere in a comparable solar system would meet that same fate, even out at the distance of Pluto, before the first light from the supernova ever arrived.

https://www.forbes.com/video/6111169884001/

The only early detection system we’d ever be able to install to know something was coming is a sufficiently sensitive neutrino detector, which could detect the unique, surefire signatures of neutrinos generated from each of carbon, neon, oxygen, and silicon burning. We would know when each of these transitions happened, giving life a few hours to say their final goodbyes during the silicon-burning phase before the supernova occurred.

There are many natural neutrino signatures produced by stars and other processes in the Universe.... [+] Every set of neutrinos produced by a different fusion process inside a star will have a different spectral energy signature, enabling astronomers to determine whether their parent star is fusing carbon, oxygen, neon, and silicon in its interior, or not.

IceCube collaboration / NSF / University of Wisconsin

It’s horrifying to think that an event as fascinating and destructive as a supernova, despite all the spectacular effects it produces, would kill anything nearby before a single perceptible signal arrived, but that’s absolutely the case with neutrinos. Produced in the core of a supernova and carrying away 99% of its energy, all life on an Earth-like would receive a lethal dose of neutrinos within 1/20th of a second as every other location on the planet. No amount of shielding, even from being on the opposite side of the planet from the supernova, would help at all.

Whenever any star goes supernova, neutrinos are the first signal that can be detected from them, but by the time they arrive, it’s already too late. Even with how rarely they interact, they’d sterilize their entire solar system before the light or matter from the blast ever arrived. At the moment of a supernova’s ignition, the fate of death is sealed by the stealthiest killer of all: the elusive neutrino.

Follow me on Twitter. Check out my website or some of my other work here.

Ethan Siegel Ethan Siegel

I am a Ph.D. astrophysicist, author, and science communicator, who professes physics and astronomy at various colleges. I have won numerous awards for science writing since 2008 for my blog, Starts With A Bang, including the award for best science blog by the Institute of Physics. My two books, Treknology: The Science of Star Trek from Tricorders to Warp Drive, Beyond the Galaxy: How humanity looked beyond our Milky Way and discovered the entire Universe, are available for purchase at Amazon. Follow me on Twitter @startswithabang.

Source: This Is How We’d All Die Instantly If The Sun Suddenly Went Supernova

108K subscribers
Our Sun would never undergo a Supernova explosion. But what if it does? Video clips from NASA’s Goddard Space Flight Center and ESA/Hubble Images by: ESA/NASA, pixabay.com Music: Olympus by Ross Budgen – Music ( https://youtu.be/BnmglWHoVrk ) Licensed under CC BY 4.0 International License We’re on Facebook: https://www.facebook.com/astrogeekz/ We’re on Instagram: https://www.instagram.com/astrogeekz/ Support us on Patreon.

Category

Science & Technology

Share this:

Like this:

Like Loading...

In 2020 Climate Science Needs To Hit The Reset Button, Part One

In a remarkable essay last week titled, “We’re Getting a Clearer Picture of the Climate Future — and It’s Not as Bad as It Once Looked,” David Wallace-Wells of New York Magazine wrote, “the climate news might be better than you thought. It’s certainly better than I’ve thought.” The essay was remarkable because Wells, a self-described “alarmist,” is also the author of The Uninhabitable Earth, which describes an apocalyptic vision of the future, dominated by “elements of climate chaos.”

According to Wallace-Wells, his new-found optimism was the result of learning that much discussion of climate change is based on extreme but implausible scenarios of the future where the world burns massive amounts of coal. The implausibility of such scenarios is underscored by more recent assessments of global energy system trajectories of the International Energy Agency and United Nations, which suggest that carbon dioxide emissions from the burning of fossil fuels will be relatively flat over the next several decades, even before aggressive climate policies are implemented.

Scenarios of the future have long sat at the center of discussions of climate science, impacts and adaptation and mitigation policies. Scenario planning has a long history and can be traced to the RAND Corporation during World War 2 and, later (ironically enough) Shell, a fossil fuel company. Scenarios are not intended to be forecasts of the future, but rather to serve as an alternative to forecasting. Scenarios provide a description of possible futures contingent upon various factors, only some of which might be under the control of decision makers.

The climate community got off track by forgetting the distinction between using scenarios as an exploratory tool for developing and evaluating policy options, and using scenarios as forecasts of where the world is headed. The scenario (or more precisely, the set of scenarios) that the climate community settled on as a baseline future for projecting future climate impacts and evaluating policy options biases how we think about climate impacts and policy responses. The point is not that climate analysts should have chosen a more realistic future as a baseline expectation, but rather, they should never have chosen a particular subset of futures for such a baseline.

The desire to predict the future is perfectly understandable. In climate science, scenarios were transformed from alternative visions of possible futures to a subset of predicted futures through the invention of a concept called “business as usual.”

The Intergovernmental Panel on Climate Change explains that “business as usual” is “synonymous” with concepts such as “baseline scenario” or “reference scenario” or “no-policy scenario.” The IPCC used of the concept of “business as usual” (and equivalencies) in the 1990s, and then explicitly rejected it in the 2000s. It has returned with a vengeance in the 2010s. A reset is needed for the 2020s.

According to the IPCC, a “baseline” scenario refers to “the state against which change is measured” and for climate impacts and policy, is “based on the assumption that no mitigation policies or measures will be implemented beyond those that are already in force and/or are legislated or planned to be adopted.” The use of such a baseline is far more important for research on climate impacts and policy than it is for most research on the physical science of climate, as the latter need not necessarily be tied to socio-economic scenarios.

The IPCC warns, quite appropriately, “Baseline scenarios are not intended to be predictions of the future, but rather counterfactual constructions that can serve to highlight the level of emissions that would occur without further policy effort.

Typically, baseline scenarios are then compared to mitigation scenarios that are constructed to meet different goals for greenhouse gas (GHG) emissions, atmosphereic (sic) concentrations, or temperature change.” Cost-benefit and effectiveness analyses in particular lend themselves to using a fixed baseline against which to evaluate an alternative, creating an incentive for the misuse of scenarios as predictions.

The IPCC warns against treating scenarios as predictions because they reach far into the future – for instance to 2100 and even beyond, and “the idea of business-as-usual in century-long socioeconomic projections is hard to fathom.” Humility in socio-economic prediction is also warranted because our collective track record in anticipating the future, especially when it comes to energy, is really quite poor.

It may seem confusing for the IPCC to recommend the use of baseline scenarios as a reference point for evaluating counterfactual futures and its parallel warning not to use reference scenarios as forecasts. The way for analysts to reconcile these two perspectives is to consider in research a very wide range of counterfactual futures as baselines.

The instant an analyst decides that one particular scenario or a subset of scenarios is more likely than others, and then designates that subset of possible futures as a baseline or “business as usual,” then that analyst has started crossing the bridge to predicting the future. When a single scenario is chosen as a baseline, that bridge has been crossed.

There is of course generally nothing wrong with predicting the future as a basis for decision making. Indeed, a decision is a form of prediction about the future. However, in some contexts we may wish to rely more on decision making that is robust to ignorance and uncertainties (and thus less on forecasts), that might lead to desired outcomes across all scenarios of the future. For instance, if you build a house high on a bluff above a floodplain, you need not worry about flood predictions. In other settings, we may wish to optimize decisions based on a specific forecast of the future, such as evacuation before an advancing storm.

Climate science – and by that I mean broadly research on physical science, impacts, economics as well as policy-related research into adaptation and mitigation —- went off track when large parts of the community and leading assessment bodies like the IPCC decided to anoint a subset of futures (and one in particular) as the baseline against which impacts and policy would be evaluated.

This is best illustrated by a detailed example.

The U.S. National Climate Assessment (NCA) is a periodic report on climate science and policy required in law.  The most recent report was published in two parts in 2017 and 2018. Those reports were centered on anointing a specific scenario of the future as “business as usual” (despite the NCA warning against doing exactly that). That scenario has a technical name, Representative Concentration Pathway (RCP) 8.5.

In his climate epiphany, David Wallace-Wells warned, “anyone, including me, who has built their understanding on what level of warming is likely this century on that RCP8.5 scenario should probably revise that understanding in a less alarmist direction.” The climate science community, broadly conceived, is among those needing to revise their understandings.

To illustrate how the USNCA came to be centered on RCP8.5, let’s take a quick deep dive into how the report was created. It’s use of scenarios was grounded in research done by the U.S. Environmental Protection Agency (EPA) and specifically a project called Climate Change Impacts and Risk Analysis. That project is described in two reports.

The first report, in 2015, explained that its methodology was based on two scenarios, a “business as usual” or “reference” scenario that projected where the world was heading in the absence of climate policies and a “mitigation” scenario representing a future with emissions reductions. In that report EPA created its own scenarios (with its BAU scenario equated to an equivalent RCP8.6 scenario). The report explained that the benefits of mitigation policy were defined by the difference between the BAU scenario and the mitigation scenario.

In its subsequent report in 2017, EPA decided to replace its scenarios with several of the RCP scenarios used by the IPCC. In that report it dropped the phrase “business as usual” and adopted RCP8.5 as its “baseline” scenario fulfilling that role. It adopted another scenario, RCP4.5 as representing a world with mitigation policy. The USNA relied heavily on the results of this research, along with other work using RCP8.5 as a “baseline.”

The USNCA defined the difference in impacts between the two RCP scenarios as representing the benefits to the United States of mitigation policy: “Comparing outcomes under RCP8.5 with those of RCP4.5 (and RCP2.6 in some cases) not only captures a range of uncertainties and plausible futures but also provides information about the potential benefits of mitigation.” But such a comparison was warned against by the creators of the RCP scenarios: “RCP8.5 cannot be used as a no-climate-policy reference scenario for the other RCPs.” Yet, there it was at the center of the most authoritative climate science report in the United States.

Reports are written by committees, and elsewhere the US NCA warned that RCP8.5 “is not intended to serve as an upper limit on possible emissions nor as a BAU or reference scenario for the other three scenarios.” But that warning was not heeded at all. RCP8.5 is used as a reference scenario throughout the report and is mentioned more than 470 times, representing about 56% of all references to RCP scenarios.

It was the USNCA misuse of RCP8.5 that appeared on a page one New York Times story that warned, “A major scientific report issued by 13 federal agencies on Friday presents the starkest warnings to date of the consequences of climate change for the United States, predicting that if significant steps are not taken to rein in global warming, the damage will knock as much as 10 percent off the size of the American economy by century’s end.”

It is not just the USNCA that has centered its work on RCP8.5 as a reference scenario to evaluate climate impacts and policy, the 2019 IPCC report on oceans and ice also adopted RCP8.5 as a reference scenario to compare with RCP2.6 as a mitigation scenario: “Under unmitigated emissions (RCP8.5), coastal societies, especially poorer, rural and small islands societies, will struggle to maintain their livelihoods and settlements during the 21st century.” That report referenced RCP8.5 more than 580 times representing more than 56% of all scenario references in the report.

Across the IPCC 5th assessment report, published in 2013 and 2014, RCP8.5 comprised 34% of scenario references. Dependence on RCP8.5 has increased in the reports of IPCC. And as an indication of where research may be heading, in the abstracts talks given at the 2019 meeting of the American Geophysical Union earlier this month, of those that mentioned RCP scenarios, 58% mentioned RCP 8.5, with RCP4.5 coming in second at 32%. If these abstracts indicate the substance of future scientific publications, then get ready for an avalanche of RCP8.5 studies.

The climate science community, despite often warning itself to the contrary, has gotten off track when it comes to the use of scenarios in impact and policy research. There can be little doubt that major assessments and a significant portion of the underlying literature has slipped into misusing scenarios as predictions of the future.

Why this has happened will no doubt be the subject of future research, but for the immediate future, the most important need will be for the climate science community to hit the reset button and get back on track. Climate change is too important to do otherwise.

Part two will discuss what this reset might look like.

Follow me on Twitter @RogerPielkeJr

I have been on the faculty of the University of Colorado since 2001, where I teach and write on a diverse range of policy and governance issues related to science, innovation, sports. I have degrees in mathematics, public policy and political science. My books include The Honest Broker: Making Sense of Science in Policy and Politics published by Cambridge University Press (2007), The Climate Fix: What Scientists and Politicians Won’t Tell you About Global Warming (2010, Basic Books) and The Edge: The War Against Cheating and Corruption in the Cutthroat World of Elite Sports (Roaring Forties Press, 2016). My most recent book is The Rightful Place of Science: Disasters and Climate Change (2nd edition, 2018, Consortium for Science, Policy & Outcomes).

Source: In 2020 Climate Science Needs To Hit The Reset Button, Part One

22.3M subscribers
Stella Wiedemeyer has channelled her mounting frustration surrounding the lack of action from the powers that be in relation to the climate crisis into organising School Strike 4 Climate actions in Melbourne. Through her grassroots engagement, she was selected to join federal political candidates at panel discussions including by Oxfam and is delighted to bring a youthful perspective to an at times demoralising issue. She is working currently to inspire environmental awareness through her personal actions, school community and new found platform within the youth climate justice movement. “I’m looking forward to challenging people to consider their position in our climate and recognise what obligations and privileges we have to create long-lasting, systemic change.” Stella Wiedemeyer is a current year 11 student who has channelled her mounting frustration surrounding the lack of action from the powers that be in relation to the climate crisis into organising School Strike 4 Climate actions in Melbourne. Through her grassroots engagement, she was selected to join federal political candidates at panel discussions including by Oxfam and is delighted to bring a youthful perspective to an at times demoralising issue. She is working currently to inspire environmental awareness through her personal actions, school community and new found platform within the youth climate justice movement. “I’m looking forward to challenging people to consider their position in our climate and recognise what obligations and privileges we have to create long-lasting, systemic change.” This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at https://www.ted.com/tedx

This Is Why We Don’t Shoot Earth’s Garbage Into The Sun

Imagine our planet as it was for the first 4.55 billion years of its existence. Fires, volcanoes, earthquakes, tsunamis, asteroid strikes, hurricanes and many other natural disasters were ubiquitous, as was biological activity throughout our entire measured history. Most of the environmental changes that occurred were gradual and isolated; only in a few instances — often correlated with mass extinctions — were the changes global, immediate, and catastrophic.

But with the arrival of human beings, Earth’s natural environment has another element to contend with: the changes wrought upon it by our species. For tens of thousands of years, the largest wars were merely regional skermishes; the largest problems with waste led only to isolated disease outbreaks. But our numbers and technological capabilities have grown, and with it, a waste management problem. You might think a great solution would be to send our worst garbage into the Sun, but we’ll never make it happen. Here’s why.

The very first launch of the Falcon Heavy, on February 6, 2018, was a tremendous success. The rocket... [+] reached low-Earth-orbit, deployed its payload successfully, and the main boosters returned to Cape Kennedy, where they landed successfully. The promise of a reusable heavy-lift vehicle is now a reality, and could lower launch costs to ~$1000/pound. Still, even with all these advances, we won't be launching our garbage into the Sun anytime soon.

Jim Watson/AFP/Getty Images

Today In: Innovation

At present, there are a little more than 7 billion humans on the planet, and the previous century saw us at last become a spacefaring civilization, where we’ve broken the gravitational bonds that have kept us shackled to Earth. We’ve extracted valuable and rare minerals and elements, synthesized new chemical compounds, developed nuclear technologies, and produced new technologies that far exceed even the wildest dreams of our distant ancestors.

Although these new technologies have transformed our world and improved our quality of life, there are negative side-effects that have come along for the ride. We now have the capacity to cause widespread damage and destruction to our environment in a variety of ways, from deforestation to atmospheric pollution to ocean acidification and more. With time and care, the Earth will begin self-regulating as soon as we stop exacerbating these problems. But other problems just aren’t going to get better on their own on any reasonable timescale.

Nuclear weapon test Mike (yield 10.4 Mt) on Enewetak Atoll. The test was part of the Operation Ivy.... [+] Mike was the first hydrogen bomb ever tested. A release of this much energy corresponds to approximately 500 grams of matter being converted into pure energy: an astonishingly large explosion for such a tiny amount of mass. Nuclear reactions involving fission or fusion (or both, as in the case of Ivy Mike) can produce tremendously dangerous, long-term radioactive waste.

National Nuclear Security Administration / Nevada Site Office

Some of what we’ve produced here on Earth isn’t merely a problem to be reckoned with over the short-term, but poses a danger that will not significantly lessen with time. Our most dangerous, long-term pollutants include nuclear by-products and waste, hazardous chemicals and biohazards, plastics that off-gas and don’t biodegrade, and could wreak havoc on a significant fraction of the living beings on Earth if they got into the environment in the wrong way.

You might think that the “worst of the worst” of these offenders should be packed onto a rocket, launched into space, and sent on a collision course with the Sun, where at last they won’t plague Earth anymore. (Yes, that was similar to the plot of Superman IV.) From a physics point of view, it’s possible to do so.

But should we do it? That’s another story entirely, and it begins with considering how gravitation works on Earth and in our Solar System.

The Mercury-bound MESSENGER spacecraft captured several stunning images of Earth during a gravity... [+] assist swingby of its home planet on Aug. 2, 2005. Several hundred images, taken with the wide-angle camera in MESSENGER's Mercury Dual Imaging System (MDIS), were sequenced into a movie documenting the view from MESSENGER as it departed Earth. Earth rotates roughly once every 24 hours on its axis and moves through space in an elliptical orbit around our Sun.

NASA / Messenger mission

Human beings evolved on Earth, grew to prominence on this world, and developed extraordinary technologies that our corner of the cosmos had never seen before. We all have long dreamed of exploring the Universe beyond our home, but only in the past few decades have we managed to escape the gravitational bonds of Earth. The gravitational pull exerted by our massive planet is only dependent on our distance from Earth’s center, which causes spacetime to curve and causes all objects on or near it — including humans — to constantly accelerate “downwards.”

There’s a certain amount of energy keeping any massive object bound to Earth: gravitational potential energy. However, if we move fast enough (i.e., impart enough kinetic energy) to an object, it can cross two important thresholds.

  1. The threshold of a stable orbital speed to never collide with Earth: about 7.9 km/s (17,700 mph).
  2. The threshold of escaping from Earth’s gravity entirely: 11.2 km/s (25,000 mph).

It takes a speed of 7.9 km/s to achieve "C" (stable orbit), while it takes a speed of 11.2 km/s for... [+] "E" to escape Earth's gravity. Speeds less than "C" will fall back to Earth; speeds between "C" and "E" will remain bound to Earth in a stable orbit.

Brian Brondel under a c.c.a.-s.a.-3.0 license

For comparison, a human at the equator of our planet, where Earth’s rotation is maximized, is moving only at about 0.47 km/s (1,000 mph), leading to the conclusion that we’re in no danger of escaping unless there’s some tremendous intervention that changes the situation.

Luckily, we’ve developed just such an intervention: rocketry. To get a rocket into Earth’s orbit, we require at least the amount of energy it would take to accelerate that rocket to the necessary threshold speed we mentioned earlier. Humanity has been doing this since the 1950s, and once we’ve escaped from Earth, there was so much more to see occurring on larger scales.

Earth isn’t stationary, but orbits the Sun at approximately 30 km/s (67,000 mph), meaning that even if you escape from Earth, you’ll still find yourself not only gravitationally bound to the Sun, but in a stable elliptical orbit around it.

The Dove satellites, launched from the ISS, are designed for Earth imaging and have numbered... [+] approximately 300 in total. There are ~130 Dove satellites, created by Planet, that are still in Earth's orbit, but that number will drop to zero by the 2030s due to orbital decay. If these satellites were boosted to escape from Earth's gravity, they would still orbit the Sun unless they were boosted by much greater amounts.

NASA

This is a key point: you might think that here on Earth, we’re bound by Earth’s gravity and that’s the dominant factor as far as gravitation is concerned. Quite to the contrary, the gravitational pull of the Sun far exceeds the gravitational pull of Earth! The only reason we don’t notice it is because you, me, and the entire planet Earth are in free-fall with respect to the Sun, and so we’re all accelerated by it at the same relative rate.

If we were in space and managed to escape from Earth’s gravity, we’d still find ourselves moving at approximately 30 km/s with respect to the Sun, and at an approximate distance of 150 million km (93 million miles) from our parent star. If we wanted to escape from the Solar System, we’d have to gain about another 12 km/s of speed to reach escape velocity, something that a few of our spacecraft (Pioneer 10 and 11, Voyager 1 and 2, and New Horizons) have already achieved.

The escape speed from the Sun at Earth's distance is 42 km/s, and we already move at 30 km/s just by... [+] orbiting the Sun. Once Voyager 2 flew by Jupiter, which gravitationally 'slingshotted' it, it was destined to leave the Solar System.

Wikimedia Commons user Cmglee

But if we wanted to go in the opposite direction, and launch a spacecraft payload into the Sun, we’d have a big challenge at hand: we’d have to lose enough kinetic energy that a stable elliptical orbit around our Sun would transition to an orbit that came close enough to the Sun to collide with it. There are only two ways to accomplish this:

  1. Bring enough fuel with you so that you can decelerate your payload sufficiently (i.e., have it lose as much of its relative speed with respect to the Sun as possible), and then watch your payload gravitationally free-fall into the Sun.
  2. Configure enough fly-bys with the innermost planets of our Solar System — Earth, Venus and/or Mercury — so that the orbiting payload gets de-boosted (as opposed to the positive boosts that spacecraft like Pioneer, Voyager, and New Horizons received from gravitationally interacting with the outer planets) and eventually comes close enough to the Sun that it gets devoured.

The idea of a gravitational slingshot, or gravity assist, is to have a spacecraft approach a planet... [+] orbiting the Sun that it is not bound to. Depending on the orientation of the spacecraft's relative trajectory, it will either receive a speed boost or a de-boost with respect to the Sun, compensated for by the energy lost or gained (respectively) by the planet orbiting the Sun.

Wikimedia Commons user Zeimusu

The first option, in reality, requires so much fuel that it’s practically impossible with current (chemical rocket) technology. If you loaded up a rocket with a massive payload, like you might expect for all the hazardous waste you want to fire into the Sun, you’d have to load it up with a lot of rocket fuel, in orbit, to decelerate it sufficiently so that it’d fall into the Sun. To launch both that payload and the additional fuel requires a rocket that’s larger, more powerful and more massive than any we’ve ever built on Earth by a large margin.

Instead, we can use the gravity assist technique to either add or remove kinetic energy from a payload. If you approach a large mass (like a planet) from behind, fly in front of it, and get gravitationally slingshotted behind the planet, the spacecraft loses energy while the planet gains energy. If you go the opposite way, though, approaching the planet from ahead, flying behind it and getting gravitationally slingshotted back in front again, your spacecraft gains energy while removing it from the orbiting planet.

The Messenger mission took seven years and a total of six gravity assists and five deep-space... [+] maneuvers to reach its final destination: in orbit around the planet Mercury. The Parker Solar Probe will need to do even more to reach its final destination: the corona of the Sun. When it comes to reaching for the inner Solar System, spacecraft are required to lose a lot of energy to make it possible: a difficult task.

NASA/JPL

Two decades ago, we successfully used this gravitational slingshot method to successfully send an orbiter to rendezvous and continuously image the planet Mercury: the Messenger mission. It enabled us to construct the first all-planet mosaic of our Solar System’s innermost world. More recently, we’ve used the same technique to launch the Parker Solar Probe into a highly elliptical orbit that will take it to within just a few solar radii of the Sun.

A carefully calculated set of future trajectories is all that’s required to reach the Sun, so long as you orient your payload with the correct initial velocity. It’s difficult to do, but not impossible, and the Parker Solar Probe is perhaps the poster child for how we would, from Earth, successfully launch a rocket payload into the Sun.

Keeping all this in mind, then, you might conclude that it’s technologically feasible to launch our garbage — including hazardous waste like poisonous chemicals, biohazards, and even radioactive waste — but it’s something we’ll almost certainly never do.

Why not? There are currently three barriers to the idea:

  1. The possibility of a launch failure. If your payload is radioactive or hazardous and you have an explosion on launch or during a fly-by with Earth, all of that waste will be uncontrollably distributed across Earth.
  2. Energetically, it costs less to shoot your payload out of the Solar System (from a positive gravity assist with planets like Jupiter) than it does to shoot your payload into the Sun.
  3. And finally, even if we chose to do it, the cost to send our garbage into the Sun is prohibitively expensive at present.

This time-series photograph of the uncrewed Antares rocket launch in 2014 shows a catastrophic... [+] explosion-on-launch, which is an unavoidable possibility for any and all rockets. Even if we could achieve a much improved success rate, the risk of contaminating our planet with hazardous waste is prohibitive for launching our garbage into the Sun (or out of the Solar System) at present.

NASA/Joel Kowsky

The most successful and reliable space launch system of all time is the Soyuz rocket, which has a 97% success rate after more than 1,000 launches. Yet a 2% or 3% failure rate, when you apply that to a rocket loaded up with all the dangerous waste you want launched off of your planet, leads to the catastrophic possibility of having that waste spread into the oceans, atmosphere, into populated areas, drinking water, etc. This scenario doesn’t end well for humanity; the risk is too high.

Considering that the United States alone is storing about 60,000 tons of high-level nuclear waste, it would take approximately 8,600 Soyuz rockets to remove this waste from the Earth. Even if we could reduce the launch failure rate to an unprecedented 0.1%, it would cost approximately a trillion dollars and, with an estimated 9 launch failures to look forward to, would lead to over 60,000 pounds of hazardous waste being randomly redistributed across the Earth.

Unless we’re willing to pay an unprecedented cost and accept the near-certainty of catastrophic environmental pollution, we have to leave the idea of shooting our garbage into the Sun to the realm of science fiction and future hopeful technologies like space elevators. It’s undeniable that we’ve made quite the mess on planet Earth. Now, it’s up to us to figure out our own way out of it.

Follow me on Twitter. Check out my website or some of my other work here.

I am a Ph.D. astrophysicist, author, and science communicator, who professes physics and astronomy at various colleges. I have won numerous awards for science writing since 2008 for my blog, Starts With A Bang, including the award for best science blog by the Institute of Physics. My two books, Treknology: The Science of Star Trek from Tricorders to Warp Drive, Beyond the Galaxy: How humanity looked beyond our Milky Way and discovered the entire Universe, are available for purchase at Amazon. Follow me on Twitter @startswithabang.

Source: This Is Why We Don’t Shoot Earth’s Garbage Into The Sun

30.2M subscribers
How to Stop Water Polution. In case you’re wondering what water polution has to do with a new continent discoevered in the Pacific Ocean, here’s the answer to this mystery. This new continent is an island that consists solely of garbage and plastic waste. Some countries are ready to announce an ecological disaster. Let’s see if there’s something we can all do to save the planet. TIMESTAMPS The popularity of plastic 0:26 Garbage islands 1:47 The Great Pacific Garbage Patch 2:30 Problems connected with the plastic pollution of the ocean 4:39 Bali ecological disaster 7:31 Several ways to solve problem 8:26 #newcontinent #garbageisland #ecologicalproblem Music: Butchers – Silent Partner https://www.youtube.com/audiolibrary/… SUMMARY -2 million plastic bags a minute are thrown away. As for bubble wrap, the amount produced in just one year would be enough to cover our planet around the equator. 500 billion plastic cups are used and disposed of annually. -There are 3 huge garbage islands in the world: in the central North Pacific Ocean, in the Indian Ocean, and in the Atlantic Ocean. -The size of the Great Pacific Garbage Patch is currently more than 600,000 square miles. According to the journal Scientific Reports, there are more than 1.8 trillion pieces of plastic that have accumulated in this area. -Plastic objects in the ocean kill animals or get stuck in their bodies. Some types of plastic are toxic. In addition, plastic has the ability to absorb such poisonous substances as mercury. Birds often choke to death after trying to swallow a bright object that has caught their eye. -Indonesian authorities have recently declared a “garbage emergency.” More than 100 tons of waste brought ashore every day to beaches from Seminyak and Jimbaran to Kuta. -To solve the problem, people can find a way to remove the garbage that is already in the ocean. Another way out is to decrease pollution or stop it completely. Subscribe to Bright Side : https://goo.gl/rQTJZz —————————————————————————————- Our Social Media: Facebook: https://www.facebook.com/brightside/ Instagram: https://www.instagram.com/brightgram/ 5-Minute Crafts Youtube: https://www.goo.gl/8JVmuC —————————————————————————————- For more videos and articles visit: http://www.brightside.me/

 

Like this:

NASA Says Earth Is Greener Today Than 20 Years Ago Thanks To China, India

Greening of China and India

NASA has some good news, the world is a greener place today than it was 20 years ago. What prompted the change? Well, it appears China and India can take the majority of the credit.

In contrast to the perception of China and India’s willingness to overexploit land, water and resources for economic gain, the countries are responsible for the largest greening of the planet in the past two decades. The two most populous countries have implemented ambitious tree planting programs and scaled up their implementation and technology around agriculture.

India continues to break world records in tree planting, with 800,000 Indians planting 50 million trees in just 24 hours.

The recent finding by NASA and published in the journal Nature Sustainability, compared satellite data from the mid-1990s to today using high-resolution imagery. Initially, the researchers were unsure what caused the significant uptick in greening around the planet. It was unclear whether a warming planet, increased carbon dioxide (CO2) or a wetter climate could have caused more plants to grow.

After further investigation of the satellite imagery, the researchers found that greening was disproportionately located in China and India. If the greening was primarily a response from climate change and a warming planet, the increased vegetation shouldn’t be limited to country borders. In addition, higher latitude regions should become greener faster than lower latitudes as permafrost melts and areas like northern Russia become more habitable.

The greening of the planet.

The greening of the planet.

Nature Sustainability

The map above shows the relative greening (increase in vegetation) and browning (decrease in vegetation) around the globe. As you can see both China and India have significant greening.

The United States sits at number 7 in the total change in vegetation percent by decade. Of course, the chart below can hide where each country started. For example, a country that largely kept their forests and vegetation intact would have little room to increase percent vegetation whereas a country that heavily relied on deforestation would have more room to grow.

Comparing the greening of various countries around the globe.

Comparing the greening of various countries around the globe.

NASA.gov

NASA used Moderate Resolution Imaging Spectroradiometer (MODIS) to get a detailed picture of Earth’s global vegetation through time. The technique provided up to 500-meter resolution for the past two decades.

Both China and India went through phases of large scale deforestation in the 1970s and 80s, clearing old growth forests for urban development, farming and agriculture. However, it is clear that when presented with a problem, humans are incredibly adept at finding a solution. When the focus shifted in the 90s to reducing air and soil pollution and combating climate change the two countries made tremendous shifts in their overall land use.

It is encouraging to see swift and rapid change in governance and land use when presented with a dilemma. It is something that will continue to be a necessary skill in the decades to come.

Follow me on Twitter or LinkedIn. Check out my website.

I am a geologist passionate about sharing Earth’s intricacies with you. I received my PhD from Duke University where I studied the geology and climate of the Amazon. I am the founder of Science Trends, a leading source of science news and analysis on everything from climate change to cancer research. Let’s connect @trevornace

 

Source: NASA Says Earth Is Greener Today Than 20 Years Ago Thanks To China, India

%d bloggers like this: