Advertisements

A Group of Big Businesses is Backing a Carbon Tax. Could It Be a Solution to Climate Change?

The long list of big companies backing a carbon tax as a solution to climate change grew this week with financial giant J.P. Morgan Chase & Co. endorsing a legislative plan billed as a centrist approach to reducing emissions.

The announcement comes as the Climate Leadership Council (CLC), the organization behind the proposal, which was first released in 2017, redoubles efforts to promote the plan before an expected introduction in Congress as the conversation around various climate solutions heats up in Washington.

The CLC announced new backers—including former Energy Secretary Ernest Moniz and former UN climate chief Christiana Figueres—and released internal poll numbers showing bipartisan voter support for the plan. Supporters now include a broad coalition of companies, from oil giants like ExxonMobil to tech behemoths like Microsoft, major environmental groups like Conservation International, and a range of economists and political leaders.

“The markets can and will do much to address climate change,” David Solomon, CEO of Goldman Sachs, a founding member of the CLC, told TIME in an emailed statement. “But given the magnitude and urgency of this challenge, governments must put a price on the cost of carbon.”

The thinking behind the plan is straight forward. Economists have long argued that a carbon tax, which makes companies pay for what they pollute and gives them an incentive to stem carbon emissions, is the most efficient way to reduce such emissions. But carbon tax proposals have been met with opposition in the past from across the political spectrum, including from some Democrats, in large part because they increase energy costs. The CLC proposal would give the money collected by the tax back to taxpayers in the form of a quarterly dividend, an effort to make it more politically palatable.

On Feb. 13, the CLC provided additional details about the plan, including introducing a new mechanism that would rapidly increase the price on carbon if targets are not met. Backers say the plan will cut U.S. emissions in half by 2035. “We think it has a compelling economic logic,” says Janet Yellen, the former chair of the Federal Reserve and a backer of the plan, in an interview.

But despite the growing coalition, actually passing the plan remains a challenging uphill battle. While more and more Republicans have stopped denying the science of climate change, many continue to insist that they would never support anything resembling a carbon tax. Meanwhile, many leading Democrats, including presidential candidate Senator Bernie Sanders of Vermont, have downplayed the role a carbon tax might play in future climate legislation. Many Democrats argue that the time has passed for such a market-driven approach to climate change, arguing that they are too little, too late and that a corporate-backed plan shouldn’t be trusted.

Still, big corporations increasingly see a carbon tax—especially a proposal like the CLC plan—as the simplest solution to a thorny problem. With clear science, activists in the streets and voters experiencing extreme weather events in their own backyards, business leaders see new climate rules as all but an inevitability, if not at the U.S. federal level then in states or other countries where they have operations.

The CLC proposal offers a business-friendly approach: nixing many existing climate regulations, a “border carbon adjustment” that would create a fee on imports from countries without a carbon price, and a dividend system that pays out the revenue collected by the carbon tax back to taxpayers. “If we do one without the other,” says Shailesh Jejurikar, CEO of Procter & Gamble’s Fabric & Home Care division, “it doesn’t work.”

Still, even as more than a dozen Fortune 500 firms support the legislation, many other businesses and influential business groups continue to either oppose a carbon tax or haven’t taken a position at all. That’s particularly true of the fossil fuel industry’s trade groups like the American Petroleum Institute, which officially has no position. Even though major oil companies like ExxonMobil and Shell have joined the CLC initiative, independent oil companies, oil refiners and other related companies remain largely opposed.

One of the biggest challenges to this measure—or any carbon tax for that matter—is the growing interest in other approaches to climate legislation. Republicans this week pushed legislation to plant trees and expand tax incentives for capturing carbon, measures that wouldn’t match the scale of the challenge but allow Republicans to offer a different message on the issue.

Earlier this month, Representative David McKinley, a Republican from West Virginia, and Kurt Schrader, an Oregon Democrat, called for legislation that would lead to an 80% reduction in emissions from the power sector by 2050 using a combination of regulation and funding for innovation and infrastructure. And more than 30 Democratic senators introduced a bill to require the Environmental Protection Agency to come up with a plan for the U.S. to eliminate its carbon footprint by 2050. “This is the quickest way we can jumpstart government-wide climate action,” Senator Tom Carper of Delaware, who introduced the legislation, said on the Senate floor.

Spotlight Story
What’s in a Name? Why WHO’s Formal Name for the New Coronavirus Disease Matters
The World Health Organization and public health experts agree COVID-19 is the best name for the disease

None of these measures are likely to become law anytime soon, and any legislative approach to addressing climate change will involve intense debate on Capitol Hill.

Even some backers of the carefully crafted CLC plan acknowledge it’s not likely to pass in its current form. “Inevitably, Congress will have some of its own ideas in terms of the implementation,” Moniz, who endorsed the CLC proposal this week, tells TIME.“ “I would welcome seeing that negotiation start in earnest.” Indeed, even having a discussion in Congress indicates a new climate for climate in Washington.

By Justin Worland February 13, 2020

Source: A Group of Big Businesses is Backing a Carbon Tax. Could It Be a Solution to Climate Change?

A revenue neutral carbon tax would automatically encourage consumers and producers to shift toward energy sources that emit less carbon. Carbon taxes are economically efficient because they make people pay for the costs they create. And a revenue neutral carbon tax would keep the government from using new revenue to subsidize other programs. For more information, visit the PolicyEd page here: https://www.policyed.org/intellection…. Additional resources: Read “Why We Support a Revenue-Neutral Carbon Tax” by George P. Shultz, Gary S. Becker, available here: https://hvr.co/2uMzTTl Read why enacting a carbon tax would free up private firms to find the most efficient ways to cut emissions in “A Conservative Answer to Climate Change” by George P. Shultz and James A. Baker III, available here: https://on.wsj.com/2loUAhM Read “There Is One Climate Solution That’s Best For The Environment – And For Business” by George P. Shultz and Lawrence H. Summers, available here: https://wapo.st/2JRoLJv Watch as George P. Shultz, James A. Baker III, and Henry Paulson discuss “Is There Deal Space for Carbon Pricing In 2017?” Available here: https://hvr.co/2NHPF90 Listen as George Shultz joins The World Today to explain why he supports a carbon tax, available here: https://ab.co/2ObRBYN John Cochrane discusses George P. Shultz and James A. Baker III oped “A Conservative Answer to Climate Change.” Availabler here: https://bit.ly/2LMPdpF Read “Let the Carbon-Dividends Debate Begin” by George P. Shultz and Ted Halstead, available here: https://bit.ly/2O95UNH Visit https://www.policyed.org/ to learn more. – Subscribe to PolicyEd’s YouTube channel: http://bit.ly/PolicyEdSub – Follow PolicyEd on Twitter: http://bit.ly/PolicyEdTwit – Follow PolicyEd on Instagram: http://bit.ly/PolicyEdInsta

Advertisements

With Second Warmest November, 2019 is Likely to Be Second Warmest Year Ever Recorded

 

Greta Thunberg might have been been named TIME’s Person of the Year for drawing global attention to climate change, but the climate continues to speak for itself. Last month was the second-hottest November in recorded history, and 2019 is likely to be the second warmest year ever.

Scientists at the National Oceanic and Atmospheric Administration announced Monday that last month was 1.66 degrees Fahrenheit above the 20th century average, making it the second hottest November since record-keeping began 140 years ago.

And there’s more bad news: 2019 through November has been the second-hottest year on record, and the season (September through November) has been the second-hottest in recorded history. Both the season and the year to date were 1.69 degrees Fahrenheit above average, coming in just behind 2016 and 2015, respectively, and the average sea surface temperature was the second-warmest for the year to date.

Scientists say that record temperatures are yet another sign that the climate is changing, but they’re even more troubling when you look at other recent records. For instance, the five hottest Novembers have all taken place since 2013. In some regions, this was the hottest November in history; Africa, South America and the Hawaiian Islands all experienced their hottest Novembers on record.

Ahira Sanchez-Lugo, a NOAA climatologist, said that there is an 85% chance that 2019 will be the second-warmest year on record. This year was warm, in part because there was an El Niño climate phenomenon, which causes temperatures to rise. However, Sanchez-Lugo says that climate change makes this effect even more extreme.

She explained that while rising temperatures due to climate change are like riding an escalator — slowly but steadily increasing — an El Niño is “as if you’re jumping on the escalator.”

Sanchez-Lugo says that these reports are like a health assessment for the Earth, and that there are some warning signs. “We’re seeing that the Earth has a temperature, but not only that, we see that there are symptoms,” says Sanchez-Lugo.

High temperatures can also cause a domino effect on the environment. For instance, sea ice coverage reached near-record lows in the Arctic and Antarctic this November. Without sea ice covering its surface, the ocean absorbs solar radiation and becomes warmer, and some research suggests that receding sea ice can also lead to higher snowfall, says Sanchez-Lugo.

Many record temperatures were set in 2019. This November follows the second-highest October on record, and the month before that tied the warmest September on record. And during July — the hottest month ever recorded globally — regions from the United States to Europe were plagued by oppressive heatwaves.

By Tara Law

Source: With Second Warmest November, 2019 is Likely to Be Second Warmest Year Ever Recorded

2.07M subscribers
Scientists are warning that a likely El Niño event coupled with climate change could make 2019 the hottest year on record. Samantha Stevenson, a climate scientist and co-author of a study on the impact of El Niño, joined CBSN to discuss the effects of warming temperatures. Subscribe to the CBS News Channel HERE: http://youtube.com/cbsnews Watch CBSN live HERE: http://cbsn.ws/1PlLpZ7 Follow CBS News on Instagram HERE: https://www.instagram.com/cbsnews/ Like CBS News on Facebook HERE: http://facebook.com/cbsnews Follow CBS News on Twitter HERE: http://twitter.com/cbsnews Get the latest news and best in original reporting from CBS News delivered to your inbox. Subscribe to newsletters HERE: http://cbsn.ws/1RqHw7T Get your news on the go! Download CBS News mobile apps HERE: http://cbsn.ws/1Xb1WC8 Get new episodes of shows you love across devices the next day, stream CBSN and local news live, and watch full seasons of CBS fan favorites like Star Trek Discovery anytime, anywhere with CBS All Access. Try it free! http://bit.ly/1OQA29B — CBSN is the first digital streaming news network that will allow Internet-connected consumers to watch live, anchored news coverage on their connected TV and other devices. At launch, the network is available 24/7 and makes all of the resources of CBS News available directly on digital platforms with live, anchored coverage 15 hours each weekday. CBSN. Always On.

Mystery Sounds From Storms Could Help Predict Tornadoes

Mysterious rumbles that herald tornadoes could one day be used to predict when and where they will strike, according to researchers.

Storms emit sounds before tornadoes form, but the signals at less than 20Hz are below the limit for human hearing. What causes these rumbles has also been a conundrum.

Now researchers said they have narrowed down the reasons for the sounds – an important factor in harnessing the knowledge to improve warnings.

“The three possibilities are core oscillations [in the tornado], pressure relaxation, and latent heat effects,” said Dr Brian Elbing, of Oklahoma State University, who is part of the team behind the research. “They are all possibilities because what we have seen is that the signal occurs before the tornado touches the ground, continues after it touches the ground, and then disappears some time after the tornado leaves the ground.”

The latest work was presented at the annual meeting of the American Physical Society’s Division of Fluid Dynamics in Seattle.

The low-frequency sound produced by tornadoes has been known about for several decades, but Elbing said a big problem has been a lack of understanding of what causes the sounds, and difficulties in unpicking them from a tornado and other aspects of the weather.

The subject has seen renewed interest in recent years, with Elbing saying it could prove particularly useful for hilly areas such as Dixie Alley, which stretches from Texas to North Carolina. “Infrasound doesn’t need line of sight like radar, so there is hope that this could significantly improve warnings in Dixie Alley where most deaths [from tornadoes] occur,” he said.

The team’s setup involves a microphone capable of picking up low-frequency sounds sealed inside a dome which has four openings at right angles to each other, each of which is attached to a hose. Three of these domes are arranged in an equilateral triangle, 60 metres away from each other.

The team say the setup allows them to filter out sounds from normal windand work out which direction the twister is travelling, while the signal itself offers an idea of the tornado’s size: a frequency of 1Hz indicates a very large tornado, while a 10Hz indicates a small one.

In their latest work, Elbing and colleagues reported a case in Oklahoma in which they were able to pick up audio clues eight minutes before the twister formed, with a clear signal detected four minutes before it hit the ground. That, they say, is important as the tornado was not picked up by radar.

“There is evidence that the amount of lead time before the tornado is dependent on how large the tornado is,” said Elbing, adding that low-frequency sounds have been detected up to two hours before a tornado forms. “This tornado we detected was very small, there was no warning issued for this tornado … which is why even a four-minute warning is a big deal.”

While the Oklahoma tornado was only 12 miles from the setup, Elbing said once the sound signal was better understood, the technique could be used over even greater distances.

“If we know the acoustic signature of a tornado, it is realistic to expect to detect a tornado from over 100 miles,” he said.

Dr Harold Brooks, a tornado expert at the US National Oceanic and Atmospheric Administration who was not involved in the work, said many questions needed to be answered before the approach could be harnessed, including whether all tornadoes make such sounds, whether such sounds can be made from other storms, and how accurate the approach is.

“No system will be perfect so there will be errors of missed events and false alarms,” said Brooks, adding that it is also not clear how many microphone arrangements would be needed to offer good coverage, saying that since the approach was based on sound waves rather than light waves, a far smaller area can be examined by each system in a given time than for radar.

“At this point it is a really intriguing thing, but there is a lot more work that needs to be done in terms of a relatively large scale experiment to actually test it,” he said.

By: @NicolaKSDavis

Source: Mystery sounds from storms could help predict tornadoes

102K subscribers
Are your kids wondering: “Why are tornadoes so hard to predict?” This question came from Hai Ming, a 2nd Grader from the US. Like, share and vote on next week’s question here: https://mysterydoug.com/vote

This Is How We’d All Die Instantly If The Sun Suddenly Went Supernova

As far as raw explosive power goes, no other cataclysm in the Universe is both as common and as destructive as a core-collapse supernova. In one brief event lasting only seconds, a runaway reaction causes a star to give off as much energy as our Sun will emit over its entire 10-12 billion year lifetime. While many supernovae have been observed both historically and since the invention of the telescope, humanity has never witnessed one up close.

Recently, the nearby red supergiant star, Betelgeuse, has started exhibiting interesting signs of dimming, leading some to suspect that it might be on the verge of going supernova. While our Sun isn’t massive enough to experience that same fate, it’s a fun and macabre thought experiment to imagine what would happen if it did. Yes, we’d all die in short order, but not from either the blast wave or from radiation. Instead, the neutrinos would get us first. Here’s how.

An animation sequence of the 17th century supernova in the constellation of Cassiopeia. This... [+] explosion, despite occurring in the Milky Way and about 60-70 years after 1604, could not be seen with the naked eye due to the intervening dust. Surrounding material plus continued emission of EM radiation both play a role in the remnant's continued illumination. A supernova is the typical fate for a star greater than about 10 solar masses, although there are some exceptions.

NASA, ESA, and the Hubble Heritage STScI/AURA)-ESA/Hubble Collaboration. Acknowledgement: Robert A. Fesen (Dartmouth College, USA) and James Long (ESA/Hubble)

A supernova — specifically, a core-collapse supernova — can only occur when a star many times more massive than our Sun runs out of nuclear fuel to burn in its core. All stars start off doing what our Sun does: fusing the most common element in the Universe, hydrogen, into helium through a series of chain reactions. During this part of a star’s life, it’s the radiation pressure from these nuclear fusion reactions that prevent the star’s interior from collapsing due to the enormous force of gravitation.

So what happens, then, when the star burns through all the hydrogen in its core? The radiation pressure drops and gravity starts to win in this titanic struggle, causing the core to contract. As it contracts, it heats up, and if the temperature can pass a certain critical threshold, the star will start fusing the next-lightest element in line, helium, to produce carbon.

This cutaway showcases the various regions of the surface and interior of the Sun, including the... [+] core, which is where nuclear fusion occurs. As time goes on, the helium-containing region in the core expands and the maximum temperature increases, causing the Sun's energy output to increase. When our Sun runs out of hydrogen fuel in the core, it will contract and heat up to a sufficient degree that helium fusion can begin.

Wikimedia Commons user Kelvinsong

This will occur in our own Sun some 5-to-7 billion years in the future, causing it to swell into a red giant. Our parent star will expand so much that Mercury, Venus, and possibly even Earth will be engulfed, but let’s instead imagine that we come up some clever plan to migrate our planet to a safe orbit, while mitigating the increased luminosity to prevent our planet from getting fried. This helium burning will last for hundreds of millions of years before our Sun runs out of helium and the core contracts and heats up once again.

For our Sun, that’s the end of the line, as we don’t have enough mass to move to the next stage and begin carbon fusion. In a star far more massive than our Sun, however, hydrogen-burning only takes millions of years to complete, and the helium-burning phase lasts merely hundreds of thousands of years. After that, the core’s contraction will enable carbon fusion to proceed, and things will move very quickly after that.

As it nears the end of its evolution, heavy elements produced by nuclear fusion inside the star are... [+] concentrated toward the center of the star. When the star explodes, the vast majority of the outer layers absorb neutrons rapidly, climbing the periodic table, and also get expelled back into the Universe where they participate in the next generation of star and planet formation.

NASA / CXC / S. Lee

Carbon fusion can produce elements such as oxygen, neon, and magnesium, but only takes hundreds of years to complete. When carbon becomes scarce in the core, it again contracts and heats up, leading to neon fusion (which lasts about a year), followed by oxygen fusion (lasting for a few months), and then silicon fusion (which lasts less than a day). In that final phase of silicon-burning, core temperatures can reach ~3 billion K, some 200 times the hottest temperatures currently found at the center of the Sun.

And then the critical moment occurs: the core runs out of silicon. Again, the pressure drops, but this time there’s nowhere to go. The elements that are produced from silicon fusion — elements like cobalt, nickel and iron — are more stable than the heavier elements that they’d conceivably fuse into. Instead, nothing there is capable of resisting gravitational collapse, and the core implodes.

Artist's illustration (left) of the interior of a massive star in the final stages, pre-supernova,... [+] of silicon-burning. (Silicon-burning is where iron, nickel, and cobalt form in the core.) A Chandra image (right) of the Cassiopeia A supernova remnant today shows elements like Iron (in blue), sulphur (green), and magnesium (red). We do not know whether all core-collapse supernovae follow the same pathway or not.

NASA/CXC/M.Weiss; X-ray: NASA/CXC/GSFC/U.Hwang & J.Laming

This is where the core-collapse supernova happens. A runaway fusion reaction occurs, producing what’s basically one giant atomic nucleus made of neutrons in the star’s core, while the outer layers have a tremendous amount of energy injected into them. The fusion reaction itself lasts for only around 10 seconds, liberating about 1044 Joules of energy, or the mass-equivalent (via Einstein’s E = mc2) of about 1027 kg: as much as you’d release by transforming two Saturns into pure energy.

That energy goes into a mix of radiation (photons), the kinetic energy of the material in the now-exploding stellar material, and neutrinos. All three of these are more than capable of ending any life that’s managed to survive on an orbiting planet up to that point, but the big question of how we’d all die if the Sun went supernova depends on the answer to one question: who gets there first?

The anatomy of a very massive star throughout its life, culminating in a Type II Supernova when the... [+] core runs out of nuclear fuel. The final stage of fusion is typically silicon-burning, producing iron and iron-like elements in the core for only a brief while before a supernova ensues. Many of the supernova remnants will lead to the formation of neutron stars, which can produce the greatest abundances of the heaviest elements of all by colliding and merging.

Nicole Rager Fuller/NSF

When the runaway fusion reaction occurs, the only delay in the light getting out comes from the fact that it’s produced in the core of this star, and the core is surrounded by the star’s outer layers. It takes a finite amount of time for that signal to propagate to the outermost surface of the star — the photosphere — where it’s then free to travel in a straight line at the speed of light.

As soon as it gets out, the radiation will scorch everything in its path, blowing the atmosphere (and any remaining ocean) clean off of the star-facing side of an Earth-like planet immediately, while the night side would last for seconds-to-minutes longer. The blast wave of the matter would follow soon afterwards, engulfing the remnants of our scorched world and quite possibly, dependent on the specifics of the explosion, destroying the planet entirely.

                        

But any living creature would surely die even before the light or the blast wave from the supernova arrived; they’d never see their demise coming. Instead, the neutrinos — which interact with matter so rarely that an entire star, to them, functions like a pane of glass does to visible light — simply speed away omnidirectionally, from the moment of their creation, at speeds indistinguishable from the speed of light.

Moreover, neutrinos carry an enormous fraction of a supernova’s energy away: approximately 99% of it. In any given moment, with our paltry Sun emitting just ~4 × 1026 joules of energy each second, approximately 70 trillion (7 × 1013) neutrinos pass through your hand. The probability that they’ll interact is tiny, but occasionally it will happen, depositing the energy it carries into your body when it happens. Only a few neutrinos actually do this over the course of a typical day with our current Sun, but if it went supernova, the story would change dramatically.

A neutrino event, identifiable by the rings of Cerenkov radiation that show up along the... [+] photomultiplier tubes lining the detector walls, showcase the successful methodology of neutrino astronomy and leveraging the use of Cherenkov radiation. This image shows multiple events, and is part of the suite of experiments paving our way to a greater understanding of neutrinos. The neutrinos detected in 1987 marked the dawn of both neutrino astronomy as well as multi-messenger astronomy.

Super Kamiokande collaboration

When a supernova occurs, the neutrino flux increases by approximately a factor of 10 quadrillion (1016), while the energy-per-neutrino goes up by around a factor of 10, increasing the probability of a neutrino interacting with your body tremendously. When you work through the math, you’ll find that even with their extraordinary low probability of interaction, any living creature — from a single-celled organism to a complex human being — would be boiled from the inside out from neutrino interactions alone.

This is the scariest outcome imaginable, because you’d never see it coming. In 1987, we observed a supernova from 168,000 light-years away with both light and neutrinos. The neutrinos arrived at three different detectors across the world, spanning about 10 seconds from the earliest to the latest. The light from the supernova, however, didn’t begin arriving until hours later. By the time the first visual signatures arrived, everything on Earth would have already been vaporized for hours.

A supernova explosion enriches the surrounding interstellar medium with heavy elements. The outer... [+] rings are caused by previous ejecta, long before the final explosion. This explosion also emitted a huge variety of neutrinos, some of which made it all the way to Earth.

ESO / L. Calçada

Perhaps the scariest part of neutrinos is how there’s no good way to shield yourself from them. Even if you tried to block their path to you with lead, or a planet, or even a neutron star, more than 50% of the neutrinos would still get through. According to some estimates, not only would all life on an Earth-like planet be destroyed by neutrinos, but any life anywhere in a comparable solar system would meet that same fate, even out at the distance of Pluto, before the first light from the supernova ever arrived.

https://www.forbes.com/video/6111169884001/

The only early detection system we’d ever be able to install to know something was coming is a sufficiently sensitive neutrino detector, which could detect the unique, surefire signatures of neutrinos generated from each of carbon, neon, oxygen, and silicon burning. We would know when each of these transitions happened, giving life a few hours to say their final goodbyes during the silicon-burning phase before the supernova occurred.

There are many natural neutrino signatures produced by stars and other processes in the Universe.... [+] Every set of neutrinos produced by a different fusion process inside a star will have a different spectral energy signature, enabling astronomers to determine whether their parent star is fusing carbon, oxygen, neon, and silicon in its interior, or not.

IceCube collaboration / NSF / University of Wisconsin

It’s horrifying to think that an event as fascinating and destructive as a supernova, despite all the spectacular effects it produces, would kill anything nearby before a single perceptible signal arrived, but that’s absolutely the case with neutrinos. Produced in the core of a supernova and carrying away 99% of its energy, all life on an Earth-like would receive a lethal dose of neutrinos within 1/20th of a second as every other location on the planet. No amount of shielding, even from being on the opposite side of the planet from the supernova, would help at all.

Whenever any star goes supernova, neutrinos are the first signal that can be detected from them, but by the time they arrive, it’s already too late. Even with how rarely they interact, they’d sterilize their entire solar system before the light or matter from the blast ever arrived. At the moment of a supernova’s ignition, the fate of death is sealed by the stealthiest killer of all: the elusive neutrino.

Follow me on Twitter. Check out my website or some of my other work here.

Ethan Siegel Ethan Siegel

I am a Ph.D. astrophysicist, author, and science communicator, who professes physics and astronomy at various colleges. I have won numerous awards for science writing since 2008 for my blog, Starts With A Bang, including the award for best science blog by the Institute of Physics. My two books, Treknology: The Science of Star Trek from Tricorders to Warp Drive, Beyond the Galaxy: How humanity looked beyond our Milky Way and discovered the entire Universe, are available for purchase at Amazon. Follow me on Twitter @startswithabang.

Source: This Is How We’d All Die Instantly If The Sun Suddenly Went Supernova

108K subscribers
Our Sun would never undergo a Supernova explosion. But what if it does? Video clips from NASA’s Goddard Space Flight Center and ESA/Hubble Images by: ESA/NASA, pixabay.com Music: Olympus by Ross Budgen – Music ( https://youtu.be/BnmglWHoVrk ) Licensed under CC BY 4.0 International License We’re on Facebook: https://www.facebook.com/astrogeekz/ We’re on Instagram: https://www.instagram.com/astrogeekz/ Support us on Patreon.

Category

Science & Technology

Share this:

Like this:

Like Loading...

In 2020 Climate Science Needs To Hit The Reset Button, Part One

In a remarkable essay last week titled, “We’re Getting a Clearer Picture of the Climate Future — and It’s Not as Bad as It Once Looked,” David Wallace-Wells of New York Magazine wrote, “the climate news might be better than you thought. It’s certainly better than I’ve thought.” The essay was remarkable because Wells, a self-described “alarmist,” is also the author of The Uninhabitable Earth, which describes an apocalyptic vision of the future, dominated by “elements of climate chaos.”

According to Wallace-Wells, his new-found optimism was the result of learning that much discussion of climate change is based on extreme but implausible scenarios of the future where the world burns massive amounts of coal. The implausibility of such scenarios is underscored by more recent assessments of global energy system trajectories of the International Energy Agency and United Nations, which suggest that carbon dioxide emissions from the burning of fossil fuels will be relatively flat over the next several decades, even before aggressive climate policies are implemented.

Scenarios of the future have long sat at the center of discussions of climate science, impacts and adaptation and mitigation policies. Scenario planning has a long history and can be traced to the RAND Corporation during World War 2 and, later (ironically enough) Shell, a fossil fuel company. Scenarios are not intended to be forecasts of the future, but rather to serve as an alternative to forecasting. Scenarios provide a description of possible futures contingent upon various factors, only some of which might be under the control of decision makers.

The climate community got off track by forgetting the distinction between using scenarios as an exploratory tool for developing and evaluating policy options, and using scenarios as forecasts of where the world is headed. The scenario (or more precisely, the set of scenarios) that the climate community settled on as a baseline future for projecting future climate impacts and evaluating policy options biases how we think about climate impacts and policy responses. The point is not that climate analysts should have chosen a more realistic future as a baseline expectation, but rather, they should never have chosen a particular subset of futures for such a baseline.

The desire to predict the future is perfectly understandable. In climate science, scenarios were transformed from alternative visions of possible futures to a subset of predicted futures through the invention of a concept called “business as usual.”

The Intergovernmental Panel on Climate Change explains that “business as usual” is “synonymous” with concepts such as “baseline scenario” or “reference scenario” or “no-policy scenario.” The IPCC used of the concept of “business as usual” (and equivalencies) in the 1990s, and then explicitly rejected it in the 2000s. It has returned with a vengeance in the 2010s. A reset is needed for the 2020s.

According to the IPCC, a “baseline” scenario refers to “the state against which change is measured” and for climate impacts and policy, is “based on the assumption that no mitigation policies or measures will be implemented beyond those that are already in force and/or are legislated or planned to be adopted.” The use of such a baseline is far more important for research on climate impacts and policy than it is for most research on the physical science of climate, as the latter need not necessarily be tied to socio-economic scenarios.

The IPCC warns, quite appropriately, “Baseline scenarios are not intended to be predictions of the future, but rather counterfactual constructions that can serve to highlight the level of emissions that would occur without further policy effort.

Typically, baseline scenarios are then compared to mitigation scenarios that are constructed to meet different goals for greenhouse gas (GHG) emissions, atmosphereic (sic) concentrations, or temperature change.” Cost-benefit and effectiveness analyses in particular lend themselves to using a fixed baseline against which to evaluate an alternative, creating an incentive for the misuse of scenarios as predictions.

The IPCC warns against treating scenarios as predictions because they reach far into the future – for instance to 2100 and even beyond, and “the idea of business-as-usual in century-long socioeconomic projections is hard to fathom.” Humility in socio-economic prediction is also warranted because our collective track record in anticipating the future, especially when it comes to energy, is really quite poor.

It may seem confusing for the IPCC to recommend the use of baseline scenarios as a reference point for evaluating counterfactual futures and its parallel warning not to use reference scenarios as forecasts. The way for analysts to reconcile these two perspectives is to consider in research a very wide range of counterfactual futures as baselines.

The instant an analyst decides that one particular scenario or a subset of scenarios is more likely than others, and then designates that subset of possible futures as a baseline or “business as usual,” then that analyst has started crossing the bridge to predicting the future. When a single scenario is chosen as a baseline, that bridge has been crossed.

There is of course generally nothing wrong with predicting the future as a basis for decision making. Indeed, a decision is a form of prediction about the future. However, in some contexts we may wish to rely more on decision making that is robust to ignorance and uncertainties (and thus less on forecasts), that might lead to desired outcomes across all scenarios of the future. For instance, if you build a house high on a bluff above a floodplain, you need not worry about flood predictions. In other settings, we may wish to optimize decisions based on a specific forecast of the future, such as evacuation before an advancing storm.

Climate science – and by that I mean broadly research on physical science, impacts, economics as well as policy-related research into adaptation and mitigation —- went off track when large parts of the community and leading assessment bodies like the IPCC decided to anoint a subset of futures (and one in particular) as the baseline against which impacts and policy would be evaluated.

This is best illustrated by a detailed example.

The U.S. National Climate Assessment (NCA) is a periodic report on climate science and policy required in law.  The most recent report was published in two parts in 2017 and 2018. Those reports were centered on anointing a specific scenario of the future as “business as usual” (despite the NCA warning against doing exactly that). That scenario has a technical name, Representative Concentration Pathway (RCP) 8.5.

In his climate epiphany, David Wallace-Wells warned, “anyone, including me, who has built their understanding on what level of warming is likely this century on that RCP8.5 scenario should probably revise that understanding in a less alarmist direction.” The climate science community, broadly conceived, is among those needing to revise their understandings.

To illustrate how the USNCA came to be centered on RCP8.5, let’s take a quick deep dive into how the report was created. It’s use of scenarios was grounded in research done by the U.S. Environmental Protection Agency (EPA) and specifically a project called Climate Change Impacts and Risk Analysis. That project is described in two reports.

The first report, in 2015, explained that its methodology was based on two scenarios, a “business as usual” or “reference” scenario that projected where the world was heading in the absence of climate policies and a “mitigation” scenario representing a future with emissions reductions. In that report EPA created its own scenarios (with its BAU scenario equated to an equivalent RCP8.6 scenario). The report explained that the benefits of mitigation policy were defined by the difference between the BAU scenario and the mitigation scenario.

In its subsequent report in 2017, EPA decided to replace its scenarios with several of the RCP scenarios used by the IPCC. In that report it dropped the phrase “business as usual” and adopted RCP8.5 as its “baseline” scenario fulfilling that role. It adopted another scenario, RCP4.5 as representing a world with mitigation policy. The USNA relied heavily on the results of this research, along with other work using RCP8.5 as a “baseline.”

The USNCA defined the difference in impacts between the two RCP scenarios as representing the benefits to the United States of mitigation policy: “Comparing outcomes under RCP8.5 with those of RCP4.5 (and RCP2.6 in some cases) not only captures a range of uncertainties and plausible futures but also provides information about the potential benefits of mitigation.” But such a comparison was warned against by the creators of the RCP scenarios: “RCP8.5 cannot be used as a no-climate-policy reference scenario for the other RCPs.” Yet, there it was at the center of the most authoritative climate science report in the United States.

Reports are written by committees, and elsewhere the US NCA warned that RCP8.5 “is not intended to serve as an upper limit on possible emissions nor as a BAU or reference scenario for the other three scenarios.” But that warning was not heeded at all. RCP8.5 is used as a reference scenario throughout the report and is mentioned more than 470 times, representing about 56% of all references to RCP scenarios.

It was the USNCA misuse of RCP8.5 that appeared on a page one New York Times story that warned, “A major scientific report issued by 13 federal agencies on Friday presents the starkest warnings to date of the consequences of climate change for the United States, predicting that if significant steps are not taken to rein in global warming, the damage will knock as much as 10 percent off the size of the American economy by century’s end.”

It is not just the USNCA that has centered its work on RCP8.5 as a reference scenario to evaluate climate impacts and policy, the 2019 IPCC report on oceans and ice also adopted RCP8.5 as a reference scenario to compare with RCP2.6 as a mitigation scenario: “Under unmitigated emissions (RCP8.5), coastal societies, especially poorer, rural and small islands societies, will struggle to maintain their livelihoods and settlements during the 21st century.” That report referenced RCP8.5 more than 580 times representing more than 56% of all scenario references in the report.

Across the IPCC 5th assessment report, published in 2013 and 2014, RCP8.5 comprised 34% of scenario references. Dependence on RCP8.5 has increased in the reports of IPCC. And as an indication of where research may be heading, in the abstracts talks given at the 2019 meeting of the American Geophysical Union earlier this month, of those that mentioned RCP scenarios, 58% mentioned RCP 8.5, with RCP4.5 coming in second at 32%. If these abstracts indicate the substance of future scientific publications, then get ready for an avalanche of RCP8.5 studies.

The climate science community, despite often warning itself to the contrary, has gotten off track when it comes to the use of scenarios in impact and policy research. There can be little doubt that major assessments and a significant portion of the underlying literature has slipped into misusing scenarios as predictions of the future.

Why this has happened will no doubt be the subject of future research, but for the immediate future, the most important need will be for the climate science community to hit the reset button and get back on track. Climate change is too important to do otherwise.

Part two will discuss what this reset might look like.

Follow me on Twitter @RogerPielkeJr

I have been on the faculty of the University of Colorado since 2001, where I teach and write on a diverse range of policy and governance issues related to science, innovation, sports. I have degrees in mathematics, public policy and political science. My books include The Honest Broker: Making Sense of Science in Policy and Politics published by Cambridge University Press (2007), The Climate Fix: What Scientists and Politicians Won’t Tell you About Global Warming (2010, Basic Books) and The Edge: The War Against Cheating and Corruption in the Cutthroat World of Elite Sports (Roaring Forties Press, 2016). My most recent book is The Rightful Place of Science: Disasters and Climate Change (2nd edition, 2018, Consortium for Science, Policy & Outcomes).

Source: In 2020 Climate Science Needs To Hit The Reset Button, Part One

22.3M subscribers
Stella Wiedemeyer has channelled her mounting frustration surrounding the lack of action from the powers that be in relation to the climate crisis into organising School Strike 4 Climate actions in Melbourne. Through her grassroots engagement, she was selected to join federal political candidates at panel discussions including by Oxfam and is delighted to bring a youthful perspective to an at times demoralising issue. She is working currently to inspire environmental awareness through her personal actions, school community and new found platform within the youth climate justice movement. “I’m looking forward to challenging people to consider their position in our climate and recognise what obligations and privileges we have to create long-lasting, systemic change.” Stella Wiedemeyer is a current year 11 student who has channelled her mounting frustration surrounding the lack of action from the powers that be in relation to the climate crisis into organising School Strike 4 Climate actions in Melbourne. Through her grassroots engagement, she was selected to join federal political candidates at panel discussions including by Oxfam and is delighted to bring a youthful perspective to an at times demoralising issue. She is working currently to inspire environmental awareness through her personal actions, school community and new found platform within the youth climate justice movement. “I’m looking forward to challenging people to consider their position in our climate and recognise what obligations and privileges we have to create long-lasting, systemic change.” This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at https://www.ted.com/tedx

Climate Change Models Were Right About Global Warming 30 years Ago

Emissions from a coal fired power station in the Latrobe Valley, Victoria, Australia. New research shows even the earliest climate models were broadly correct in predicting the relationship between greenhouse gas emissions and warming. Ashley Cooper/Construction Photography/Avalon/Getty

Even 30 years ago, climate change models were doing a reasonably good job at predicting future global warming, a study has found. Previously, climate change deniers had used model inconsistencies to raise doubts about the relationship between greenhouse gas emissions and global warming. Scientists say their research, published in the journal Geophysical Research Letters, “should help resolve public confusion around the performance of past climate modeling efforts.”

The team, from the University of California, Berkeley, the Massachusetts Institute of Technology and NASA, found that 14 of 17 climate models produced between 1970 and 2007 were broadly correct in their predictions.

According to the World Meteorological Organization, global temperatures have risen by 1.1 degrees Celsius since the start of the industrial revolution—a trend driven by human activity and specifically, by greenhouse gases emissions. The same report revealed that the global average temperature between 2015 and 2019 was 0.2 degrees Celsius higher than between 2011 and 2015.

Predicting what will happen in the future is tricky because there are many unknowns to factor in—and several directions we as a global society might chose to take. To be accurate, models not only rely on solid physics, but on precise forecasting when it comes to levels of future emissions.

That is where James Hansen’s 1988 models for NASA went wrong. The forecasts were inaccurate because his predictions on future emissions did not account for the Montreal Protocol, which came into effect a year later. This meant his predictions for future warming were also wrong.

The Montreal Protocol banned the use of chlorofluorocarbons, or CFCs, which were potent greenhouse gases that were depleting the ozone layer.

“If you account for these and look at the relationship in his model between temperature and radiative forcing, which is CO2 and other greenhouse gases, he gets it pretty much dead on,” Hausfather said. “So the physics of his model was right. The relationship between how much CO2 there is in the atmosphere and how much warming you get, was right. He just got the future emissions wrong.”

He added: “Physics we can understand, it is a deterministic system; future emissions depend on human systems, which are not necessarily deterministic.”

This is why many climate models often offer low emission and high emission scenarios.

For the study, Hausfather and colleagues took two things into consideration when calculating the accuracy of the older models—how did they predict future temperatures, and how did they predict the link between temperature and changes in levels of greenhouse gases.

The researchers say that there were some that projected too little warming and others that projected too much warming. However, most were generally correct when it came to predicting global warming, particularly when differences in emission projections were accounted for.

“We find no evidence that the climate models evaluated in this paper have systematically overestimated or underestimated warming over their projection period,” the team wrote.

“The projection skill of the 1970s models is particularly impressive given the limited observational evidence of warming at the time, as the world was thought to have been cooling for the past few decades.”

Hausfather added: “The real message is that the warming we have experienced is pretty much exactly what climate models predicted it would be as much as 30 years ago. This really gives us more confidence that today’s models are getting things largely right as well.”

By

Source: Climate Change Models Were Right About Global Warming 30 years Ago—Including That of NASA Scientist James Hansen

1.97M subscribers
Global warming turns 120 next year… sort of. Next year will be the 120th anniversary of the first time we figured out that human activity could be causing climate change. Since then, the science has gotten firmer and the politics have gotten murkier, but the outlook for the future remains uncertain. This is the history of manmade global warming in three minutes. (Corrects number of hottest years in history since 1998.) (Video by: Alan Jeffries, Christian Capestany, Eric Roston) –Subscribe to Bloomberg on YouTube: http://www.youtube.com/Bloomberg Bloomberg Television offers extensive coverage and analysis of international business news and stories of global importance. It is available in more than 310 million households worldwide and reaches the most affluent and influential viewers in terms of household income, asset value and education levels. With production hubs in London, New York and Hong Kong, the network provides 24-hour continuous coverage of the people, companies and ideas that move the markets.

Deadly Volcanic Explosion Rocks New Zealand: Here’s Everything You Need To Know

A powerful volcanic eruption has rocked New Zealand’s White Island, a small uninhabited volcano sticking up out of the sea in the Bay of Plenty, 50 kilometres offshore of the country’s North Island.

Although details remain a little sparse, the New Zealand police force suspect that fewer than 50 people were present on the island when the eruption took them by surprise. At the time of writing, several people have been injured, some reportedly with serious burns, and some have been evacuated to the mainland. At least one person is critically injured.

A number of people on the island are currently unaccounted for. Police Deputy Commissioner John Tims told a press conference that at least one person has died, and that they are unlikely to be the sole casualty.

Plenty of news reports will focus on those injured by the eruption, and understandably so. Here, you’ll find some scientific information that should provide some background as to why this took place. As ever, I’ll update this as more information comes in, when I can.

So, what happened?

On Monday December 9th at 14:11pm in New Zealand’s time zone, (late at night on Sunday, Eastern Time), one took place on White Island, also known by its Māori name Whakaari. It has been described by GeoNet – an official scientific initiative that’s a collaboration between the New Zealand government’s Earthquake Commission and New Zealand-based geoscience institute GNS Science – as an impulsive, short-lived event that affected the crater floor. The activity, they say, appears to have diminished since the eruption.

The event generated an ash plume that rose 3,700 metres or so above the vent. As seen by webcam images, ash blanketed the crater floor, and ashfall seems to be more or less confined to the island.

                                   

So far, reports suggest that 20 people or so have been injured out of a possible 100 on the island at the time. Tourists can get boats or helicopters to the island, a near constantly restless volcano, to peer at its hyperactivity. It’s a small island, just 2,400 metres across at its longest, with its 321-metre-high summit simply the high point of the crater rim, which is open and exposed.

                                    

What kind of eruption was this?

It’s too early to tell, but the short time span of the main event, the fact that there have been injuries and the temporary ash plume that fell mostly back onto the island suggests that this was a type of volcanic explosion. It’s difficult to say what kind of blast it was at this stage, but it could either be one that unleashed fresh volcanic debris or one that didn’t, one that involved external water or one that didn’t.

Being so close to the sea, external water may have infiltrated White Island’s magma supply. When mixed in an appropriate manner and with the right magma-to-water ratio, you can get an explosive vaporization of the water which sets of a rapid chain reaction of violent depressurization events – i.e., an explosion. If this just releases steam and no new magmatic products, it is technically not an eruption, but a hydrothermal blast. If it does unleash novel volcanic debris, it is referred to as a phreatomagmatic eruption.

External water doesn’t necessarily have to be involved. Perhaps the blast was more like the one that recently took place at Italy’s Stromboli volcano, where a gloopy, gas-rich lump of magma high up the volcano’s throat (known as its conduit) managed to rush up to the surface via its own natural buoyancy, where the gas rapidly expanded and flung lava and ash into the sky. The Strombolian eruption style, named after the eponymous Sicilian volcanic isle, can be observed at volcanoes all over the world, but each eruption at each individual volcano can vary in intensity.

In any of these cases, people standing inadvertently too close to a powerful enough explosion can be harmed by all kinds of things, from the shockwave of the explosion itself causing damage to their internal organs to heavy, hot and sometimes molten debris being flung out by the blast.

                                  

The ash can also cause health hazards; it is toxic and glassy, so breathing it in can damage respiratory systems. The risk for harm is far greater for those with pre-existing breathing conditions than those who are otherwise healthy.

What kind of volcano is White Island?

It is the summit of a submarine volcano that is 16 by 18 kilometres across, one that erupted enough volcanic debris long ago to rise from the waves and prevent itself being reclaimed by them. According to the Smithsonian Institution’s Global Volcanism Program, it is made of two overlapping stratovolcanoes (steep, mountain-shaped volcanoes) that have a somewhat gloopy magmatic consistency – and, as a result, prone to trapping gas and causing explosive eruptions.

It is highly active, and its eruptions can sometimes produce topographic changes. In the 19th and 20th centuries, new vents have opened up, putting holes in the crater floor. Some of the crater wall catastrophically failed and collapsed in 1914, creating a debris avalanche that smothered buildings and workers at a sulphur mine.

What are its past eruptions like?

Māori legends have spoken of the eruptive fury of the volcano for some time now. Since 1826, observers have recorded a mixture of phreatomagmatic and strombolian eruption styles.

Most of the island’s eruptions rank as a 2 on the Volcanic Explosivity Index or VEI, which takes into account the amount of fresh volcanic debris ejected, the height of the ash plume, and some additional details. A 2 is classified as “explosive”, producing a 1-5-kilometre-high ash plume, and unleashes no more than 400 Olympic-sized swimming pools’ worth of fresh volcanic debris.

Sometimes, there have been eruptions ranking as a 3 on the VEI, which are described as “catastrophic”, producing a plume of ash up to 15 kilometres high and can make 4,000 Olympic-sized swimming pools’ worth of fresh volcanic debris. Each eruption, though, has its own characteristics and behaviours that the VEI doesn’t describe; the index is just used as a proxy for the explosivity and volume of debris involved in the eruption, with higher numbers being far less common around the world over time than the lower numbers.

In any event, this latest event certainly sounds like a 2 or 3 on the VEI. Even a 1 can injure or kill people; it depends where they were in relation to the eruption or blast when it took place.

Why wasn’t this eruption forecast?

For volcanoes that are well monitored, like White Island, scientists can look at an array of data – seismic signals that indicating magma cracking through rock as it rises, gas emissions at the surface that suggest magma is just below, the deformation of the ground as magma moves about, and so on – to forecast what may happen next in various timeframes. Although it is difficult to say with any certainty or precision when an eruption may happen and what kind of eruption it will be, for some volcanoes, volcanologists can get advanced notice that something may be about to happen. It could be an eruption, but it could also just be magma moving about and then going quiet.

Each volcano is idiosyncratic, though. They may all play by the same rules, but each player is different: some volcanoes are more hyperactive, others take a long time to erupt and spend most of their lifetime doing very little. There are many grey areas; Italy’s Stromboli volcano often coughs up some lava several times per day, and that is expected; every few years, though, it can engage in an explosive convulsion that sends debris shooting all over the place, just like it did earlier this year. Those rarer blasts are harder to predict, and often don’t give off any warning signs.

Similarly, the explosion at White Island was fairly spontaneous; a violent sneeze if you will, one that wouldn’t have given New Zealand’s scientific instrumentation any warning signs until the moment it took place, or perhaps immediately before. The volcano had been rumbling a little more than its background level over the past few weeks, so the authorities had raised the alert level a little, but there was no way they could have foreseen this sort of paroxysm.

There was little anyone could have done, I suspect – this was just bad luck on the part of those tourists.

What happens next?

There is a chance that this explosion may have helped unleash magma trapped beneath the surface, leading to a more prolonged eruption or perhaps a few more similarly sized explosions. It’s also possible that this was an isolated explosion, and the volcano was “clearing its throat” and nothing more. Only time will tell, but I’m sure for some time now tourists won’t be allowed near the island.

Is this related to the Ring of Fire or any other volcanic eruptions taking place right now?

Well, it is related to the Ring of Fire in that this volcano sits on it. This term described a conveniently shaped network of major tectonic boundaries that are continuously shifting around in very complex ways. Thanks to these behaviours, this network is responsible for 75 percent of the world’s volcanic activity, or thereabouts (and a staggering 90 percent of the planet’s earthquakes).

The underlying causes may be similar, but any eruptions that occur here happen independently of each other. There’s pretty much no evidence that volcanic eruptions can trigger other volcanic eruptions, (although there’s an ongoing healthy debate as to whether earthquakes, in some circumstances, can initiate volcanic eruptions nearby). What you are seeing here is just White Island doing its own thing.

On average, 40 volcanoes around the world are erupting at any one time. Sometimes, White Island is among them. This is par for the course, and not a sign of some sort of impending volcanological apoclaypse.

What can I do to help?

As always, don’t spread information whose veracity you are unsure of. Only used trusted, well-cited journalistic sources (hello!) or go to the official government or scientific networks, like GeoNet.

The spread of misinformation, accidental or intentional, will distract scientists and aid workers from their life-saving work, and it will sow erroneously founded seeds of panic in the minds of those affected by the disaster. If you see anyone doing this, politely but firmly tell them to stop; if they don’t, report them.

Follow me on Twitter.

Robin George Andrews is a doctor of experimental volcanology-turned-science journalist. He tends to write about the most extravagant of scientific tales, from eruptions

Source: Deadly Volcanic Explosion Rocks New Zealand: Here’s Everything You Need To Know

14.7K subscribers
This short video gives a break-down of New Zealand’s volcanic history, Subscribe to our YouTube Channel and stay up to date about What’s On at Auckland Museum. https://www.youtube.com/aucklandmuseum Visit our Facebook at https://www.facebook.com/AucklandMuseum/ or our website http://www.aucklandmuseum.com/ Housed in one of New Zealand’s finest heritage buildings, Auckland Museum is the cultural and spiritual touchstone for the many races that inhabit this beautiful land, and the first stop for anyone wishing to gain an insight into New Zealand and its peoples. Priceless Māori treasures, amazing natural history, Māori cultural performances three times daily.

The Century’s Strongest Super-Typhoon Hagibis Is About To Hit Japan—1,600 Flights Canceled

The streets of Tokyo outside my window are currently getting a little quieter, but there is absolutely no sense of panic in Japan’s capital. Typhoons are common-place in Japan, and the infrastructure has been built to withstand regular storms each year.

There are two major sporting events in Japan this weekend; the Rugby World Cup which has now canceled two games. England versus France and Scotland versus Japan. The other major event is the Japanese Grand Prix, who have moved qualifying to Sunday, with the race going ahead almost immediately afterwards.

24-Hour Travel Disruption

The biggest impact will likely be on flights. The eye of the storm is 55 miles wide alone, and satellite imagery shows the entire storm is currently larger than the entire nation of Japan. Hagibis will be one of the strongest typhoons to directly hit the island nation in decades.

Today In: Lifestyle

All Nippon Airways have now canceled all domestic flights departing from Tokyo on Saturday. The capital looks set to receive a direct hit from the storm but no one in the capital seems to be too concerned at this point. Although the Meteorological Agency has classified the storm as “violent”—the highest strength categorisation—rail operators have so far only warned that there may be cancellations.

With a storm this size, or any major storm, safety is paramount, however, Japanese authorities seem confident with their planning preparations. Japan Airlines have followed ANA’s example and canceled 90% of domestic flights, yet both airlines are optimistic of early morning departures on Saturday which remain scheduled until 8am. Additionally, both airlines are hopeful that some international flights will resume by late Saturday evening.

Tokyo airports have been worst affected by the disruption, with both major Japanese carriers, ANA and JAL, canceling 558 and 540 flights respectively. Flight cancellations are being seen around the globe to and from Tokyo, with British Airways scraping flights from London, and flights to North America also being affected. Almost every major airline around the world has been impacted by one of the largest storms to ever hit Japan directly, but the feeling on the ground here is that disruption shouldn’t last beyond a 24-hour window.

What Makes Typhoon Hagibis Different?

The Size:

Storm Hagibis’ has a diameter that covers an immense 1,400km. Until the very last moment, no-one or nowhere in vast areas of Japan is safe from this expansive storm.

The Time Of The Month: This weekend is a full moon, meaning that sea levels are higher than average. With potential storm surge and waves being predicted to be up to 13m in some areas, coastal flooding could be devastating.

Force: With wind gusts predicted to be over 240km/h, and a direct hit to Tokyo looking increasingly likely over the next few hours, Typhoon Hagibis could be one of the strongest storms to hit Japan in decades.

In terms of pressure, Hagibis could also be the strongest on record, ever. With a current pressure of 900 hPa, this is already lower than hurricane Dorian which devastated the Bahamas earlier this year, clocking in at a pressure of 910 hPa. The strongest Tropical Cyclone ever recorded was Typhoon Tip which reached 870 hPa and made landfall in the Philippines in 1979. All Japanese airlines suggest checking their websites before travelling tomorrow.

I spend 360 days a year on the road traveling for work discovering new experiences at every turn, trying out the best and the worst airlines around the world. I set the Guinness World record for being the youngest person to travel to all 196 countries in the world by the age of 25, and you could perhaps say I caught the travel bug over that 6-year journey. I now take over 100 flights every year and I am still discovering many new places, both good and bad, whilst writing about my experiences along the way. In addition to rediscovering known destinations, I visit some of the World’s least frequented regions such as Yemen to highlight untold stories. Join me on an adventure from economy to first-class flights, the best and worst airports, and from Afghanistan to Zimbabwe.

Source: The Century’s Strongest Super-Typhoon Hagibis Is About To Hit Japan—1,600 Flights Canceled

437K subscribers
Japan is bracing for what is expected to be the most powerful storm in decades. Typhoon Hagibis is advancing north towards Japan’s main island of Honshu, with damaging winds and torrential rain. Subscribe to our channel here: https://cna.asia/youtubesub Subscribe to our news service on Telegram: https://cna.asia/telegram Follow us: CNA: https://cna.asia CNA Lifestyle: http://www.cnalifestyle.com Facebook: https://www.facebook.com/channelnewsasia Instagram: https://www.instagram.com/channelnews… Twitter: https://www.twitter.com/channelnewsasia

Scientists Weighed All The Mass In The Milky Way Galaxy It’s Mind Boggling

Something weird is happening in our galaxy: It’s spinning fast enough that stars ought to be flying off, but there’s something holding them together.

The substance that acts as a gravitational glue is dark matter. Yet it’s incredibly mysterious: Because it doesn’t emit light, no one has ever directly seen it. And no one knows what it’s made of, though there are plenty of wild hypotheses.

For our galaxy — and most others — to remain stable, physicists believe there’s much, much more dark matter in the universe than regular matter. But how much?

Recently astronomers using the Hubble Space Telescope and the European Space Agency’s Gaia star map attempted to calculate the mass of the entire Milky Way galaxy.

It’s not an easy thing to do. For one, it’s difficult to measure the mass of something we’re inside of. The Milky Way galaxy measures some 258,000 light-years across. (Recall that one light-year equals 5.88 trillion miles. Yes, the galaxy is enormous.) And an abundance of stars and gas obscures our view of the galactic center. The team of astronomers essentially measured the speed of some objects moving in our galaxy and deduced the mass from there (the more massive the galaxy, the faster the objects should move.)

Their answer: The galaxy weighs around 1.5 trillion solar masses. This number helps put in perspective how very small we are.

Take, for instance, where stars in the Milky Way fit in.

If you’re lucky enough to get a completely dark, clear sky for stargazing, it’s possible to behold as many as 9,000 stars above you. That’s how many are visible to the naked eye. But another 100 billion stars (or more) are out there just in our own Milky Way galaxy — yet they’re just 4 percent of all the stuff, or matter, in the galaxy.

Another 12 percent of the mass in the universe is gas (planets, you, me, asteroids, all of that is negligible mass in the grand accounting of the galaxy). The remaining 84 percent of the matter in the galaxy is the dark matter, Laura Watkins, a research fellow at the European Southern Observatory, and a collaborator on the project, explains.

The enormity of the galaxy, and the enormity of the mystery of what it’s made of, is really hard to think through. So, here, using the recent ESA-Hubble findings, we’ve tried to visualize the scale of the galaxy and the scale of the dark matter mystery at the heart of it.

As a visual metaphor, we’ve constructed a tower of mass. You’ll see that all the stars in the galaxy just represent a searchlight at the top of the building. The vast majorities of the floors, well, no one knows what goes on in there.

The mass of the Milky Way, visualized

To visualize the mass of 1.5 trillion suns, let’s start small. This is the Earth. It has a mass of 5.972 × 10^24 kilograms.

This is the Earth compared to the sun. The sun is 333,000 times more massive than Earth.

Now let’s try to imagine the mass of the 100 billion stars (or more) stars in the Milky Way galaxy.

That’s enormous.

Another 12 percent** of the mass in the galaxy is just gas floating between stars (mostly hydrogen and helium).

Here’s what the gas looks like using this same visual scale.

What about black holes? “It’s a bit harder to put an exact number of how much they contribute to the total mass, as we don’t know how many there are, but it will be a very, very very small fraction,” Watkins explains. “The supermassive black hole at the center of the Milky Way is around 6 million solar masses,” which is really tiny on the scale of the entire mass of the galaxy.

And it’s tiny on the scale of the most abundant, mysterious matter in the galaxy: the dark stuff. Again: 84 percent of the galaxy is made up of dark matter.

Dark matter doesn’t seem to interact with normal matter at all, and it’s invisible. But our galaxy, and universe, would fall apart without it.

Scientists hypothesized its existence when they realized that galaxies spin too quickly to hold themselves together with the mass of stars alone. Think of a carnival ride that spins people around. If it spun fast enough, those riders would be ripped off the ride.

Accounting for “dark matter,” and the gravity it generates, made their models of galaxies stable again. There’s some other evidence for dark matter, too: It seems to produce the same gravitational lensing effect (meaning that it warps the fabric of spacetime) as regular matter.

Now let’s try to visualize the mass of dark matter, compared to the mass of stars and gas.

And remember: This is just our galaxy. There are some hundreds of billions of galaxies in the universe.

Also remember that dark matter isn’t even the biggest mystery in the universe, in terms of scale. Some 27 percent of the universe is dark matter, and a mere 5 percent is the matter and energy you and I see and interact with.

The remaining 68 percent of all the matter and energy in the universe is dark energy (which is accelerating the expansion of the universe). While dark matter keeps individual galaxies together, dark energy propels all the galaxies in the universe apart from one another.

What you can see in the night sky might seem enormous: the thousands of stars, and solar systems, to potentially explore. But it’s just a teeny-tiny slice of what’s really out there.

**(Clarification: Ari Maller, a physics professor at New York City College of Technology, wrote in, pointing out that the proportions in our graphic —4 percent of the matter in the galaxy being stars, 12 percent gas, and 84 percent dark matter — are a bit off. They do, he says, represent the overall proportions of each in the universe. But, he writes “we don’t live in an average place,” clarifying that instead ”the gas in the Milky Way is only about 10 percent of its mass.”)

By

Source: Scientists weighed all the mass in the Milky Way galaxy. It’s mind-boggling.

Read more: http://www.newscientist.com/article/m… The latest weigh-in of our home galaxy shows much less mass from dark matter, which means we may live in a cosmic oddball

It’s a Moment of Reckoning For How We Use the Planet to Halt Climate Change, Warns U.N. Report  

1.jpg

Aerial view of the Transamazonica Road (BR-230) near Medicilandia, Para State, Brazil on March 13, 2019. – According to the NGO Imazon, deforestation in the Amazonia increased in a 54% in January, 2019 -the first month of Brazilian President Jair Bolsonaro’s term- compared to the same month of 2018. MAURO PIMENTEL—AFP/Getty Images

The human relationship with the land we live on has evolved over the hundreds of thousands of years humans have roamed the planet, but no period has seen as dramatic change as the last century when humans used land in new ways to extract wealth and build a modern economy.

Now, a landmark new U.N. report warns, humans face a moment of reckoning on how we use the planet’s land: human practices like deforestation threaten to undermine the role nature has played soaking up carbon dioxide emissions for more than a century. At the same time, climate change could threaten our ability to use the land, risking food security and vulnerable communities at risk of extreme weather.

“As we’ve continued to pour more and more carbon dioxide in the atmosphere, the Earth’s system has responded and it’s continued to absorb more and more,” says Louis Verchot, a lead study author and scientist at the International Center for Tropical Agriculture. But “this additional gift from nature is limited. It’s not going to continue forever.”

Today, emissions from land use — think of practices like agriculture and logging — cause nearly a quarter of human induced greenhouse emissions, according to the report, authored by scientists on the Intergovernmental Panel on Climate Change (IPCC), the U.N. climate science body.

Still, land elsewhere on the planet has balanced the effects of those emissions. In recent years, forests, wetlands and other land systems have soaked up 11.2 gigatonnes more carbon dioxide than they have emitted on an annual basis. That’s a greater quantity of carbon dioxide than released by the world’s coal-fired power plants in a given year. But a slew of human practices including deforestation, soil degradation and the destruction of land-based ecosystems threaten to halt that trend, potentially driving land to release more carbon dioxide than it absorbs.

Climate advocates billed the report as a wakeup call. Much of the attention around addressing climate change has focused on shifting the global energy system, but to keep warming at bay will require nature-based solutions that consider how humans use land, climate scientists say.

The report — at more than 1,300 pages in length — lays out a number of opportunties to use land to reverse the trend. And many of the solutions are already at hand, if governments have the wherewithal to implement them. “We don’t have to wait for some sort of new technological innovation,” says study author Pamela McElwee, an associate professor of human ecology at Rutgers University. “But what some of these solutions do require is attention, financial support, enabling environments.”

Significantly reducing deforestation while increasing the rates of restoring forests ranks among the most urgent solutions in order to retain any hope of keeping temperatures from rising to catastrophic levels by the end of the century. Reducing deforestation alone can stop annual emissions equivalent to twice those of India’s, scientists found.

The report also highlights how emissions from agriculture contribute significantly to climate change, and the opportunity to address it by rethinking diets. As global demand for food has grown, food producers have converted forests into agricultural land, leading to a release of carbon dioxide stored in trees. At the same time, more than a quarter of food goes to waste, according to the report.

With those trends in mind, scientists say a shift away from eating meat toward plant-based diets could yield big dividends in the fight against climate change. Reduced meat consumption means lower emissions from livestock and the fertilizer needed to sustain them but also provides an opportunity to reforest land that farmers would have otherwise used for grazing. Rethinking the human diet across the globe could drive emissions reductions of up to 8 gigatonnes annually, according to the report, greater than an entire year of emissions in the U.S.

But, while these changes are technically feasible, there are a number of barriers to adoption. To achieve the greatest emissions reductions by shifting diets would require most of the world to go vegan, for instance, requiring a fight against entrenched agricultural interests and cultural preferences.

And despite year’s of research underscoring the threat of deforestation the practice has worsened in some of the most critical areas. In recent years, deforestation has accelerated in the Amazon rain forest in both Brazil and Colombia, with a recent report from Brazil’s National Institute for Space Research showing that the practice had increased 40% in the previous two months compared with the same period the year prior.

The new IPCC report comes less than a year after the body’s 2018 report on the dire effects of 1.5°C of warming, which warned that climate change will bring catastrophic levels at even that level of warming. In its wake, students walked out of school across the globe, some governments committed to reducing their emissions and activists in the U.S rallied for a Green New Deal, all citing the report’s impact.

Much like last year’s, the new IPCC report highlights a number of shocking risks. The surface temperature on land has already warmed more than 1.5°C since the beginning of the industrial era, and continued warming threatens to cause a slew of extreme weather events while threatening food security and other essentials required for human life. Whether this report can inspire a similar wave of action remains to be seen.

By Justin Worland

Source: It’s a Moment of Reckoning For How We Use the Planet to Halt Climate Change, Warns U.N. Report  

%d bloggers like this:
Skip to toolbar