The argument could be made that data centers are a major driving force in the energy transition to renewables...Siemens
With the backdrop of a global energy crisis, it’s perhaps a good time to discuss the real energy cost of data centers, why they’ve become one of the world’s fastest-advancing industries, and how an argument could be made that they are, in fact, a major driving force in the energy transition to renewables.
Firstly, let’s consider their growth. Between 2015 and 2021, internet users have increased by 60%, and internet traffic by 440%, according to IEA data. But despite a rise in workload for data centers of 260% in the same time period, global energy use attributed to data centers has remained relatively flat, representing just a 10% increase.
Decoupling service demand from energy
This decoupling of energy usage from service demand is hugely significant. It’s being driven by rapid innovation of digital technology in data centers, allowing the industry to offset huge increases in demand with improved infrastructure efficiency.
In short, we are building more and more digital capacity, modernizing legacy infrastructure and meeting the growth, but in real terms we’re not using much more energy than we were five years ago.
The real challenge here is how we limit the environmental impact of global internet use. We should be honest; there is an impact, there always will be and we need to continue to use technology and commercial innovation to mitigate it.
Looking at the lifecycle of a datacenter, we see that more than 85% of the carbon impact today comes from operations. There is also certainly an impact during construction and it’s also significant – certainly worthy of mitigation – but net-zero construction will not address the larger part of the challenge: the carbon impact of running the facility for 20 years or more.
Driving the energy transition
Enter the concept of energy mix. A data center’s carbon performance is broadly a function of the energy mix in the location in which it’s operating. There are some exceptions, where operators take the responsibility to generate power on-site using renewables or gas, but largely speaking local grids power data centers.
The data center industry is a major buyer of PPA agreements for renewable energy, and this has a significant impact on the energy mix. In 2021, Amazon and Microsoft were the two largest corporate buyers of renewable energy through PPA; to a degree, the data center industry is helping to drive decarbonization by underwriting a significant proportion of grid-scale, carbon-free energy for industry.
This can catalyze a whole ecosystem at a national level, and also demonstrates to the broader industrial base that critical loads can reliably move to renewables.
Growing adoption of renewable energy – and higher levels of infrastructure redundancy at the IT level – are also leading to new design best practices in data centers. Battery Energy Storage Systems (BESS) are replacing diesel gensets as short-term backup power supply, for example.
With energy markets increasingly interconnected, data centers adopting BESS can generate unprecedented revenues by running in island mode in case of outages (without emitting carbon dioxide from diesel generators) or stabilizing the frequency of the grid.
Beyond carbon reduction
The quest for efficiency is as important as ever, and the discipline of energy management has a massive effect on the operational impact of a data center. Innovation in technology for managing environmental conditions inside a data center is having a significant impact on energy consumption.
We know that 20% to 40% of a data center’s energy use can go towards cooling and ventilation, and this mechanical load is a prime area for optimization through technology. White space cooling is a good example. Our technology for this – a white space cooling optimization solution – uses an advanced machine-learning model to analyze the effect of cooling on specific areas of a data center, creating an influence map to limit energy use to only what’s necessary.
AI engines like this can make a significant difference to a data center’s energy costs; it’s one of the reasons the new Greenergy data center in Estonia is the most energy efficient in the Baltic region.
Building elasticity into data center design is also a key factor in reducing the sector’s energy consumption and cost. Through intelligent design, instrumentation, control and automation, a data center can enable and disable capacity when it’s needed, rather than constantly running circuits and networks with no work to do.
In addition to preventing the over-provision of infrastructure, this is crucial at times of the year where additional capacity is needed to cover short peaks in demand. Black Friday and the Christmas period are a perfect example; with better mechanical, electrical and automation design we can dynamically reduce or increase data center infrastructure resources, enabling them to reliably run closer to their capacity for short periods of time.
The future?
Digitalization and the rise of internet access have delivered significant benefits to humanity, but we should be humble enough to acknowledge there is a sustainability impact. Our mission now is to continually and steadily drive that impact toward zero.
The data center industry has led by example by willingly investing in decarbonization, and has demonstrated that an ecosystem of technology companies can work together to measure, manage and fundamentally change an industry’s approach to energy through innovation. We must continue this pace of investment and innovation if we are to deliver the lowest possible impact of our digital lives on the environment.
Ciaran is VP and Global Head of Siemens’ Datacenter Solutions business, tasked with the continued development of Siemens as a technology provider to the….
In the U.S., you could have blinked and missed a momentous announcement that impacts our ability to safeguard and improve our planet. I noticed that news coverage of the COP15 on biodiversity in Montreal was significantly less than that of the COP21 agreement on climate held in Paris in 2015. Nor did it receive the fanfare or celebrity endorsement. But, it is an important step forward in ensuring our planet continues to be a diverse ecosystem.
Biodiversity is paramount for the functioning of our planet, and the business case should suggest that focusing on biodiversity is as important as reigning in our carbon emissions: “Degrading ecosystems could trigger a downward spiral of US$2.7 trillion in global Gross Domestic Product by 2030.” Failures in biodiversity costs the global economy more than $5 trillion a year in the form of lost natural services.
The World Economic Forum in collaboration with PwC found that “$44 trillion of economic value generation—more than half of the world’s total GDP—is moderately or highly dependent on nature and its services and is therefore exposed to nature loss.”
There are 23 environmental targets stipulated under the COP15 agenda. Of these, the most recognized target is known as the “30 by 30” conservation target (paywall). This name comes from the mandate that by 2030, governments will ensure and enable 30% of the planet is under protection.
This will be accomplished through the creation of protected areas and other known measures for area-based conservation. Currently, approximately 17% of land and roughly 8% of the oceans are protected. Additionally, the deal would roughly double overall financing focused on biodiversity protection to $200 billion a year from all sources.
But, all together, I think businesses can do more. Conservation is a powerful component in maintaining our biodiversity. As of 2019, one study found nearly 30,000 species were at risk of extinction, while another found that nearly 1 million species were at risk of extinction. A healthy planet, one with a fully functioning diverse ecosystem, requires we also determine how to recover those lost species.
It’s important for companies to plan for and align on the need to support biodiversity. Through new technologies, we can improve species resiliency and grow species capacity. Some companies like mine are even working on bringing back species that are needed for health ecosystems. These steps will allow us to create more hospitable spaces for species.
In the next five years, I think we will see scientists actively recovering species and protecting species from diseases, creating more extreme weather-resistant species.
More and more companies are joining in efforts to support the environment. But, other businesses also have the opportunity to get involved with efforts to support biodiversity efforts.
1. Identify your full environmental impact.
First, start with understanding the impacts of your organization on the full environment, not just your carbon emissions. Emissions are one aspect of climate change, but they paint an incomplete picture. I suggest business leaders also undertake a material assessment to understand the material impacts and dependencies of their organizations. Science Based Targets Network offers a guide that companies can utilize.
2. Evaluate business risks and opportunities related to nature and biodiversity.
This evaluation can lead to actions that help businesses understand the loss of biodiversity on your financial and business outcomes. One tool that organizations can utilize is the framework from the Taskforce on Nature-related Financial Disclosures (TNFD), which can be used to evaluate and manage nature-related risks.
3. Set goals and raise awareness around your goals.
Measure and set targets for land use, freshwater use and ecosystem integrity. Again, there are useful tools available for companies attempting to measure and set biodiversity targets. And then, commit publicly to those goals to reduce waste and prioritize the protection of the planet as a core aspect of business. This could include signing corporate pledges.
4. Consider nature-based solutions.
New companies and legacy companies can consider the concept of “nature-based solutions,” which originate with the idea that companies can restore nature, mitigate and adapt to climate change, while also supporting the lifestyles and interests of local people. In doing so businesses can transform to restore and regenerate landscapes.
Stopping climate change and protecting plants and animals are and should be recognized as linked goals. Reducing carbon emissions is vital. Recovering species is too.
This is all possible, but requires technology investment and ideological alignment. Protecting biodiversity is more than just conserving what we have; it’s planning a pathway to a better tomorrow.
A new study in Nature Sustainability incorporates the damages that climate change does to healthy ecosystems into standard climate-economics models. The key finding in the study by Bernardo Bastien-Olvera and Frances Moore from the University of California at Davis:
The models have been underestimating the cost of climate damages to society by a factor of more than five. Their study concludes that the most cost-effective emissions pathway results in just 1.5 degrees Celsius (2.7 degrees Fahrenheit) additional global warming by 2100, consistent with the “aspirational” objective of the 2015 Paris Climate Agreement.
Models that combine climate science and economics, called “integrated assessment models” (IAMs), are critical tools in developing and implementing climate policies and regulations.
In 2010, an Obama administration governmental interagency working group used IAMs to establish the social cost of carbon – the first federal estimates of climate damage costs caused by carbon pollution. That number guides federal agencies required to consider the costs and benefits of proposed regulations.
Economic models of climate have long been criticized by those convinced they underestimate the costs of climate damages, in some cases to a degree that climate scientists consider absurd. Given the importance of the social cost of carbon to federal rulemaking, some critics have complained that the Trump EPA used what they see as creative accounting to slash the government’s estimate of the number. In one of his inauguration day Executive Orders, President Biden established a new Interagency Working Group to re-evaluate the social cost of all greenhouse gases.
IAMs often have long been criticized by those convinced they underestimate the costs of climate damages, in some cases to a degree that climate scientists consider absurd. Perhaps the most prominent IAM is the Dynamic Integrated Climate-Economy (DICE) model, for which its creator, William Nordhaus, was awarded the 2018 Nobel Prize in Economic Sciences.
Judging by DICE, the economically optimal carbon emissions pathway – that is, the pathway considered most cost-effective – would lead to a warming increase of more than 3°C (5.4°F) from pre-industrial temperatures by 2100 (under a 3% discount rate). IPCC has reported that reaching this level of further warming could likely result in severe consequences, including substantial species extinctions and very high risks of food supply instabilities.
In their Nature Sustainability study, the UC Davis researchers find that when natural capital is incorporated into the models, the emissions pathway that yields the best outcome for the global economy is more consistent with the dangerous risks posed by continued global warming described in the published climate science literature.
Accounting for climate change degrading of natural capital
Natural capital includes elements of nature that produce value to people either directly or indirectly. “DICE models economic production as a function of generic capital and labor,” Moore explained via email. “If instead you think natural capital plays some distinct role in economic production, and that climate change will disproportionately affect natural capital, then the economic implications are much larger than if you just roll everything together and allow damage to affect output.”
Bastien-Olvera offered an analogy to explain the incorporation of natural capital into the models: “The standard approach looks at how climate change is damaging ‘the fruit of the tree’ (market goods); we are looking at how climate change is damaging the ‘tree’ itself (natural capital).” In an adaptation of DICE they call “GreenDICE,” the authors incorporated climate impacts on natural capital via three pathways:
The first pathway accounts for the direct influence of natural capital on market goods. Some industries like timber, agriculture, and fisheries are heavily dependent on natural capital, but all goods produced in the economy rely on these natural resources to some degree.
According to GreenDICE, this pathway alone more than doubles the model’s central estimate of the social cost of carbon in 2020 from $28 per ton in the standard DICE model to $72 per ton, and the new economically optimal pathway would have society limit global warming to 2.2°C (4°F) above pre-industrial temperatures by 2100.
The second pathway incorporates ecosystem services that don’t directly feed into market goods. Examples are the flood protection provided by a healthy mangrove forest, or the recreational benefits provided by natural places.
In the study, this second pathway nearly doubles the social cost of carbon once again, to $133 per ton in 2020, and it lowers the most cost-effective pathway to 1.8°C (3.2°F) by 2100. Finally, the third pathway includes non-use values, which incorporate the value people place on species or natural places, regardless of any good they produce. The most difficult to quantify, this pathway could be measured, for instance, by asking people how much they would be willing to pay to save one of these species from extinction.
In GreenDICE, non-use values increase the social cost of carbon to $160 per ton of carbon dioxide in 2020 (rising to about $300 in 2050 and $670 per ton in 2100) and limit global warming to about 1.5°C (2.8°F) by 2100 in the new economically optimal emissions pathway. (Note for economics wonks – the model runs used a 1.5% pure rate of time preference.)
Climate economics findings increasingly reinforce Paris targets
It may come as no surprise that destabilizing Earth’s climate would be a costly proposition, but key IAMs have suggested otherwise. Based on the new Nature Sustainability study, the models have been missing the substantial value of natural capital associated with healthy ecosystems that are being degraded by climate change.
Columbia University economist Noah Kaufman, not involved in the study, noted via email that as long as federal agencies use the social cost of carbon in IAMs for rulemaking cost-benefit analyses, efforts like GreenDICE are important to improving those estimates. According to Kaufman, many papers (including one he authored a decade ago) have tried to improve IAMs by following a similar recipe: “start with DICE => find an important problem => improve the methodology => produce a (usually much higher) social cost of carbon.”
For example, several other papers published in recent years, including one authored by Moore, have suggested that, because they neglect ways that climate change will slow economic growth, IAMs may also be significantly underestimating climate damage costs. Poorer countries – often located in already-hot climates near the equator, with economies relying most heavily on natural capital, and lacking resources to adapt to climate change – are the most vulnerable to its damages, despite their being the least responsible for the carbon pollution causing the climate crisis.
Another recent study in Nature Climate Change updated the climate science and economics assumptions in DICE and similarly concluded that the most cost-effective emissions pathway would limit global warming to less than 2°C (3.6°F) by 2100, without even including the value of natural capital. Asked about that paper, Bastien-Olvera noted, “In my view, the fact that these two studies get to similar policy conclusions using two very different approaches definitely indicates the urgency of cutting emissions.”
Recent economics and climate science research findings consistently support more aggressive carbon emissions efforts consistent with the Paris climate targets.
Wesleyan University economist Gary Yohe, also not involved in the study, agreed that the new Nature Sustainability study “supports growing calls for aggressive near-term mitigation.” Yohe said the paper “provides added support to the notion that climate risks to natural capital are important considerations, especially in calibrating the climate risk impacts of all sorts of regulations like CAFE standards.”
But Yohe said he believes that considering the risks to unique and threatened systems at higher temperatures makes a more persuasive case for climate policy than just attempting to assess their economic impacts. In a recent Nature Climate Change paper, Kaufman and colleagues similarly suggested that policymakers should select a net-zero emissions target informed by the best available science and economics, and then use models to set a carbon price that would achieve those goals.
Their study estimated that to reach net-zero carbon pollution by 2050, the U.S. should set a carbon price of about $50 per ton in 2025, rising to $100 per ton by 2030. However climate damages are evaluated, whether through a more complete economic accounting of adverse impacts or via risk-based assessments of physical threats to ecological and human systems, recent economics and climate science research findings consistently support more aggressive carbon emissions efforts consistent with the Paris climate targets.
You might be using an app to read this very article. And if you’re reading it on an iPhone, then you got that app through the App Store, the Apple-owned and -operated gateway for apps on its phones. But a lot of people want that to change.
Apple is facing growing scrutiny for the tight control it has over so much of the mobile-first, app-centric world it created. The iPhone, which was released in 2007, and the App Store, which came along a year later, helped make Apple one of the most valuable companies on the planet, as well as one of the most powerful. Now, lawmakers, regulators, developers, and consumers are questioning the extent and effects of that power — including if and how it should be reined in.
Efforts in the United States and abroad could significantly loosen Apple’s grip over one of its most important lines of business and fundamentally change how iPhone and iPad users get and pay for their apps. It could make many more apps available. It could make them less safe. And it could make them cheaper.
The iPhone maker isn’t the only company under the antitrust microscope. Once lauded as shining beacons of innovation and ingenuity that would guide the world into the 21st century, Apple is just one of several Big Tech companies now accused of amassing too much power over parts of the economy that have become as essential as steel, oil, and the telephone were in centuries past.
These companies have a great deal of control over what we can do on our phones, the items we buy online and how they get to our homes, our personal data, the internet ecosystem, even our online identities. Some believe the best way to deal with Big Tech now is the way we dealt with steel, oil, and telephone monopolies decades ago: by using antitrust laws to place restrictions on them or even break them up.
And if our existing laws can’t do it, legislators want to introduce new laws that target the digital marketplace. In her book Monopolies Suck, antitrust expert Sally Hubbard described Apple as a “warm and fuzzy monopolist” when compared to Facebook, Google, and Amazon, the other three companies in the so-called Big Four that have been accused of being too big.
It doesn’t quite have the negative public perception that its three peers have, and the effects of its exclusive control over mobile apps on its consumers aren’t as obvious. For many people, Facebook, Google, and Amazon are unavoidable realities of life on the internet these days, while Apple makes products they choose to buy.
But more than half of the smartphones in the United States are iPhones, and as those phones become integrated into more facets of our daily lives, Apple’s exclusive control over what we can do with those phones and which apps we can use becomes more problematic. It’s also an outlier; rival mobile operating system Android allows pretty much any app, though app stores may have their own restrictions.
Apple makes the phones. But should Apple set the rules over everything we can do with them? And what are iPhone users missing out on when one company controls so much of their experience on them?
Apple’s vertical integration model was fine until it wasn’t
Many of the problems Apple faces now come from a principle of its business model: Maintain as much control as possible over as many aspects of its products as possible. This is unusual for a computer manufacturer. You can buy a computer with a Microsoft operating system from a variety of manufacturers, and nearly 1,300 brands sell devices with Google’s Android operating system.
But Apple’s operating systems — macOS, iOS, iPadOS, and watchOS — are only on Apple’s devices. Apple has said it does this to ensure that its products are easy to use, private, and secure. It’s a selling point for the company and a reason some customers are willing to pay a premium for Apple devices…Continue reading
While most struggle to gain familiarity with the increasing role of distributed ledgers in the wider economy, developers are already delving deeper into providing more advanced functionality. Right now, software engineering teams are working on technology that will become integral to global functioning in the years to come.
One critical innovation in Web3 that will facilitate wide-scale adoption is the subnet. While it is possible to explore these concepts in-depth, it is much better to simply give a high-level overview. When the complicated terminology is removed, the core concepts are actually very relatable.
What Is The Crypto Scalability Problem?
In crypto “lingo,” blockchains are divided into Layer One (L1) and Layer Two (L2). Again, this sounds a lot more complicated than it is. L2 blockchains are those that address specific problems that an L1 blockchain cannot cope with. They are placed “on top” of earlier blockchains.
A prime example is Bitcoin. Bitcoin, the first cryptocurrency, was a wonderful innovation for its time. But it quickly ran into massive scalability problems, with high fees and network congestion. So it needed a L2 solution, which is known as the Lightning Network. In the same way, Ethereum ran into issues as an earlier crypto ecosystem. So it needed to resort to an L2 solution in the form of Plasma.
Unfortunately, these L2 solutions are not doing what they are supposed to do. Ethereum is still the primary ecosystem on which dApps are built, and NFTs are traded (as ERC-20 tokens). Still, it has massive fees, which is why developers and market newcomers are looking towards alternatives such as Avalanche.
L2 solutions are not resolving the problem of scalability. High fees and slow speeds are powerful indicators that cryptocurrency cannot go mainstream. If cryptocurrency needs global adoption, it needs to be able to cope with more people on the network. L2 solutions have not yet proven up to the mark. Subnets are a much more versatile and effective technology.
Exploring Subnets Within Web3
Subnets are a game-changer for crypto scalability. A subnet is merely a sub-level network within a larger network. Each blockchain is simply a network – a network being the number of nodes/servers that communicate with each other through distinct protocols. The subnet will take attributes from the parent chain/larger network but will have a specific use case.
Subnets are closely related to the concept of sharding. They are very reliable, efficient, and better at solving scalability than L2 blockchains. The major difference between subnets and sharding is that subnets can be created at will by customers and developers.
While sharding is built into the architecture, you can launch infinite subnets to see which ones scale the best while implementing the sharding model. In other words, you can create infinite subnets that take the best attributes from the initial blockchain network. These subnets can be put to a variety of different uses.
Subnets Are Already Taking Over
Avalanche is a prominent blockchain that has recently launched subnets, allowing many newer Web3 projects to build their own ecosystems. Ayoken Labs has launched its token on the Avalanche C-Chain. Ayoken is a digital collectibles marketplace that connects creators to global audiences. With a vision to onboard 10 million new crypto users & digital collectible owners, Ayoken Labs aims to catalyze the mainstream adoption of crypto in emerging markets. It is onboarding creatives to the metaverse. And it selected Avalanche to assist with this venture.
Avalanche offers C-Chain, X-Chain, and P-Chain subnets. P-Chain is for staking, X-Chain is for sending and receiving transfers, and C-Chain is for smart contracts (broadly speaking). These subnets allow for distinct chains to be used for specific purposes. But they are all validated by the primary network, taking its benefits with them.
Ankr is another web3 company that aims to be a major player in the subnet/sidechain space. Ankr is a major Web3 infrastructure provider that launched the first Binance Smart Chain Application Sidechain (BAS) testnet, along with Celer and NodeReal.
The BAS Testnet is a framework for creating side chains dedicated to applications in the BNB Chain ecosystem. Ankr is also the main infrastructure partner for Binance, Fantom, and Polygon and has helped these major firms to scale. Ankr also launched the first game on the Binance Application Sidechain (BAS).
It is currently the leading RPC provider and offers a cost-effective mechanism to build, deploy, and scale in Web3. Its low latency and high resilience levels can be observed from many tracking tools.
As a major infrastructure provider, Ankr is also looking to launch subnets so that Web3 projects can grow from a stable, fast, and efficient foundation. This enables projects to test and grow without being “locked-in” to a previous blockchain.
Crypto Scalability Issues: A Thing Of The Past
Subnet functionality is going to become a core necessity to build the future of Web3 and resolve the crypto scalability issue. It resolves perhaps the most pressing issue observed with previous blockchains. Development teams can tweak and test in secure environments and can create as many subnets as they wish.
These innovations will ultimately help to grow the wider ecosystem and help to quickly replace legacy systems that are already obsolete.
By: Victor Fabusola Crypto Writer & Blockchain Journalist. Lover of mental models and conscious hip-hop.
Internet Standard Subnetting Procedure. IETF. p. 6. doi:10.17487/RFC0950. RFC950. It is useful to preserve and extend the interpretation of these special addresses in subnetted networks. This means the values of all zeros and all ones in the subnet field should not be assigned to actual (physical) subnets.Troy Pummill; Bill Manning (December 1995).
Variable Length Subnet Table For IPv4. IETF. doi:10.17487/RFC1878. RFC1878. This practice is obsolete! Modern software will be able to utilize all definable networks. (Informational RFC, demoted to category Historic)A. Retana; R. White; V. Fuller; D. McPherson (December 2000).
RFC 3627 to Historic Status. IETF. doi:10.17487/RFC6547. RFC6547. This document moves “Use of /127 Prefix Length Between Routers Considered Harmful” (RFC 3627) to Historic status to reflect the updated guidance contained in “Using 127-Bit IPv6 Prefixes on Inter-Router Links” (RFC 6164).R. Hinden; S. Deering (February 2006).