Weather tracking is difficult work, and has historically relied on satellites that are large and cost millions of dollars to launch into space. And with the threat of climate change making things like tropical storms, tornadoes and other weather events more violent around the world today, people are increasingly reliant on early warnings and real-time monitoring.
However, NASA is looking to change that by deploying a new breed of weather satellite that takes advantage of recent advances in miniaturization. This class of satellite is known as the RainCube (Radar in CubeSat), which uses experimental technology to see storms by detecting rain and snow using very small and sophisticated instruments.
Stand outside and take deep breath. Do you know what you’re breathing? For most people, the answer is simple – air. And air, which is essential to life as we know it, is composed of roughly twenty-percent oxygen gas (O²) and seventy-eight percent nitrogen gas (N²). However, within the remaining one-percent and change are several other trace gases, as well as few other ingredients that are not always healthy.
In the 1950s, famed physicist Enrico Fermi posed the question that encapsulated one of the toughest questions in the Search for Extra-Terrestrial Intelligence (SETI): “Where the heck is everybody?” What he meant was, given the age of the Universe (13.8 billion years), the sheer number of galaxies (between 1 and 2 trillion), and the overall number of planets, why has humanity still not found evidence of extra-terrestrial intelligence?
This question, which has come to be known as the “Fermi Paradox”, is something scientists continue to ponder. In a new study, a team from the University of Rochester considered that perhaps Climate Change is the reason. Using a mathematical model based on the Anthropocene, they considered how civilizations and planet systems co-evolve and whether or not intelligent species are capable of living sustainability with their environment.
Today, Climate Change is one of the most pressing issues facing humanity. Thanks to changes that have taken place in the past few centuries – i.e. the industrial revolution, population growth, the growth of urban centers and reliance on fossil fuels – humans have had a significant impact on the planet. In fact, many geologists refer to the current era as the “Anthropocene” because humanity has become the single greatest factor affecting planetary evolution.
In the future, populations are expected to grow even further, reaching about 10 billion by mid-century and over 11 billion by 2100. In that time, the number of people who live within urban centers will also increase dramatically, increasing from 54% to 66% by mid-century. As such, the quesiton of how billions of people can live sustainably has become an increasingly important one.
“Astrobiology is the study of life and its possibilities in a planetary context. That includes ‘exo-civilizations’ or what we usually call aliens. If we’re not the universe’s first civilization, that means there are likely to be rules for how the fate of a young civilization like our own progresses.”
Using the Anthropocene as an example, one can see how civilization-planet systems co-evolve, and how a civilization can endanger itself through growth and expansion – in what is known as a “progress trap“. Basically, as civilizations grow, they consume more of the planet’s resources, which causes changes in the planet’s conditions. In this sense, the fate of a civilization comes down to how they use their planet’s resources.
In order to illustrate this process Frank and his collaborators developed a mathematical model that considers civilizations and planets as a whole. As Prof. Frank explained:
“The point is to recognize that driving climate change may be something generic. The laws of physics demand that any young population, building an energy-intensive civilization like ours, is going to have feedback on its planet. Seeing climate change in this cosmic context may give us better insight into what’s happening to us now and how to deal with it.”
The model was also based on case studies of extinct civilizations, which included the famous example of what became of the inhabitants of Rapa Nui (aka. Easter Island). According to archaeological studies, the people of the South Pacific began colonizing this island between 400 and 700 CE and its population peaked at 10,000 sometime between 1200 and 1500 CE.
By the 18th century, however, the inhabitants had depleted their resources and the population declined to just 2000. This example raises the important concept known as “carrying capacity”, which is the maximum number of species an environment can support. As Frank explained, Climate Change is essentially how the Earth responds to the expansion of our civilization:
“If you go through really strong climate change, then your carrying capacity may drop, because, for example, large-scale agriculture might be strongly disrupted. Imagine if climate change caused rain to stop falling in the Midwest. We wouldn’t be able to grow food, and our population would diminish.”
Using their mathematical model, the team identified four potential scenarios that might occur on a planet. These include the Die-Off scenario, the Sustainability scenario, the Collapse Without Resource Change scenario, and the Collapse With Resource Change scenario. In the Die-Off scenario, the population and the planet’s state (for example, average temperatures) rise very quickly.
This would eventually lead to a population peak and then a rapid decline as changing planetary conditions make it harder for the majority of the population to survive. Eventually, a steady population level would be achieved, but it would only be a fraction of what the peak population was. This scenario occurs when civilizations are unwilling or unable to change from high-impact resources (i.e. oil, coal, clear-cutting) to sustainable ones (renewable energy).
In the Sustainability scenario, the population and planetary conditions both rise, but eventually come to together with steady values, thus avoiding any catastrophic effects. This scenario occurs when civilizations recognize that environmental changes threaten their existence and successfully make the transition from high-impact resources to sustainable ones.
The final two scenarios – Collapse Without Resource Change and Collapse With Resource Change – differ in one key respect. In the former, the population and temperature both rise rapidly until the population reaches a peak and begins to drop rapidly – though it is not clear if the species itself survives. In the latter, the population and temperature rise rapidly, but the populations recognizes the danger and makes the transition. Unfortunately, the change comes too late and the population collapses anyway.
At present, scientists cannot say with any confidence which of these fates will be the one humanity faces. Perhaps we will make the transition before it is too late, perhaps not. But in the meantime, Frank and his colleagues hope to use more detailed models to predict how planets will respond to civilizations and the different ways they consume energy and resources in order to grow.
From this, scientists may be able to refine their predictions of what awaits us in this century and the next. It is during this time that crucial changes will be taking place, which include the aforementioned population growth, and the steady rise in temperatures. For instance, based on two scenarios that measured CO2 increases by the year 2100, NASA indicated that global temperatures could rise by either 2.5 °C (4.5 °F) or 4.4 °C (8 °F).
In the former scenario, where CO2 levels reached 550 ppm by 2100, the changes would be sustainable. But in the latter scenario, where CO2 levels reached 800 ppm, the changes would cause widespread disruption to systems that billions of humans depends upon for their livelihood and survival. Worse than that, life would become untenable in certain areas of the world, leading to massive displacement and humanitarian crises.
In addition to offering a possible resolution for the Fermi Paradox, this study offers some helpful advice for human beings. By thinking of civilizations and planets as a whole – be they Earth or exoplanets – researchers will be able to better predict what changes will be necessary for human civilization to survive. As Frank warned, it is absolutely essential that humanity mobilize now to ensure that the worst-case scenario does not occur here on Earth:
“If you change the earth’s climate enough, you might not be able to change it back. Even if you backed off and started to use solar or other less impactful resources, it could be too late, because the planet has already been changing. These models show we can’t just think about a population evolving on its own. We have to think about our planets and civilizations co-evolving.”
And be sure to enjoy this video that addresses Prof. Frank and his team’s research, courtesy of the University of Rochester:
One of the most visible signs of Climate Change are the ways in which glaciers and ice sheets have been disappearing all over the world. This trend is not reserved to the Arctic ice cap or the Antarctic Basin, of course. On every part of the planet, scientists have been monitoring glaciers that have been shrinking in the past few decades to determine their rate of loss.
These activities are overseen by NASA’s Earth Observatory, which relies on instruments like the Landsat satellites to monitor seasonal ice losses from orbit. As these satellites demonstrated with a series of recently released images, the Puncak Jaya ice sheets on the south pacific island of Papua/New Guinea have been receding in the past three decades, and are at risk of disappearing in just a decade.
The Papau province of New Guinea has a very rugged landscape that consists of the mountains that make up Sudirman Range. The tallest peaks in this range are Puncak Jaya and Ngga Pulu, which stand 4,884 meters (16,020 feet) and 4,862 meters (15,950 feet) above sea level, respectively. Despite being located in the tropics, the natural elevation of these peaks allows them to sustain small fields of “permanent” ice.
Given the geography, these ice fields are incredibly rare. In fact, within the tropics, the closest glacial ice is found 11,200 km (6,900 mi) away on Mount Kenya in Africa. Otherwise, one has to venture north for about 4,500 km (2,800 mi) to Mount Tate in central Japan, where glacial ice is more common since it is much farther away from the equator.
Sadly, these rare glaciers are becoming more threatened with every passing year. Like all tropical glaciers in the world today, the glaciers on the slopes around Puncak Jaya have been shrinking at a such a rate that scientists estimate that they could be gone within a decade. This was illustrated by a pair of Landsat images that show how the ice fields have shrunk over the past thirty years.
The first of these images (shown above) was acquired on November 3rd, 1988, by the Thematic Mapper instrument aboard the Landsat 5 satellite. The second image (shown below) was acquired on December 5th, 2017, by the Operational Land Imager (OLI) on the Landsat 8 satellite. These false-color images are a combination of shortwave infrared, infrared, near-infrared, and red light.
The extent of the ice fields are shown in light blue, whereas rocky areas are represented in brown, vegetation in green, and clouds in white. The gray circular area near the center of the 2017 image is the Grasberg mine, the largest gold and second-largest copper mine in the world. This mine expanded considerably between the 1980s and 2000s are a result of a boom in copper prices.
As the images show, in 1988, there were five masses of ice resting on the mountain slopes – the Meren, Southwall, Carstensz, East Northwall Firn and West Northwall Firn glaciers. However, by 2017, only the Carstensz and a small portion of the East Northwall Firn glaciers remained. As Christopher Shuman, a research professor at the University of Maryland Baltimore County and NASA’s Goddard Space Flight Center, explained:
“The ice area losses since the 1980s here are quite striking, visible in the contrast of the blue ice with the reddish bedrock. Even though the area still gets snowfalls, it is clearly not sustaining these glacial remnants.”
Similarly, in 2009, images taken by Landsat 5 of these same glaciers (see below) indicated that the Meren and Southwall glaciers had disappeared. Meanwhile, the Carstensz, East Northwall Firn and West Northwall Firn glaciers had retreated dramatically. Based on the rate of loss, scientists estimated at the time that all of Puncak Jaya’s glaciers would be gone within 20 years.
As these latest images demonstrate, their estimates were right on the money. At their current rate, what remains of the Carstensz and East Northwall Firn glaciers will be gone by the late 2020s. The primary cause of the ice loss is rising air temperatures, which leads to rapid sublimation. However, changes in humidity levels, precipitation patterns and cloudiness can also have an impact.
Humidity is also important, since it affects how readily glaciers can lose mass directly to the atmosphere. Where the air is more moist, ice is able to make the transition to water more easily, and can be returned to the glacier in the form of precipitation. Where the air is predominately dry, ice makes the transition directly from a solid form to a gaseous form (aka. sublimation).
Temperature and precipitation are also closely linked to ice loss. Where temperatures are low enough, precipitation takes the form of snow, which can sustain glaciers and cause them to grow. Rainfall, on the other hand, will cause ice sheets to melt and recede. And of course, clouds affect how much sunlight reaches the glacier’s surface, which results in warming and sublimation.
For many tropical glaciers, scientists are still working out the relative importance of these factors and attempting to determine to what extent anthropogenic factors plays a role. In the meantime, tracking how these changes are leading to ice loss in the tropical regions provides scientists with a means of comparison when studying ice loss in other parts of the world.
As Andrew Klein, a geography professor at Texas A & M University who has studied the region, explained:
“Glacier recession continues in the tropics—these happen to be the last glaciers in the eastern tropics. Fortunately, the impact will be limited given their small size and the fact that they do not represent a significant water resource.”
Satellites continue to play an important role in the monitoring process, giving scientist the ability to map glacier ice loss, map seasonal changes, and draw comparisons between different parts on the planet. They also allow scientists to monitor remote and inaccessible areas of the planet to see how they too are being affected. Last, but not least, they allow scientists to estimate the timing of a glacier’s disappearance.
According to modern theories of geological evolution, the last major ice age (known as the Pliocene-Quaternary glaciation) began about 2.58 million years ago during the late Pliocene Epoch. Since then, the world has experienced several glacial and interglacial periods, and has been in an inter-glacial period (where the ice sheets have been retreating) ever since the last glacial period ended about 10,000 years ago.
According to new research, this trend experienced a bit of a hiccup during the late Paleolithic era. It was at this time – roughly 12,800 years ago, according to a new study from the University of Kansas – that a comet struck our planet and triggered massive wildfires. This impact also triggered a short glacial period that temporarily reversed the previous period of warming, which had a drastic affect on wildlife and human development.
The study in question, “Extraordinary Biomass-Burning Episode and Impact Winter Triggered by the Younger Dryas Cosmic Impact ~12,800 Years Ago”, was so large that it was divided into two parts. Part I. Ice Cores and Glaciers; and Part II. Lake, Marine, and Terrestrial Sediments, were both recently published by The Journal of Geography, part of the the University of Chicago Press’ series of scientific publications.
For the sake of their study, the team combined data from ice core, forest, pollen and other geochemical and isotopic markers obtained from more than 170 different sites across the world. Based on this data, the team concluded that roughly 12,800 years ago, a global disaster was triggered when a stream of fragments from a comet measuring about 100 km (62 mi) in diameter exploded in Earth’s atmosphere and rained down on the surface.
As KU Emeritus Professor of Physics & Astronomy Adrian Melott explained in a KU press release:
“The hypothesis is that a large comet fragmented and the chunks impacted the Earth, causing this disaster. A number of different chemical signatures — carbon dioxide, nitrate, ammonia and others — all seem to indicate that an astonishing 10 percent of the Earth’s land surface, or about 10 million square kilometers, was consumed by fires.”
According to their research, these massive wildfires also caused a massive feedback in Earth’s climate. As fires rushed across much of the planet’s landscape, the smoke and dust clogged the sky and blocked out sunlight. This triggered rapid cooling in the atmosphere, causing plants to die, food sources to dwindle, and ocean levels to drop. Last, but not least, the ice sheets which had been previously retreating began to advance again.
This quasi-ice age, according to the study, lasted about another thousand years. When the climate began to warm again, life began to recover, but was faced with a number of drastic changes. For example, fewer large animals survived, which affected the hunter-gather culture of humans all across North America. This was reflected in the different types of spear points that have been dated to this period.
What’s more, pollen samples obtained from this period indicate that pine forests were likely burned off and were replaced by poplar forests, a species that colonizes cleared areas. The authors also suggest that this impact could have been responsible for the so-called Younger Dryas cool episode. This period occurred roughly 12,000 years ago, where gradual climatic warming was temporarily reversed.
Intrinsic to this period was an increase of biomass burning and the extinctions of larger species during the late Pleistocene period (ca. 2,588,000 to 11,700 years ago). These sudden changes are believed to be what led to severe shifts in human populations, causing a decline during the 1000-year cold period, and leading to the adoption of agriculture and animal husbandry once the climate began to warm again.
In short, this new theory could help explain a number of changes that made humanity what it is today. As Mellot indicated:
“Computations suggest that the impact would have depleted the ozone layer, causing increases in skin cancer and other negative health effects. The impact hypothesis is still a hypothesis, but this study provides a massive amount of evidence, which we argue can only be all explained by a major cosmic impact.”
These studies not only provide insight into the timeline of Earth’s geological evolution, they also sheds light on the history of the Solar System. According to this study, the remnants of the meteor which struck Earth still persist within our Solar System today. Last, but not least, the climate shifts that these impacts created had a profound effect on the evolution of life here on Earth.
For almost two decades, NASA’s Earth Observatory has provided a constant stream of information about the Earth’s climate, water cycle, and meteorological patterns. This information has allowed scientists to track weather systems, map urban development and agriculture, and monitor for changes in the atmosphere. This has been especially important given the impact of Anthropogenic Climate Change.
Consider the animation recently released by the Earth Observatory, which show how the city of Cape Town, South Africa has been steadily depleting its supply of fresh water over the past few years. Based on multiple sources of data, this illustration and the images it is based on show how urbanization, over-consumption, and changes in weather patterns around Cape Town are leading to a water crisis.
These images that make up this animation are partly based on satellite data of Cape Town’s six major reservoirs, which was acquired between January 3rd, 2014, and January 14th, 2018. Of these six reservoirs, the largest is the Theewaterskloof Dam, which has a capacity of 480 billion liters (126.8 billion gallons) and accounts for about 41% of the water storage capacity available to Cape Town.
All told, these damns collectively store up to 898,000 megaliters (230 billion gallons) of water for Cape Town’s four million people. But according to data provided by NASA Earth Observatory, Landsat data from the U.S. Geological Survey, and water level data from South Africa’s Department of Water and Sanitation, these reservoirs have been seriously depleted thanks an ongoing drought in the region.
As you can see from the images (and from the animation above), the reservoirs have been slowly shrinking over the past few years. The extent of the reservoirs is shown in blue while dry areas are represented in grey to show how much their water levels have changed. While the decrease is certainly concerning, what is especially surprising is how rapidly it has taken place.
In 2014, Theewaterskloof was near full capacity, and during the previous year, the weather station at Cape Town airport indicated that the region experienced more rainfall than it had seen in decades. Over 682 millimeters (27 inches) of rain was reported in total that year, whereas 515 mm (20.3 in) is considered to be a normal annual rainfall for the region.
However, the region began to experience a drought in 2015 as rainfall faltered to just 325 mm (12.8 in). The next year was even worse with 221 mm (8.7 in); and in 2017, the station recorded just 157 mm (6.2 in) of rain. As of January 29th, 2018, the six reservoirs were at just 26% of their total capacity and Theewaterskloof Dam was in the worst shape, with just 13% of its capacity.
Naturally, this is rather dire news for Cape Town’s 4 million residents, and has led to some rather stark predictions. According to a recent statement made by the mayor of Cape Town, if current consumption patterns continue then the city’s disaster plan will have to be enacted. Known as Day Zero, this plan will go into effect when the city’s reservoirs reach 13.5% of capacity, and will result in water being turned off for all but hospitals and communal taps.
At this point, most people in the city will be left without tap water for drinking, bathing, or other uses and will be forced to procure water from some 200 collection points throughout the city. At present, Day Zero is expected to happen on April 12th, depending on weather patterns and consumption in the coming months.
Ordinarily, the rainy season last from May to September, and the implementation of Day Zero will depend on the level of rainfall. By the end of January, farmers will also stop drawing from the system for irrigation, meaning that water supplies prior to the rainy season could be stretched a little longer.
This is not the first time that Cape Town has been faced with the prospect of a Day Zero. Back in May of 2017, the city was declared a disaster area as the annual rainfall proved to be less than hoped for. This led to the province instituting the Disaster Management Act, which gives the provincial government the power to re-prioritize funding and enact conservation measures to preserve water in preparation for the dry season.
By the following September, Cape Town authorities released a series of guidelines for water usage that banned the use of all drinking water for non-essential purposes and urged people to use less than 87 liters (23 gallons) of water per person, per day. At the same time, authorities indicated that they were pursuing efforts to increase the supply of water by recycling, establish new desalinization facilities, and drill for new sources of groundwater.
But with the drought going into it’s fourth year, there is once again fear that the water crisis is not going to end anytime soon. According to an analysis performed by Piotr Wolski, a hydrologist at the Climate Systems Analysis Group at the University of Cape Town, this sort of pattern is something that happens every 1000 years or so. This conclusion was based on rainfall patterns dating back to 1923.
However, population growth and a lack of new infrastructure in the region has made the current water crisis what it is. Between 1995 and 2018, the population of Cape Town grew by roughly 80% while the capacity of the region’s dams grew by just 15%. However, the current predicament has accelerated plans to increase the water supply by creating new infrastructure and diverting water from the Berg River to the Voëlvlei Dam (now scheduled for completion by 2019).
For people living in many other parts of the world this story is a very familiar one. This includes California, which has been experiencing annual droughts since 2012; and southern India, which was hit by the worst drought in decades in 2016. All over the planet, growing populations and over-consumption are combining with shifting weather patterns and environmental impact to create a growing water crisis.
But as the saying goes, “necessity is the mother of invention”. And there’s nothing like an impending crisis to make people take stock of a problem and look for solutions!
Beneath the Antarctic ice sheet, there lies a continent that is covered by rivers and lakes, the largest of which is the size of Lake Erie. Over the course of a regular year, the ice sheet melts and refreezes, causing the lakes and rivers to periodically fill and drain rapidly from the melt water. This process makes it easier for Antarctica’s frozen surface to slide around, and to rise and fall in some places by as much as 6 meters (20 feet).
According to a new study led by researchers from NASA’s Jet Propulsion Laboratory, there may be a mantle plume beneath the area known as Marie Byrd Land. The presence of this geothermal heat source could explain some of the melting that takes place beneath the sheet and why it is unstable today. It could also help explain how the sheet collapsed rapidly in the past during previous periods of climate change.
The motion of Antarctica’s ice sheet over time has always been a source of interest to Earth scientists. By measuring the rate at which the ice sheet rises and falls, scientists are able to estimate where and how much water is melting at the base. It is because of these measurements that scientists first began to speculate about the presence of heat sources beneath Antarctica’s frozen surface.
The proposal that a mantle plume exists under Marie Byrd Land was first made 30 years ago by Wesley E. LeMasurier, a scientist from the University of Colorado Denver. According to the research he conducted, this constituted a possible explanation for regional volcanic activity and a topographic dome feature. But it was only more recently that seismic imaging surveys offered supporting evidence for this mantle plume.
However, direct measurements of the region beneath Marie Byrd Land is not currently possible. Hence why Seroussi and Erik Ivins of the JPL relied on the Ice Sheet System Model (ISSM) to confirm the existence of the plume. This model is essentially a numerical depiction of the physics of the ice sheet, which was developed by scientists at the JPL and the University of California, Irvine.
To ensure that the model was realistic, Seroussi and her team drew on observations of changes in altitude of the ice sheet made over the course of many years. These were conducted by NASA’s Ice, Clouds, and Land Elevation Satellite (ICESat) and their airborne Operation IceBridge campaign. These missions have been measuring the Antarctic ice sheet for years, which have led tot he creation of very accurate three-dimensional elevation maps.
Seroussi also enhanced the ISSM to include natural sources of heating and heat transport that result in freezing, melting, liquid water, friction, and other processes. This combined data placed powerful constrains on the allowable melt rates in Antarctica, and allowed the team to run dozens of simulations and test a wide range of possible locations for the mantle plume.
What they found was that the heat flux caused by the mantle plume would not exceed more than 150 milliwatts per square meter. By comparison, regions where there is no volcanic activity typically experience a feat flux of between 40 and 60 milliwatts, whereas geothermal hotspots – like the one under Yellowstone National Park – experience an average of about 200 milliwatts per square meter.
Where they conducted simulations that exceeded 150 millwatts per square meter, the melt rate was too high compared to the space-based data. Except in one location, which was an area inland of the Ross Sea, which is known to experience intense flows of water. This region required a heat flow of at least 150 to 180 milliwatts per square meter to align with its observed melt rates.
In this region, seismic imaging has also shown that heating might reach the ice sheet through a rift in the Earth’s mantle. This too is consistent with a mantle plume, which are thought to be narrow streams of hot magma rising through the Earth’s mantle and spreading out under the crust. This viscous magma then balloons under the crust and causes it to bulge upward.
Where ice lies over top of the plume, this process transfers heat into the ice sheet, triggering significant melting and runoff. In the end, Seroussi and her colleagues provide compelling evidence – based on a combination of surface and seismic data – for a surface plume beneath the ice sheet of West Antarctica. They also estimate that this mantle plume formed roughly 50 to 110 million years ago, long before the West Antarctic ice sheet came into existence.
Roughly 11,000 years ago, when the last ice age ended, the ice sheet experienced a period of rapid, sustained ice loss. As global weather patterns and rising sea levels began to change, warm water was pushed closer to the ice sheet. Seroussi and Irvins study suggests that the mantle plume could be facilitating this kind of rapid loss today, much as it did during the last onset of an inter-glacial period.
Understanding the sources of ice sheet loss under West Antarctica is important as far as estimating the rate at which ice may be lost there, which is essentially to predicting the effects of climate change. Given that Earth is once again going through global temperature changes – this time, due to human activity – it is essential to creating accurate climate models that will let us know how rapidly polar ice will melt and sea levels will rise.
It also informs our understanding of how our planet’s history and climate shifts are linked, and what effect these had on its geological evolution.
One of the most worrisome aspects of Climate Change is the role played by positive feedback mechanisms. In addition to global temperatures rising because of increased carbon dioxide and greenhouse gas emissions, there is the added push created by deforestation, ocean acidification, and (most notably) the disappearance of the Arctic Polar Ice Cap.
However, according to a new study by a team of researchers from the School of Earth and Space Exploration at Arizona State University, it might be possible to refreeze parts of the Arctic ice sheet. Through a geoengineering technique that would rely on wind-powered pumps, they believe one of the largest positive feedback mechanisms on the planet can be neutralized.
Their study, titled “Arctic Ice Management“, appeared recently in Earth’s Future, an online journal published by the American Geophysical Union. As they indicate, the current rate at which Arctic ice is disappearing it quite disconcerting. Moreover, humanity is not likely to be able to combat rising global temperatures in the coming decades without the presence of the polar ice cap.
Of particular concern is the rate at which polar ice has been disappearing, which has been quite pronounced in recent decades. The rate of loss has been estimated at being between 3.5% and 4.1% per decade, with in an overall decrease of at least 15% since 1979 (when satellite measurements began). To make things worse, the rate at which ice is being lost is accelerating.
From a baseline of about 3% per decade between 1978-1999, the rate of loss since the 2000s has climbed considerably – to the point that the extent of sea-ice in 2016 was the second lowest ever recorded. As they state in their Introduction (and with the support of numerous sources), the problem is only likely to get worse between now and the mid-21st century:
“Global average temperatures have been observed to rise linearly with cumulative CO2 emissions and are predicted to continue to do so, resulting in temperature increases of perhaps 3°C or more by the end of the century. The Arctic region will continue to warm more rapidly than the global mean. Year-round reductions in Arctic sea ice are projected in virtually all scenarios, and a nearly ice-free (<106 km2 sea-ice extent for five consecutive years) Arctic Ocean is considered “likely” by 2050 in a business-as-usual scenario.”
One of the reasons the Arctic is warming faster than the rest of the planet has to do with strong ice-albedo feedback. Basically, fresh snow ice reflects up to 90% of sunlight while sea ice reflects sunlight with albedo up to 0.7, whereas open water (which has an albedo of close to 0.06) absorbs most sunlight. Ergo, as more ice melts, the more sunlight is absorbed, driving temperatures in the Arctic up further.
Arctic sea-ice extent (area covered at least 15% by sea ice) in September 2007 (white area). The red curve denotes the 1981–2010 average. Credit: National Snow and Ice Data CenterTo address this concern, the research team – led by Steven J. Desch, a professor from the School of Earth and Space Exploration – considered how the melting is connected to seasonal fluctuations. Essentially, the Arctic sea ice is getting thinner over time because new ice (aka. “first-year ice”), which is created with every passing winter, is typically just 1 meter (3.28 ft) thick.
Ice that survives the summer in the Arctic is capable of growing and becoming “multiyear ice”, with a typical thickness of 2 to 4 meters (6.56 to 13.12 ft). But thanks to the current trend, where summers are getting progressively warmer, “first-year ice” has been succumbing to summer melts and fracturing before it can grow. Whereas multiyear ice comprised 50 to 60% of all ice in the Arctic Ocean in the 1980s, by 2010, it made up just 15%.
With this in mind, Desch and his colleagues considered a possible solution that would ensure that “first-year ice” would have a better chance of surviving the summer. By placing machines that would use wind power to generate pumps, they estimate that water could be brought to the surface over the course of an Arctic winter, when it would have the best chance of freezing.
Based on calculations of wind speed in the Arctic, they calculate that a wind turbine with 6-meter diameter blades would generate sufficient electricity so that a single pump could raise water to a height of 7 meters, and at a rate of 27 metric tons (29.76 US tons) per hour. The net effect of this would be thicker sheets of ice in the entire affected area, which would have a better chance of surviving the summer.
Over time, the negative feedback created by more ice would cause less sunlight to be absorbed by the Arctic ocean, thus leading to more cooling and more ice accumulation. This, they claim, could be done on a relatively modest budget of $500 billion per year for the entire Arctic, or $50 billion per year for 10% of the Arctic.
While this may sounds like a huge figure, they are quick to point out that the cast covering the entire Arctic with ice-creating pumps – which could save trillions in GDP and countless lives- is equivalent to just 0.64% of current world gross domestic product (GDP) of $78 trillion. For a country like the United States, it represents just 13% of the current federal budget ($3.8 trillion).
And while there are several aspects of this proposal that still need to be worked out (which Desch and his team fully acknowledge), the concept does appear to be theoretically sound. Not only does it take into account the way seasonal change and Climate Change are linked in the Arctic, it acknowledges how humanity is not likely to be be able to address Climate Change without resorting to geoengineering techniques.
And since Arctic ice is one of the most important things when it comes to regulating global temperatures, it makes perfect sense to start here.
Located along the east coast of the Antarctic Peninsula is the Larsen Ice Shelf. Named after the Norwegian Captain who explored the ice front back in 1893, this ice shelf has been monitored for decades due to its close connection with rising global temperatures. Essentially, since the 1990s, the shelf has been breaking apart, causing collapses of considerable intensity.
According to the British Antarctic Survey (BAS), the section of the ice sheet known as the Larsen C Ice Shelf could be experiencing a collapse of its own soon enough. Based on video footage and satellite evidence of the sizeable rift (which is 457 m or 15oo ft across) in the shelf, it is believed that an ice berg that is roughly 5,000 km² (1930.5 mi²) in size could be breaking off and calving into the ocean in the near future.
An ice shelf is essentially a floating extension of a land-based glacier. In this case, the Larsen Ice Shelf is seaborne section of the larger Larsen Glacier, which flows southeast past Mount Larsen and enters the Ross Sea just south of Victoria Land. These shelves often act as buttresses, holding back glaciers that flow down to the coast, thus preventing them from entering the ocean and contributing to rising sea levels.
In the past twenty-two years, the Larsen A and B ice shelves (which were situated further north along the Antarctic Peninsula) both collapsed into the sea. This resulted in the dramatic acceleration of glaciers behind them, as larger volumes of ice were able to flow down the coast and drop into the ocean. While Larsen C appeared to still be stable, in November of 2016, NASA noted the presence of a large crack in its surface.
This crack was about 110 kilometers (68 mi) long and was more than 91 m (299 ft) wide, reaching a depth of about 500 m (1,600 ft). By December, the rift had extended another 21 km (13 mi), which raised concerns about calving. In February of 2017, satellite observations of the shelf noted that the crack appeared to have grown further, which confirmed what researches from the MIDAS project had previously reported.
This UK-based Antarctic research project – which is based at Swansea University and Aberystwyth University in Wales and supported by the BAS and various international partners – is dedicated to monitoring the Larsen C ice shelf in Antarctica. Through a combination of field work, satellite observations, and computer simulations, they have catalogued how recent warming trends has caused seasonal melts of the ice shelf and affected its structure.
And in recent years, they have been monitoring the large crack, which has been fast-moving, and noted the appearance of several elongations. It was during the current Antarctic field season that members of the project filmed what the crack looked like from the air. In previous surveys, the glaciology research team has conducted research on the ice shelf using seismic techniques to survey the seafloor beneath it.
However, this past season, they did not set up on the ice shelf itself for fear of a calving event. Instead, they made a series of trips to and from the UK’s Rothera Research Station aboard twin otter aircraft. During an outing to retrieve some of their science equipment, the crew noted how the crack looked from above and started filming. As you can see from the footage, the rift is very wide and extremely long.
What’s more, the team estimates that if an iceberg from this shelf breaks off and falls into the ocean, it will likely be over three times the size of cities like London or New York City. And while this sort of thing is common with glaciers, the collapse of a large section of Larsen C could speed the flow of the Larsen Glacier towards the Antarctic Ocean.
As Dr Paul Holland, an ice and ocean modeller at the British Antarctic Survey, said in a recent press release:
“Iceberg calving is a normal part of the glacier life cycle, and there is every chance that Larsen C will remain stable and this ice will regrow. However, it is also possible that this iceberg calving will leave Larsen C in an unstable configuration. If that happens, further iceberg calving could cause a retreat of Larsen C. We won’t be able to tell whether Larsen C is unstable until the iceberg has calved and we are able to understand the behavior of the remaining ice. The stability of ice shelves is important because they resist the flow of the grounded ice inland. After the collapse of Larsen B, its tributary glaciers accelerated, contributing to sea-level rise.”
One of the greatest concerns about climate change is the feedback mechanisms it creates. In addition to increased warming trends caused by rising levels of CO² in the atmosphere, the melting of glaciers and the breakup of ice shelves can have a pronounced effect on sea levels. In the end, the depletion of glaciers in Antarctica could have dramatic consequences for the rest of the planet.
The reality of Climate Change has become painfully apparent in recent years, thanks to extended droughts in places like California, diminishing water tables around the world, rising tides, and coastal storms of increasing intensity and frequency. But perhaps the most measurable trend is the way that average global temperatures have kept rising year after year.
And this has certainly been the case for the year of 2016. According to independent analyses provided by NASA’s Goddard Institute for Space Studies (GISS) and the National Oceanic and Atmospheric Agency (NOAA), 2016 was the warmest year since modern record keeping began in 1880. This represents a continuation of a most alarming trend, where 16 of the 17 warmest years on record have occurred since 2001.
Based in New York, GISS conducts space and Earth sciences research, in support of the Goddard Space Flight Center’s (GSFC) Sciences and Exploration Directorate. Since its establishment in 1961, the Institute has conducted valuable research on Earth’s structure and atmosphere, the Earth-Sun relationship, and the structure and atmospheres of other planets in the Solar System.
Their early studies of Earth and other solar planets using data collected by satellites, space probes, and landers eventually led to GISS becoming a leading authority on atmospheric modeling. Similarly, the NOAA efforts to monitor atmospheric conditions and weather in the US since 1970s has led to them becoming a major scientific authority on Climate Change.
Together, the two organizations looked over global temperature data for the year of 2016 and came to the same conclusion. Based on their assessments, GISS determined that globally-averaged surface temperatures in 2016 were 0.99 °C (1.78 °F) warmer than the mid-20th century mean. As GISS Director Gavin Schmidt put it, these findings should silence any doubts about the ongoing nature of Global Warming:
“2016 is remarkably the third record year in a row in this series. We don’t expect record years every year, but the ongoing long-term warming trend is clear.”
The NOAA’s findings were similar, with an average temperature of 14.83 °C (58.69 °F) being reported for 2016. This surpassed last year’s record by about 0.004 °C (0.07 °F), and represents a change of around 0.94 °C (1.69 F) above the 20th century average. The year began with a boost, thanks to El Nino; and for the eight consecutive months that followed (January to August) the world experienced record temperatures.
This represents a consistent change since 2001, where average global temperatures have increased, leading to of the 16 warmest years on record since 1880 in a row. In addition, on five separate occasions during this period, the annual global temperature was record-breaking – in 2005, 2010, 2014, 2015, and 2016, respectively.
With regards to the long-term trend, average global temperatures have increased by about 1.1° Celsius (2° Fahrenheit) since 1880. This too represents a change, since the rate of increase was placed at 0.8° Celsius (1.4° Fahrenheit) back in 2014. Two-thirds of this warming has occurred since 1975, which coincides with a period of rapid population growth, industrialization, and increased consumption of fossil fuels.
And while there is always a degree of uncertainty when it comes to atmospheric and temperature modelling, owing to the fact that the location of measuring stations and practices change over time, NASA indicated that they were over 95% certain of these results. As such, there is little reason to doubt them, especially since they are consistent with what is at this point a very well-documented trend.
To see an animated graph of average global temperature increases since 1880, click here. To see the full data set and learn about the methods employed by GISS, click here.
And be sure to check out this NASA video that shows these changes on a global map: