More stories

  • in

    Improving US air quality, equitably

    Decarbonization of national economies will be key to achieving global net-zero emissions by 2050, a major stepping stone to the Paris Agreement’s long-term goal of keeping global warming well below 2 degrees Celsius (and ideally 1.5 C), and thereby averting the worst consequences of climate change. Toward that end, the United States has pledged to reduce its greenhouse gas emissions by 50-52 percent from 2005 levels by 2030, backed by its implementation of the 2022 Inflation Reduction Act. This strategy is consistent with a 50-percent reduction in carbon dioxide (CO2) by the end of the decade.

    If U.S. federal carbon policy is successful, the nation’s overall air quality will also improve. Cutting CO2 emissions reduces atmospheric concentrations of air pollutants that lead to the formation of fine particulate matter (PM2.5), which causes more than 200,000 premature deaths in the United States each year. But an average nationwide improvement in air quality will not be felt equally; air pollution exposure disproportionately harms people of color and lower-income populations.

    How effective are current federal decarbonization policies in reducing U.S. racial and economic disparities in PM2.5 exposure, and what changes will be needed to improve their performance? To answer that question, researchers at MIT and Stanford University recently evaluated a range of policies which, like current U.S. federal carbon policies, reduce economy-wide CO2 emissions by 40-60 percent from 2005 levels by 2030. Their findings appear in an open-access article in the journal Nature Communications.

    First, they show that a carbon-pricing policy, while effective in reducing PM2.5 exposure for all racial/ethnic groups, does not significantly mitigate relative disparities in exposure. On average, the white population undergoes far less exposure than Black, Hispanic, and Asian populations. This policy does little to reduce exposure disparities because the CO2 emissions reductions that it achieves primarily occur in the coal-fired electricity sector. Other sectors, such as industry and heavy-duty diesel transportation, contribute far more PM2.5-related emissions.

    The researchers then examine thousands of different reduction options through an optimization approach to identify whether any possible combination of carbon dioxide reductions in the range of 40-60 percent can mitigate disparities. They find that that no policy scenario aligned with current U.S. carbon dioxide emissions targets is likely to significantly reduce current PM2.5 exposure disparities.

    “Policies that address only about 50 percent of CO2 emissions leave many polluting sources in place, and those that prioritize reductions for minorities tend to benefit the entire population,” says Noelle Selin, supervising author of the study and a professor at MIT’s Institute for Data, Systems and Society and Department of Earth, Atmospheric and Planetary Sciences. “This means that a large range of policies that reduce CO2 can improve air quality overall, but can’t address long-standing inequities in air pollution exposure.”

    So if climate policy alone cannot adequately achieve equitable air quality results, what viable options remain? The researchers suggest that more ambitious carbon policies could narrow racial and economic PM2.5 exposure disparities in the long term, but not within the next decade. To make a near-term difference, they recommend interventions designed to reduce PM2.5 emissions resulting from non-CO2 sources, ideally at the economic sector or community level.

    “Achieving improved PM2.5 exposure for populations that are disproportionately exposed across the United States will require thinking that goes beyond current CO2 policy strategies, most likely involving large-scale structural changes,” says Selin. “This could involve changes in local and regional transportation and housing planning, together with accelerated efforts towards decarbonization.” More

  • in

    New clean air and water labs to bring together researchers, policymakers to find climate solutions

    MIT’s Abdul Latif Jameel Poverty Action Lab (J-PAL) is launching the Clean Air and Water Labs, with support from Community Jameel, to generate evidence-based solutions aimed at increasing access to clean air and water.

    Led by J-PAL’s Africa, Middle East and North Africa (MENA), and South Asia regional offices, the labs will partner with government agencies to bring together researchers and policymakers in areas where impactful clean air and water solutions are most urgently needed.

    Together, the labs aim to improve clean air and water access by informing the scaling of evidence-based policies and decisions of city, state, and national governments that serve nearly 260 million people combined.

    The Clean Air and Water Labs expand the work of J-PAL’s King Climate Action Initiative, building on the foundational support of King Philanthropies, which significantly expanded J-PAL’s work at the nexus of climate change and poverty alleviation worldwide. 

    Air pollution, water scarcity and the need for evidence 

    Africa, MENA, and South Asia are on the front lines of global air and water crises. 

    “There is no time to waste investing in solutions that do not achieve their desired effects,” says Iqbal Dhaliwal, global executive director of J-PAL. “By co-generating rigorous real-world evidence with researchers, policymakers can have the information they need to dedicate resources to scaling up solutions that have been shown to be effective.”

    In India, about 75 percent of households did not have drinking water on premises in 2018. In MENA, nearly 90 percent of children live in areas facing high or extreme water stress. Across Africa, almost 400 million people lack access to safe drinking water. 

    Simultaneously, air pollution is one of the greatest threats to human health globally. In India, extraordinary levels of air pollution are shortening the average life expectancy by five years. In Africa, rising indoor and ambient air pollution contributed to 1.1 million premature deaths in 2019. 

    There is increasing urgency to find high-impact and cost-effective solutions to the worsening threats to human health and resources caused by climate change. However, data and evidence on potential solutions are limited.

    Fostering collaboration to generate policy-relevant evidence 

    The Clean Air and Water Labs will foster deep collaboration between government stakeholders, J-PAL regional offices, and researchers in the J-PAL network. 

    Through the labs, J-PAL will work with policymakers to:

    co-diagnose the most pressing air and water challenges and opportunities for policy innovation;
    expand policymakers’ access to and use of high-quality air and water data;
    co-design potential solutions informed by existing evidence;
    co-generate evidence on promising solutions through rigorous evaluation, leveraging existing and new data sources; and
    support scaling of air and water policies and programs that are found to be effective through evaluation. 
    A research and scaling fund for each lab will prioritize resources for co-generated pilot studies, randomized evaluations, and scaling projects. 

    The labs will also collaborate with C40 Cities, a global network of mayors of the world’s leading cities that are united in action to confront the climate crisis, to share policy-relevant evidence and identify opportunities for potential new connections and research opportunities within India and across Africa.

    This model aims to strengthen the use of evidence in decision-making to ensure solutions are highly effective and to guide research to answer policymakers’ most urgent questions. J-PAL Africa, MENA, and South Asia’s strong on-the-ground presence will further bridge research and policy work by anchoring activities within local contexts. 

    “Communities across the world continue to face challenges in accessing clean air and water, a threat to human safety that has only been exacerbated by the climate crisis, along with rising temperatures and other hazards,” says George Richards, director of Community Jameel. “Through our collaboration with J-PAL and C40 in creating climate policy labs embedded in city, state, and national governments in Africa and South Asia, we are committed to innovative and science-based approaches that can help hundreds of millions of people enjoy healthier lives.”

    J-PAL Africa, MENA, and South Asia will formally launch Clean Air and Water Labs with government partners over the coming months. J-PAL is housed in the MIT Department of Economics, within the School of Humanities, Arts, and Social Sciences. More

  • in

    Q&A: Are far-reaching fires the new normal?

    Where there’s smoke, there is fire. But with climate change, larger and longer-burning wildfires are sending smoke farther from their source, often to places that are unaccustomed to the exposure. That’s been the case this week, as smoke continues to drift south from massive wildfires in Canada, prompting warnings of hazardous air quality, and poor visibility in states across New England, the mid-Atlantic, and the Midwest.

    As wildfire season is just getting going, many may be wondering: Are the air-polluting effects of wildfires a new normal?

    MIT News spoke with Professor Colette Heald of the Department of Civil and Environmental Engineering and the Department of Earth, Atmospheric and Planetary Sciences, and Professor Noelle Selin of the Institute for Data, Systems and Society and the Department of Earth, Atmospheric and Planetary Sciences. Heald specializes in atmospheric chemistry and has studied the climate and health effects associated with recent wildfires, while Selin works with atmospheric models to track air pollutants around the world, which she uses to inform policy decisions on mitigating  pollution and climate change. The researchers shared some of their insights on the immediate impacts of Canada’s current wildfires and what downwind regions may expect in the coming months, as the wildfire season stretches into summer.  

    Q: What role has climate change and human activity played in the wildfires we’ve seen so far this year?

    Heald: Unusually warm and dry conditions have dramatically increased fire susceptibility in Canada this year. Human-induced climate change makes such dry and warm conditions more likely. Smoke from fires in Alberta and Nova Scotia in May, and Quebec in early June, has led to some of the worst air quality conditions measured locally in Canada. This same smoke has been transported into the United States and degraded air quality here as well. Local officials have determined that ignitions have been associated with lightning strikes, but human activity has also played a role igniting some of the fires in Alberta.

    Q: What can we expect for the coming months in terms of the pattern of wildfires and their associated air pollution across the United States?

    Heald: The Government of Canada is projecting higher-than-normal fire activity throughout the 2023 fire season. Fire susceptibility will continue to respond to changing weather conditions, and whether the U.S. is impacted will depend on the winds and how air is transported across those regions. So far, the fire season in the United States has been below average, but fire risk is expected to increase modestly through the summer, so we may see local smoke influences as well.

    Q: How has air pollution from wildfires affected human health in the U.S. this year so far?

    Selin: The pollutant of most concern in wildfire smoke is fine particulate matter (PM2.5) – fine particles in the atmosphere that can be inhaled deep into the lungs, causing health damages. Exposure to PM2.5 causes respiratory and cardiovascular damage, including heart attacks and premature deaths. It can also cause symptoms like coughing and difficulty breathing. In New England this week, people have been breathing much higher concentrations of PM2.5 than usual. People who are particularly vulnerable to the effects are likely experiencing more severe impacts, such as older people and people with underlying conditions. But PM2.5 affects everyone. While the number and impact of wildfires varies from year to year, the associated air pollution from them generally lead to tens of thousands of premature deaths in the U.S. overall annually. There is also some evidence that PM2.5 from fires could be particularly damaging to health.

    While we in New England usually have relatively lower levels of pollution, it’s important also to note that some cities around the globe experience very high PM2.5 on a regular basis, not only from wildfires, but other sources such as power plants and industry. So, while we’re feeling the effects over the past few days, we should remember the broader importance of reducing PM2.5 levels overall for human health everywhere.

    Q: While firefighters battle fires directly this wildfire season, what can we do to reduce the effects of associated air pollution? And what can we do in the long-term, to prevent or reduce wildfire impacts?

    Selin: In the short term, protecting yourself from the impacts of PM2.5 is important. Limiting time outdoors, avoiding outdoor exercise, and wearing a high-quality mask are some strategies that can minimize exposure. Air filters can help reduce the concentrations of particles in indoor air. Taking measures to avoid exposure is particularly important for vulnerable groups. It’s also important to note that these strategies aren’t equally possible for everyone (for example, people who work outside) — stressing the importance of developing new strategies to address the underlying causes of increasing wildfires.

    Over the long term, mitigating climate change is important — because warm and dry conditions lead to wildfires, warming increases fire risk. Preventing the fires that are ignited by people or human activities can help.  Another way that damages can be mitigated in the longer term is by exploring land management strategies that could help manage fire intensity. More

  • in

    J-WAFS announces 2023 seed grant recipients

    Today, the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) announced its ninth round of seed grants to support innovative research projects at MIT. The grants are designed to fund research efforts that tackle challenges related to water and food for human use, with the ultimate goal of creating meaningful impact as the world population continues to grow and the planet undergoes significant climate and environmental changes.Ten new projects led by 15 researchers from seven different departments will be supported this year. The projects address a range of challenges by employing advanced materials, technology innovations, and new approaches to resource management. The new projects aim to remove harmful chemicals from water sources, develop monitoring and other systems to help manage various aquaculture industries, optimize water purification materials, and more.“The seed grant program is J-WAFS’ flagship grant initiative,” says J-WAFS executive director Renee J. Robins. “The funding is intended to spur groundbreaking MIT research addressing complex issues that are challenging our water and food systems. The 10 projects selected this year show great promise, and we look forward to the progress and accomplishments these talented researchers will make,” she adds.The 2023 J-WAFS seed grant researchers and their projects are:Sara Beery, an assistant professor in the Department of Electrical Engineering and Computer Science (EECS), is building the first completely automated system to estimate the size of salmon populations in the Pacific Northwest (PNW).Salmon are a keystone species in the PNW, feeding human populations for the last 7,500 years at least. However, overfishing, habitat loss, and climate change threaten extinction of salmon populations across the region. Accurate salmon counts during their seasonal migration to their natal river to spawn are essential for fisheries’ regulation and management but are limited by human capacity. Fish population monitoring is a widespread challenge in the United States and worldwide. Beery and her team are working to build a system that will provide a detailed picture of the state of salmon populations in unprecedented, spatial, and temporal resolution by combining sonar sensors and computer vision and machine learning (CVML) techniques. The sonar will capture individual fish as they swim upstream and CVML will train accurate algorithms to interpret the sonar video for detecting, tracking, and counting fish automatically while adapting to changing river conditions and fish densities.Another aquaculture project is being led by Michael Triantafyllou, the Henry L. and Grace Doherty Professor in Ocean Science and Engineering in the Department of Mechanical Engineering, and Robert Vincent, the assistant director at MIT’s Sea Grant Program. They are working with Otto Cordero, an associate professor in the Department of Civil and Environmental Engineering, to control harmful bacteria blooms in aquaculture algae feed production.

    Aquaculture in the United States represents a $1.5 billion industry annually and helps support 1.7 million jobs, yet many American hatcheries are not able to keep up with demand. One barrier to aquaculture production is the high degree of variability in survival rates, most likely caused by a poorly controlled microbiome that leads to bacterial infections and sub-optimal feed efficiency. Triantafyllou, Vincent, and Cordero plan to monitor the microbiome composition of a shellfish hatchery in order to identify possible causing agents of mortality, as well as beneficial microbes. They hope to pair microbe data with detail phenotypic information about the animal population to generate rapid diagnostic tests and explore the potential for microbiome therapies to protect larvae and prevent future outbreaks. The researchers plan to transfer their findings and technology to the local and regional aquaculture community to ensure healthy aquaculture production that will support the expansion of the U.S. aquaculture industry.

    David Des Marais is the Cecil and Ida Green Career Development Professor in the Department of Civil and Environmental Engineering. His 2023 J-WAFS project seeks to understand plant growth responses to elevated carbon dioxide (CO2) in the atmosphere, in the hopes of identifying breeding strategies that maximize crop yield under future CO2 scenarios.Today’s crop plants experience higher atmospheric CO2 than 20 or 30 years ago. Crops such as wheat, oat, barley, and rice typically increase their growth rate and biomass when grown at experimentally elevated atmospheric CO2. This is known as the so-called “CO2 fertilization effect.” However, not all plant species respond to rising atmospheric CO2 with increased growth, and for the ones that do, increased growth doesn’t necessarily correspond to increased crop yield. Using specially built plant growth chambers that can control the concentration of CO2, Des Marais will explore how CO2 availability impacts the development of tillers (branches) in the grass species Brachypodium. He will study how gene expression controls tiller development, and whether this is affected by the growing environment. The tillering response refers to how many branches a plant produces, which sets a limit on how much grain it can yield. Therefore, optimizing the tillering response to elevated CO2 could greatly increase yield. Des Marais will also look at the complete genome sequence of Brachypodium, wheat, oat, and barley to help identify genes relevant for branch growth.Darcy McRose, an assistant professor in the Department of Civil and Environmental Engineering, is researching whether a combination of plant metabolites and soil bacteria can be used to make mineral-associated phosphorus more bioavailable.The nutrient phosphorus is essential for agricultural plant growth, but when added as a fertilizer, phosphorus sticks to the surface of soil minerals, decreasing bioavailability, limiting plant growth, and accumulating residual phosphorus. Heavily fertilized agricultural soils often harbor large reservoirs of this type of mineral-associated “legacy” phosphorus. Redox transformations are one chemical process that can liberate mineral-associated phosphorus. However, this needs to be carefully controlled, as overly mobile phosphorus can lead to runoff and pollution of natural waters. Ideally, phosphorus would be made bioavailable when plants need it and immobile when they don’t. Many plants make small metabolites called coumarins that might be able to solubilize mineral-adsorbed phosphorus and be activated and inactivated under different conditions. McRose will use laboratory experiments to determine whether a combination of plant metabolites and soil bacteria can be used as a highly efficient and tunable system for phosphorus solubilization. She also aims to develop an imaging platform to investigate exchanges of phosphorus between plants and soil microbes.Many of the 2023 seed grants will support innovative technologies to monitor, quantify, and remediate various kinds of pollutants found in water. Two of the new projects address the problem of per- and polyfluoroalkyl substances (PFAS), human-made chemicals that have recently emerged as a global health threat. Known as “forever chemicals,” PFAS are used in many manufacturing processes. These chemicals are known to cause significant health issues including cancer, and they have become pervasive in soil, dust, air, groundwater, and drinking water. Unfortunately, the physical and chemical properties of PFAS render them difficult to detect and remove.Aristide Gumyusenge, the Merton C. Assistant Professor of Materials Science and Engineering, is using metal-organic frameworks for low-cost sensing and capture of PFAS. Most metal-organic frameworks (MOFs) are synthesized as particles, which complicates their high accuracy sensing performance due to defects such as intergranular boundaries. Thin, film-based electronic devices could enable the use of MOFs for many applications, especially chemical sensing. Gumyusenge’s project aims to design test kits based on two-dimensional conductive MOF films for detecting PFAS in drinking water. In early demonstrations, Gumyusenge and his team showed that these MOF films can sense PFAS at low concentrations. They will continue to iterate using a computation-guided approach to tune sensitivity and selectivity of the kits with the goal of deploying them in real-world scenarios.Carlos Portela, the Brit (1961) and Alex (1949) d’Arbeloff Career Development Professor in the Department of Mechanical Engineering, and Ariel Furst, the Cook Career Development Professor in the Department of Chemical Engineering, are building novel architected materials to act as filters for the removal of PFAS from water. Portela and Furst will design and fabricate nanoscale materials that use activated carbon and porous polymers to create a physical adsorption system. They will engineer the materials to have tunable porosities and morphologies that can maximize interactions between contaminated water and functionalized surfaces, while providing a mechanically robust system.Rohit Karnik is a Tata Professor and interim co-department head of the Department of Mechanical Engineering. He is working on another technology, his based on microbead sensors, to rapidly measure and monitor trace contaminants in water.Water pollution from both biological and chemical contaminants contributes to an estimated 1.36 million deaths annually. Chemical contaminants include pesticides and herbicides, heavy metals like lead, and compounds used in manufacturing. These emerging contaminants can be found throughout the environment, including in water supplies. The Environmental Protection Agency (EPA) in the United States sets recommended water quality standards, but states are responsible for developing their own monitoring criteria and systems, which must be approved by the EPA every three years. However, the availability of data on regulated chemicals and on candidate pollutants is limited by current testing methods that are either insensitive or expensive and laboratory-based, requiring trained scientists and technicians. Karnik’s project proposes a simple, self-contained, portable system for monitoring trace and emerging pollutants in water, making it suitable for field studies. The concept is based on multiplexed microbead-based sensors that use thermal or gravitational actuation to generate a signal. His proposed sandwich assay, a testing format that is appealing for environmental sensing, will enable both single-use and continuous monitoring. The hope is that the bead-based assays will increase the ease and reach of detecting and quantifying trace contaminants in water for both personal and industrial scale applications.Alexander Radosevich, a professor in the Department of Chemistry, and Timothy Swager, the John D. MacArthur Professor of Chemistry, are teaming up to create rapid, cost-effective, and reliable techniques for on-site arsenic detection in water.Arsenic contamination of groundwater is a problem that affects as many as 500 million people worldwide. Arsenic poisoning can lead to a range of severe health problems from cancer to cardiovascular and neurological impacts. Both the EPA and the World Health Organization have established that 10 parts per billion is a practical threshold for arsenic in drinking water, but measuring arsenic in water at such low levels is challenging, especially in resource-limited environments where access to sensitive laboratory equipment may not be readily accessible. Radosevich and Swager plan to develop reaction-based chemical sensors that bind and extract electrons from aqueous arsenic. In this way, they will exploit the inherent reactivity of aqueous arsenic to selectively detect and quantify it. This work will establish the chemical basis for a new method of detecting trace arsenic in drinking water.Rajeev Ram is a professor in the Department of Electrical Engineering and Computer Science. His J-WAFS research will advance a robust technology for monitoring nitrogen-containing pollutants, which threaten over 15,000 bodies of water in the United States alone.Nitrogen in the form of nitrate, nitrite, ammonia, and urea can run off from agricultural fertilizer and lead to harmful algal blooms that jeopardize human health. Unfortunately, monitoring these contaminants in the environment is challenging, as sensors are difficult to maintain and expensive to deploy. Ram and his students will work to establish limits of detection for nitrate, nitrite, ammonia, and urea in environmental, industrial, and agricultural samples using swept-source Raman spectroscopy. Swept-source Raman spectroscopy is a method of detecting the presence of a chemical by using a tunable, single mode laser that illuminates a sample. This method does not require costly, high-power lasers or a spectrometer. Ram will then develop and demonstrate a portable system that is capable of achieving chemical specificity in complex, natural environments. Data generated by such a system should help regulate polluters and guide remediation.Kripa Varanasi, a professor in the Department of Mechanical Engineering, and Angela Belcher, the James Mason Crafts Professor and head of the Department of Biological Engineering, will join forces to develop an affordable water disinfection technology that selectively identifies, adsorbs, and kills “superbugs” in domestic and industrial wastewater.Recent research predicts that antibiotic-resistance bacteria (superbugs) will result in $100 trillion in health care expenses and 10 million deaths annually by 2050. The prevalence of superbugs in our water systems has increased due to corroded pipes, contamination, and climate change. Current drinking water disinfection technologies are designed to kill all types of bacteria before human consumption. However, for certain domestic and industrial applications there is a need to protect the good bacteria required for ecological processes that contribute to soil and plant health. Varanasi and Belcher will combine material, biological, process, and system engineering principles to design a sponge-based water disinfection technology that can identify and destroy harmful bacteria while leaving the good bacteria unharmed. By modifying the sponge surface with specialized nanomaterials, their approach will be able to kill superbugs faster and more efficiently. The sponge filters can be deployed under very low pressure, making them an affordable technology, especially in resource-constrained communities.In addition to the 10 seed grant projects, J-WAFS will also fund a research initiative led by Greg Sixt. Sixt is the research manager for climate and food systems at J-WAFS, and the director of the J-WAFS-led Food and Climate Systems Transformation (FACT) Alliance. His project focuses on the Lake Victoria Basin (LVB) of East Africa. The second-largest freshwater lake in the world, Lake Victoria straddles three countries (Uganda, Tanzania, and Kenya) and has a catchment area that encompasses two more (Rwanda and Burundi). Sixt will collaborate with Michael Hauser of the University of Natural Resources and Life Sciences, Vienna, and Paul Kariuki, of the Lake Victoria Basin Commission.The group will study how to adapt food systems to climate change in the Lake Victoria Basin. The basin is facing a range of climate threats that could significantly impact livelihoods and food systems in the expansive region. For example, extreme weather events like droughts and floods are negatively affecting agricultural production and freshwater resources. Across the LVB, current approaches to land and water management are unsustainable and threaten future food and water security. The Lake Victoria Basin Commission (LVBC), a specialized institution of the East African Community, wants to play a more vital role in coordinating transboundary land and water management to support transitions toward more resilient, sustainable, and equitable food systems. The primary goal of this research will be to support the LVBC’s transboundary land and water management efforts, specifically as they relate to sustainability and climate change adaptation in food systems. The research team will work with key stakeholders in Kenya, Uganda, and Tanzania to identify specific capacity needs to facilitate land and water management transitions. The two-year project will produce actionable recommendations to the LVBC. More

  • in

    Study: Shutting down nuclear power could increase air pollution

    Nearly 20 percent of today’s electricity in the United States comes from nuclear power. The U.S. has the largest nuclear fleet in the world, with 92 reactors scattered around the country. Many of these power plants have run for more than half a century and are approaching the end of their expected lifetimes.

    Policymakers are debating whether to retire the aging reactors or reinforce their structures to continue producing nuclear energy, which many consider a low-carbon alternative to climate-warming coal, oil, and natural gas.

    Now, MIT researchers say there’s another factor to consider in weighing the future of nuclear power: air quality. In addition to being a low carbon-emitting source, nuclear power is relatively clean in terms of the air pollution it generates. Without nuclear power, how would the pattern of air pollution shift, and who would feel its effects?

    The MIT team took on these questions in a new study appearing today in Nature Energy. They lay out a scenario in which every nuclear power plant in the country has shut down, and consider how other sources such as coal, natural gas, and renewable energy would fill the resulting energy needs throughout an entire year.

    Their analysis reveals that indeed, air pollution would increase, as coal, gas, and oil sources ramp up to compensate for nuclear power’s absence. This in itself may not be surprising, but the team has put numbers to the prediction, estimating that the increase in air pollution would have serious health effects, resulting in an additional 5,200 pollution-related deaths over a single year.

    If, however, more renewable energy sources become available to supply the energy grid, as they are expected to by the year 2030, air pollution would be curtailed, though not entirely. The team found that even under this heartier renewable scenario, there is still a slight increase in air pollution in some parts of the country, resulting in a total of 260 pollution-related deaths over one year.

    When they looked at the populations directly affected by the increased pollution, they found that Black or African American communities — a disproportionate number of whom live near fossil-fuel plants — experienced the greatest exposure.

    “This adds one more layer to the environmental health and social impacts equation when you’re thinking about nuclear shutdowns, where the conversation often focuses on local risks due to accidents and mining or long-term climate impacts,” says lead author Lyssa Freese, a graduate student in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS).

    “In the debate over keeping nuclear power plants open, air quality has not been a focus of that discussion,” adds study author Noelle Selin, a professor in MIT’s Institute for Data, Systems, and Society (IDSS) and EAPS. “What we found was that air pollution from fossil fuel plants is so damaging, that anything that increases it, such as a nuclear shutdown, is going to have substantial impacts, and for some people more than others.”

    The study’s MIT-affiliated co-authors also include Principal Research Scientist Sebastian Eastham and Guillaume Chossière SM ’17, PhD ’20, along with Alan Jenn of the University of California at Davis.

    Future phase-outs

    When nuclear power plants have closed in the past, fossil fuel use increased in response. In 1985, the closure of reactors in Tennessee Valley prompted a spike in coal use, while the 2012 shutdown of a plant in California led to an increase in natural gas. In Germany, where nuclear power has almost completely been phased out, coal-fired power increased initially to fill the gap.

    Noting these trends, the MIT team wondered how the U.S. energy grid would respond if nuclear power were completely phased out.

    “We wanted to think about what future changes were expected in the energy grid,” Freese says. “We knew that coal use was declining, and there was a lot of work already looking at the impact of what that would have on air quality. But no one had looked at air quality and nuclear power, which we also noticed was on the decline.”

    In the new study, the team used an energy grid dispatch model developed by Jenn to assess how the U.S. energy system would respond to a shutdown of nuclear power. The model simulates the production of every power plant in the country and runs continuously to estimate, hour by hour, the energy demands in 64 regions across the country.

    Much like the way the actual energy market operates, the model chooses to turn a plant’s production up or down based on cost: Plants producing the cheapest energy at any given time are given priority to supply the grid over more costly energy sources.

    The team fed the model available data on each plant’s changing emissions and energy costs throughout an entire year. They then ran the model under different scenarios, including: an energy grid with no nuclear power, a baseline grid similar to today’s that includes nuclear power, and a grid with no nuclear power that also incorporates the additional renewable sources that are expected to be added by 2030.

    They combined each simulation with an atmospheric chemistry model to simulate how each plant’s various emissions travel around the country and to overlay these tracks onto maps of population density. For populations in the path of pollution, they calculated the risk of premature death based on their degree of exposure.

    System response

    Play video

    Courtesy of the researchers, edited by MIT News

    Their analysis showed a clear pattern: Without nuclear power, air pollution worsened in general, mainly affecting regions in the East Coast, where nuclear power plants are mostly concentrated. Without those plants, the team observed an uptick in production from coal and gas plants, resulting in 5,200 pollution-related deaths across the country, compared to the baseline scenario.

    They also calculated that more people are also likely to die prematurely due to climate impacts from the increase in carbon dioxide emissions, as the grid compensates for nuclear power’s absence. The climate-related effects from this additional influx of carbon dioxide could lead to 160,000 additional deaths over the next century.

    “We need to be thoughtful about how we’re retiring nuclear power plants if we are trying to think about them as part of an energy system,” Freese says. “Shutting down something that doesn’t have direct emissions itself can still lead to increases in emissions, because the grid system will respond.”

    “This might mean that we need to deploy even more renewables, in order to fill the hole left by nuclear, which is essentially a zero-emissions energy source,” Selin adds. “Otherwise we will have a reduction in air quality that we weren’t necessarily counting on.”

    This study was supported, in part, by the U.S. Environmental Protection Agency. More

  • in

    Helping the cause of environmental resilience

    Haruko Wainwright, the Norman C. Rasmussen Career Development Professor in Nuclear Science and Engineering (NSE) and assistant professor in civil and environmental engineering at MIT, grew up in rural Japan, where many nuclear facilities are located. She remembers worrying about the facilities as a child. Wainwright was only 6 at the time of the Chernobyl accident in 1986, but still recollects it vividly.

    Those early memories have contributed to Wainwright’s determination to research how technologies can mold environmental resilience — the capability of mitigating the consequences of accidents and recovering from contamination.

    Wainwright believes that environmental monitoring can help improve resilience. She co-leads the U.S. Department of Energy (DOE)’s Advanced Long-term Environmental Monitoring Systems (ALTEMIS) project, which integrates technologies such as in situ sensors, geophysics, remote sensing, simulations, and artificial intelligence to establish new paradigms for monitoring. The project focuses on soil and groundwater contamination at more than 100 U.S. sites that were used for nuclear weapons production.

    As part of this research, which was featured last year in Environmental Science & Technology Journal, Wainwright is working on a machine learning framework for improving environmental monitoring strategies. She hopes the ALTEMIS project will enable the rapid detection of anomalies while ensuring the stability of residual contamination and waste disposal facilities.

    Childhood in rural Japan

    Even as a child, Wainwright was interested in physics, history, and a variety of other subjects.

    But growing up in a rural area was not ideal for someone interested in STEM. There were no engineers or scientists in the community and no science museums, either. “It was not so cool to be interested in science, and I never talked about my interest with anyone,” Wainwright recalls.

    Television and books were the only door to the world of science. “I did not study English until middle school and I had never been on a plane until college. I sometimes find it miraculous that I am now working in the U.S. and teaching at MIT,” she says.

    As she grew a little older, Wainwright heard a lot of discussions about nuclear facilities in the region and many stories about Hiroshima and Nagasaki.

    At the same time, giants like Marie Curie inspired her to pursue science. Nuclear physics was particularly fascinating. “At some point during high school, I started wondering ‘what are radiations, what is radioactivity, what is light,’” she recalls. Reading Richard Feynman’s books and trying to understand quantum mechanics made her want to study physics in college.

    Pursuing research in the United States

    Wainwright pursued an undergraduate degree in engineering physics at Kyoto University. After two research internships in the United States, Wainwright was impressed by the dynamic and fast-paced research environment in the country.

    And compared to Japan, there were “more women in science and engineering,” Wainwright says. She enrolled at the University of California at Berkeley in 2005, where she completed her doctorate in nuclear engineering with minors in statistics and civil and environmental engineering.

    Before moving to MIT NSE in 2022, Wainwright was a staff scientist in the Earth and Environmental Area at Lawrence Berkeley National Laboratory (LBNL). She worked on a variety of topics, including radioactive contamination, climate science, CO2 sequestration, precision agriculture, and watershed science. Her time at LBNL helped Wainwright build a solid foundation about a variety of environmental sensors and monitoring and simulation methods across different earth science disciplines.   

    Empowering communities through monitoring

    One of the most compelling takeaways from Wainwright’s early research: People trust actual measurements and data as facts, even though they are skeptical about models and predictions. “I talked with many people living in Fukushima prefecture. Many of them have dosimeters and measure radiation levels on their own. They might not trust the government, but they trust their own data and are then convinced that it is safe to live there and to eat local food,” Wainwright says.

    She has been impressed that area citizens have gained significant knowledge about radiation and radioactivity through these efforts. “But they are often frustrated that people living far away, in cities like Tokyo, still avoid agricultural products from Fukushima,” Wainwright says.

    Wainwright thinks that data derived from environmental monitoring — through proper visualization and communication — can address misconceptions and fake news that often hurt people near contaminated sites.

    Wainwright is now interested in how these technologies — tested with real data at contaminated sites — can be proactively used for existing and future nuclear facilities “before contamination happens,” as she explored for Nuclear News. “I don’t think it is a good idea to simply dismiss someone’s concern as irrational. Showing credible data has been much more effective to provide assurance. Or a proper monitoring network would enable us to minimize contamination or support emergency responses when accidents happen,” she says.

    Educating communities and students

    Part of empowering communities involves improving their ability to process science-based information. “Potentially hazardous facilities always end up in rural regions; minorities’ concerns are often ignored. The problem is that these regions don’t produce so many scientists or policymakers; they don’t have a voice,” Wainwright says, “I am determined to dedicate my time to improve STEM education in rural regions and to increase the voice in these regions.”

    In a project funded by DOE, she collaborates with the team of researchers at the University of Alaska — the Alaska Center for Energy and Power and Teaching Through Technology program — aiming to improve STEM education for rural and indigenous communities. “Alaska is an important place for energy transition and environmental justice,” Wainwright says. Micro-nuclear reactors can potentially improve the life of rural communities who bear the brunt of the high cost of fuel and transportation. However, there is a distrust of nuclear technologies, stemming from past nuclear weapon testing. At the same time, Alaska has vast metal mining resources for renewable energy and batteries. And there are concerns about environmental contamination from mining and various sources. The teams’ vision is much broader, she points out. “The focus is on broader environmental monitoring technologies and relevant STEM education, addressing general water and air qualities,” Wainwright says.

    The issues also weave into the courses Wainwright teaches at MIT. “I think it is important for engineering students to be aware of environmental justice related to energy waste and mining as well as past contamination events and their recovery,” she says. “It is not OK just to send waste to, or develop mines in, rural regions, which could be a special place for some people. We need to make sure that these developments will not harm the environment and health of local communities.” Wainwright also hopes that this knowledge will ultimately encourage students to think creatively about engineering designs that minimize waste or recycle material.

    The last question of the final quiz of one of her recent courses was: Assume that you store high-level radioactive waste in your “backyard.” What technical strategies would make you and your family feel safe? “All students thought about this question seriously and many suggested excellent points, including those addressing environmental monitoring,” Wainwright says, “that made me hopeful about the future.” More

  • in

    Improving health outcomes by targeting climate and air pollution simultaneously

    Climate policies are typically designed to reduce greenhouse gas emissions that result from human activities and drive climate change. The largest source of these emissions is the combustion of fossil fuels, which increases atmospheric concentrations of ozone, fine particulate matter (PM2.5) and other air pollutants that pose public health risks. While climate policies may result in lower concentrations of health-damaging air pollutants as a “co-benefit” of reducing greenhouse gas emissions-intensive activities, they are most effective at improving health outcomes when deployed in tandem with geographically targeted air-quality regulations.

    Yet the computer models typically used to assess the likely air quality/health impacts of proposed climate/air-quality policy combinations come with drawbacks for decision-makers. Atmospheric chemistry/climate models can produce high-resolution results, but they are expensive and time-consuming to run. Integrated assessment models can produce results for far less time and money, but produce results at global and regional scales, rendering them insufficiently precise to obtain accurate assessments of air quality/health impacts at the subnational level.

    To overcome these drawbacks, a team of researchers at MIT and the University of California at Davis has developed a climate/air-quality policy assessment tool that is both computationally efficient and location-specific. Described in a new study in the journal ACS Environmental Au, the tool could enable users to obtain rapid estimates of combined policy impacts on air quality/health at more than 1,500 locations around the globe — estimates precise enough to reveal the equity implications of proposed policy combinations within a particular region.

    “The modeling approach described in this study may ultimately allow decision-makers to assess the efficacy of multiple combinations of climate and air-quality policies in reducing the health impacts of air pollution, and to design more effective policies,” says Sebastian Eastham, the study’s lead author and a principal research scientist at the MIT Joint Program on the Science and Policy of Global Change. “It may also be used to determine if a given policy combination would result in equitable health outcomes across a geographical area of interest.”

    To demonstrate the efficiency and accuracy of their policy assessment tool, the researchers showed that outcomes projected by the tool within seconds were consistent with region-specific results from detailed chemistry/climate models that took days or even months to run. While continuing to refine and develop their approaches, they are now working to embed the new tool into integrated assessment models for direct use by policymakers.

    “As decision-makers implement climate policies in the context of other sustainability challenges like air pollution, efficient modeling tools are important for assessment — and new computational techniques allow us to build faster and more accurate tools to provide credible, relevant information to a broader range of users,” says Noelle Selin, a professor at MIT’s Institute for Data, Systems and Society and Department of Earth, Atmospheric and Planetary Sciences, and supervising author of the study. “We are looking forward to further developing such approaches, and to working with stakeholders to ensure that they provide timely, targeted and useful assessments.”

    The study was funded, in part, by the U.S. Environmental Protection Agency and the Biogen Foundation. More

  • in

    Study: Carbon-neutral pavements are possible by 2050, but rapid policy and industry action are needed

    Almost 2.8 million lane-miles, or about 4.6 million lane-kilometers, of the United States are paved.

    Roads and streets form the backbone of our built environment. They take us to work or school, take goods to their destinations, and much more.

    However, a new study by MIT Concrete Sustainability Hub (CSHub) researchers shows that the annual greenhouse gas (GHG) emissions of all construction materials used in the U.S. pavement network are 11.9 to 13.3 megatons. This is equivalent to the emissions of a gasoline-powered passenger vehicle driving about 30 billion miles in a year.

    As roads are built, repaved, and expanded, new approaches and thoughtful material choices are necessary to dampen their carbon footprint. 

    The CSHub researchers found that, by 2050, mixtures for pavements can be made carbon-neutral if industry and governmental actors help to apply a range of solutions — like carbon capture — to reduce, avoid, and neutralize embodied impacts. (A neutralization solution is any compensation mechanism in the value chain of a product that permanently removes the global warming impact of the processes after avoiding and reducing the emissions.) Furthermore, nearly half of pavement-related greenhouse gas (GHG) savings can be achieved in the short term with a negative or nearly net-zero cost.

    The research team, led by Hessam AzariJafari, MIT CSHub’s deputy director, closed gaps in our understanding of the impacts of pavements decisions by developing a dynamic model quantifying the embodied impact of future pavements materials demand for the U.S. road network. 

    The team first split the U.S. road network into 10-mile (about 16 kilometer) segments, forecasting the condition and performance of each. They then developed a pavement management system model to create benchmarks helping to understand the current level of emissions and the efficacy of different decarbonization strategies. 

    This model considered factors such as annual traffic volume and surface conditions, budget constraints, regional variation in pavement treatment choices, and pavement deterioration. The researchers also used a life-cycle assessment to calculate annual state-level emissions from acquiring pavement construction materials, considering future energy supply and materials procurement.

    The team considered three scenarios for the U.S. pavement network: A business-as-usual scenario in which technology remains static, a projected improvement scenario aligned with stated industry and national goals, and an ambitious improvement scenario that intensifies or accelerates projected strategies to achieve carbon neutrality. 

    If no steps are taken to decarbonize pavement mixtures, the team projected that GHG emissions of construction materials used in the U.S. pavement network would increase by 19.5 percent by 2050. Under the projected scenario, there was an estimated 38 percent embodied impact reduction for concrete and 14 percent embodied impact reduction for asphalt by 2050.

    The keys to making the pavement network carbon neutral by 2050 lie in multiple places. Fully renewable energy sources should be used for pavement materials production, transportation, and other processes. The federal government must contribute to the development of these low-carbon energy sources and carbon capture technologies, as it would be nearly impossible to achieve carbon neutrality for pavements without them. 

    Additionally, increasing pavements’ recycled content and improving their design and production efficiency can lower GHG emissions to an extent. Still, neutralization is needed to achieve carbon neutrality.

    Making the right pavement construction and repair choices would also contribute to the carbon neutrality of the network. For instance, concrete pavements can offer GHG savings across the whole life cycle as they are stiffer and stay smoother for longer, meaning they require less maintenance and have a lesser impact on the fuel efficiency of vehicles. 

    Concrete pavements have other use-phase benefits including a cooling effect through an intrinsically high albedo, meaning they reflect more sunlight than regular pavements. Therefore, they can help combat extreme heat and positively affect the earth’s energy balance through positive radiative forcing, making albedo a potential neutralization mechanism.

    At the same time, a mix of fixes, including using concrete and asphalt in different contexts and proportions, could produce significant GHG savings for the pavement network; decision-makers must consider scenarios on a case-by-case basis to identify optimal solutions. 

    In addition, it may appear as though the GHG emissions of materials used in local roads are dwarfed by the emissions of interstate highway materials. However, the study found that the two road types have a similar impact. In fact, all road types contribute heavily to the total GHG emissions of pavement materials in general. Therefore, stakeholders at the federal, state, and local levels must be involved if our roads are to become carbon neutral. 

    The path to pavement network carbon-neutrality is, therefore, somewhat of a winding road. It demands regionally specific policies and widespread investment to help implement decarbonization solutions, just as renewable energy initiatives have been supported. Providing subsidies and covering the costs of premiums, too, are vital to avoid shifts in the market that would derail environmental savings.

    When planning for these shifts, we must recall that pavements have impacts not just in their production, but across their entire life cycle. As pavements are used, maintained, and eventually decommissioned, they have significant impacts on the surrounding environment.

    If we are to meet climate goals such as the Paris Agreement, which demands that we reach carbon-neutrality by 2050 to avoid the worst impacts of climate change, we — as well as industry and governmental stakeholders — must come together to take a hard look at the roads we use every day and work to reduce their life cycle emissions. 

    The study was published in the International Journal of Life Cycle Assessment. In addition to AzariJafari, the authors include Fengdi Guo of the MIT Department of Civil and Environmental Engineering; Jeremy Gregory, executive director of the MIT Climate and Sustainability Consortium; and Randolph Kirchain, director of the MIT CSHub. More