More stories

  • in

    Exploring the cellular neighborhood

    Cells rely on complex molecular machines composed of protein assemblies to perform essential functions such as energy production, gene expression, and protein synthesis. To better understand how these machines work, scientists capture snapshots of them by isolating proteins from cells and using various methods to determine their structures. However, isolating proteins from cells also removes them from the context of their native environment, including protein interaction partners and cellular location.

    Recently, cryogenic electron tomography (cryo-ET) has emerged as a way to observe proteins in their native environment by imaging frozen cells at different angles to obtain three-dimensional structural information. This approach is exciting because it allows researchers to directly observe how and where proteins associate with each other, revealing the cellular neighborhood of those interactions within the cell.

    With the technology available to image proteins in their native environment, MIT graduate student Barrett Powell wondered if he could take it one step further: What if molecular machines could be observed in action? In a paper published March 8 in Nature Methods, Powell describes the method he developed, called tomoDRGN, for modeling structural differences of proteins in cryo-ET data that arise from protein motions or proteins binding to different interaction partners. These variations are known as structural heterogeneity. 

    Although Powell had joined the lab of MIT associate professor of biology Joey Davis as an experimental scientist, he recognized the potential impact of computational approaches in understanding structural heterogeneity within a cell. Previously, the Davis Lab developed a related methodology named cryoDRGN to understand structural heterogeneity in purified samples. As Powell and Davis saw cryo-ET rising in prominence in the field, Powell took on the challenge of re-imagining this framework to work in cells.

    When solving structures with purified samples, each particle is imaged only once. By contrast, cryo-ET data is collected by imaging each particle more than 40 times from different angles. That meant tomoDRGN needed to be able to merge the information from more than 40 images, which was where the project hit a roadblock: the amount of data led to an information overload.

    To address this, Powell successfully rebuilt the cryoDRGN model to prioritize only the highest-quality data. When imaging the same particle multiple times, radiation damage occurs. The images acquired earlier, therefore, tend to be of higher quality because the particles are less damaged.

    “By excluding some of the lower-quality data, the results were actually better than using all of the data — and the computational performance was substantially faster,” Powell says.

    Just as Powell was beginning work on testing his model, he had a stroke of luck: The authors of a groundbreaking new study that visualized, for the first time, ribosomes inside cells at near-atomic resolution, shared their raw data on the Electric Microscopy Public Image Archive (EMPIAR). This dataset was an exemplary test case for Powell, through which he demonstrated that tomoDRGN could uncover structural heterogeneity within cryo-ET data. 

    According to Powell, one exciting result is what tomoDRGN found surrounding a subset of ribosomes in the EMPIAR dataset. Some of the ribosomal particles were associated with a bacterial cell membrane and engaged in a process called cotranslational translocation. This occurs when a protein is being simultaneously synthesized and transported across a membrane. Researchers can use this result to make new hypotheses about how the ribosome functions with other protein machinery integral to transporting proteins outside of the cell, now guided by a structure of the complex in its native environment. 

    After seeing that tomoDRGN could resolve structural heterogeneity from a structurally diverse dataset, Powell was curious: How small of a population could tomoDRGN identify? For that test, he chose a protein named apoferritin, which is a commonly used benchmark for cryo-ET and is often treated as structurally homogeneous. Ferritin is a protein used for iron storage and is referred to as apoferritin when it lacks iron.

    Surprisingly, in addition to the expected particles, tomoDRGN revealed a minor population of ferritin particles — with iron bound — making up just 2 percent of the dataset, that was not previously reported. This result further demonstrated tomoDRGN’s ability to identify structural states that occur so infrequently that they would be averaged out of a 3D reconstruction. 

    Powell and other members of the Davis Lab are excited to see how tomoDRGN can be applied to further ribosomal studies and to other systems. Davis works on understanding how cells assemble, regulate, and degrade molecular machines, so the next steps include exploring ribosome biogenesis within cells in greater detail using this new tool.

    “What are the possible states that we may be losing during purification?” Davis asks. “Perhaps more excitingly, we can look at how they localize within the cell and what partners and protein complexes they may be interacting with.” More

  • in

    Learning the language of molecules to predict their properties

    Discovering new materials and drugs typically involves a manual, trial-and-error process that can take decades and cost millions of dollars. To streamline this process, scientists often use machine learning to predict molecular properties and narrow down the molecules they need to synthesize and test in the lab.

    Researchers from MIT and the MIT-Watson AI Lab have developed a new, unified framework that can simultaneously predict molecular properties and generate new molecules much more efficiently than these popular deep-learning approaches.

    To teach a machine-learning model to predict a molecule’s biological or mechanical properties, researchers must show it millions of labeled molecular structures — a process known as training. Due to the expense of discovering molecules and the challenges of hand-labeling millions of structures, large training datasets are often hard to come by, which limits the effectiveness of machine-learning approaches.

    By contrast, the system created by the MIT researchers can effectively predict molecular properties using only a small amount of data. Their system has an underlying understanding of the rules that dictate how building blocks combine to produce valid molecules. These rules capture the similarities between molecular structures, which helps the system generate new molecules and predict their properties in a data-efficient manner.

    This method outperformed other machine-learning approaches on both small and large datasets, and was able to accurately predict molecular properties and generate viable molecules when given a dataset with fewer than 100 samples.

    “Our goal with this project is to use some data-driven methods to speed up the discovery of new molecules, so you can train a model to do the prediction without all of these cost-heavy experiments,” says lead author Minghao Guo, a computer science and electrical engineering (EECS) graduate student.

    Guo’s co-authors include MIT-IBM Watson AI Lab research staff members Veronika Thost, Payel Das, and Jie Chen; recent MIT graduates Samuel Song ’23 and Adithya Balachandran ’23; and senior author Wojciech Matusik, a professor of electrical engineering and computer science and a member of the MIT-IBM Watson AI Lab, who leads the Computational Design and Fabrication Group within the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). The research will be presented at the International Conference for Machine Learning.

    Learning the language of molecules

    To achieve the best results with machine-learning models, scientists need training datasets with millions of molecules that have similar properties to those they hope to discover. In reality, these domain-specific datasets are usually very small. So, researchers use models that have been pretrained on large datasets of general molecules, which they apply to a much smaller, targeted dataset. However, because these models haven’t acquired much domain-specific knowledge, they tend to perform poorly.

    The MIT team took a different approach. They created a machine-learning system that automatically learns the “language” of molecules — what is known as a molecular grammar — using only a small, domain-specific dataset. It uses this grammar to construct viable molecules and predict their properties.

    In language theory, one generates words, sentences, or paragraphs based on a set of grammar rules. You can think of a molecular grammar the same way. It is a set of production rules that dictate how to generate molecules or polymers by combining atoms and substructures.

    Just like a language grammar, which can generate a plethora of sentences using the same rules, one molecular grammar can represent a vast number of molecules. Molecules with similar structures use the same grammar production rules, and the system learns to understand these similarities.

    Since structurally similar molecules often have similar properties, the system uses its underlying knowledge of molecular similarity to predict properties of new molecules more efficiently. 

    “Once we have this grammar as a representation for all the different molecules, we can use it to boost the process of property prediction,” Guo says.

    The system learns the production rules for a molecular grammar using reinforcement learning — a trial-and-error process where the model is rewarded for behavior that gets it closer to achieving a goal.

    But because there could be billions of ways to combine atoms and substructures, the process to learn grammar production rules would be too computationally expensive for anything but the tiniest dataset.

    The researchers decoupled the molecular grammar into two parts. The first part, called a metagrammar, is a general, widely applicable grammar they design manually and give the system at the outset. Then it only needs to learn a much smaller, molecule-specific grammar from the domain dataset. This hierarchical approach speeds up the learning process.

    Big results, small datasets

    In experiments, the researchers’ new system simultaneously generated viable molecules and polymers, and predicted their properties more accurately than several popular machine-learning approaches, even when the domain-specific datasets had only a few hundred samples. Some other methods also required a costly pretraining step that the new system avoids.

    The technique was especially effective at predicting physical properties of polymers, such as the glass transition temperature, which is the temperature required for a material to transition from solid to liquid. Obtaining this information manually is often extremely costly because the experiments require extremely high temperatures and pressures.

    To push their approach further, the researchers cut one training set down by more than half — to just 94 samples. Their model still achieved results that were on par with methods trained using the entire dataset.

    “This grammar-based representation is very powerful. And because the grammar itself is a very general representation, it can be deployed to different kinds of graph-form data. We are trying to identify other applications beyond chemistry or material science,” Guo says.

    In the future, they also want to extend their current molecular grammar to include the 3D geometry of molecules and polymers, which is key to understanding the interactions between polymer chains. They are also developing an interface that would show a user the learned grammar production rules and solicit feedback to correct rules that may be wrong, boosting the accuracy of the system.

    This work is funded, in part, by the MIT-IBM Watson AI Lab and its member company, Evonik. More

  • in

    J-WAFS announces 2023 seed grant recipients

    Today, the Abdul Latif Jameel Water and Food Systems Lab (J-WAFS) announced its ninth round of seed grants to support innovative research projects at MIT. The grants are designed to fund research efforts that tackle challenges related to water and food for human use, with the ultimate goal of creating meaningful impact as the world population continues to grow and the planet undergoes significant climate and environmental changes.Ten new projects led by 15 researchers from seven different departments will be supported this year. The projects address a range of challenges by employing advanced materials, technology innovations, and new approaches to resource management. The new projects aim to remove harmful chemicals from water sources, develop monitoring and other systems to help manage various aquaculture industries, optimize water purification materials, and more.“The seed grant program is J-WAFS’ flagship grant initiative,” says J-WAFS executive director Renee J. Robins. “The funding is intended to spur groundbreaking MIT research addressing complex issues that are challenging our water and food systems. The 10 projects selected this year show great promise, and we look forward to the progress and accomplishments these talented researchers will make,” she adds.The 2023 J-WAFS seed grant researchers and their projects are:Sara Beery, an assistant professor in the Department of Electrical Engineering and Computer Science (EECS), is building the first completely automated system to estimate the size of salmon populations in the Pacific Northwest (PNW).Salmon are a keystone species in the PNW, feeding human populations for the last 7,500 years at least. However, overfishing, habitat loss, and climate change threaten extinction of salmon populations across the region. Accurate salmon counts during their seasonal migration to their natal river to spawn are essential for fisheries’ regulation and management but are limited by human capacity. Fish population monitoring is a widespread challenge in the United States and worldwide. Beery and her team are working to build a system that will provide a detailed picture of the state of salmon populations in unprecedented, spatial, and temporal resolution by combining sonar sensors and computer vision and machine learning (CVML) techniques. The sonar will capture individual fish as they swim upstream and CVML will train accurate algorithms to interpret the sonar video for detecting, tracking, and counting fish automatically while adapting to changing river conditions and fish densities.Another aquaculture project is being led by Michael Triantafyllou, the Henry L. and Grace Doherty Professor in Ocean Science and Engineering in the Department of Mechanical Engineering, and Robert Vincent, the assistant director at MIT’s Sea Grant Program. They are working with Otto Cordero, an associate professor in the Department of Civil and Environmental Engineering, to control harmful bacteria blooms in aquaculture algae feed production.

    Aquaculture in the United States represents a $1.5 billion industry annually and helps support 1.7 million jobs, yet many American hatcheries are not able to keep up with demand. One barrier to aquaculture production is the high degree of variability in survival rates, most likely caused by a poorly controlled microbiome that leads to bacterial infections and sub-optimal feed efficiency. Triantafyllou, Vincent, and Cordero plan to monitor the microbiome composition of a shellfish hatchery in order to identify possible causing agents of mortality, as well as beneficial microbes. They hope to pair microbe data with detail phenotypic information about the animal population to generate rapid diagnostic tests and explore the potential for microbiome therapies to protect larvae and prevent future outbreaks. The researchers plan to transfer their findings and technology to the local and regional aquaculture community to ensure healthy aquaculture production that will support the expansion of the U.S. aquaculture industry.

    David Des Marais is the Cecil and Ida Green Career Development Professor in the Department of Civil and Environmental Engineering. His 2023 J-WAFS project seeks to understand plant growth responses to elevated carbon dioxide (CO2) in the atmosphere, in the hopes of identifying breeding strategies that maximize crop yield under future CO2 scenarios.Today’s crop plants experience higher atmospheric CO2 than 20 or 30 years ago. Crops such as wheat, oat, barley, and rice typically increase their growth rate and biomass when grown at experimentally elevated atmospheric CO2. This is known as the so-called “CO2 fertilization effect.” However, not all plant species respond to rising atmospheric CO2 with increased growth, and for the ones that do, increased growth doesn’t necessarily correspond to increased crop yield. Using specially built plant growth chambers that can control the concentration of CO2, Des Marais will explore how CO2 availability impacts the development of tillers (branches) in the grass species Brachypodium. He will study how gene expression controls tiller development, and whether this is affected by the growing environment. The tillering response refers to how many branches a plant produces, which sets a limit on how much grain it can yield. Therefore, optimizing the tillering response to elevated CO2 could greatly increase yield. Des Marais will also look at the complete genome sequence of Brachypodium, wheat, oat, and barley to help identify genes relevant for branch growth.Darcy McRose, an assistant professor in the Department of Civil and Environmental Engineering, is researching whether a combination of plant metabolites and soil bacteria can be used to make mineral-associated phosphorus more bioavailable.The nutrient phosphorus is essential for agricultural plant growth, but when added as a fertilizer, phosphorus sticks to the surface of soil minerals, decreasing bioavailability, limiting plant growth, and accumulating residual phosphorus. Heavily fertilized agricultural soils often harbor large reservoirs of this type of mineral-associated “legacy” phosphorus. Redox transformations are one chemical process that can liberate mineral-associated phosphorus. However, this needs to be carefully controlled, as overly mobile phosphorus can lead to runoff and pollution of natural waters. Ideally, phosphorus would be made bioavailable when plants need it and immobile when they don’t. Many plants make small metabolites called coumarins that might be able to solubilize mineral-adsorbed phosphorus and be activated and inactivated under different conditions. McRose will use laboratory experiments to determine whether a combination of plant metabolites and soil bacteria can be used as a highly efficient and tunable system for phosphorus solubilization. She also aims to develop an imaging platform to investigate exchanges of phosphorus between plants and soil microbes.Many of the 2023 seed grants will support innovative technologies to monitor, quantify, and remediate various kinds of pollutants found in water. Two of the new projects address the problem of per- and polyfluoroalkyl substances (PFAS), human-made chemicals that have recently emerged as a global health threat. Known as “forever chemicals,” PFAS are used in many manufacturing processes. These chemicals are known to cause significant health issues including cancer, and they have become pervasive in soil, dust, air, groundwater, and drinking water. Unfortunately, the physical and chemical properties of PFAS render them difficult to detect and remove.Aristide Gumyusenge, the Merton C. Assistant Professor of Materials Science and Engineering, is using metal-organic frameworks for low-cost sensing and capture of PFAS. Most metal-organic frameworks (MOFs) are synthesized as particles, which complicates their high accuracy sensing performance due to defects such as intergranular boundaries. Thin, film-based electronic devices could enable the use of MOFs for many applications, especially chemical sensing. Gumyusenge’s project aims to design test kits based on two-dimensional conductive MOF films for detecting PFAS in drinking water. In early demonstrations, Gumyusenge and his team showed that these MOF films can sense PFAS at low concentrations. They will continue to iterate using a computation-guided approach to tune sensitivity and selectivity of the kits with the goal of deploying them in real-world scenarios.Carlos Portela, the Brit (1961) and Alex (1949) d’Arbeloff Career Development Professor in the Department of Mechanical Engineering, and Ariel Furst, the Cook Career Development Professor in the Department of Chemical Engineering, are building novel architected materials to act as filters for the removal of PFAS from water. Portela and Furst will design and fabricate nanoscale materials that use activated carbon and porous polymers to create a physical adsorption system. They will engineer the materials to have tunable porosities and morphologies that can maximize interactions between contaminated water and functionalized surfaces, while providing a mechanically robust system.Rohit Karnik is a Tata Professor and interim co-department head of the Department of Mechanical Engineering. He is working on another technology, his based on microbead sensors, to rapidly measure and monitor trace contaminants in water.Water pollution from both biological and chemical contaminants contributes to an estimated 1.36 million deaths annually. Chemical contaminants include pesticides and herbicides, heavy metals like lead, and compounds used in manufacturing. These emerging contaminants can be found throughout the environment, including in water supplies. The Environmental Protection Agency (EPA) in the United States sets recommended water quality standards, but states are responsible for developing their own monitoring criteria and systems, which must be approved by the EPA every three years. However, the availability of data on regulated chemicals and on candidate pollutants is limited by current testing methods that are either insensitive or expensive and laboratory-based, requiring trained scientists and technicians. Karnik’s project proposes a simple, self-contained, portable system for monitoring trace and emerging pollutants in water, making it suitable for field studies. The concept is based on multiplexed microbead-based sensors that use thermal or gravitational actuation to generate a signal. His proposed sandwich assay, a testing format that is appealing for environmental sensing, will enable both single-use and continuous monitoring. The hope is that the bead-based assays will increase the ease and reach of detecting and quantifying trace contaminants in water for both personal and industrial scale applications.Alexander Radosevich, a professor in the Department of Chemistry, and Timothy Swager, the John D. MacArthur Professor of Chemistry, are teaming up to create rapid, cost-effective, and reliable techniques for on-site arsenic detection in water.Arsenic contamination of groundwater is a problem that affects as many as 500 million people worldwide. Arsenic poisoning can lead to a range of severe health problems from cancer to cardiovascular and neurological impacts. Both the EPA and the World Health Organization have established that 10 parts per billion is a practical threshold for arsenic in drinking water, but measuring arsenic in water at such low levels is challenging, especially in resource-limited environments where access to sensitive laboratory equipment may not be readily accessible. Radosevich and Swager plan to develop reaction-based chemical sensors that bind and extract electrons from aqueous arsenic. In this way, they will exploit the inherent reactivity of aqueous arsenic to selectively detect and quantify it. This work will establish the chemical basis for a new method of detecting trace arsenic in drinking water.Rajeev Ram is a professor in the Department of Electrical Engineering and Computer Science. His J-WAFS research will advance a robust technology for monitoring nitrogen-containing pollutants, which threaten over 15,000 bodies of water in the United States alone.Nitrogen in the form of nitrate, nitrite, ammonia, and urea can run off from agricultural fertilizer and lead to harmful algal blooms that jeopardize human health. Unfortunately, monitoring these contaminants in the environment is challenging, as sensors are difficult to maintain and expensive to deploy. Ram and his students will work to establish limits of detection for nitrate, nitrite, ammonia, and urea in environmental, industrial, and agricultural samples using swept-source Raman spectroscopy. Swept-source Raman spectroscopy is a method of detecting the presence of a chemical by using a tunable, single mode laser that illuminates a sample. This method does not require costly, high-power lasers or a spectrometer. Ram will then develop and demonstrate a portable system that is capable of achieving chemical specificity in complex, natural environments. Data generated by such a system should help regulate polluters and guide remediation.Kripa Varanasi, a professor in the Department of Mechanical Engineering, and Angela Belcher, the James Mason Crafts Professor and head of the Department of Biological Engineering, will join forces to develop an affordable water disinfection technology that selectively identifies, adsorbs, and kills “superbugs” in domestic and industrial wastewater.Recent research predicts that antibiotic-resistance bacteria (superbugs) will result in $100 trillion in health care expenses and 10 million deaths annually by 2050. The prevalence of superbugs in our water systems has increased due to corroded pipes, contamination, and climate change. Current drinking water disinfection technologies are designed to kill all types of bacteria before human consumption. However, for certain domestic and industrial applications there is a need to protect the good bacteria required for ecological processes that contribute to soil and plant health. Varanasi and Belcher will combine material, biological, process, and system engineering principles to design a sponge-based water disinfection technology that can identify and destroy harmful bacteria while leaving the good bacteria unharmed. By modifying the sponge surface with specialized nanomaterials, their approach will be able to kill superbugs faster and more efficiently. The sponge filters can be deployed under very low pressure, making them an affordable technology, especially in resource-constrained communities.In addition to the 10 seed grant projects, J-WAFS will also fund a research initiative led by Greg Sixt. Sixt is the research manager for climate and food systems at J-WAFS, and the director of the J-WAFS-led Food and Climate Systems Transformation (FACT) Alliance. His project focuses on the Lake Victoria Basin (LVB) of East Africa. The second-largest freshwater lake in the world, Lake Victoria straddles three countries (Uganda, Tanzania, and Kenya) and has a catchment area that encompasses two more (Rwanda and Burundi). Sixt will collaborate with Michael Hauser of the University of Natural Resources and Life Sciences, Vienna, and Paul Kariuki, of the Lake Victoria Basin Commission.The group will study how to adapt food systems to climate change in the Lake Victoria Basin. The basin is facing a range of climate threats that could significantly impact livelihoods and food systems in the expansive region. For example, extreme weather events like droughts and floods are negatively affecting agricultural production and freshwater resources. Across the LVB, current approaches to land and water management are unsustainable and threaten future food and water security. The Lake Victoria Basin Commission (LVBC), a specialized institution of the East African Community, wants to play a more vital role in coordinating transboundary land and water management to support transitions toward more resilient, sustainable, and equitable food systems. The primary goal of this research will be to support the LVBC’s transboundary land and water management efforts, specifically as they relate to sustainability and climate change adaptation in food systems. The research team will work with key stakeholders in Kenya, Uganda, and Tanzania to identify specific capacity needs to facilitate land and water management transitions. The two-year project will produce actionable recommendations to the LVBC. More

  • in

    Study: Shutting down nuclear power could increase air pollution

    Nearly 20 percent of today’s electricity in the United States comes from nuclear power. The U.S. has the largest nuclear fleet in the world, with 92 reactors scattered around the country. Many of these power plants have run for more than half a century and are approaching the end of their expected lifetimes.

    Policymakers are debating whether to retire the aging reactors or reinforce their structures to continue producing nuclear energy, which many consider a low-carbon alternative to climate-warming coal, oil, and natural gas.

    Now, MIT researchers say there’s another factor to consider in weighing the future of nuclear power: air quality. In addition to being a low carbon-emitting source, nuclear power is relatively clean in terms of the air pollution it generates. Without nuclear power, how would the pattern of air pollution shift, and who would feel its effects?

    The MIT team took on these questions in a new study appearing today in Nature Energy. They lay out a scenario in which every nuclear power plant in the country has shut down, and consider how other sources such as coal, natural gas, and renewable energy would fill the resulting energy needs throughout an entire year.

    Their analysis reveals that indeed, air pollution would increase, as coal, gas, and oil sources ramp up to compensate for nuclear power’s absence. This in itself may not be surprising, but the team has put numbers to the prediction, estimating that the increase in air pollution would have serious health effects, resulting in an additional 5,200 pollution-related deaths over a single year.

    If, however, more renewable energy sources become available to supply the energy grid, as they are expected to by the year 2030, air pollution would be curtailed, though not entirely. The team found that even under this heartier renewable scenario, there is still a slight increase in air pollution in some parts of the country, resulting in a total of 260 pollution-related deaths over one year.

    When they looked at the populations directly affected by the increased pollution, they found that Black or African American communities — a disproportionate number of whom live near fossil-fuel plants — experienced the greatest exposure.

    “This adds one more layer to the environmental health and social impacts equation when you’re thinking about nuclear shutdowns, where the conversation often focuses on local risks due to accidents and mining or long-term climate impacts,” says lead author Lyssa Freese, a graduate student in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS).

    “In the debate over keeping nuclear power plants open, air quality has not been a focus of that discussion,” adds study author Noelle Selin, a professor in MIT’s Institute for Data, Systems, and Society (IDSS) and EAPS. “What we found was that air pollution from fossil fuel plants is so damaging, that anything that increases it, such as a nuclear shutdown, is going to have substantial impacts, and for some people more than others.”

    The study’s MIT-affiliated co-authors also include Principal Research Scientist Sebastian Eastham and Guillaume Chossière SM ’17, PhD ’20, along with Alan Jenn of the University of California at Davis.

    Future phase-outs

    When nuclear power plants have closed in the past, fossil fuel use increased in response. In 1985, the closure of reactors in Tennessee Valley prompted a spike in coal use, while the 2012 shutdown of a plant in California led to an increase in natural gas. In Germany, where nuclear power has almost completely been phased out, coal-fired power increased initially to fill the gap.

    Noting these trends, the MIT team wondered how the U.S. energy grid would respond if nuclear power were completely phased out.

    “We wanted to think about what future changes were expected in the energy grid,” Freese says. “We knew that coal use was declining, and there was a lot of work already looking at the impact of what that would have on air quality. But no one had looked at air quality and nuclear power, which we also noticed was on the decline.”

    In the new study, the team used an energy grid dispatch model developed by Jenn to assess how the U.S. energy system would respond to a shutdown of nuclear power. The model simulates the production of every power plant in the country and runs continuously to estimate, hour by hour, the energy demands in 64 regions across the country.

    Much like the way the actual energy market operates, the model chooses to turn a plant’s production up or down based on cost: Plants producing the cheapest energy at any given time are given priority to supply the grid over more costly energy sources.

    The team fed the model available data on each plant’s changing emissions and energy costs throughout an entire year. They then ran the model under different scenarios, including: an energy grid with no nuclear power, a baseline grid similar to today’s that includes nuclear power, and a grid with no nuclear power that also incorporates the additional renewable sources that are expected to be added by 2030.

    They combined each simulation with an atmospheric chemistry model to simulate how each plant’s various emissions travel around the country and to overlay these tracks onto maps of population density. For populations in the path of pollution, they calculated the risk of premature death based on their degree of exposure.

    System response

    Play video

    Courtesy of the researchers, edited by MIT News

    Their analysis showed a clear pattern: Without nuclear power, air pollution worsened in general, mainly affecting regions in the East Coast, where nuclear power plants are mostly concentrated. Without those plants, the team observed an uptick in production from coal and gas plants, resulting in 5,200 pollution-related deaths across the country, compared to the baseline scenario.

    They also calculated that more people are also likely to die prematurely due to climate impacts from the increase in carbon dioxide emissions, as the grid compensates for nuclear power’s absence. The climate-related effects from this additional influx of carbon dioxide could lead to 160,000 additional deaths over the next century.

    “We need to be thoughtful about how we’re retiring nuclear power plants if we are trying to think about them as part of an energy system,” Freese says. “Shutting down something that doesn’t have direct emissions itself can still lead to increases in emissions, because the grid system will respond.”

    “This might mean that we need to deploy even more renewables, in order to fill the hole left by nuclear, which is essentially a zero-emissions energy source,” Selin adds. “Otherwise we will have a reduction in air quality that we weren’t necessarily counting on.”

    This study was supported, in part, by the U.S. Environmental Protection Agency. More

  • in

    Methane research takes on new urgency at MIT

    One of the most notable climate change provisions in the 2022 Inflation Reduction Act is the first U.S. federal tax on a greenhouse gas (GHG). That the fee targets methane (CH4), rather than carbon dioxide (CO2), emissions is indicative of the urgency the scientific community has placed on reducing this short-lived but powerful gas. Methane persists in the air about 12 years — compared to more than 1,000 years for CO2 — yet it immediately causes about 120 times more warming upon release. The gas is responsible for at least a quarter of today’s gross warming. 

    “Methane has a disproportionate effect on near-term warming,” says Desiree Plata, the director of MIT Methane Network. “CH4 does more damage than CO2 no matter how long you run the clock. By removing methane, we could potentially avoid critical climate tipping points.” 

    Because GHGs have a runaway effect on climate, reductions made now will have a far greater impact than the same reductions made in the future. Cutting methane emissions will slow the thawing of permafrost, which could otherwise lead to massive methane releases, as well as reduce increasing emissions from wetlands.  

    “The goal of MIT Methane Network is to reduce methane emissions by 45 percent by 2030, which would save up to 0.5 degree C of warming by 2100,” says Plata, an associate professor of civil and environmental engineering at MIT and director of the Plata Lab. “When you consider that governments are trying for a 1.5-degree reduction of all GHGs by 2100, this is a big deal.” 

    Under normal concentrations, methane, like CO2, poses no health risks. Yet methane assists in the creation of high levels of ozone. In the lower atmosphere, ozone is a key component of air pollution, which leads to “higher rates of asthma and increased emergency room visits,” says Plata. 

    Methane-related projects at the Plata Lab include a filter made of zeolite — the same clay-like material used in cat litter — designed to convert methane into CO2 at dairy farms and coal mines. At first glance, the technology would appear to be a bit of a hard sell, since it converts one GHG into another. Yet the zeolite filter’s low carbon and dollar costs, combined with the disproportionate warming impact of methane, make it a potential game-changer.

    The sense of urgency about methane has been amplified by recent studies that show humans are generating far more methane emissions than previously estimated, and that the rates are rising rapidly. Exactly how much methane is in the air is uncertain. Current methods for measuring atmospheric methane, such as ground, drone, and satellite sensors, “are not readily abundant and do not always agree with each other,” says Plata.  

    The Plata Lab is collaborating with Tim Swager in the MIT Department of Chemistry to develop low-cost methane sensors. “We are developing chemiresisitive sensors that cost about a dollar that you could place near energy infrastructure to back-calculate where leaks are coming from,” says Plata.  

    The researchers are working on improving the accuracy of the sensors using machine learning techniques and are planning to integrate internet-of-things technology to transmit alerts. Plata and Swager are not alone in focusing on data collection: the Inflation Reduction Act adds significant funding for methane sensor research. 

    Other research at the Plata Lab includes the development of nanomaterials and heterogeneous catalysis techniques for environmental applications. The lab also explores mitigation solutions for industrial waste, particularly those related to the energy transition. Plata is the co-founder of an lithium-ion battery recycling startup called Nth Cycle. 

    On a more fundamental level, the Plata Lab is exploring how to develop products with environmental and social sustainability in mind. “Our overarching mission is to change the way that we invent materials and processes so that environmental objectives are incorporated along with traditional performance and cost metrics,” says Plata. “It is important to do that rigorous assessment early in the design process.”

    Play video

    MIT amps up methane research 

    The MIT Methane Network brings together 26 researchers from MIT along with representatives of other institutions “that are dedicated to the idea that we can reduce methane levels in our lifetime,” says Plata. The organization supports research such as Plata’s zeolite and sensor projects, as well as designing pipeline-fixing robots, developing methane-based fuels for clean hydrogen, and researching the capture and conversion of methane into liquid chemical precursors for pharmaceuticals and plastics. Other members are researching policies to encourage more sustainable agriculture and land use, as well as methane-related social justice initiatives. 

    “Methane is an especially difficult problem because it comes from all over the place,” says Plata. A recent Global Carbon Project study estimated that half of methane emissions are caused by humans. This is led by waste and agriculture (28 percent), including cow and sheep belching, rice paddies, and landfills.  

    Fossil fuels represent 18 percent of the total budget. Of this, about 63 percent is derived from oil and gas production and pipelines, 33 percent from coal mining activities, and 5 percent from industry and transportation. Human-caused biomass burning, primarily from slash-and-burn agriculture, emits about 4 percent of the global total.  

    The other half of the methane budget includes natural methane emissions from wetlands (20 percent) and other natural sources (30 percent). The latter includes permafrost melting and natural biomass burning, such as forest fires started by lightning.  

    With increases in global warming and population, the line between anthropogenic and natural causes is getting fuzzier. “Human activities are accelerating natural emissions,” says Plata. “Climate change increases the release of methane from wetlands and permafrost and leads to larger forest and peat fires.”  

    The calculations can get complicated. For example, wetlands provide benefits from CO2 capture, biological diversity, and sea level rise resiliency that more than compensate for methane releases. Meanwhile, draining swamps for development increases emissions. 

    Over 100 nations have signed onto the U.N.’s Global Methane Pledge to reduce at least 30 percent of anthropogenic emissions within the next 10 years. The U.N. report estimates that this goal can be achieved using proven technologies and that about 60 percent of these reductions can be accomplished at low cost. 

    Much of the savings would come from greater efficiencies in fossil fuel extraction, processing, and delivery. The methane fees in the Inflation Reduction Act are primarily focused on encouraging fossil fuel companies to accelerate ongoing efforts to cap old wells, flare off excess emissions, and tighten pipeline connections.  

    Fossil fuel companies have already made far greater pledges to reduce methane than they have with CO2, which is central to their business. This is due, in part, to the potential savings, as well as in preparation for methane regulations expected from the Environmental Protection Agency in late 2022. The regulations build upon existing EPA oversight of drilling operations, and will likely be exempt from the U.S. Supreme Court’s ruling that limits the federal government’s ability to regulate GHGs. 

    Zeolite filter targets methane in dairy and coal 

    The “low-hanging fruit” of gas stream mitigation addresses most of the 20 percent of total methane emissions in which the gas is released in sufficiently high concentrations for flaring. Plata’s zeolite filter aims to address the thornier challenge of reducing the 80 percent of non-flammable dilute emissions. 

    Plata found inspiration in decades-old catalysis research for turning methane into methanol. One strategy has been to use an abundant, low-cost aluminosilicate clay called zeolite.  

    “The methanol creation process is challenging because you need to separate a liquid, and it has very low efficiency,” says Plata. “Yet zeolite can be very efficient at converting methane into CO2, and it is much easier because it does not require liquid separation. Converting methane to CO2 sounds like a bad thing, but there is a major anti-warming benefit. And because methane is much more dilute than CO2, the relative CO2 contribution is minuscule.”  

    Using zeolite to create methanol requires highly concentrated methane, high temperatures and pressures, and industrial processing conditions. Yet Plata’s process, which dopes the zeolite with copper, operates in the presence of oxygen at much lower temperatures under typical pressures. “We let the methane proceed the way it wants from a thermodynamic perspective from methane to methanol down to CO2,” says Plata. 

    Researchers around the world are working on other dilute methane removal technologies. Projects include spraying iron salt aerosols into sea air where they react with natural chlorine or bromine radicals, thereby capturing methane. Most of these geoengineering solutions, however, are difficult to measure and would require massive scale to make a difference.  

    Plata is focusing her zeolite filters on environments where concentrations are high, but not so high as to be flammable. “We are trying to scale zeolite into filters that you could snap onto the side of a cross-ventilation fan in a dairy barn or in a ventilation air shaft in a coal mine,” says Plata. “For every packet of air we bring in, we take a lot of methane out, so we get more bang for our buck.”  

    The major challenge is creating a filter that can handle high flow rates without getting clogged or falling apart. Dairy barn air handlers can push air at up to 5,000 cubic feet per minute and coal mine handlers can approach 500,000 CFM. 

    Plata is exploring engineering options including fluidized bed reactors with floating catalyst particles. Another filter solution, based in part on catalytic converters, features “higher-order geometric structures where you have a porous material with a long path length where the gas can interact with the catalyst,” says Plata. “This avoids the challenge with fluidized beds of containing catalyst particles in the reactor. Instead, they are fixed within a structured material.”  

    Competing technologies for removing methane from mine shafts “operate at temperatures of 1,000 to 1,200 degrees C, requiring a lot of energy and risking explosion,” says Plata. “Our technology avoids safety concerns by operating at 300 to 400 degrees C. It reduces energy use and provides more tractable deployment costs.” 

    Potentially, energy and dollar costs could be further reduced in coal mines by capturing the heat generated by the conversion process. “In coal mines, you have enrichments above a half-percent methane, but below the 4 percent flammability threshold,” says Plata. “The excess heat from the process could be used to generate electricity using off-the-shelf converters.” 

    Plata’s dairy barn research is funded by the Gerstner Family Foundation and the coal mining project by the U.S. Department of Energy. “The DOE would like us to spin out the technology for scale-up within three years,” says Plata. “We cannot guarantee we will hit that goal, but we are trying to develop this as quickly as possible. Our society needs to start reducing methane emissions now.”  More

  • in

    MIT welcomes eight MLK Visiting Professors and Scholars for 2022-23

    From space traffic to virus evolution, community journalism to hip-hop, this year’s cohort in the Martin Luther King Jr. (MLK) Visiting Professors and Scholars Program will power an unprecedented range of intellectual pursuits during their time on the MIT campus. 

    “MIT is so fortunate to have this group of remarkable individuals join us,” says Institute Community and Equity Officer John Dozier. “They bring a range and depth of knowledge to share with our students and faculty, and we look forward to working with them to build a stronger sense of community across the Institute.”

    Since its inception in 1990, the MLK Scholars Program has hosted more than 135 visiting professors, practitioners, and intellectuals who enhance and enrich the MIT community through their engagement with students and faculty. The program, which honors the life and legacy of MLK by increasing the presence and recognizing the contributions of underrepresented scholars, is supported by the Office of the Provost with oversight from the Institute Community and Equity Office. 

    In spring 2022, MIT President Rafael Reif committed to MIT to adding two new positions in the MLK Visiting Scholars Program, including an expert in Native American studies. Those additional positions will be filled in the coming year.  

    The 2022-23 MLK Scholars:

    Daniel Auguste is an assistant professor in the Department of Sociology at Florida Atlantic University and is hosted by Roberto Fernandez in MIT Sloan School of Management. Auguste’s research interests include social inequalities in entrepreneurship development. During his visit, Auguste will study the impact of education debt burden and wealth inequality on business ownership and success, and how these consequences differ by race and ethnicity.

    Tawanna Dillahunt is an associate professor in the School of Information at the University of Michigan, where she also holds an appointment with the electrical engineering and computer science department. Catherine D’Ignazio in the Department of Urban Studies and Planning and Fotini Christia in the Institute for Data, Systems, and Society are her faculty hosts. Dillahunt’s scholarship focuses on equitable and inclusive computing. She identifies technological opportunities and implements tools to address and alleviate employment challenges faced by marginalized people. Dillahunt’s visiting appointment begins in September 2023.

    Javit Drake ’94 is a principal scientist in modeling and simulation and measurement sciences at Proctor & Gamble. His faculty host is Fikile Brushett in the Department of Chemical Engineering. An industry researcher with electrochemical energy expertise, Drake is a Course 10 (chemical engineering) alumnus, repeat lecturer, and research affiliate in the department. During his visit, he will continue to work with the Brushett Research Group to deepen his research and understanding of battery technologies while he innovates from those discoveries.

    Eunice Ferreira is an associate professor in the Department of Theater at Skidmore College and is hosted by Claire Conceison in Music and Theater Arts. This fall, Ferreira will teach “Black Theater Matters,” a course where students will explore performance and the cultural production of Black intellectuals and artists on Broadway and in local communities. Her upcoming book projects include “Applied Theatre and Racial Justice: Radical Imaginings for Just Communities” (forthcoming from Routledge) and “Crioulo Performance: Remapping Creole and Mixed Race Theatre” (forthcoming from Vanderbilt University Press). 

    Wasalu Jaco, widely known as Lupe Fiasco, is a rapper, record producer, and entrepreneur. He will be co-hosted by Nick Montfort of Comparative Media Studies/Writing and Mary Fuller of Literature. Jaco’s interests lie in the nexus of rap, computing, and activism. As a former visiting artist in MIT’s Center for Art, Science and Technology (CAST), he will leverage existing collaborations and participate in digital media and art research projects that use computing to explore novel questions related to hip-hop and rap. In addition to his engagement in cross-departmental projects, Jaco will teach a spring course on rap in the media and social contexts.

    Moribah Jah is an associate professor in the Aerospace Engineering and Engineering Mechanics Department at the University of Texas at Austin. He is hosted by Danielle Wood in Media Arts and Sciences and the Department of Aeronautics and Astronautics, and Richard Linares in the Department of Aeronautics and Astronautics. Jah’s research interests include space sustainability and space traffic management; as a visiting scholar, he will develop and strengthen a joint MIT/UT-Austin research program to increase resources and visibility of space sustainability. Jah will also help host the AeroAstro Rising Stars symposium, which highlights graduate students, postdocs, and early-career faculty from backgrounds underrepresented in aerospace engineering. 

    Louis Massiah SM ’82 is a documentary filmmaker and the founder and director of community media of Scribe Video Center, a nonprofit organization that uses media as a tool for social change. His work focuses on empowering Black, Indigenous, and People of Color (BIPOC) filmmakers to tell the stories of/by BIPOC communities. Massiah is hosted by Vivek Bald in Creative Media Studies/Writing. Massiah’s first project will be the launch of a National Community Media Journalism Consortium, a platform to share local news on a broader scale across communities.

    Brian Nord, a scientist at Fermi National Accelerator Laboratory, will join the Laboratory for Nuclear Science, hosted by Jesse Thaler in the Department of Physics. Nord’s research interests include the connection between ethics, justice, and scientific discovery. His efforts will be aimed at introducing new insights into how we model physical systems, design scientific experiments, and approach the ethics of artificial intelligence. As a lead organizer of the Strike for Black Lives in 2020, Nord will engage with justice-oriented members of the MIT physics community to strategize actions for advocacy and activism.

    Brandon Ogbunu, an assistant professor in the Department of Ecology and Evolutionary Biology at Yale University, will be hosted by Matthew Shoulders in the Department of Chemistry. Ogbunu’s research focus is on implementing chemistry and materials science perspectives into his work on virus evolution. In addition to serving as a guest lecturer in graduate courses, he will be collaborating with the Office of Engineering Outreach Programs on their K-12 outreach and recruitment efforts.

    For more information about these scholars and the program, visit mlkscholars.mit.edu. More

  • in

    Is it topological? A new materials database has the answer

    What will it take to make our electronics smarter, faster, and more resilient? One idea is to build them from materials that are topological.

    Topology stems from a branch of mathematics that studies shapes that can be manipulated or deformed without losing certain core properties. A donut is a common example: If it were made of rubber, a donut could be twisted and squeezed into a completely new shape, such as a coffee mug, while retaining a key trait — namely, its center hole, which takes the form of the cup’s handle. The hole, in this case, is a topological trait, robust against certain deformations.

    In recent years, scientists have applied concepts of topology to the discovery of materials with similarly robust electronic properties. In 2007, researchers predicted the first electronic topological insulators — materials in which electrons that behave in ways that are “topologically protected,” or persistent in the face of certain disruptions.

    Since then, scientists have searched for more topological materials with the aim of building better, more robust electronic devices. Until recently, only a handful of such materials were identified, and were therefore assumed to be a rarity.

    Now researchers at MIT and elsewhere have discovered that, in fact, topological materials are everywhere, if you know how to look for them.

    In a paper published today in Science, the team, led by Nicolas Regnault of Princeton University and the École Normale Supérieure Paris, reports harnessing the power of multiple supercomputers to map the electronic structure of more than 96,000 natural and synthetic crystalline materials. They applied sophisticated filters to determine whether and what kind of topological traits exist in each structure.

    Overall, they found that 90 percent of all known crystalline structures contain at least one topological property, and more than 50 percent of all naturally occurring materials exhibit some sort of topological behavior.

    “We found there’s a ubiquity — topology is everywhere,” says Benjamin Wieder, the study’s co-lead, and a postdoc in MIT’s Department of Physics.

    The team has compiled the newly identified materials into a new, freely accessible Topological Materials Database resembling a periodic table of topology. With this new library, scientists can quickly search materials of interest for any topological properties they might hold, and harness them to build ultra-low-power transistors, new magnetic memory storage, and other devices with robust electronic properties.

    The paper includes co-lead author Maia Vergniory of the Donostia International Physics Center, Luis Elcoro of the University of Basque Country, Stuart Parkin and Claudia Felser of the Max Planck Institute, and Andrei Bernevig of Princeton University.

    Beyond intuition

    The new study was motivated by a desire to speed up the traditional search for topological materials.

    “The way the original materials were found was through chemical intuition,” Wieder says. “That approach had a lot of early successes. But as we theoretically predicted more kinds of topological phases, it seemed intuition wasn’t getting us very far.”

    Wieder and his colleagues instead utilized an efficient and systematic method to root out signs of topology, or robust electronic behavior, in all known crystalline structures, also known as inorganic solid-state materials.

    For their study, the researchers looked to the Inorganic Crystal Structure Database, or ICSD, a repository into which researchers enter the atomic and chemical structures of crystalline materials that they have studied. The database includes materials found in nature, as well as those that have been synthesized and manipulated in the lab. The ICSD is currently the largest materials database in the world, containing over 193,000 crystals whose structures have been mapped and characterized.

    The team downloaded the entire ICSD, and after performing some data cleaning to weed out structures with corrupted files or incomplete data, the researchers were left with just over 96,000 processable structures. For each of these structures, they performed a set of calculations based on fundamental knowledge of the relation between chemical constituents, to produce a map of the material’s electronic structure, also known as the electron band structure.

    The team was able to efficiently carry out the complicated calculations for each structure using multiple supercomputers, which they then employed to perform a second set of operations, this time to screen for various known topological phases, or persistent electrical behavior in each crystal material.

    “We’re looking for signatures in the electronic structure in which certain robust phenomena should occur in this material,” explains Wieder, whose previous work involved refining and expanding the screening technique, known as topological quantum chemistry.

    From their high-throughput analysis, the team quickly discovered a surprisingly large number of materials that are naturally topological, without any experimental manipulation, as well as materials that can be manipulated, for instance with light or chemical doping, to exhibit some sort of robust electronic behavior. They also discovered a handful of materials that contained more than one topological state when exposed to certain conditions.

    “Topological phases of matter in 3D solid-state materials have been proposed as venues for observing and manipulating exotic effects, including the interconversion of electrical current and electron spin, the tabletop simulation of exotic theories from high-energy physics, and even, under the right conditions, the storage and manipulation of quantum information,” Wieder notes. 

    For experimentalists who are studying such effects, Wieder says the team’s new database now reveals a menagerie of new materials to explore.

    This research was funded, in part, by the U.S. Department of Energy, the National Science Foundation, and the Office of Naval Research. More

  • in

    Seven from MIT elected to American Academy of Arts and Sciences for 2022

    Seven MIT faculty members are among more than 250 leaders from academia, the arts, industry, public policy, and research elected to the American Academy of Arts and Sciences, the academy announced Thursday.

    One of the nation’s most prestigious honorary societies, the academy is also a leading center for independent policy research. Members contribute to academy publications, as well as studies of science and technology policy, energy and global security, social policy and American institutions, the humanities and culture, and education.

    Those elected from MIT this year are:

    Alberto Abadie, professor of economics and associate director of the Institute for Data, Systems, and Society
    Regina Barzilay, the School of Engineering Distinguished Professor for AI and Health
    Roman Bezrukavnikov, professor of mathematics
    Michale S. Fee, the Glen V. and Phyllis F. Dorflinger Professor and head of the Department of Brain and Cognitive Sciences
    Dina Katabi, the Thuan and Nicole Pham Professor
    Ronald T. Raines, the Roger and Georges Firmenich Professor of Natural Products Chemistry
    Rebecca R. Saxe, the John W. Jarve Professor of Brain and Cognitive Sciences

    “We are celebrating a depth of achievements in a breadth of areas,” says David Oxtoby, president of the American Academy. “These individuals excel in ways that excite us and inspire us at a time when recognizing excellence, commending expertise, and working toward the common good is absolutely essential to realizing a better future.”

    Since its founding in 1780, the academy has elected leading thinkers from each generation, including George Washington and Benjamin Franklin in the 18th century, Maria Mitchell and Daniel Webster in the 19th century, and Toni Morrison and Albert Einstein in the 20th century. The current membership includes more than 250 Nobel and Pulitzer Prize winners. More