More stories

  • in

    A breakthrough on “loss and damage,” but also disappointment, at UN climate conference

    As the 2022 United Nations climate change conference, known as COP27, stretched into its final hours on Saturday, Nov. 19, it was uncertain what kind of agreement might emerge from two weeks of intensive international negotiations.

    In the end, COP27 produced mixed results: on the one hand, a historic agreement for wealthy countries to compensate low-income countries for “loss and damage,” but on the other, limited progress on new plans for reducing the greenhouse gas emissions that are warming the planet.

    “We need to drastically reduce emissions now — and this is an issue this COP did not address,” said U.N. Secretary-General António Guterres in a statement at the conclusion of COP27. “A fund for loss and damage is essential — but it’s not an answer if the climate crisis washes a small island state off the map — or turns an entire African country to desert.”

    Throughout the two weeks of the conference, a delegation of MIT students, faculty, and staff was at the Sharm El-Sheikh International Convention Center to observe the negotiations, conduct and share research, participate in panel discussions, and forge new connections with researchers, policymakers, and advocates from around the world.

    Loss and damage

    A key issue coming in to COP27 (COP stands for “conference of the parties” to the U.N. Framework Convention on Climate Change, held for the 27th time) was loss and damage: a term used by the U.N. to refer to harms caused by climate change — either through acute catastrophes like extreme weather events or slower-moving impacts like sea level rise — to which communities and countries are unable to adapt. 

    Ultimately, a deal on loss and damage proved to be COP27’s most prominent accomplishment. Negotiators reached an eleventh-hour agreement to “establish new funding arrangements for assisting developing countries that are particularly vulnerable to the adverse effects of climate change.” 

    “Providing financial assistance to developing countries so they can better respond to climate-related loss and damage is not only a moral issue, but also a pragmatic one,” said Michael Mehling, deputy director of the MIT Center for Energy and Environmental Policy Research, who attended COP27 and participated in side events. “Future emissions growth will be squarely centered in the developing world, and offering support through different channels is key to building the trust needed for more robust global cooperation on mitigation.”

    Youssef Shaker, a graduate student in the MIT Technology and Policy Program and a research assistant with the MIT Energy Initiative, attended the second week of the conference, where he followed the negotiations over loss and damage closely. 

    “While the creation of a fund is certainly an achievement,” Shaker said, “significant questions remain to be answered, such as the size of the funding available as well as which countries receive access to it.” A loss-and-damage fund that is not adequately funded, Shaker noted, “would not be an impactful outcome.” 

    The agreement on loss and damage created a new committee, made up of 24 country representatives, to “operationalize” the new funding arrangements, including identifying funding sources. The committee is tasked with delivering a set of recommendations at COP28, which will take place next year in Dubai.

    Advising the U.N. on net zero

    Though the decisions reached at COP27 did not include major new commitments on reducing emissions from the combustion of fossil fuels, the transition to a clean global energy system was nevertheless a key topic of conversation throughout the conference.

    The Council of Engineers for the Energy Transition (CEET), an independent, international body of engineers and energy systems experts formed to provide advice to the U.N. on achieving net-zero emissions globally by 2050, convened for the first time at COP27. Jessika Trancik, a professor in the MIT Institute for Data, Systems, and Society and a member of CEET, spoke on a U.N.-sponsored panel on solutions for the transition to clean energy.

    Trancik noted that the energy transition will look different in different regions of the world. “As engineers, we need to understand those local contexts and design solutions around those local contexts — that’s absolutely essential to support a rapid and equitable energy transition.”

    At the same time, Trancik noted that there is now a set of “low-cost, ready-to-scale tools” available to every region — tools that resulted from a globally competitive process of innovation, stimulated by public policies in different countries, that dramatically drove down the costs of technologies like solar energy and lithium-ion batteries. The key, Trancik said, is for regional transition strategies to “tap into global processes of innovation.”

    Reinventing climate adaptation

    Elfatih Eltahir, the H. M. King Bhumibol Professor of Hydrology and Climate, traveled to COP27 to present plans for the Jameel Observatory Climate Resilience Early Warning System (CREWSnet), one of the five projects selected in April 2022 as a flagship in MIT’s Climate Grand Challenges initiative. CREWSnet focuses on climate adaptation, the term for adapting to climate impacts that are unavoidable.

    The aim of CREWSnet, Eltahir told the audience during a panel discussion, is “nothing short of reinventing the process of climate change adaptation,” so that it is proactive rather than reactive; community-led; data-driven and evidence-based; and so that it integrates different climate risks, from heat waves to sea level rise, rather than treating them individually.

    “However, it’s easy to talk about these changes,” said Eltahir. “The real challenge, which we are now just launching and engaging in, is to demonstrate that on the ground.” Eltahir said that early demonstrations will happen in a couple of key locations, including southwest Bangladesh, where multiple climate risks — rising sea levels, increasing soil salinity, and intensifying heat waves and cyclones — are combining to threaten the area’s agricultural production.

    Building on COP26

    Some members of MIT’s delegation attended COP27 to advance efforts that had been formally announced at last year’s U.N. climate conference, COP26, in Glasgow, Scotland.

    At an official U.N. side event co-organized by MIT on Nov. 11, Greg Sixt, the director of the Food and Climate Systems Transformation (FACT) Alliance led by the Abdul Latif Jameel Water and Food Systems Lab, provided an update on the alliance’s work since its launch at COP26.

    Food systems are a major source of greenhouse gas emissions — and are increasingly vulnerable to climate impacts. The FACT Alliance works to better connect researchers to farmers, food businesses, policymakers, and other food systems stakeholders to make food systems (which include food production, consumption, and waste) more sustainable and resilient. 

    Sixt told the audience that the FACT Alliance now counts over 20 research and stakeholder institutions around the world among its members, but also collaborates with other institutions in an “open network model” to advance work in key areas — such as a new research project exploring how climate scenarios could affect global food supply chains.

    Marcela Angel, research program director for the Environmental Solutions Initiative (ESI), helped convene a meeting at COP27 of the Afro-InterAmerican Forum on Climate Change, which also launched at COP26. The forum works with Afro-descendant leaders across the Americas to address significant environmental issues, including climate risks and biodiversity loss. 

    At the event — convened with the Colombian government and the nonprofit Conservation International — ESI brought together leaders from six countries in the Americas and presented recent work that estimates that there are over 178 million individuals who identify as Afro-descendant living in the Americas, in lands of global environmental importance. 

    “There is a significant overlap between biodiversity hot spots, protected areas, and areas of high Afro-descendant presence,” said Angel. “But the role and climate contributions of these communities is understudied, and often made invisible.”    

    Limiting methane emissions

    Methane is a short-lived but potent greenhouse gas: When released into the atmosphere, it immediately traps about 120 times more heat than carbon dioxide does. More than 150 countries have now signed the Global Methane Pledge, launched at COP26, which aims to reduce methane emissions by at least 30 percent by 2030 compared to 2020 levels.

    Sergey Paltsev, the deputy director of the Joint Program on the Science and Policy of Global Change and a senior research scientist at the MIT Energy Initiative, gave the keynote address at a Nov. 17 event on methane, where he noted the importance of methane reductions from the oil and gas sector to meeting the 2030 goal.

    “The oil and gas sector is where methane emissions reductions could be achieved the fastest,” said Paltsev. “We also need to employ an integrated approach to address methane emissions in all sectors and all regions of the world because methane emissions reductions provide a near-term pathway to avoiding dangerous tipping points in the global climate system.”

    “Keep fighting relentlessly”

    Arina Khotimsky, a senior majoring in materials science and engineering and a co-president of the MIT Energy and Climate Club, attended the first week of COP27. She reflected on the experience in a social media post after returning home. 

    “COP will always have its haters. Is there greenwashing? Of course! Is everyone who should have a say in this process in the room? Not even close,” wrote Khotimsky. “So what does it take for COP to matter? It takes everyone who attended to not only put ‘climate’ on front-page news for two weeks, but to return home and keep fighting relentlessly against climate change. I know that I will.” More

  • in

    Computing for the health of the planet

    The health of the planet is one of the most important challenges facing humankind today. From climate change to unsafe levels of air and water pollution to coastal and agricultural land erosion, a number of serious challenges threaten human and ecosystem health.

    Ensuring the health and safety of our planet necessitates approaches that connect scientific, engineering, social, economic, and political aspects. New computational methods can play a critical role by providing data-driven models and solutions for cleaner air, usable water, resilient food, efficient transportation systems, better-preserved biodiversity, and sustainable sources of energy.

    The MIT Schwarzman College of Computing is committed to hiring multiple new faculty in computing for climate and the environment, as part of MIT’s plan to recruit 20 climate-focused faculty under its climate action plan. This year the college undertook searches with several departments in the schools of Engineering and Science for shared faculty in computing for health of the planet, one of the six strategic areas of inquiry identified in an MIT-wide planning process to help focus shared hiring efforts. The college also undertook searches for core computing faculty in the Department of Electrical Engineering and Computer Science (EECS).

    The searches are part of an ongoing effort by the MIT Schwarzman College of Computing to hire 50 new faculty — 25 shared with other academic departments and 25 in computer science and artificial intelligence and decision-making. The goal is to build capacity at MIT to help more deeply infuse computing and other disciplines in departments.

    Four interdisciplinary scholars were hired in these searches. They will join the MIT faculty in the coming year to engage in research and teaching that will advance physical understanding of low-carbon energy solutions, Earth-climate modeling, biodiversity monitoring and conservation, and agricultural management through high-performance computing, transformational numerical methods, and machine-learning techniques.

    “By coordinating hiring efforts with multiple departments and schools, we were able to attract a cohort of exceptional scholars in this area to MIT. Each of them is developing and using advanced computational methods and tools to help find solutions for a range of climate and environmental issues,” says Daniel Huttenlocher, dean of the MIT Schwarzman College of Computing and the Henry Warren Ellis Professor of Electrical Engineering and Computer Science. “They will also help strengthen cross-departmental ties in computing across an important, critical area for MIT and the world.”

    “These strategic hires in the area of computing for climate and the environment are an incredible opportunity for the college to deepen its academic offerings and create new opportunity for collaboration across MIT,” says Anantha P. Chandrakasan, dean of the MIT School of Engineering and the Vannevar Bush Professor of Electrical Engineering and Computer Science. “The college plays a pivotal role in MIT’s overarching effort to hire climate-focused faculty — introducing the critical role of computing to address the health of the planet through innovative research and curriculum.”

    The four new faculty members are:

    Sara Beery will join MIT as an assistant professor in the Faculty of Artificial Intelligence and Decision-Making in EECS in September 2023. Beery received her PhD in computing and mathematical sciences at Caltech in 2022, where she was advised by Pietro Perona. Her research focuses on building computer vision methods that enable global-scale environmental and biodiversity monitoring across data modalities, tackling real-world challenges including strong spatiotemporal correlations, imperfect data quality, fine-grained categories, and long-tailed distributions. She partners with nongovernmental organizations and government agencies to deploy her methods in the wild worldwide and works toward increasing the diversity and accessibility of academic research in artificial intelligence through interdisciplinary capacity building and education.

    Priya Donti will join MIT as an assistant professor in the faculties of Electrical Engineering and Artificial Intelligence and Decision-Making in EECS in academic year 2023-24. Donti recently finished her PhD in the Computer Science Department and the Department of Engineering and Public Policy at Carnegie Mellon University, co-advised by Zico Kolter and Inês Azevedo. Her work focuses on machine learning for forecasting, optimization, and control in high-renewables power grids. Specifically, her research explores methods to incorporate the physics and hard constraints associated with electric power systems into deep learning models. Donti is also co-founder and chair of Climate Change AI, a nonprofit initiative to catalyze impactful work at the intersection of climate change and machine learning that is currently running through the Cornell Tech Runway Startup Postdoc Program.

    Ericmoore Jossou will join MIT as an assistant professor in a shared position between the Department of Nuclear Science and Engineering and the faculty of electrical engineering in EECS in July 2023. He is currently an assistant scientist at the Brookhaven National Laboratory, a U.S. Department of Energy-affiliated lab that conducts research in nuclear and high energy physics, energy science and technology, environmental and bioscience, nanoscience, and national security. His research at MIT will focus on understanding the processing-structure-properties correlation of materials for nuclear energy applications through advanced experiments, multiscale simulations, and data science. Jossou obtained his PhD in mechanical engineering in 2019 from the University of Saskatchewan.

    Sherrie Wang will join MIT as an assistant professor in a shared position between the Department of Mechanical Engineering and the Institute for Data, Systems, and Society in academic year 2023-24. Wang is currently a Ciriacy-Wantrup Postdoctoral Fellow at the University of California at Berkeley, hosted by Solomon Hsiang and the Global Policy Lab. She develops machine learning for Earth observation data. Her primary application areas are improving agricultural management and forecasting climate phenomena. She obtained her PhD in computational and mathematical engineering from Stanford University in 2021, where she was advised by David Lobell. More

  • in

    Taking a magnifying glass to data center operations

    When the MIT Lincoln Laboratory Supercomputing Center (LLSC) unveiled its TX-GAIA supercomputer in 2019, it provided the MIT community a powerful new resource for applying artificial intelligence to their research. Anyone at MIT can submit a job to the system, which churns through trillions of operations per second to train models for diverse applications, such as spotting tumors in medical images, discovering new drugs, or modeling climate effects. But with this great power comes the great responsibility of managing and operating it in a sustainable manner — and the team is looking for ways to improve.

    “We have these powerful computational tools that let researchers build intricate models to solve problems, but they can essentially be used as black boxes. What gets lost in there is whether we are actually using the hardware as effectively as we can,” says Siddharth Samsi, a research scientist in the LLSC. 

    To gain insight into this challenge, the LLSC has been collecting detailed data on TX-GAIA usage over the past year. More than a million user jobs later, the team has released the dataset open source to the computing community.

    Their goal is to empower computer scientists and data center operators to better understand avenues for data center optimization — an important task as processing needs continue to grow. They also see potential for leveraging AI in the data center itself, by using the data to develop models for predicting failure points, optimizing job scheduling, and improving energy efficiency. While cloud providers are actively working on optimizing their data centers, they do not often make their data or models available for the broader high-performance computing (HPC) community to leverage. The release of this dataset and associated code seeks to fill this space.

    “Data centers are changing. We have an explosion of hardware platforms, the types of workloads are evolving, and the types of people who are using data centers is changing,” says Vijay Gadepally, a senior researcher at the LLSC. “Until now, there hasn’t been a great way to analyze the impact to data centers. We see this research and dataset as a big step toward coming up with a principled approach to understanding how these variables interact with each other and then applying AI for insights and improvements.”

    Papers describing the dataset and potential applications have been accepted to a number of venues, including the IEEE International Symposium on High-Performance Computer Architecture, the IEEE International Parallel and Distributed Processing Symposium, the Annual Conference of the North American Chapter of the Association for Computational Linguistics, the IEEE High-Performance and Embedded Computing Conference, and International Conference for High Performance Computing, Networking, Storage and Analysis. 

    Workload classification

    Among the world’s TOP500 supercomputers, TX-GAIA combines traditional computing hardware (central processing units, or CPUs) with nearly 900 graphics processing unit (GPU) accelerators. These NVIDIA GPUs are specialized for deep learning, the class of AI that has given rise to speech recognition and computer vision.

    The dataset covers CPU, GPU, and memory usage by job; scheduling logs; and physical monitoring data. Compared to similar datasets, such as those from Google and Microsoft, the LLSC dataset offers “labeled data, a variety of known AI workloads, and more detailed time series data compared with prior datasets. To our knowledge, it’s one of the most comprehensive and fine-grained datasets available,” Gadepally says. 

    Notably, the team collected time-series data at an unprecedented level of detail: 100-millisecond intervals on every GPU and 10-second intervals on every CPU, as the machines processed more than 3,000 known deep-learning jobs. One of the first goals is to use this labeled dataset to characterize the workloads that different types of deep-learning jobs place on the system. This process would extract features that reveal differences in how the hardware processes natural language models versus image classification or materials design models, for example.   

    The team has now launched the MIT Datacenter Challenge to mobilize this research. The challenge invites researchers to use AI techniques to identify with 95 percent accuracy the type of job that was run, using their labeled time-series data as ground truth.

    Such insights could enable data centers to better match a user’s job request with the hardware best suited for it, potentially conserving energy and improving system performance. Classifying workloads could also allow operators to quickly notice discrepancies resulting from hardware failures, inefficient data access patterns, or unauthorized usage.

    Too many choices

    Today, the LLSC offers tools that let users submit their job and select the processors they want to use, “but it’s a lot of guesswork on the part of users,” Samsi says. “Somebody might want to use the latest GPU, but maybe their computation doesn’t actually need it and they could get just as impressive results on CPUs, or lower-powered machines.”

    Professor Devesh Tiwari at Northeastern University is working with the LLSC team to develop techniques that can help users match their workloads to appropriate hardware. Tiwari explains that the emergence of different types of AI accelerators, GPUs, and CPUs has left users suffering from too many choices. Without the right tools to take advantage of this heterogeneity, they are missing out on the benefits: better performance, lower costs, and greater productivity.

    “We are fixing this very capability gap — making users more productive and helping users do science better and faster without worrying about managing heterogeneous hardware,” says Tiwari. “My PhD student, Baolin Li, is building new capabilities and tools to help HPC users leverage heterogeneity near-optimally without user intervention, using techniques grounded in Bayesian optimization and other learning-based optimization methods. But, this is just the beginning. We are looking into ways to introduce heterogeneity in our data centers in a principled approach to help our users achieve the maximum advantage of heterogeneity autonomously and cost-effectively.”

    Workload classification is the first of many problems to be posed through the Datacenter Challenge. Others include developing AI techniques to predict job failures, conserve energy, or create job scheduling approaches that improve data center cooling efficiencies.

    Energy conservation 

    To mobilize research into greener computing, the team is also planning to release an environmental dataset of TX-GAIA operations, containing rack temperature, power consumption, and other relevant data.

    According to the researchers, huge opportunities exist to improve the power efficiency of HPC systems being used for AI processing. As one example, recent work in the LLSC determined that simple hardware tuning, such as limiting the amount of power an individual GPU can draw, could reduce the energy cost of training an AI model by 20 percent, with only modest increases in computing time. “This reduction translates to approximately an entire week’s worth of household energy for a mere three-hour time increase,” Gadepally says.

    They have also been developing techniques to predict model accuracy, so that users can quickly terminate experiments that are unlikely to yield meaningful results, saving energy. The Datacenter Challenge will share relevant data to enable researchers to explore other opportunities to conserve energy.

    The team expects that lessons learned from this research can be applied to the thousands of data centers operated by the U.S. Department of Defense. The U.S. Air Force is a sponsor of this work, which is being conducted under the USAF-MIT AI Accelerator.

    Other collaborators include researchers at MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). Professor Charles Leiserson’s Supertech Research Group is investigating performance-enhancing techniques for parallel computing, and research scientist Neil Thompson is designing studies on ways to nudge data center users toward climate-friendly behavior.

    Samsi presented this work at the inaugural AI for Datacenter Optimization (ADOPT’22) workshop last spring as part of the IEEE International Parallel and Distributed Processing Symposium. The workshop officially introduced their Datacenter Challenge to the HPC community.

    “We hope this research will allow us and others who run supercomputing centers to be more responsive to user needs while also reducing the energy consumption at the center level,” Samsi says. More

  • in

    MIT announces five flagship projects in first-ever Climate Grand Challenges competition

    MIT today announced the five flagship projects selected in its first-ever Climate Grand Challenges competition. These multiyear projects will define a dynamic research agenda focused on unraveling some of the toughest unsolved climate problems and bringing high-impact, science-based solutions to the world on an accelerated basis.

    Representing the most promising concepts to emerge from the two-year competition, the five flagship projects will receive additional funding and resources from MIT and others to develop their ideas and swiftly transform them into practical solutions at scale.

    “Climate Grand Challenges represents a whole-of-MIT drive to develop game-changing advances to confront the escalating climate crisis, in time to make a difference,” says MIT President L. Rafael Reif. “We are inspired by the creativity and boldness of the flagship ideas and by their potential to make a significant contribution to the global climate response. But given the planet-wide scale of the challenge, success depends on partnership. We are eager to work with visionary leaders in every sector to accelerate this impact-oriented research, implement serious solutions at scale, and inspire others to join us in confronting this urgent challenge for humankind.”

    Brief descriptions of the five Climate Grand Challenges flagship projects are provided below.

    Bringing Computation to the Climate Challenge

    This project leverages advances in artificial intelligence, machine learning, and data sciences to improve the accuracy of climate models and make them more useful to a variety of stakeholders — from communities to industry. The team is developing a digital twin of the Earth that harnesses more data than ever before to reduce and quantify uncertainties in climate projections.

    Research leads: Raffaele Ferrari, the Cecil and Ida Green Professor of Oceanography in the Department of Earth, Atmospheric and Planetary Sciences, and director of the Program in Atmospheres, Oceans, and Climate; and Noelle Eckley Selin, director of the Technology and Policy Program and professor with a joint appointment in the Institute for Data, Systems, and Society and the Department of Earth, Atmospheric and Planetary Sciences

    Center for Electrification and Decarbonization of Industry

    This project seeks to reinvent and electrify the processes and materials behind hard-to-decarbonize industries like steel, cement, ammonia, and ethylene production. A new innovation hub will perform targeted fundamental research and engineering with urgency, pushing the technological envelope on electricity-driven chemical transformations.

    Research leads: Yet-Ming Chiang, the Kyocera Professor of Materials Science and Engineering, and Bilge Yıldız, the Breene M. Kerr Professor in the Department of Nuclear Science and Engineering and professor in the Department of Materials Science and Engineering

    Preparing for a new world of weather and climate extremes

    This project addresses key gaps in knowledge about intensifying extreme events such as floods, hurricanes, and heat waves, and quantifies their long-term risk in a changing climate. The team is developing a scalable climate-change adaptation toolkit to help vulnerable communities and low-carbon energy providers prepare for these extreme weather events.

    Research leads: Kerry Emanuel, the Cecil and Ida Green Professor of Atmospheric Science in the Department of Earth, Atmospheric and Planetary Sciences and co-director of the MIT Lorenz Center; Miho Mazereeuw, associate professor of architecture and urbanism in the Department of Architecture and director of the Urban Risk Lab; and Paul O’Gorman, professor in the Program in Atmospheres, Oceans, and Climate in the Department of Earth, Atmospheric and Planetary Sciences

    The Climate Resilience Early Warning System

    The CREWSnet project seeks to reinvent climate change adaptation with a novel forecasting system that empowers underserved communities to interpret local climate risk, proactively plan for their futures incorporating resilience strategies, and minimize losses. CREWSnet will initially be demonstrated in southwestern Bangladesh, serving as a model for similarly threatened regions around the world.

    Research leads: John Aldridge, assistant leader of the Humanitarian Assistance and Disaster Relief Systems Group at MIT Lincoln Laboratory, and Elfatih Eltahir, the H.M. King Bhumibol Professor of Hydrology and Climate in the Department of Civil and Environmental Engineering

    Revolutionizing agriculture with low-emissions, resilient crops

    This project works to revolutionize the agricultural sector with climate-resilient crops and fertilizers that have the ability to dramatically reduce greenhouse gas emissions from food production.

    Research lead: Christopher Voigt, the Daniel I.C. Wang Professor in the Department of Biological Engineering

    “As one of the world’s leading institutions of research and innovation, it is incumbent upon MIT to draw on our depth of knowledge, ingenuity, and ambition to tackle the hard climate problems now confronting the world,” says Richard Lester, MIT associate provost for international activities. “Together with collaborators across industry, finance, community, and government, the Climate Grand Challenges teams are looking to develop and implement high-impact, path-breaking climate solutions rapidly and at a grand scale.”

    The initial call for ideas in 2020 yielded nearly 100 letters of interest from almost 400 faculty members and senior researchers, representing 90 percent of MIT departments. After an extensive evaluation, 27 finalist teams received a total of $2.7 million to develop comprehensive research and innovation plans. The projects address four broad research themes:

    To select the winning projects, research plans were reviewed by panels of international experts representing relevant scientific and technical domains as well as experts in processes and policies for innovation and scalability.

    “In response to climate change, the world really needs to do two things quickly: deploy the solutions we already have much more widely, and develop new solutions that are urgently needed to tackle this intensifying threat,” says Maria Zuber, MIT vice president for research. “These five flagship projects exemplify MIT’s strong determination to bring its knowledge and expertise to bear in generating new ideas and solutions that will help solve the climate problem.”

    “The Climate Grand Challenges flagship projects set a new standard for inclusive climate solutions that can be adapted and implemented across the globe,” says MIT Chancellor Melissa Nobles. “This competition propels the entire MIT research community — faculty, students, postdocs, and staff — to act with urgency around a worsening climate crisis, and I look forward to seeing the difference these projects can make.”

    “MIT’s efforts on climate research amid the climate crisis was a primary reason that I chose to attend MIT, and remains a reason that I view the Institute favorably. MIT has a clear opportunity to be a thought leader in the climate space in our own MIT way, which is why CGC fits in so well,” says senior Megan Xu, who served on the Climate Grand Challenges student committee and is studying ways to make the food system more sustainable.

    The Climate Grand Challenges competition is a key initiative of “Fast Forward: MIT’s Climate Action Plan for the Decade,” which the Institute published in May 2021. Fast Forward outlines MIT’s comprehensive plan for helping the world address the climate crisis. It consists of five broad areas of action: sparking innovation, educating future generations, informing and leveraging government action, reducing MIT’s own climate impact, and uniting and coordinating all of MIT’s climate efforts. More

  • in

    Using artificial intelligence to find anomalies hiding in massive datasets

    Identifying a malfunction in the nation’s power grid can be like trying to find a needle in an enormous haystack. Hundreds of thousands of interrelated sensors spread across the U.S. capture data on electric current, voltage, and other critical information in real time, often taking multiple recordings per second.

    Researchers at the MIT-IBM Watson AI Lab have devised a computationally efficient method that can automatically pinpoint anomalies in those data streams in real time. They demonstrated that their artificial intelligence method, which learns to model the interconnectedness of the power grid, is much better at detecting these glitches than some other popular techniques.

    Because the machine-learning model they developed does not require annotated data on power grid anomalies for training, it would be easier to apply in real-world situations where high-quality, labeled datasets are often hard to come by. The model is also flexible and can be applied to other situations where a vast number of interconnected sensors collect and report data, like traffic monitoring systems. It could, for example, identify traffic bottlenecks or reveal how traffic jams cascade.

    “In the case of a power grid, people have tried to capture the data using statistics and then define detection rules with domain knowledge to say that, for example, if the voltage surges by a certain percentage, then the grid operator should be alerted. Such rule-based systems, even empowered by statistical data analysis, require a lot of labor and expertise. We show that we can automate this process and also learn patterns from the data using advanced machine-learning techniques,” says senior author Jie Chen, a research staff member and manager of the MIT-IBM Watson AI Lab.

    The co-author is Enyan Dai, an MIT-IBM Watson AI Lab intern and graduate student at the Pennsylvania State University. This research will be presented at the International Conference on Learning Representations.

    Probing probabilities

    The researchers began by defining an anomaly as an event that has a low probability of occurring, like a sudden spike in voltage. They treat the power grid data as a probability distribution, so if they can estimate the probability densities, they can identify the low-density values in the dataset. Those data points which are least likely to occur correspond to anomalies.

    Estimating those probabilities is no easy task, especially since each sample captures multiple time series, and each time series is a set of multidimensional data points recorded over time. Plus, the sensors that capture all that data are conditional on one another, meaning they are connected in a certain configuration and one sensor can sometimes impact others.

    To learn the complex conditional probability distribution of the data, the researchers used a special type of deep-learning model called a normalizing flow, which is particularly effective at estimating the probability density of a sample.

    They augmented that normalizing flow model using a type of graph, known as a Bayesian network, which can learn the complex, causal relationship structure between different sensors. This graph structure enables the researchers to see patterns in the data and estimate anomalies more accurately, Chen explains.

    “The sensors are interacting with each other, and they have causal relationships and depend on each other. So, we have to be able to inject this dependency information into the way that we compute the probabilities,” he says.

    This Bayesian network factorizes, or breaks down, the joint probability of the multiple time series data into less complex, conditional probabilities that are much easier to parameterize, learn, and evaluate. This allows the researchers to estimate the likelihood of observing certain sensor readings, and to identify those readings that have a low probability of occurring, meaning they are anomalies.

    Their method is especially powerful because this complex graph structure does not need to be defined in advance — the model can learn the graph on its own, in an unsupervised manner.

    A powerful technique

    They tested this framework by seeing how well it could identify anomalies in power grid data, traffic data, and water system data. The datasets they used for testing contained anomalies that had been identified by humans, so the researchers were able to compare the anomalies their model identified with real glitches in each system.

    Their model outperformed all the baselines by detecting a higher percentage of true anomalies in each dataset.

    “For the baselines, a lot of them don’t incorporate graph structure. That perfectly corroborates our hypothesis. Figuring out the dependency relationships between the different nodes in the graph is definitely helping us,” Chen says.

    Their methodology is also flexible. Armed with a large, unlabeled dataset, they can tune the model to make effective anomaly predictions in other situations, like traffic patterns.

    Once the model is deployed, it would continue to learn from a steady stream of new sensor data, adapting to possible drift of the data distribution and maintaining accuracy over time, says Chen.

    Though this particular project is close to its end, he looks forward to applying the lessons he learned to other areas of deep-learning research, particularly on graphs.

    Chen and his colleagues could use this approach to develop models that map other complex, conditional relationships. They also want to explore how they can efficiently learn these models when the graphs become enormous, perhaps with millions or billions of interconnected nodes. And rather than finding anomalies, they could also use this approach to improve the accuracy of forecasts based on datasets or streamline other classification techniques.

    This work was funded by the MIT-IBM Watson AI Lab and the U.S. Department of Energy. More

  • in

    Q&A: Can the world change course on climate?

    In this ongoing series on climate issues, MIT faculty, students, and alumni in the humanistic fields share perspectives that are significant for solving climate change and mitigating its myriad social and ecological impacts. Nazli Choucri is a professor of political science and an expert on climate issues, who also focuses on international relations and cyberpolitics. She is the architect and director of the Global System for Sustainable Development, an evolving knowledge networking system centered on sustainability problems and solution strategies. The author and/or editor of 12 books, she is also the founding editor of the MIT Press book series “Global Environmental Accord: Strategies for Sustainability and Institutional Innovation.” Q: The impacts of climate change — including storms, floods, wildfires, and droughts — have the potential to destabilize nations, yet they are not constrained by borders. What international developments most concern you in terms of addressing climate change and its myriad ecological and social impacts?

    A: Climate change is a global issue. By definition, and a long history of practice, countries focus on their own priorities and challenges. Over time, we have seen the gradual development of norms reflecting shared interests, and the institutional arrangements to support and pursue the global good. What concerns me most is that general responses to the climate crisis are being framed in broad terms; the overall pace of change remains perilously slow; and uncertainty remains about operational action and implementation of stated intent. We have just seen the completion of the 26th meeting of states devoted to climate change, the United Nations Climate Change Conference (COP26). In some ways this is positive. Yet, past commitments remain unfulfilled, creating added stress in an already stressful political situation. Industrial countries are uneven in their recognition of, and responses to, climate change. This may signal uncertainty about whether climate matters are sufficiently compelling to call for immediate action. Alternatively, the push for changing course may seem too costly at a time when other imperatives — such as employment, economic growth, or protecting borders — inevitably dominate discourse and decisions. Whatever the cause, the result has been an unwillingness to take strong action. Unfortunately, climate change remains within the domain of “low politics,” although there are signs the issue is making a slow but steady shift to “high politics” — those issues deemed vital to the existence of the state. This means that short-term priorities, such as those noted above, continue to shape national politics and international positions and, by extension, to obscure the existential threat revealed by scientific evidence. As for developing countries, these are overwhelmed by internal challenges, and managing the difficulties of daily life always takes priority over other challenges, however compelling. Long-term thinking is a luxury, but daily bread is a necessity. Non-state actors — including registered nongovernmental organizations, climate organizations, sustainability support groups, activists of various sorts, and in some cases much of civil society — have been left with a large share of the responsibility for educating and convincing diverse constituencies of the consequences of inaction on climate change. But many of these institutions carry their own burdens and struggle to manage current pressures. The international community, through its formal and informal institutions, continues to articulate the perils of climate change and to search for a powerful consensus that can prove effective both in form and in function. The general contours are agreed upon — more or less. But leadership of, for, and by the global collective is elusive and difficult to shape. Most concerning of all is the clear reluctance to address head-on the challenge of planning for changes that we know will occur. The reality that we are all being affected — in different ways and to different degrees — has yet to be sufficiently appreciated by everyone, everywhere. Yet, in many parts of the world, major shifts in climate will create pressures on human settlements, spur forced migrations, or generate social dislocations. Some small island states, for example, may not survive a sea-level surge. Everywhere there is a need to cut emissions, and this means adaptation and/or major changes in economic activity and in lifestyle.The discourse and debate at COP26 reflect all of such persistent features in the international system. So far, the largest achievements center on the common consensus that more must be done to prevent the rise in temperature from creating a global catastrophe. This is not enough, however. Differences remain, and countries have yet to specify what cuts in emissions they are willing to make.Echoes of who is responsible for what remains strong. The thorny matter of the unfulfilled pledge of $100 billion once promised by rich countries to help countries to reduce their emissions remained unresolved. At the same time, however, some important agreements were reached. The United States and China announced they would make greater efforts to cut methane, a powerful greenhouse gas. More than 100 countries agreed to end deforestation. India joined the countries committed to attain zero emissions by 2070. And on matters of finance, countries agreed to a two-year plan to determine how to meet the needs of the most-vulnerable countries. Q: In what ways do you think the tools and insights from political science can advance efforts to address climate change and its impacts?A: I prefer to take a multidisciplinary view of the issues at hand, rather than focus on the tools of political science alone. Disciplinary perspectives can create siloed views and positions that undermine any overall drive toward consensus. The scientific evidence is pointing to, even anticipating, pervasive changes that transcend known and established parameters of social order all across the globe.That said, political science provides important insight, even guidance, for addressing the impacts of climate change in some notable ways. One is understanding the extent to which our formal institutions enable discussion, debate, and decisions about the directions we can take collectively to adapt, adjust, or even depart from the established practices of managing social order.If we consider politics as the allocation of values in terms of who gets what, when, and how, then it becomes clear that the current allocation requires a change in course. Coordination and cooperation across the jurisdictions of sovereign states is foundational for any response to climate change impacts.We have already recognized, and to some extent, developed targets for reducing carbon emissions — a central impact from traditional forms of energy use — and are making notable efforts to shift toward alternatives. This move is an easy one compared to all the work that needs to be done to address climate change. But, in taking this step we have learned quite a bit that might help in creating a necessary consensus for cross-jurisdiction coordination and response.Respecting individuals and protecting life is increasingly recognized as a global value — at least in principle. As we work to change course, new norms will be developed, and political science provides important perspectives on how to establish such norms. We will be faced with demands for institutional design, and these will need to embody our guiding values. For example, having learned to recognize the burdens of inequity, we can establish the value of equity as foundational for our social order both now and as we recognize and address the impacts of climate change.

    Q: You teach a class on “Sustainability Development: Theory and Practice.” Broadly speaking, what are goals of this class? What lessons do you hope students will carry with them into the future?A: The goal of 17.181, my class on sustainability, is to frame as clearly as possible the concept of sustainable development (sustainability) with attention to conceptual, empirical, institutional, and policy issues.The course centers on human activities. Individuals are embedded in complex interactive systems: the social system, the natural environment, and the constructed cyber domain — each with distinct temporal, special, and dynamic features. Sustainability issues intersect with, but cannot be folded into, the impacts of climate change. Sustainability places human beings in social systems at the core of what must be done to respect the imperatives of a highly complex natural environment.We consider sustainability an evolving knowledge domain with attendant policy implications. It is driven by events on the ground, not by revolution in academic or theoretical concerns per se. Overall, sustainable development refers to the process of meeting the needs of current and future generations, without undermining the resilience of the life-supporting properties, the integrity of social systems, or the supports of the human-constructed cyberspace.More specifically, we differentiate among four fundamental dimensions and their necessary conditions:

    (a) ecological systems — exhibiting balance and resilience;(b) economic production and consumption — with equity and efficiency;(c) governance and politics — with participation and responsiveness; and(d) institutional performance — demonstrating adaptation and incorporating feedback.The core proposition is this: If all conditions hold, then the system is (or can be) sustainable. Then, we must examine the critical drivers — people, resources, technology, and their interactions — followed by a review and assessment of evolving policy responses. Then we ask: What are new opportunities?I would like students to carry forward these ideas and issues: what has been deemed “normal” in modern Western societies and in developing societies seeking to emulate the Western model is damaging humans in many ways — all well-known. Yet only recently have alternatives begun to be considered to the traditional economic growth model based on industrialization and high levels of energy use. To make changes, we must first understand the underlying incentives, realities, and choices that shape a whole set of dysfunctional behaviors and outcomes. We then need to delve deep into the driving sources and consequences, and to consider the many ways in which our known “normal” can be adjusted — in theory and in practice. Q: In confronting an issue as formidable as global climate change, what gives you hope?  A: I see a few hopeful signs; among them:The scientific evidence is clear and compelling. We are no longer discussing whether there is climate change, or if we will face major challenges of unprecedented proportions, or even how to bring about an international consensus on the salience of such threats.Climate change has been recognized as a global phenomenon. Imperatives for cooperation are necessary. No one can go it alone. Major efforts have and are being made in world politics to forge action agendas with specific targets.The issue appears to be on the verge of becoming one of “high politics” in the United States.Younger generations are more sensitive to the reality that we are altering the life-supporting properties of our planet. They are generally more educated, skilled, and open to addressing such challenges than their elders.However disappointing the results of COP26 might seem, the global community is moving in the right direction.None of the above points, individually or jointly, translates into an effective response to the known impacts of climate change — let alone the unknown. But, this is what gives me hope.

    Interview prepared by MIT SHASS CommunicationsEditorial, design, and series director: Emily HiestandSenior writer: Kathryn O’Neill More

  • in

    The reasons behind lithium-ion batteries’ rapid cost decline

    Lithium-ion batteries, those marvels of lightweight power that have made possible today’s age of handheld electronics and electric vehicles, have plunged in cost since their introduction three decades ago at a rate similar to the drop in solar panel prices, as documented by a study published last March. But what brought about such an astonishing cost decline, of about 97 percent?

    Some of the researchers behind that earlier study have now analyzed what accounted for the extraordinary savings. They found that by far the biggest factor was work on research and development, particularly in chemistry and materials science. This outweighed the gains achieved through economies of scale, though that turned out to be the second-largest category of reductions.

    The new findings are being published today in the journal Energy and Environmental Science, in a paper by MIT postdoc Micah Ziegler, recent graduate student Juhyun Song PhD ’19, and Jessika Trancik, a professor in MIT’s Institute for Data, Systems and Society.

    The findings could be useful for policymakers and planners to help guide spending priorities in order to continue the pathway toward ever-lower costs for this and other crucial energy storage technologies, according to Trancik. Their work suggests that there is still considerable room for further improvement in electrochemical battery technologies, she says.

    The analysis required digging through a variety of sources, since much of the relevant information consists of closely held proprietary business data. “The data collection effort was extensive,” Ziegler says. “We looked at academic articles, industry and government reports, press releases, and specification sheets. We even looked at some legal filings that came out. We had to piece together data from many different sources to get a sense of what was happening.” He says they collected “about 15,000 qualitative and quantitative data points, across 1,000 individual records from approximately 280 references.”

    Data from the earliest times are hardest to access and can have the greatest uncertainties, Trancik says, but by comparing different data sources from the same period they have attempted to account for these uncertainties.

    Overall, she says, “we estimate that the majority of the cost decline, more than 50 percent, came from research-and-development-related activities.” That included both private sector and government-funded research and development, and “the vast majority” of that cost decline within that R&D category came from chemistry and materials research.

    That was an interesting finding, she says, because “there were so many variables that people were working on through very different kinds of efforts,” including the design of the battery cells themselves, their manufacturing systems, supply chains, and so on. “The cost improvement emerged from a diverse set of efforts and many people, and not from the work of only a few individuals.”

    The findings about the importance of investment in R&D were especially significant, Ziegler says, because much of this investment happened after lithium-ion battery technology was commercialized, a stage at which some analysts thought the research contribution would become less significant. Over roughly a 20-year period starting five years after the batteries’ introduction in the early 1990s, he says, “most of the cost reduction still came from R&D. The R&D contribution didn’t end when commercialization began. In fact, it was still the biggest contributor to cost reduction.”

    The study took advantage of an analytical approach that Trancik and her team initially developed to analyze the similarly precipitous drop in costs of silicon solar panels over the last few decades. They also applied the approach to understand the rising costs of nuclear energy. “This is really getting at the fundamental mechanisms of technological change,” she says. “And we can also develop these models looking forward in time, which allows us to uncover the levers that people could use to improve the technology in the future.”

    One advantage of the methodology Trancik and her colleagues have developed, she says, is that it helps to sort out the relative importance of different factors when many variables are changing all at once, which typically happens as a technology improves. “It’s not simply adding up the cost effects of these variables,” she says, “because many of these variables affect many different cost components. There’s this kind of intricate web of dependencies.” But the team’s methodology makes it possible to “look at how that overall cost change can be attributed to those variables, by essentially mapping out that network of dependencies,” she says.

    This can help provide guidance on public spending, private investments, and other incentives. “What are all the things that different decision makers could do?” she asks. “What decisions do they have agency over so that they could improve the technology, which is important in the case of low-carbon technologies, where we’re looking for solutions to climate change and we have limited time and limited resources? The new approach allows us to potentially be a bit more intentional about where we make those investments of time and money.”

    “This paper collects data available in a systematic way to determine changes in the cost components of lithium-ion batteries between 1990-1995 and 2010-2015,” says Laura Diaz Anadon, a professor of climate change policy at Cambridge University, who was not connected to this research. “This period was an important one in the history of the technology, and understanding the evolution of cost components lays the groundwork for future work on mechanisms and could help inform research efforts in other types of batteries.”

    The research was supported by the Alfred P. Sloan Foundation, the Environmental Defense Fund, and the MIT Technology and Policy Program. More

  • in

    Design’s new frontier

    In the 1960s, the advent of computer-aided design (CAD) sparked a revolution in design. For his PhD thesis in 1963, MIT Professor Ivan Sutherland developed Sketchpad, a game-changing software program that enabled users to draw, move, and resize shapes on a computer. Over the course of the next few decades, CAD software reshaped how everything from consumer products to buildings and airplanes were designed.

    “CAD was part of the first wave in computing in design. The ability of researchers and practitioners to represent and model designs using computers was a major breakthrough and still is one of the biggest outcomes of design research, in my opinion,” says Maria Yang, Gail E. Kendall Professor and director of MIT’s Ideation Lab.

    Innovations in 3D printing during the 1980s and 1990s expanded CAD’s capabilities beyond traditional injection molding and casting methods, providing designers even more flexibility. Designers could sketch, ideate, and develop prototypes or models faster and more efficiently. Meanwhile, with the push of a button, software like that developed by Professor Emeritus David Gossard of MIT’s CAD Lab could solve equations simultaneously to produce a new geometry on the fly.

    In recent years, mechanical engineers have expanded the computing tools they use to ideate, design, and prototype. More sophisticated algorithms and the explosion of machine learning and artificial intelligence technologies have sparked a second revolution in design engineering.

    Researchers and faculty at MIT’s Department of Mechanical Engineering are utilizing these technologies to re-imagine how the products, systems, and infrastructures we use are designed. These researchers are at the forefront of the new frontier in design.

    Computational design

    Faez Ahmed wants to reinvent the wheel, or at least the bicycle wheel. He and his team at MIT’s Design Computation & Digital Engineering Lab (DeCoDE) use an artificial intelligence-driven design method that can generate entirely novel and improved designs for a range of products — including the traditional bicycle. They create advanced computational methods to blend human-driven design with simulation-based design.

    “The focus of our DeCoDE lab is computational design. We are looking at how we can create machine learning and AI algorithms to help us discover new designs that are optimized based on specific performance parameters,” says Ahmed, an assistant professor of mechanical engineering at MIT.

    For their work using AI-driven design for bicycles, Ahmed and his collaborator Professor Daniel Frey wanted to make it easier to design customizable bicycles, and by extension, encourage more people to use bicycles over transportation methods that emit greenhouse gases.

    To start, the group gathered a dataset of 4,500 bicycle designs. Using this massive dataset, they tested the limits of what machine learning could do. First, they developed algorithms to group bicycles that looked similar together and explore the design space. They then created machine learning models that could successfully predict what components are key in identifying a bicycle style, such as a road bike versus a mountain bike.

    Once the algorithms were good enough at identifying bicycle designs and parts, the team proposed novel machine learning tools that could use this data to create a unique and creative design for a bicycle based on certain performance parameters and rider dimensions.

    Ahmed used a generative adversarial network — or GAN — as the basis of this model. GAN models utilize neural networks that can create new designs based on vast amounts of data. However, using GAN models alone would result in homogeneous designs that lack novelty and can’t be assessed in terms of performance. To address these issues in design problems, Ahmed has developed a new method which he calls “PaDGAN,” performance augmented diverse GAN.

    “When we apply this type of model, what we see is that we can get large improvements in the diversity, quality, as well as novelty of the designs,” Ahmed explains.

    Using this approach, Ahmed’s team developed an open-source computational design tool for bicycles freely available on their lab website. They hope to further develop a set of generalizable tools that can be used across industries and products.

    Longer term, Ahmed has his sights set on loftier goals. He hopes the computational design tools he develops could lead to “design democratization,” putting more power in the hands of the end user.

    “With these algorithms, you can have more individualization where the algorithm assists a customer in understanding their needs and helps them create a product that satisfies their exact requirements,” he adds.

    Using algorithms to democratize the design process is a goal shared by Stefanie Mueller, an associate professor in electrical engineering and computer science and mechanical engineering.

    Personal fabrication

    Platforms like Instagram give users the freedom to instantly edit their photographs or videos using filters. In one click, users can alter the palette, tone, and brightness of their content by applying filters that range from bold colors to sepia-toned or black-and-white. Mueller, X-Window Consortium Career Development Professor, wants to bring this concept of the Instagram filter to the physical world.

    “We want to explore how digital capabilities can be applied to tangible objects. Our goal is to bring reprogrammable appearance to the physical world,” explains Mueller, director of the HCI Engineering Group based out of MIT’s Computer Science and Artificial Intelligence Laboratory.

    Mueller’s team utilizes a combination of smart materials, optics, and computation to advance personal fabrication technologies that would allow end users to alter the design and appearance of the products they own. They tested this concept in a project they dubbed “Photo-Chromeleon.”

    First, a mix of photochromic cyan, magenta, and yellow dies are airbrushed onto an object — in this instance, a 3D sculpture of a chameleon. Using software they developed, the team sketches the exact color pattern they want to achieve on the object itself. An ultraviolet light shines on the object to activate the dyes.

    To actually create the physical pattern on the object, Mueller has developed an optimization algorithm to use alongside a normal office projector outfitted with red, green, and blue LED lights. These lights shine on specific pixels on the object for a given period of time to physically change the makeup of the photochromic pigments.

    “This fancy algorithm tells us exactly how long we have to shine the red, green, and blue light on every single pixel of an object to get the exact pattern we’ve programmed in our software,” says Mueller.

    Giving this freedom to the end user enables limitless possibilities. Mueller’s team has applied this technology to iPhone cases, shoes, and even cars. In the case of shoes, Mueller envisions a shoebox embedded with UV and LED light projectors. Users could put their shoes in the box overnight and the next day have a pair of shoes in a completely new pattern.

    Mueller wants to expand her personal fabrication methods to the clothes we wear. Rather than utilize the light projection technique developed in the PhotoChromeleon project, her team is exploring the possibility of weaving LEDs directly into clothing fibers, allowing people to change their shirt’s appearance as they wear it. These personal fabrication technologies could completely alter consumer habits.

    “It’s very interesting for me to think about how these computational techniques will change product design on a high level,” adds Mueller. “In the future, a consumer could buy a blank iPhone case and update the design on a weekly or daily basis.”

    Computational fluid dynamics and participatory design

    Another team of mechanical engineers, including Sili Deng, the Brit (1961) & Alex (1949) d’Arbeloff Career Development Professor, are developing a different kind of design tool that could have a large impact on individuals in low- and middle-income countries across the world.

    As Deng walked down the hallway of Building 1 on MIT’s campus, a monitor playing a video caught her eye. The video featured work done by mechanical engineers and MIT D-Lab on developing cleaner burning briquettes for cookstoves in Uganda. Deng immediately knew she wanted to get involved.

    “As a combustion scientist, I’ve always wanted to work on such a tangible real-world problem, but the field of combustion tends to focus more heavily on the academic side of things,” explains Deng.

    After reaching out to colleagues in MIT D-Lab, Deng joined a collaborative effort to develop a new cookstove design tool for the 3 billion people across the world who burn solid fuels to cook and heat their homes. These stoves often emit soot and carbon monoxide, leading not only to millions of deaths each year, but also worsening the world’s greenhouse gas emission problem.

    The team is taking a three-pronged approach to developing this solution, using a combination of participatory design, physical modeling, and experimental validation to create a tool that will lead to the production of high-performing, low-cost energy products.

    Deng and her team in the Deng Energy and Nanotechnology Group use physics-based modeling for the combustion and emission process in cookstoves.

    “My team is focused on computational fluid dynamics. We use computational and numerical studies to understand the flow field where the fuel is burned and releases heat,” says Deng.

    These flow mechanics are crucial to understanding how to minimize heat loss and make cookstoves more efficient, as well as learning how dangerous pollutants are formed and released in the process.

    Using computational methods, Deng’s team performs three-dimensional simulations of the complex chemistry and transport coupling at play in the combustion and emission processes. They then use these simulations to build a combustion model for how fuel is burned and a pollution model that predicts carbon monoxide emissions.

    Deng’s models are used by a group led by Daniel Sweeney in MIT D-Lab to test the experimental validation in prototypes of stoves. Finally, Professor Maria Yang uses participatory design methods to integrate user feedback, ensuring the design tool can actually be used by people across the world.

    The end goal for this collaborative team is to not only provide local manufacturers with a prototype they could produce themselves, but to also provide them with a tool that can tweak the design based on local needs and available materials.

    Deng sees wide-ranging applications for the computational fluid dynamics her team is developing.

    “We see an opportunity to use physics-based modeling, augmented with a machine learning approach, to come up with chemical models for practical fuels that help us better understand combustion. Therefore, we can design new methods to minimize carbon emissions,” she adds.

    While Deng is utilizing simulations and machine learning at the molecular level to improve designs, others are taking a more macro approach.

    Designing intelligent systems

    When it comes to intelligent design, Navid Azizan thinks big. He hopes to help create future intelligent systems that are capable of making decisions autonomously by using the enormous amounts of data emerging from the physical world. From smart robots and autonomous vehicles to smart power grids and smart cities, Azizan focuses on the analysis, design, and control of intelligent systems.

    Achieving such massive feats takes a truly interdisciplinary approach that draws upon various fields such as machine learning, dynamical systems, control, optimization, statistics, and network science, among others.

    “Developing intelligent systems is a multifaceted problem, and it really requires a confluence of disciplines,” says Azizan, assistant professor of mechanical engineering with a dual appointment in MIT’s Institute for Data, Systems, and Society (IDSS). “To create such systems, we need to go beyond standard approaches to machine learning, such as those commonly used in computer vision, and devise algorithms that can enable safe, efficient, real-time decision-making for physical systems.”

    For robot control to work in the complex dynamic environments that arise in the real world, real-time adaptation is key. If, for example, an autonomous vehicle is going to drive in icy conditions or a drone is operating in windy conditions, they need to be able to adapt to their new environment quickly.

    To address this challenge, Azizan and his collaborators at MIT and Stanford University have developed a new algorithm that combines adaptive control, a powerful methodology from control theory, with meta learning, a new machine learning paradigm.

    “This ‘control-oriented’ learning approach outperforms the existing ‘regression-oriented’ methods, which are mostly focused on just fitting the data, by a wide margin,” says Azizan.

    Another critical aspect of deploying machine learning algorithms in physical systems that Azizan and his team hope to address is safety. Deep neural networks are a crucial part of autonomous systems. They are used for interpreting complex visual inputs and making data-driven predictions of future behavior in real time. However, Azizan urges caution.

    “These deep neural networks are only as good as their training data, and their predictions can often be untrustworthy in scenarios not covered by their training data,” he says. Making decisions based on such untrustworthy predictions could lead to fatal accidents in autonomous vehicles or other safety-critical systems.

    To avoid these potentially catastrophic events, Azizan proposes that it is imperative to equip neural networks with a measure of their uncertainty. When the uncertainty is high, they can then be switched to a “safe policy.”

    In pursuit of this goal, Azizan and his collaborators have developed a new algorithm known as SCOD — Sketching Curvature of Out-of-Distribution Detection. This framework could be embedded within any deep neural network to equip them with a measure of their uncertainty.

    “This algorithm is model-agnostic and can be applied to neural networks used in various kinds of autonomous systems, whether it’s drones, vehicles, or robots,” says Azizan.

    Azizan hopes to continue working on algorithms for even larger-scale systems. He and his team are designing efficient algorithms to better control supply and demand in smart energy grids. According to Azizan, even if we create the most efficient solar panels and batteries, we can never achieve a sustainable grid powered by renewable resources without the right control mechanisms.

    Mechanical engineers like Ahmed, Mueller, Deng, and Azizan serve as the key to realizing the next revolution of computing in design.

    “MechE is in a unique position at the intersection of the computational and physical worlds,” Azizan says. “Mechanical engineers build a bridge between theoretical, algorithmic tools and real, physical world applications.”

    Sophisticated computational tools, coupled with the ground truth mechanical engineers have in the physical world, could unlock limitless possibilities for design engineering, well beyond what could have been imagined in those early days of CAD. More