More stories

  • in

    Co-creating climate futures with real-time data and spatial storytelling

    Virtual story worlds and game engines aren’t just for video games anymore. They are now tools for scientists and storytellers to digitally twin existing physical spaces and then turn them into vessels to dream up speculative climate stories and build collective designs of the future. That’s the theory and practice behind the MIT WORLDING initiative.

    Twice this year, WORLDING matched world-class climate story teams working in XR (extended reality) with relevant labs and researchers across MIT. One global group returned for a virtual gathering online in partnership with Unity for Humanity, while another met for one weekend in person, hosted at the MIT Media Lab.

    “We are witnessing the birth of an emergent field that fuses climate science, urban planning, real-time 3D engines, nonfiction storytelling, and speculative fiction, and it is all fueled by the urgency of the climate crises,” says Katerina Cizek, lead designer of the WORLDING initiative at the Co-Creation Studio of MIT Open Documentary Lab. “Interdisciplinary teams are forming and blossoming around the planet to collectively imagine and tell stories of healthy, livable worlds in virtual 3D spaces and then finding direct ways to translate that back to earth, literally.”

    At this year’s virtual version of WORLDING, five multidisciplinary teams were selected from an open call. In a week-long series of research and development gatherings, the teams met with MIT scientists, staff, fellows, students, and graduates, as well as other leading figures in the field. Guests ranged from curators at film festivals such as Sundance and Venice, climate policy specialists, and award-winning media creators to software engineers and renowned Earth and atmosphere scientists. The teams heard from MIT scholars in diverse domains, including geomorphology, urban planning as acts of democracy, and climate researchers at MIT Media Lab.

    Mapping climate data

    “We are measuring the Earth’s environment in increasingly data-driven ways. Hundreds of terabytes of data are taken every day about our planet in order to study the Earth as a holistic system, so we can address key questions about global climate change,” explains Rachel Connolly, an MIT Media Lab research scientist focused in the “Future Worlds” research theme, in a talk to the group. “Why is this important for your work and storytelling in general? Having the capacity to understand and leverage this data is critical for those who wish to design for and successfully operate in the dynamic Earth environment.”

    Making sense of billions of data points was a key theme during this year’s sessions. In another talk, Taylor Perron, an MIT professor of Earth, atmospheric and planetary sciences, shared how his team uses computational modeling combined with many other scientific processes to better understand how geology, climate, and life intertwine to shape the surfaces of Earth and other planets. His work resonated with one WORLDING team in particular, one aiming to digitally reconstruct the pre-Hispanic Lake Texcoco — where current day Mexico City is now situated — as a way to contrast and examine the region’s current water crisis.

    Democratizing the future

    While WORLDING approaches rely on rigorous science and the interrogation of large datasets, they are also founded on democratizing community-led approaches.

    MIT Department of Urban Studies and Planning graduate Lafayette Cruise MCP ’19 met with the teams to discuss how he moved his own practice as a trained urban planner to include a futurist component involving participatory methods. “I felt we were asking the same limited questions in regards to the future we were wanting to produce. We’re very limited, very constrained, as to whose values and comforts are being centered. There are so many possibilities for how the future could be.”

    Scaling to reach billions

    This work scales from the very local to massive global populations. Climate policymakers are concerned with reaching billions of people in the line of fire. “We have a goal to reach 1 billion people with climate resilience solutions,” says Nidhi Upadhyaya, deputy director at Atlantic Council’s Adrienne Arsht-Rockefeller Foundation Resilience Center. To get that reach, Upadhyaya is turning to games. “There are 3.3 billion-plus people playing video games across the world. Half of these players are women. This industry is worth $300 billion. Africa is currently among the fastest-growing gaming markets in the world, and 55 percent of the global players are in the Asia Pacific region.” She reminded the group that this conversation is about policy and how formats of mass communication can be used for policymaking, bringing about change, changing behavior, and creating empathy within audiences.

    Socially engaged game development is also connected to education at Unity Technologies, a game engine company. “We brought together our education and social impact work because we really see it as a critical flywheel for our business,” said Jessica Lindl, vice president and global head of social impact/education at Unity Technologies, in the opening talk of WORLDING. “We upscale about 900,000 students, in university and high school programs around the world, and about 800,000 adults who are actively learning and reskilling and upskilling in Unity. Ultimately resulting in our mission of the ‘world is a better place with more creators in it,’ millions of creators who reach billions of consumers — telling the world stories, and fostering a more inclusive, sustainable, and equitable world.”

    Access to these technologies is key, especially the hardware. “Accessibility has been missing in XR,” explains Reginé Gilbert, who studies and teaches accessibility and disability in user experience design at New York University. “XR is being used in artificial intelligence, assistive technology, business, retail, communications, education, empathy, entertainment, recreation, events, gaming, health, rehabilitation meetings, navigation, therapy, training, video programming, virtual assistance wayfinding, and so many other uses. This is a fun fact for folks: 97.8 percent of the world hasn’t tried VR [virtual reality] yet, actually.”

    Meanwhile, new hardware is on its way. The WORLDING group got early insights into the highly anticipated Apple Vision Pro headset, which promises to integrate many forms of XR and personal computing in one device. “They’re really pushing this kind of pass-through or mixed reality,” said Dan Miller, a Unity engineer on the poly spatial team, collaborating with Apple, who described the experience of the device as “You are viewing the real world. You’re pulling up windows, you’re interacting with content. It’s a kind of spatial computing device where you have multiple apps open, whether it’s your email client next to your messaging client with a 3D game in the middle. You’re interacting with all these things in the same space and at different times.”

    “WORLDING combines our passion for social-impact storytelling and incredible innovative storytelling,” said Paisley Smith of the Unity for Humanity Program at Unity Technologies. She added, “This is an opportunity for creators to incubate their game-changing projects and connect with experts across climate, story, and technology.”

    Meeting at MIT

    In a new in-person iteration of WORLDING this year, organizers collaborated closely with Connolly at the MIT Media Lab to co-design an in-person weekend conference Oct. 25 – Nov. 7 with 45 scholars and professionals who visualize climate data at NASA, the National Oceanic and Atmospheric Administration, planetariums, and museums across the United States.

    A participant said of the event, “An incredible workshop that had had a profound effect on my understanding of climate data storytelling and how to combine different components together for a more [holistic] solution.”

    “With this gathering under our new Future Worlds banner,” says Dava Newman, director of the MIT Media Lab and Apollo Program Professor of Astronautics chair, “the Media Lab seeks to affect human behavior and help societies everywhere to improve life here on Earth and in worlds beyond, so that all — the sentient, natural, and cosmic — worlds may flourish.” 

    “WORLDING’s virtual-only component has been our biggest strength because it has enabled a true, international cohort to gather, build, and create together. But this year, an in-person version showed broader opportunities that spatial interactivity generates — informal Q&As, physical worksheets, and larger-scale ideation, all leading to deeper trust-building,” says WORLDING producer Srushti Kamat SM ’23.

    The future and potential of WORLDING lies in the ongoing dialogue between the virtual and physical, both in the work itself and in the format of the workshops. More

  • in

    Technique could efficiently solve partial differential equations for numerous applications

    In fields such as physics and engineering, partial differential equations (PDEs) are used to model complex physical processes to generate insight into how some of the most complicated physical and natural systems in the world function.

    To solve these difficult equations, researchers use high-fidelity numerical solvers, which can be very time-consuming and computationally expensive to run. The current simplified alternative, data-driven surrogate models, compute the goal property of a solution to PDEs rather than the whole solution. Those are trained on a set of data that has been generated by the high-fidelity solver, to predict the output of the PDEs for new inputs. This is data-intensive and expensive because complex physical systems require a large number of simulations to generate enough data. 

    In a new paper, “Physics-enhanced deep surrogates for partial differential equations,” published in December in Nature Machine Intelligence, a new method is proposed for developing data-driven surrogate models for complex physical systems in such fields as mechanics, optics, thermal transport, fluid dynamics, physical chemistry, and climate models.

    The paper was authored by MIT’s professor of applied mathematics Steven G. Johnson along with Payel Das and Youssef Mroueh of the MIT-IBM Watson AI Lab and IBM Research; Chris Rackauckas of Julia Lab; and Raphaël Pestourie, a former MIT postdoc who is now at Georgia Tech. The authors call their method “physics-enhanced deep surrogate” (PEDS), which combines a low-fidelity, explainable physics simulator with a neural network generator. The neural network generator is trained end-to-end to match the output of the high-fidelity numerical solver.

    “My aspiration is to replace the inefficient process of trial and error with systematic, computer-aided simulation and optimization,” says Pestourie. “Recent breakthroughs in AI like the large language model of ChatGPT rely on hundreds of billions of parameters and require vast amounts of resources to train and evaluate. In contrast, PEDS is affordable to all because it is incredibly efficient in computing resources and has a very low barrier in terms of infrastructure needed to use it.”

    In the article, they show that PEDS surrogates can be up to three times more accurate than an ensemble of feedforward neural networks with limited data (approximately 1,000 training points), and reduce the training data needed by at least a factor of 100 to achieve a target error of 5 percent. Developed using the MIT-designed Julia programming language, this scientific machine-learning method is thus efficient in both computing and data.

    The authors also report that PEDS provides a general, data-driven strategy to bridge the gap between a vast array of simplified physical models with corresponding brute-force numerical solvers modeling complex systems. This technique offers accuracy, speed, data efficiency, and physical insights into the process.

    Says Pestourie, “Since the 2000s, as computing capabilities improved, the trend of scientific models has been to increase the number of parameters to fit the data better, sometimes at the cost of a lower predictive accuracy. PEDS does the opposite by choosing its parameters smartly. It leverages the technology of automatic differentiation to train a neural network that makes a model with few parameters accurate.”

    “The main challenge that prevents surrogate models from being used more widely in engineering is the curse of dimensionality — the fact that the needed data to train a model increases exponentially with the number of model variables,” says Pestourie. “PEDS reduces this curse by incorporating information from the data and from the field knowledge in the form of a low-fidelity model solver.”

    The researchers say that PEDS has the potential to revive a whole body of the pre-2000 literature dedicated to minimal models — intuitive models that PEDS could make more accurate while also being predictive for surrogate model applications.

    “The application of the PEDS framework is beyond what we showed in this study,” says Das. “Complex physical systems governed by PDEs are ubiquitous, from climate modeling to seismic modeling and beyond. Our physics-inspired fast and explainable surrogate models will be of great use in those applications, and play a complementary role to other emerging techniques, like foundation models.”

    The research was supported by the MIT-IBM Watson AI Lab and the U.S. Army Research Office through the Institute for Soldier Nanotechnologies.  More

  • in

    Search algorithm reveals nearly 200 new kinds of CRISPR systems

    Microbial sequence databases contain a wealth of information about enzymes and other molecules that could be adapted for biotechnology. But these databases have grown so large in recent years that they’ve become difficult to search efficiently for enzymes of interest.

    Now, scientists at the McGovern Institute for Brain Research at MIT, the Broad Institute of MIT and Harvard, and the National Center for Biotechnology Information (NCBI) at the National Institutes of Health have developed a new search algorithm that has identified 188 kinds of new rare CRISPR systems in bacterial genomes, encompassing thousands of individual systems. The work appears today in Science.

    The algorithm, which comes from the lab of pioneering CRISPR researcher Professor Feng Zhang, uses big-data clustering approaches to rapidly search massive amounts of genomic data. The team used their algorithm, called Fast Locality-Sensitive Hashing-based clustering (FLSHclust) to mine three major public databases that contain data from a wide range of unusual bacteria, including ones found in coal mines, breweries, Antarctic lakes, and dog saliva. The scientists found a surprising number and diversity of CRISPR systems, including ones that could make edits to DNA in human cells, others that can target RNA, and many with a variety of other functions.

    The new systems could potentially be harnessed to edit mammalian cells with fewer off-target effects than current Cas9 systems. They could also one day be used as diagnostics or serve as molecular records of activity inside cells.

    The researchers say their search highlights an unprecedented level of diversity and flexibility of CRISPR and that there are likely many more rare systems yet to be discovered as databases continue to grow.

    “Biodiversity is such a treasure trove, and as we continue to sequence more genomes and metagenomic samples, there is a growing need for better tools, like FLSHclust, to search that sequence space to find the molecular gems,” says Zhang, a co-senior author on the study and the James and Patricia Poitras Professor of Neuroscience at MIT with joint appointments in the departments of Brain and Cognitive Sciences and Biological Engineering. Zhang is also an investigator at the McGovern Institute for Brain Research at MIT, a core institute member at the Broad, and an investigator at the Howard Hughes Medical Institute. Eugene Koonin, a distinguished investigator at the NCBI, is co-senior author on the study as well.

    Searching for CRISPR

    CRISPR, which stands for clustered regularly interspaced short palindromic repeats, is a bacterial defense system that has been engineered into many tools for genome editing and diagnostics.

    To mine databases of protein and nucleic acid sequences for novel CRISPR systems, the researchers developed an algorithm based on an approach borrowed from the big data community. This technique, called locality-sensitive hashing, clusters together objects that are similar but not exactly identical. Using this approach allowed the team to probe billions of protein and DNA sequences — from the NCBI, its Whole Genome Shotgun database, and the Joint Genome Institute — in weeks, whereas previous methods that look for identical objects would have taken months. They designed their algorithm to look for genes associated with CRISPR.

    “This new algorithm allows us to parse through data in a time frame that’s short enough that we can actually recover results and make biological hypotheses,” says Soumya Kannan PhD ’23, who is a co-first author on the study. Kannan was a graduate student in Zhang’s lab when the study began and is currently a postdoc and Junior Fellow at Harvard University. Han Altae-Tran PhD ’23, a graduate student in Zhang’s lab during the study and currently a postdoc at the University of Washington, was the study’s other co-first author.

    “This is a testament to what you can do when you improve on the methods for exploration and use as much data as possible,” says Altae-Tran. “It’s really exciting to be able to improve the scale at which we search.”

    New systems

    In their analysis, Altae-Tran, Kannan, and their colleagues noticed that the thousands of CRISPR systems they found fell into a few existing and many new categories. They studied several of the new systems in greater detail in the lab.

    They found several new variants of known Type I CRISPR systems, which use a guide RNA that is 32 base pairs long rather than the 20-nucleotide guide of Cas9. Because of their longer guide RNAs, these Type I systems could potentially be used to develop more precise gene-editing technology that is less prone to off-target editing. Zhang’s team showed that two of these systems could make short edits in the DNA of human cells. And because these Type I systems are similar in size to CRISPR-Cas9, they could likely be delivered to cells in animals or humans using the same gene-delivery technologies being used today for CRISPR.

    One of the Type I systems also showed “collateral activity” — broad degradation of nucleic acids after the CRISPR protein binds its target. Scientists have used similar systems to make infectious disease diagnostics such as SHERLOCK, a tool capable of rapidly sensing a single molecule of DNA or RNA. Zhang’s team thinks the new systems could be adapted for diagnostic technologies as well.

    The researchers also uncovered new mechanisms of action for some Type IV CRISPR systems, and a Type VII system that precisely targets RNA, which could potentially be used in RNA editing. Other systems could potentially be used as recording tools — a molecular document of when a gene was expressed — or as sensors of specific activity in a living cell.

    Mining data

    The scientists say their algorithm could aid in the search for other biochemical systems. “This search algorithm could be used by anyone who wants to work with these large databases for studying how proteins evolve or discovering new genes,” Altae-Tran says.

    The researchers add that their findings illustrate not only how diverse CRISPR systems are, but also that most are rare and only found in unusual bacteria. “Some of these microbial systems were exclusively found in water from coal mines,” Kannan says. “If someone hadn’t been interested in that, we may never have seen those systems. Broadening our sampling diversity is really important to continue expanding the diversity of what we can discover.”

    This work was supported by the Howard Hughes Medical Institute; the K. Lisa Yang and Hock E. Tan Molecular Therapeutics Center at MIT; Broad Institute Programmable Therapeutics Gift Donors; The Pershing Square Foundation, William Ackman and Neri Oxman; James and Patricia Poitras; BT Charitable Foundation; Asness Family Foundation; Kenneth C. Griffin; the Phillips family; David Cheng; and Robert Metcalfe. More

  • in

    Rewarding excellence in open data

    The second annual MIT Prize for Open Data, which included a $2,500 cash prize, was recently awarded to 10 individual and group research projects. Presented jointly by the School of Science and the MIT Libraries, the prize highlights the value of open data — research data that is openly accessible and reusable — at the Institute. The prize winners and 12 honorable mention recipients were honored at the Open Data @ MIT event held Oct. 24 at Hayden Library. 

    Conceived by Chris Bourg, director of MIT Libraries, and Rebecca Saxe, associate dean of the School of Science and the John W. Jarve (1978) Professor of Brain and Cognitive Sciences, the prize program was launched in 2022. It recognizes MIT-affiliated researchers who use or share open data, create infrastructure for open data sharing, or theorize about open data. Nominations were solicited from across the Institute, with a focus on trainees: undergraduate and graduate students, postdocs, and research staff. 

    “The prize is explicitly aimed at early-career researchers,” says Bourg. “Supporting and encouraging the next generation of researchers will help ensure that the future of scholarship is characterized by a norm of open sharing.”

    The 2023 awards were presented at a celebratory event held during International Open Access Week. Winners gave five-minute presentations on their projects and the role that open data plays in their research. The program also included remarks from Bourg and Anne White, School of Engineering Distinguished Professor of Engineering, vice provost, and associate vice president for research administration. White reflected on the ways in which MIT has demonstrated its values with the open sharing of research and scholarship and acknowledged the efforts of the honorees and advocates gathered at the event: “Thank you for the active role you’re all playing in building a culture of openness in research,” she said. “It benefits us all.” 

    Winners were chosen from more than 80 nominees, representing all five MIT schools, the MIT Schwarzman College of Computing, and several research centers across the Institute. A committee composed of faculty, staff, and graduate students made the selections:

    Hammaad Adam, graduate student in the Institute for Data, Systems, and Society, accepted on behalf of the team behind Organ Retrieval and Collection of Health Information for Donation (ORCHID), the first ever multi-center dataset dedicated to the organ procurement process. ORCHID provides the first opportunity to quantitatively analyze organ procurement organization decisions and identify operational inefficiencies.
    Adam Atanas, postdoc in the Department of Brain and Cognitive Sciences (BCS), and Jungsoo Kim, graduate student in BCS, created WormWideWeb.org. The site, allowing researchers to easily browse and download C. elegans whole-brain datasets, will be useful to C. elegans neuroscientists and theoretical/computational neuroscientists. 
    Paul Berube, research scientist in the Department of Civil and Environmental Engineering, and Steven Biller, assistant professor of biological sciences at Wellesley College, won for “Unlocking Marine Microbiomes with Open Data.” Open data of genomes and metagenomes for marine ecosystems, with a focus on cyanobacteria, leverage the power of contemporaneous data from GEOTRACES and other long-standing ocean time-series programs to provide underlying information to answer questions about marine ecosystem function. 
    Jack Cavanagh, Sarah Kopper, and Diana Horvath of the Abdul Latif Jameel Poverty Action Lab (J-PAL) were recognized for J-PAL’s Data Publication Infrastructure, which includes a trusted repository of open-access datasets, a dedicated team of data curators, and coding tools and training materials to help other teams publish data in an efficient and ethical manner. 
    Jerome Patrick Cruz, graduate student in the Department of Political Science, won for OpenAudit, leveraging advances in natural language processing and machine learning to make data in public audit reports more usable for academics and policy researchers, as well as governance practitioners, watchdogs, and reformers. This work was done in collaboration with colleagues at Ateneo de Manila University in the Philippines. 
    Undergraduate student Daniel Kurlander created a tool for planetary scientists to rapidly access and filter images of the comet 67P/Churyumov-Gerasimenko. The web-based tool enables searches by location and other properties, does not require a time-intensive download of a massive dataset, allows analysis of the data independent of the speed of one’s computer, and does not require installation of a complex set of programs. 
    Halie Olson, postdoc in BCS, was recognized for sharing data from a functional magnetic resonance imaging (fMRI) study on language processing. The study used video clips from “Sesame Street” in which researchers manipulated the comprehensibility of the speech stream, allowing them to isolate a “language response” in the brain.
    Thomas González Roberts, graduate student in the Department of Aeronautics and Astronautics, won for the International Telecommunication Union Compliance Assessment Monitor. This tool combats the heritage of secrecy in outer space operations by creating human- and machine-readable datasets that succinctly describe the international agreements that govern satellite operations. 
    Melissa Kline Struhl, research scientist in BCS, was recognized for Children Helping Science, a free, open-source platform for remote studies with babies and children that makes it possible for researchers at more than 100 institutions to conduct reproducible studies. 
    JS Tan, graduate student in the Department of Urban Studies and Planning, developed the Collective Action in Tech Archive in collaboration with Nataliya Nedzhvetskaya of the University of California at Berkeley. It is an open database of all publicly recorded collective actions taken by workers in the global tech industry. 
    A complete list of winning projects and honorable mentions, including links to the research data, is available on the MIT Libraries website. More

  • in

    Forging climate connections across the Institute

    Climate change is the ultimate cross-cutting issue: Not limited to any one discipline, it ranges across science, technology, policy, culture, human behavior, and well beyond. The response to it likewise requires an all-of-MIT effort.

    Now, to strengthen such an effort, a new grant program spearheaded by the Climate Nucleus, the faculty committee charged with the oversight and implementation of Fast Forward: MIT’s Climate Action Plan for the Decade, aims to build up MIT’s climate leadership capacity while also supporting innovative scholarship on diverse climate-related topics and forging new connections across the Institute.

    Called the Fast Forward Faculty Fund (F^4 for short), the program has named its first cohort of six faculty members after issuing its inaugural call for proposals in April 2023. The cohort will come together throughout the year for climate leadership development programming and networking. The program provides financial support for graduate students who will work with the faculty members on the projects — the students will also participate in leadership-building activities — as well as $50,000 in flexible, discretionary funding to be used to support related activities. 

    “Climate change is a crisis that truly touches every single person on the planet,” says Noelle Selin, co-chair of the nucleus and interim director of the Institute for Data, Systems, and Society. “It’s therefore essential that we build capacity for every member of the MIT community to make sense of the problem and help address it. Through the Fast Forward Faculty Fund, our aim is to have a cohort of climate ambassadors who can embed climate everywhere at the Institute.”

    F^4 supports both faculty who would like to begin doing climate-related work, as well as faculty members who are interested in deepening their work on climate. The program has the core goal of developing cohorts of F^4 faculty and graduate students who, in addition to conducting their own research, will become climate leaders at MIT, proactively looking for ways to forge new climate connections across schools, departments, and disciplines.

    One of the projects, “Climate Crisis and Real Estate: Science-based Mitigation and Adaptation Strategies,” led by Professor Siqi Zheng of the MIT Center for Real Estate in collaboration with colleagues from the MIT Sloan School of Management, focuses on the roughly 40 percent of carbon dioxide emissions that come from the buildings and real estate sector. Zheng notes that this sector has been slow to respond to climate change, but says that is starting to change, thanks in part to the rising awareness of climate risks and new local regulations aimed at reducing emissions from buildings.

    Using a data-driven approach, the project seeks to understand the efficient and equitable market incentives, technology solutions, and public policies that are most effective at transforming the real estate industry. Johnattan Ontiveros, a graduate student in the Technology and Policy Program, is working with Zheng on the project.

    “We were thrilled at the incredible response we received from the MIT faculty to our call for proposals, which speaks volumes about the depth and breadth of interest in climate at MIT,” says Anne White, nucleus co-chair and vice provost and associate vice president for research. “This program makes good on key commitments of the Fast Forward plan, supporting cutting-edge new work by faculty and graduate students while helping to deepen the bench of climate leaders at MIT.”

    During the 2023-24 academic year, the F^4 faculty and graduate student cohorts will come together to discuss their projects, explore opportunities for collaboration, participate in climate leadership development, and think proactively about how to deepen interdisciplinary connections among MIT community members interested in climate change.

    The six inaugural F^4 awardees are:

    Professor Tristan Brown, History Section: Humanistic Approaches to the Climate Crisis  

    With this project, Brown aims to create a new community of practice around narrative-centric approaches to environmental and climate issues. Part of a broader humanities initiative at MIT, it brings together a global working group of interdisciplinary scholars, including Serguei Saavedra (Department of Civil and Environmental Engineering) and Or Porath (Tel Aviv University; Religion), collectively focused on examining the historical and present links between sacred places and biodiversity for the purposes of helping governments and nongovernmental organizations formulate better sustainability goals. Boyd Ruamcharoen, a PhD student in the History, Anthropology, and Science, Technology, and Society (HASTS) program, will work with Brown on this project.

    Professor Kerri Cahoy, departments of Aeronautics and Astronautics and Earth, Atmospheric, and Planetary Sciences (AeroAstro): Onboard Autonomous AI-driven Satellite Sensor Fusion for Coastal Region Monitoring

    The motivation for this project is the need for much better data collection from satellites, where technology can be “20 years behind,” says Cahoy. As part of this project, Cahoy will pursue research in the area of autonomous artificial intelligence-enabled rapid sensor fusion (which combines data from different sensors, such as radar and cameras) onboard satellites to improve understanding of the impacts of climate change, specifically sea-level rise and hurricanes and flooding in coastal regions. Graduate students Madeline Anderson, a PhD student in electrical engineering and computer science (EECS), and Mary Dahl, a PhD student in AeroAstro, will work with Cahoy on this project.

    Professor Priya Donti, Department of Electrical Engineering and Computer Science: Robust Reinforcement Learning for High-Renewables Power Grids 

    With renewables like wind and solar making up a growing share of electricity generation on power grids, Donti’s project focuses on improving control methods for these distributed sources of electricity. The research will aim to create a realistic representation of the characteristics of power grid operations, and eventually inform scalable operational improvements in power systems. It will “give power systems operators faith that, OK, this conceptually is good, but it also actually works on this grid,” says Donti. PhD candidate Ana Rivera from EECS is the F^4 graduate student on the project.

    Professor Jason Jackson, Department of Urban Studies and Planning (DUSP): Political Economy of the Climate Crisis: Institutions, Power and Global Governance

    This project takes a political economy approach to the climate crisis, offering a distinct lens to examine, first, the political governance challenge of mobilizing climate action and designing new institutional mechanisms to address the global and intergenerational distributional aspects of climate change; second, the economic challenge of devising new institutional approaches to equitably finance climate action; and third, the cultural challenge — and opportunity — of empowering an adaptive socio-cultural ecology through traditional knowledge and local-level social networks to achieve environmental resilience. Graduate students Chen Chu and Mrinalini Penumaka, both PhD students in DUSP, are working with Jackson on the project.

    Professor Haruko Wainwright, departments of Nuclear Science and Engineering (NSE) and Civil and Environmental Engineering: Low-cost Environmental Monitoring Network Technologies in Rural Communities for Addressing Climate Justice 

    This project will establish a community-based climate and environmental monitoring network in addition to a data visualization and analysis infrastructure in rural marginalized communities to better understand and address climate justice issues. The project team plans to work with rural communities in Alaska to install low-cost air and water quality, weather, and soil sensors. Graduate students Kay Whiteaker, an MS candidate in NSE, and Amandeep Singh, and MS candidate in System Design and Management at Sloan, are working with Wainwright on the project, as is David McGee, professor in earth, atmospheric, and planetary sciences.

    Professor Siqi Zheng, MIT Center for Real Estate and DUSP: Climate Crisis and Real Estate: Science-based Mitigation and Adaptation Strategies 

    See the text above for the details on this project. More

  • in

    Improving US air quality, equitably

    Decarbonization of national economies will be key to achieving global net-zero emissions by 2050, a major stepping stone to the Paris Agreement’s long-term goal of keeping global warming well below 2 degrees Celsius (and ideally 1.5 C), and thereby averting the worst consequences of climate change. Toward that end, the United States has pledged to reduce its greenhouse gas emissions by 50-52 percent from 2005 levels by 2030, backed by its implementation of the 2022 Inflation Reduction Act. This strategy is consistent with a 50-percent reduction in carbon dioxide (CO2) by the end of the decade.

    If U.S. federal carbon policy is successful, the nation’s overall air quality will also improve. Cutting CO2 emissions reduces atmospheric concentrations of air pollutants that lead to the formation of fine particulate matter (PM2.5), which causes more than 200,000 premature deaths in the United States each year. But an average nationwide improvement in air quality will not be felt equally; air pollution exposure disproportionately harms people of color and lower-income populations.

    How effective are current federal decarbonization policies in reducing U.S. racial and economic disparities in PM2.5 exposure, and what changes will be needed to improve their performance? To answer that question, researchers at MIT and Stanford University recently evaluated a range of policies which, like current U.S. federal carbon policies, reduce economy-wide CO2 emissions by 40-60 percent from 2005 levels by 2030. Their findings appear in an open-access article in the journal Nature Communications.

    First, they show that a carbon-pricing policy, while effective in reducing PM2.5 exposure for all racial/ethnic groups, does not significantly mitigate relative disparities in exposure. On average, the white population undergoes far less exposure than Black, Hispanic, and Asian populations. This policy does little to reduce exposure disparities because the CO2 emissions reductions that it achieves primarily occur in the coal-fired electricity sector. Other sectors, such as industry and heavy-duty diesel transportation, contribute far more PM2.5-related emissions.

    The researchers then examine thousands of different reduction options through an optimization approach to identify whether any possible combination of carbon dioxide reductions in the range of 40-60 percent can mitigate disparities. They find that that no policy scenario aligned with current U.S. carbon dioxide emissions targets is likely to significantly reduce current PM2.5 exposure disparities.

    “Policies that address only about 50 percent of CO2 emissions leave many polluting sources in place, and those that prioritize reductions for minorities tend to benefit the entire population,” says Noelle Selin, supervising author of the study and a professor at MIT’s Institute for Data, Systems and Society and Department of Earth, Atmospheric and Planetary Sciences. “This means that a large range of policies that reduce CO2 can improve air quality overall, but can’t address long-standing inequities in air pollution exposure.”

    So if climate policy alone cannot adequately achieve equitable air quality results, what viable options remain? The researchers suggest that more ambitious carbon policies could narrow racial and economic PM2.5 exposure disparities in the long term, but not within the next decade. To make a near-term difference, they recommend interventions designed to reduce PM2.5 emissions resulting from non-CO2 sources, ideally at the economic sector or community level.

    “Achieving improved PM2.5 exposure for populations that are disproportionately exposed across the United States will require thinking that goes beyond current CO2 policy strategies, most likely involving large-scale structural changes,” says Selin. “This could involve changes in local and regional transportation and housing planning, together with accelerated efforts towards decarbonization.” More

  • in

    From physics to generative AI: An AI model for advanced pattern generation

    Generative AI, which is currently riding a crest of popular discourse, promises a world where the simple transforms into the complex — where a simple distribution evolves into intricate patterns of images, sounds, or text, rendering the artificial startlingly real. 

    The realms of imagination no longer remain as mere abstractions, as researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have brought an innovative AI model to life. Their new technology integrates two seemingly unrelated physical laws that underpin the best-performing generative models to date: diffusion, which typically illustrates the random motion of elements, like heat permeating a room or a gas expanding into space, and Poisson Flow, which draws on the principles governing the activity of electric charges.

    This harmonious blend has resulted in superior performance in generating new images, outpacing existing state-of-the-art models. Since its inception, the “Poisson Flow Generative Model ++” (PFGM++) has found potential applications in various fields, from antibody and RNA sequence generation to audio production and graph generation.

    The model can generate complex patterns, like creating realistic images or mimicking real-world processes. PFGM++ builds off of PFGM, the team’s work from the prior year. PFGM takes inspiration from the means behind the mathematical equation known as the “Poisson” equation, and then applies it to the data the model tries to learn from. To do this, the team used a clever trick: They added an extra dimension to their model’s “space,” kind of like going from a 2D sketch to a 3D model. This extra dimension gives more room for maneuvering, places the data in a larger context, and helps one approach the data from all directions when generating new samples. 

    “PFGM++ is an example of the kinds of AI advances that can be driven through interdisciplinary collaborations between physicists and computer scientists,” says Jesse Thaler, theoretical particle physicist in MIT’s Laboratory for Nuclear Science’s Center for Theoretical Physics and director of the National Science Foundation’s AI Institute for Artificial Intelligence and Fundamental Interactions (NSF AI IAIFI), who was not involved in the work. “In recent years, AI-based generative models have yielded numerous eye-popping results, from photorealistic images to lucid streams of text. Remarkably, some of the most powerful generative models are grounded in time-tested concepts from physics, such as symmetries and thermodynamics. PFGM++ takes a century-old idea from fundamental physics — that there might be extra dimensions of space-time — and turns it into a powerful and robust tool to generate synthetic but realistic datasets. I’m thrilled to see the myriad of ways ‘physics intelligence’ is transforming the field of artificial intelligence.”

    The underlying mechanism of PFGM isn’t as complex as it might sound. The researchers compared the data points to tiny electric charges placed on a flat plane in a dimensionally expanded world. These charges produce an “electric field,” with the charges looking to move upwards along the field lines into an extra dimension and consequently forming a uniform distribution on a vast imaginary hemisphere. The generation process is like rewinding a videotape: starting with a uniformly distributed set of charges on the hemisphere and tracking their journey back to the flat plane along the electric lines, they align to match the original data distribution. This intriguing process allows the neural model to learn the electric field, and generate new data that mirrors the original. 

    The PFGM++ model extends the electric field in PFGM to an intricate, higher-dimensional framework. When you keep expanding these dimensions, something unexpected happens — the model starts resembling another important class of models, the diffusion models. This work is all about finding the right balance. The PFGM and diffusion models sit at opposite ends of a spectrum: one is robust but complex to handle, the other simpler but less sturdy. The PFGM++ model offers a sweet spot, striking a balance between robustness and ease of use. This innovation paves the way for more efficient image and pattern generation, marking a significant step forward in technology. Along with adjustable dimensions, the researchers proposed a new training method that enables more efficient learning of the electric field. 

    To bring this theory to life, the team resolved a pair of differential equations detailing these charges’ motion within the electric field. They evaluated the performance using the Frechet Inception Distance (FID) score, a widely accepted metric that assesses the quality of images generated by the model in comparison to the real ones. PFGM++ further showcases a higher resistance to errors and robustness toward the step size in the differential equations.

    Looking ahead, they aim to refine certain aspects of the model, particularly in systematic ways to identify the “sweet spot” value of D tailored for specific data, architectures, and tasks by analyzing the behavior of estimation errors of neural networks. They also plan to apply the PFGM++ to the modern large-scale text-to-image/text-to-video generation.

    “Diffusion models have become a critical driving force behind the revolution in generative AI,” says Yang Song, research scientist at OpenAI. “PFGM++ presents a powerful generalization of diffusion models, allowing users to generate higher-quality images by improving the robustness of image generation against perturbations and learning errors. Furthermore, PFGM++ uncovers a surprising connection between electrostatics and diffusion models, providing new theoretical insights into diffusion model research.”

    “Poisson Flow Generative Models do not only rely on an elegant physics-inspired formulation based on electrostatics, but they also offer state-of-the-art generative modeling performance in practice,” says NVIDIA Senior Research Scientist Karsten Kreis, who was not involved in the work. “They even outperform the popular diffusion models, which currently dominate the literature. This makes them a very powerful generative modeling tool, and I envision their application in diverse areas, ranging from digital content creation to generative drug discovery. More generally, I believe that the exploration of further physics-inspired generative modeling frameworks holds great promise for the future and that Poisson Flow Generative Models are only the beginning.”

    Authors on a paper about this work include three MIT graduate students: Yilun Xu of the Department of Electrical Engineering and Computer Science (EECS) and CSAIL, Ziming Liu of the Department of Physics and the NSF AI IAIFI, and Shangyuan Tong of EECS and CSAIL, as well as Google Senior Research Scientist Yonglong Tian PhD ’23. MIT professors Max Tegmark and Tommi Jaakkola advised the research.

    The team was supported by the MIT-DSTA Singapore collaboration, the MIT-IBM Grand Challenge project, National Science Foundation grants, The Casey and Family Foundation, the Foundational Questions Institute, the Rothberg Family Fund for Cognitive Science, and the ML for Pharmaceutical Discovery and Synthesis Consortium. Their work was presented at the International Conference on Machine Learning this summer. More

  • in

    3 Questions: A new PhD program from the Center for Computational Science and Engineering

    This fall, the Center for Computational Science and Engineering (CCSE), an academic unit in the MIT Schwarzman College of Computing, is introducing a new standalone PhD degree program that will enable students to pursue research in cross-cutting methodological aspects of computational science and engineering. The launch follows approval of the center’s degree program proposal at the May 2023 Institute faculty meeting.

    Doctoral-level graduate study in computational science and engineering (CSE) at MIT has, for the past decade, been offered through an interdisciplinary program in which CSE students are admitted to one of eight participating academic departments in the School of Engineering or School of Science. While this model adds a strong disciplinary component to students’ education, the rapid growth of the CSE field and the establishment of the MIT Schwarzman College of Computing have prompted an exciting expansion of MIT’s graduate-level offerings in computation.

    The new degree, offered by the college, will run alongside MIT’s existing interdisciplinary offerings in CSE, complementing these doctoral training programs and preparing students to contribute to the leading edge of the field. Here, CCSE co-directors Youssef Marzouk and Nicolas Hadjiconstantinou discuss the standalone program and how they expect it to elevate the visibility and impact of CSE research and education at MIT.

    Q: What is computational science and engineering?

    Marzouk: Computational science and engineering focuses on the development and analysis of state-of-the-art methods for computation and their innovative application to problems of science and engineering interest. It has intellectual foundations in applied mathematics, statistics, and computer science, and touches the full range of science and engineering disciplines. Yet, it synthesizes these foundations into a discipline of its own — one that links the digital and physical worlds. It’s an exciting and evolving multidisciplinary field.

    Hadjiconstantinou: Examples of CSE research happening at MIT include modeling and simulation techniques, the underlying computational mathematics, and data-driven modeling of physical systems. Computational statistics and scientific machine learning have become prominent threads within CSE, joining high-performance computing, mathematically-oriented programming languages, and their broader links to algorithms and software. Application domains include energy, environment and climate, materials, health, transportation, autonomy, and aerospace, among others. Some of our researchers focus on general and widely applicable methodology, while others choose to focus on methods and algorithms motivated by a specific domain of application.

    Q: What was the motivation behind creating a standalone PhD program?

    Marzouk: The new degree focuses on a particular class of students whose background and interests are primarily in CSE methodology, in a manner that cuts across the disciplinary research structure represented by our current “with-departments” degree program. There is a strong research demand for such methodologically-focused students among CCSE faculty and MIT faculty in general. Our objective is to create a targeted, coherent degree program in this field that, alongside our other thriving CSE offerings, will create the leading environment for top CSE students worldwide.

    Hadjiconstantinou: One of CCSE’s most important functions is to recruit exceptional students who are trained in and want to work in computational science and engineering. Experience with our CSE master’s program suggests that students with a strong background and interests in the discipline prefer to apply to a pure CSE program for their graduate studies. The standalone degree aims to bring these students to MIT and make them available to faculty across the Institute.

    Q: How will this impact computing education and research at MIT? 

    Hadjiconstantinou: We believe that offering a standalone PhD program in CSE alongside the existing “with-departments” programs will significantly strengthen MIT’s graduate programs in computing. In particular, it will strengthen the methodological core of CSE research and education at MIT, while continuing to support the disciplinary-flavored CSE work taking place in our participating departments, which include Aeronautics and Astronautics; Chemical Engineering; Civil and Environmental Engineering; Materials Science and Engineering; Mechanical Engineering; Nuclear Science and Engineering; Earth, Atmospheric and Planetary Sciences; and Mathematics. Together, these programs will create a stronger CSE student cohort and facilitate deeper exchanges between the college and other units at MIT.

    Marzouk: In a broader sense, the new program is designed to help realize one of the key opportunities presented by the college, which is to create a richer variety of graduate degrees in computation and to involve as many faculty and units in these educational endeavors as possible. The standalone CSE PhD will join other distinguished doctoral programs of the college — such as the Department of Electrical Engineering and Computer Science PhD; the Operations Research Center PhD; and the Interdisciplinary Doctoral Program in Statistics and the Social and Engineering Systems PhD within the Institute for Data, Systems, and Society — and grow in a way that is informed by them. The confluence of these academic programs, and natural synergies among them, will make MIT quite unique. More