More stories

  • in

    Center to advance predictive simulation research established at MIT Schwarzman College of Computing

    Understanding the degradation of materials in extreme environments is a scientific problem with major technological applications, ranging from spaceflight to industrial and nuclear safety. Yet it presents an intrinsic challenge: Researchers cannot easily reproduce these environments in the laboratory or observe essential degradation processes in real-time. Computational modeling and simulation have consequently become indispensable tools in helping to predict the behavior of complex materials across a range of strenuous conditions
    At MIT, a new research effort aims to advance the state-of-the-art in predictive simulation as well as shape new interdisciplinary graduate education programs at the intersection of computational science and computer science.
    Strengthening engagement with the sciences
    The Center for Exascale Simulation of Materials in Extreme Environments (CESMIX) — based at the Center for Computational Science and Engineering (CCSE) within the MIT Stephen A. Schwarzman College of Computing — will bring together researchers in numerical algorithms and scientific computing, quantum chemistry, materials science, and computer science to connect quantum and molecular simulations of materials with advanced programming languages, compiler technologies, and software performance engineering tools, underpinned by rigorous approaches to statistical inference and uncertainty quantification.
    “One of the goals of CESMIX is to build a substantive link between computer science and computational science and engineering, something that historically has been hard to do, but is sorely needed,” says Daniel Huttenlocher, dean of the MIT Schwarzman College of Computing. “The center will also provide opportunities for faculty, researchers, and students across MIT to interact intellectually and create a new synthesis of different disciplines, which is central to the mission of the college.”
    Leading the project as principal investigator is Youssef Marzouk, professor of aeronautics and astronautics and co-director of CCSE, which was renamed from the Center of Computational Engineering in January to reflect its strengthening engagement with the sciences at MIT. Marzouk, who is also a member of the Statistics and Data Science Center, notes that “CESMIX is trying to do two things simultaneously. On the one hand, we want to solve an incredibly challenging multiscale simulation problem, harnessing quantum mechanical models of complex materials to achieve unprecedented accuracy at the engineering scale. On the other hand, we want to create tools that make development and holistic performance engineering of the associated software stack as easy as possible, to achieve top performance on the coming generation of exascale computational hardware.”
    The project involves participation from an interdisciplinary cohort of eight faculty members, serving as co-PIs, and research scientists spanning multiple labs and departments at MIT. The full list of participants includes:
    Youssef Marzouk, PI, professor of aeronautics and astronautics and co-director of CCSE;
    Saman Amarasinghe, co-PI, professor of computer science and engineering;
    Alan Edelman, co-PI, professor of applied mathematics;
    Nicolas Hadjiconstantinou, co-PI, professor of mechanical engineering and co-director of CCSE;
    Asegun Henry, co-PI, associate professor of mechanical engineering;
    Heather Kulik, co-PI, associate professor of chemical engineering;
    Charles Leiserson, co-PI, the Edwin Sibley Webster Professor of Electrical Engineering;
    Jaime Peraire, co-PI, the H.N. Slater Professor of Aeronautics and Astronautics;
    Cuong Nguyen, principal research scientist of aeronautics and astronautics;
    Tao B. Schardl, research scientist in the Computer Science and Artificial Intelligence Laboratory; and
    Mehdi Pishahang, research scientist of mechanical engineering.
    MIT was among a total of nine universities selected as part of the Predictive Science Academic Alliance Program (PSAAP) III to form a new center to support science-based modeling and simulation and exascale computing technologies. This is the third time that PSAAP centers have been awarded by the U.S. Department of Energy’s National Nuclear Security Administration (DoE/NNSA) since the program launched in 2008 and is the first time that the Institute has ever been selected. MIT is one of just two institutions nationwide chosen to establish a Single-Discipline Center in this round and will receive up to $9.5 million in funding through a cooperative agreement over five years.
    Advancing predictive simulation
    CESMIX will focus on exascale simulation of materials in hypersonic flow environments. It will also drive the development of new predictive simulation paradigms and computer science tools for the exascale. Researchers will specifically aim to predict the degradation of complex (disordered and multi-component) materials under extreme loading inaccessible to direct experimental observation — an application representing a technology domain of intense current interest, and one that ­­­exemplifies an important class of scientific problems involving material interfaces in extreme environments.
    “A big challenge here is in being able to predict what reactions will occur and what new molecules will form under these conditions. While quantum mechanical modeling will enable us to predict these events, we also need to be able to address the times and length scales of these processes,” says Kulik, who is also a faculty member of CCSE. “Our efforts will be focused on developing the needed software and machine learning tools that tell us when more affordable physical models can address the length scale challenge and when we need quantum mechanics to address the accuracy challenge.”
    CESMIX researchers plan on disseminating their results via multiple open-source software projects, engaging their developer and user communities. The project will also support the work of postdocs, graduate students, and research scientists at MIT with the overarching goal of creating new paradigms of practice for the next generation of computational scientists. More

  • in

    Study identifies reasons for soaring nuclear plant cost overruns in the U.S.

    A new analysis by MIT researchers details many of the underlying issues that have caused cost overruns on new nuclear power plants in the U.S., which have soared ever higher over the last five decades. The new findings may help the designers of new plants build in resilience to the factors that tend to cause these overruns, thus helping to bring down the costs of such plants.
    Many analysts believe nuclear power will play an essential part in reducing global emissions of greenhouse gases, and finding ways to curb these rising costs could be an important step toward encouraging the construction of new plants, the researchers say. The findings are being published today in the journal Joule, in a paper by MIT professors Jessika Trancik and Jacopo Buongiorno, along with former students Philip Eash-Gates SM ’19, Magdalena Klemun PhD ’20, Goksin Kavlak PhD ’18, and Research Scientist James McNerney.
    Among the surprising findings in the study, which covered 50 years of U.S. nuclear power plant construction data, was that, contrary to expectations, building subsequent plants based on an existing design actually costs more, not less, than building the initial plant.
    The authors also found that while changes in safety regulations could account for some of the excess costs, that was only one of numerous factors contributing to the overages.
    “It’s a known fact that costs have been rising in the U.S. and in a number of other locations, but what was not known is why and what to do about it,” says Trancik, who is an associate professor of energy studies in MIT’s Institute for Data, Systems and Society. The main lesson to be learned, she says, is that “we need to be rethinking our approach to engineering design.”
    Part of that rethinking, she says, is to pay close attention to the details of what has caused past plant construction costs to spiral out of control, and to design plants in a way that minimizes the likelihood of such factors arising. This requires new methods and theories of technological innovation and change, which the team has been advancing over the past two decades.
    For example, many of the excess costs were associated with delays caused by the need to make last-minute design changes based on particular conditions at the construction site or other local circumstances, so if more components of the plant, or even the entire plant, could be built offsite under controlled factory conditions, such extra costs could be substantially cut.
    Specific design changes to the containment buildings surrounding the reactor could also help to reduce costs significantly, Trancik says. For example, substituting some new kinds of concrete in the massive structures could reduce the overall amount of the material needed, and thus slash the onsite construction time as well as the material costs.
    Many of the reasons behind the cost increases, Trancik says, “suggest that there’s a lack of resilience, in the process of constructing these plants, to variable construction conditions.” Those variations can come from safety regulations that are changing over time, but there are other reasons as well. “All of this points to the fact that there is a path forward to increasing resilience that involves understanding the mechanisms behind why costs increased in the first place.”
    Say overall construction costs are very sensitive to upfront design costs, for example: “If you’re having to go back and redo the design because of something about a particular site or a changing safety regulation, then if you build into your design that you have all of these different possibilities based on these things that could happen,” that can protect against the need for such last-minute redesign work.
    “These are soft costs contributions,” Trancik says, which have not tended to be prioritized in the typical design process. “They’re not hardware costs, they are changes to the environment in which the construction is happening. … If you build that in to your engineering models and your engineering design process, then you may be able to avoid the cost increases in the future.”
    One approach, which would involve designing nuclear plants that could be built in factories and trucked to the site, has been advocated by many nuclear engineers for years. For example, rather than today’s huge nuclear plants, modular and smaller reactors could be completely self-contained and delivered to their final site with the nuclear fuel already installed. Numerous such plants could be ganged together to provide output comparable to that of larger plants, or they could be distributed more widely to reduce the need for long-distance transmission of the power. Alternatively, a larger plant could be designed to be assembled on site from an array of smaller factory-built subassemblies.
    “This relationship between the hardware design and the soft costs really needs to be brought into the engineering design process,” she says, “but it’s not going to happen without a concerted effort, and without being informed by modeling that accounts for these potential ballooning soft costs.”
    Trancik says that while some of the steps to control costs involve increased use of automated processes, these need to be considered in a societal context. “Many of these involve human jobs and it is important, especially in this time, where there’s such a need to create high-quality sustained jobs for people, this should also factor into the engineering design process. So it’s not that you need to look only at costs.” But the kind of analysis the team used, she says, can still be useful. “You can also look at the benefit of a technology in terms of jobs, and this approach to mechanistic modeling can allow you to do that.”
    The methodology the team used to analyze the causes of cost overruns could potentially also be applied to other large, capital-intensive construction projects, Trancik says, where similar kinds of cost overruns often occur.
    “One way to think about it as you’re bringing more of the entire construction process into manufacturing plants, that can be much more standardized.” That kind of increased standardization is part of what has led, for example, to a 95 percent cost reduction in solar panels and in lithium-ion batteries over the last few decades, she says. “We can think of it as making these larger projects more similar to those manufacturing processes.”
    Buongiorno adds that “only by reducing the cost of new plants can we expect nuclear energy to play a pivotal role in the upcoming energy transformation.”
    The work was supported by the David and Lucille Packard Foundation and the MIT Energy Initiative. More

  • in

    Inequality across networks

    Eaman Jahani has thought about society and equality for a long time. His interest may have been shaped in part by his childhood in Iran, where he witnessed a constant struggle for social equity and progress. “I was reading political news constantly,” he remembers. 
    But all societies have struggles, and Jahani thinks if he’d grown up elsewhere, he’d be drawn to the same questions. “When I came to the United States, I started reading social theories of the struggle between labor and capital,” he says. “The U.S. also deals with challenges of how institutions reward and punish people differently.”
    Jahani came to the United States for college, getting first a bachelor’s and then a master’s degree in computer science. For a time, he was a data analyst for Google, and he’s now a PhD student in the Social and Engineering Systems program (SES) within the MIT Institute for Data, Systems, and Society (IDSS). The SES program provides training in both statistics and the social sciences, and student research examines the societal aspects of challenges across domains like energy, transportation, and health care, using the advanced tools of computing and data science.
    To augment his statistical training, Jahani chose to add the Interdisciplinary Doctoral Program in Statistics to his SES program. After completing additional coursework and integrating statistics into his advising and dissertation, Jahani will graduate with a PhD in social and engineering systems and statistics.
    “I wanted to do research with a social component, and sociology is getting more computational” says Jahani. “IDSS was perfect. I wanted to apply statistical analysis to my research, but at the same time get training in social science methodology.” 
    Information brokers
    Working with Alex “Sandy” Pentland, an MIT Media Lab and IDSS professor, and Dean Eckles, a Sloan School of Management professor and IDSS affiliate, Jahani explores how the structure of social networks can perpetuate inequality, especially the unequal distribution of resources. “Networks play an important role in access to opportunities, like employment information,” he says. “I study how networks reinforce and even widen existing inequalities.”
    Networks can be mapped out as nodes and connections, but as they get larger and more complex, even observing their structure requires statistical tools. One of the first components of Jahani’s dissertation was a study that used phone communication records, surveys, and income data to construct a network of about 33,000 individuals.
    “What we saw is that network diversity — the level of access to unique communities — only seemed to have positive effects for higher-income individuals. So maybe network effects can exacerbate existing inequalities.” Jahani then looked for more concrete causal evidence linking less-positive outcomes to network structure directly. In a randomized experiment, he found that where he seeded particular information in the network determined who would get access to it.
    To test this relationship between network structure and access to information, Jahani constructed a model that generates randomly structured networks. A key variable for him is brokerage, viewed in network terms as “variations in links across different groups.” In other words: Connections exist between high-income people and between low-income people, but what connections exist between these two groups? Jahani found that when a small number of low-income “brokers” provided most of the access to high-income people, outcomes for low-income people weren’t nearly as good as when the same number of connections were distributed more evenly across the network.
    Looking for a gold mine
    Based on the findings of his observations and models, Jahani’s next step is to test network effects in a controlled way. The method: an online game, developed with the help of an Undergraduate Research Opportunities Program student, that recruits players who can receive real money based on their actions.
    “The basic goal of the game is to find a gold mine on a map,” says Jahani. “Finding the gold mine determines how much reward money you receive at the end.”
    Here’s the catch: Players are split into two groups, with one group getting a higher chance of receiving the gold mine’s location. Over several rounds of play, researchers can observe individual behavior in these two “networks.” Sharing the location of the gold mine will reduce your take for that round, but could encourage others in the network to share the knowledge with you in subsequent rounds of play.
    “We provide this mechanism for reciprocity,” explains Jahani, “and we simulate a ‘high status’ and a ‘low status’ group. Since there is a limited amount of ‘gold,’ it’s what we call a rivalrous resource. We’re going to make the network have more or less brokerage so we can see what sort of network structures lead to higher inter-group differences.”
    The experiment is ongoing, but early results are confirming some of Jahani’s early observations. “We’re finding that if the chances of receiving information about the gold mine’s location is fairly low for a particular group, they are less likely to cooperate with each other.” There is plenty of incentive for “high-status” people to share with each other, since they expect their network to reciprocate. But if the people in your network are unlikely (or unable) to help you in the future, you can only help them at your own expense.
    Home on the range
    Jahani is also working with Eckles on a collaboration with Facebook studying other network effects. Specifically, they are looking at longer-range connections — when you are connected to someone but have no mutual connections, or even an indirect connection with their connections. These connections tend to be weak, but there has been some evidence of stronger long-range connections on social media.
    “We’re exploring that finding, considering factors like anonymous accounts, multiple accounts, and ways in which close ties might seem artificially to have high range,” says Jahani. “It can be a pitfall of big data to find interesting patterns that are merely an artifact of a platform. But range of a tie is usually a good predictor of economic outcomes, so it’s valuable to identify prevalence and characteristics of strong long-range ties.”
    Jahani’s dissertation keeps him busy, as have other efforts in data analysis, from co-organizing a two-week summer institute for computational social science to winning first place in the Fragile Families Challenge by building a model that uses data to predict various outcomes for American families. “And I ski and hike,” he adds. “My summer hiking in Italy is so important, we even went this year, buying tickets as soon as negative coronavirus tests came back.”
    Jahani has begun his own search for future employment and plans to build on his work at IDSS. “Institutional norms can also affect unequal distribution, and I’m interested in looking into that in the future.”
    Hopefully his network will help him out. More

  • in

    Every vote counts for this math student

    Record voter turnouts are predicted in the U.S. elections this year, but will they arrive at the polls, or the early-voting ballot box, with informed opinions? And are more-informed voters more likely to vote? That’s a problem that math doctoral candidate Ashwin Narayan decided to work on this semester.
    Narayan had moved home to New Jersey following MIT’s shutdown in the spring, and over the summer he started to look for work in progressive data science. “Because of all the Covid-related upheaval at MIT and in the world, I felt I would struggle with focusing on my thesis,” he recalls. Shifting the completion of his PhD to September 2021, he signed on at the nonpartisan national voter education organization BallotReady to work on its CivicEngine platform.
    With just days before the election, Narayan described his virtual workplace with a single word that few others working from home can use: exciting.
    “The adrenaline is pumping, the caffeine is flowing, the nerves are wracked, and the tension is high,” he says. “I’ve been interested in politics for quite a while, but it was definitely a passive interest, mostly just reading a lot of news. But this election, I really felt that I wanted to do something more, to be more active and work towards some immediate impact.”
    Founded five years ago out of the University of Chicago, BallotReady works with customers, from state parties to companies like Snapchat and the Miami Heat, to help provide unbiased information about candidates and ballot initiatives in order to encourage and educate voters. In an internal analysis that BallotReady commissioned a few years ago from a team of researchers at MIT Department of Economics, they found that BallotReady users are 20 percentage points more likely to vote than nonusers, based on the turnout in Kentucky’s general election in 2015. The authors of that study, MIT postdoc Cory Smith and Enrico Cantoni and Donghee Jo, are working on more recent data to figure out how the site’s tools affect turnout, says Narayan. 
    Narayan’s role as an electoral fellow is to manage the interface between the campaigns and organizations and one of the country’s largest elections databases. Specifically, his focus is on the “Make a Plan to Vote” tool, which informs voters on how to vote by mail, drop box, or in person, whether by early vote or on Election Day.
    “Questions about mail-in ballots and early voting have been highest on people’s minds,” says Narayan. “The database we have compiled now has data about mail-in voting regulations — how to get a ballot and when to return it by,  locations for ballot drop boxes, and early voting polling locations for nearly every address in the country.”
    Users can also access information about candidates and ballot questions using a customizable, mobile-friendly voter guide. To avoid bias, information is linked to a source, information is aggregated instead of interpreted, and candidates are listed in alphabetical order. The site also collects endorsements, a candidate’s experience, and stances on issues, based on what they’ve said in debates as covered in the news, or from the candidates’ websites. While BallotReady doesn’t monetize its voter facing site, the CivicEngine platform that Narayan is working on does sell products to drive turnout for its customers. 
    “For me personally, I find the chance to work with such comprehensive data about elections in the U.S. a fascinating opportunity to shed light on how to make voting easier. We look at the presence of drop-box locations, the restrictiveness of mail-in policies, the number of candidates on a ballot, how many races are uncontested, and so on and so forth, and, based on post-election statistics on turnout, can draw connections between the number of voters and the policies.”
    “The mission of BallotReady appealed to me; informing every voter about every aspect of their ballot is such a fundamental tenet of democracy that it should be entirely nonpartisan,” he adds. “It was important to me to work not only with a group of data scientists, but rather with a group that has deep knowledge about political campaigns, with policy, and organizing, who can motivate the key questions to ask with their deep domain knowledge.”
    His work with BallotReady will extend through December. Because BallotReady launched in 2016 on a regional basis, and their 2018 expansion nationally was not in an election year, Narayan will be helping the company to debrief what information they collected, and to prepare for future elections. “We are hoping to analyze where our users come from, how popular our various tools are, and go through feedback from customers to figure out exactly what they liked about their data and hope for in future elections,” he says.
    While he may have taken a semester off officially, his work is aligned with his studies, which focus on how policy and society interact with data science. He has taken law school courses addressing regulation around data and privacy, recently contributed an article on the impact of AI on existing health-care privacy protections for MIT Science Policy Review, and works with his advisor, Professor Bonnie Berger, to develop statistically motivated algorithms to analyze large biological datasets.
    “I do think I was well-prepared for the work I’m doing now because of my research,” he says. “I’ve spent the past four years working with biologists to figure out the right questions to ask to better understand massive, noisy datasets, and the political world is extremely analogous: It’s not only that the data are noisy and hard to compile, but also it’s crucial to work with the experts to figure out the right questions.” More

  • in

    An interdisciplinary approach to sustainable PPE

    “Crisis moments can be the best time for collective trust building,” says Jarrod Goentzel, principal research scientist and lecturer for the Center for Transportation and Logistics (CTL) and director of the MIT Humanitarian Supply Chain Lab. “People’s minds are open in unique ways during crisis, so it’s a good time to shape our mindset for moving forward.”
    Goentzel is referring to the double crisis that struck the United States earlier this year: the Covid-19 pandemic and the resulting personal protective equipment (PPE) shortage. In response, MIT was agile — collecting, fundraising, and facilitating the purchase of PPE donations (i.e., gloves, face masks, face coverings, gowns, face shields, sanitizing wipes) for front-line workers at MIT and beyond. Now, as campus repopulates with a percentage of students, staff, faculty, and researchers, Goentzel is part of an interdisciplinary research team convened by the MIT Office of Sustainability (MITOS) to “shape mindsets” and identify sustainable procurement and sourcing strategies for PPE going forward. 
    The team was brought together as part of the newest campus-as-a-test bed research project through the Campus Sustainability Incubator Fund, administered by MITOS, which seeks to enable MIT community members to use the campus itself for research in sustainable operations, management, and design. By testing ideas on campus, the project uniquely connects researchers and operational staff, allowing for immediate feedback and application of findings at MIT.
    “It has been immensely valuable to have the incredible response, support, and partnership of MIT’s research community during this time of crisis,” says Christina Lo, director of strategic sourcing and contracts in the Office of the Vice President for Finance (VPF). “The work of the cross functional PPE donation team led by [Director of Institute for Medical Engineering and Science Edward J. Poitras Professor in Medical Engineering and Science] Elazer Edelman was the impetus that helped kick off a timely decision to centrally source, procure, and provide PPE and other essential supplies to our entire campus community,” Lo explains.
    That partnership quickly connected Lo and VPF with the Sustainability Incubator Fund team, who began offering data-driven approaches for strategically securing and distributing products needed by the MIT community. “By sharing knowledge, information and data, we have established a collaborative framework that we hope will continue beyond this current crisis. By bringing together dedicated individuals and experts from across our administrative and research units, we are building community to better serve our community,” she adds.
    This operational/research partnership has also allowed the team to work across different scales and time frames. “The beginning of this idea was ‘What are the challenges MIT is going to face due to Covid-19?’ There is the reopening of campus in the near term, but in the long term we need to look at the sustainability dimensions more broadly,” says MIT Sloan School of Management visiting Associate Professor Valerie Karplus, also an associate professor at Carnegie Mellon University since September. She, along with Goentzel and CTL Research Scientist and Director of MIT Sustainable Supply Chains Alexis Bateman; graduate research assistant Molly McGuigan; Institute for Data, Systems, and Society (IDSS) PhD candidate Mandy Wu; master of applied science in supply chain management students Song Gao and Kelly Sorel; and MITOS faculty fellow and Concrete Sustainability Hub Executive Director Jeremy Gregory round out the research team.
    The challenge the team faces is common: In times of crisis, cost and speed of procurement take precedence over the environmental and human health impacts of essential items like PPE. With their forthcoming strategy suggestions, the team hopes to change that. “The real opportunity coming out of this is that by doing all this pre-work, when we go into another emergency, the sustainability impact of a product can be considered a priority without affecting performance,” explains McGuigan.
    McGuigan, like the rest of the team, is uniquely skilled at addressing issues related to PPE — her research has focused on supply chains, and as an Army service member she worked on PPE sourcing in Liberia during the Ebola outbreak. Goentzel also supported PPE procurement during that outbreak and, along with Bateman, was most recently focused on supply chains impacted by the Covid-19 pandemic. Karplus, meanwhile, has been active in MIT working groups for both PPE donations and policy.
    “One of the strengths of MITOS is the ability to work with partners across campus in sourcing sustainability solutions. Because of this, we can see opportunities for collaboration that might be missed. This was a great opportunity to connect three distinct research groups that consider PPE supply chain to distribution to disposal and pair them with operational partners to develop a baseline understanding and future sustainability solution for MIT. Ultimately, we will share their findings to inform similar PPE procurement to disposal at other campuses,” says Director of Sustainability Julie Newman.
    While the team works to identify and craft a strategy for future sustainable PPE procurement policies, they continue to offer real-time feedback and insight to the operational side of MIT. Working with procurement, facilities, and maintenance, the team has applied insights to the newly established MIT Covid-19 Store, a centralized database that allows departments, labs, and centers (DLCs) to order and receive the PPE they need to maintain their operations safely from within MIT. As DLCs request supplies, the team is carefully tracking data, forecasting, and working to generate suggested amounts to help purchasers — many of whom never purchased PPE before — make decisions. “Everyone in procurement has been working at lightning speed to get everything as fast as possible. But we’re able to do analysis for them and to feed that back into the Covid-19 Store to say ‘Do we have enough?’ ‘What projects should we be focused on?’” explains McGuigan.
    This process of helping DLCs identify how much PPE they need, the team hopes, will impact behavior as well. “Some of this process is about building trust. It’s designed so that everyone is going through this similar process for purchasing PPE, so those buying can believe others are acting a way that is fair and equitable by design,” explains Goentzel, noting that this behavioral approach allows better allocation of scarce resources while avoiding over-ordering and waste.
    Gregory is careful to add that one caveat of providing future strategies is that the challenges of behavior go beyond sheer volume: “We can go through a bunch of research and identify the most sustainable options in this space, but there is a whole other challenge around actually getting people to make those selections,” he says. As the team continues their work, the hope is that findings from this research and the tests in practice will be used to inform decisions on campus as well as far beyond, offering insight to government and institutions at all levels.
    The research in the project is ongoing with a final report delivery date in 2021. Students are encouraged to apply to work with the team and further the research via UROP. For campus inquiries about how to participate in this work and/or general questions about this research project, please contact Alexis Bateman. More

  • in

    Emery Brown wins Swartz Prize for Theoretical and Computational Neuroscience

    The Society for Neuroscience (SfN) announced today that it has awarded the Swartz Prize for Theoretical and Computational Neuroscience to Emery N. Brown, the Edward Hood Taplin Professor of Medical Engineering and Computational Neuroscience at MIT.
    Brown, a member of The Picower Institute for Learning and Memory and the Institute for Medical Engineering and Science, as well as the Warren M. Zapol Professor at Harvard Medical School, is a neuroscientist, a statistician, and a practicing anesthesiologist at Massachusetts General Hospital. His research has produced principled and efficient new methods for decoding patterns of neural and brain network activity and has advanced neuroscientific understanding of how anesthetics affect the brain, which can improve patient care.
    “Dr. Brown’s seminal scientific contributions to neural signal processing and the theory of anesthetic mechanisms, together with his service as an educator and a physician, make him highly deserving of the 2020 Swartz Prize,” SfN President Barry Everitt said in a press release announcing the award. “Dr. Brown has demonstrated an unusually broad knowledge of neuroscience, a deep understanding of theoretical and computational tools, and an uncanny ability to find explanatory simplicity lurking beneath complicated observational phenomena.”
    In its announcement, the world’s largest neuroscience organization elaborated on the breadth and depth of Brown’s influence in many lines of research.
    “Brown’s insights and approaches have been critical to the development of some of the first models estimating functional connectivity among a group of simultaneously recorded neurons,” SfN’s announcement stated. “He has contributed statistical methods to analyze recordings of circadian rhythms and signal processing methods to analyze neuronal spike trains, local field potentials and EEG recordings.”
    With regard to anesthesiology, the statement continued: “Brown has proposed that the altered arousal states produced by the principal classes of anesthetics can be characterized by analyzing the locations of their molecular targets, along with the anatomy and physiology of the circuits that connect these locations. Overall, his systems neuroscience paradigm, supported by mechanistic modeling and cutting-edge statistical evaluation of evidence, is transforming anesthesiology from an empirical, clinical practice into a principled neuroscience-based discipline.”
    Brown says the recognition made him thankful for the chances his research, teaching, and medical practice have given him to work with colleagues and students.
    “Receiving the Swartz Prize is a great honor,” he says. “The prize recognizes my group’s work to characterize more accurately the properties of neural systems by developing and applying statistical methods and signal processing algorithms that capture their dynamical features. It further recognizes our efforts to uncover the neurophysiological mechanisms of how anesthetics work, and to translate those insights into new practices for managing patients receiving anesthesia care.
    “Finally,” he adds, “receipt of the Swartz Prize makes me eternally grateful for the outstanding colleagues, graduate students, postdocs, undergraduates, research assistants, and staff with whom I have had the good fortune to work.”
    The prize, which includes $30,000, was awarded during SfN’s Awards Announcement Week, Oct. 26-29. More

  • in

    Silencing gene expression to cure complex diseases

    Many people think of new medicines as bullets, and in the pharmaceutical industry, frequently used terms like “targets” and “hits” reinforce that idea. Immuneering co-founder and CEO Ben Zeskind ’03, PhD ’06 prefers a different analogy.
    His company, which specializes in bioinformatics and computational biology, sees many effective drugs more like noise-canceling headphones.
    Rather than focusing on the DNA and proteins involved in a disease, Immuneering focuses on disease-associated gene signaling and expression data. The company is trying to cancel out those signals like a pair of headphones blocks out unwanted background noise.
    The approach is guided by Immuneering’s decade-plus of experience helping large pharmaceutical companies understand the biological mechanisms behind some of their most successful medicines.
    “We started noticing some common patterns in terms of how these very successful drugs were working, and eventually we realized we could use these insights to create a platform that would let us identify new medicine,” Zeskind says. “[The idea is] to not just make existing medicines work better but also to create entirely new medicines that work better than anything that has come before.”
    In keeping with that idea, Immuneering is currently developing a bold pipeline of drugs aimed at some of the most deadly forms of cancer, in addition to other complex diseases that have proven difficult to treat, like Alzheimer’s. The company’s lead drug candidate, which targets a protein signaling pathway associated with many human cancers, will begin clinical trials within the year.
    It’s the first of what Immuneering hopes will be a number of clinical trials enabled by what the company calls its “disease-canceling technology,” which analyzes the gene expression data of diseases and uses computational models to identify small-molecule compounds likely to bind to disease pathways and silence them.
    “Our most advanced candidates go after the RAS-RAF-MEK [protein] pathway,” Zeskind explains. “This is a pathway that’s activated in about half of all human cancers. This pathway is incredibly important in a number of the most serious cancers: pancreatic, colorectal, melanoma, lung cancer — a lot of the cancers that have proven tougher to go after. We believe this is one of the largest unsolved problems in human cancer.”
    A good foundation
    As an undergraduate, Zeskind participated in the MIT $100K Entrepreneurship Competition (the $50K back then) and helped organize some of the MIT Enterprise Forum’s events around entrepreneurship.
    “MIT has a unique culture around entrepreneurship,” Zeskind says. “There aren’t many organizations that encourage it and celebrate it the way MIT does. Also, the philosophy of the biological engineering department, of taking problems in biology and analyzing them quantitatively and systematically using principles of engineering, that philosophy really drives our company today.”
    Although his PhD didn’t focus on bioinformatics, Zeskind’s coursework did involve some computational analysis and offered a primer on oncology. One course in particular, taught by Doug Lauffenburger, the Ford Professor of Biological Engineering, Chemical Engineering, and Biology, resonated with him. The class tasked students with uncovering some of the mechanisms of the interleukin-2 (IL-2) protein, a molecule found in the immune system that’s known to severely limit tumor growth in a small percentage of people with certain cancers.
    After Zeskind earned his MBA at Harvard Business School in 2008, he returned to MIT’s campus to talk to Lauffenburger about his idea for a company that would decipher the reasons for IL-2’s success in certain patients. Lauffenburger would go on to join Immuneering’s advisory board.
    Of course, due to the financial crisis of 2007-08, that proved to be difficult timing for launching a startup. Without easy access to capital, Zeskind approached pharmaceutical companies to show them some of the insights his team had gained on IL-2. The companies weren’t interested in IL-2, but they were intrigued by Immuneering’s process for uncovering the way it worked.
    “At first we thought, ‘We just spent a year figuring out IL-2 and now we have to start from scratch,’” Zeskind recalls. “But then we realized it would be easier the second time around, and that was a real turning point because we realized the company wasn’t about that specific medicine, it was about using data to figure out mechanism.”
    In one of the company’s first projects, Immuneering uncovered some of the mechanisms behind an early cancer immunotherapy developed by Bristol-Myers Squibb. In another, they studied the workings of Teva Pharmaceuticals’ drug for multiple sclerosis.
    As Immuneering continued working on successful drugs, they began to notice some counterintuitive patterns.
    “A lot of the conventional wisdom is to focus on DNA,” Zeskind says. “But what we saw over and over across many different projects was that transcriptomics, or which genes are turned on when — something you measure through RNA levels — was the thing that was most frequently informative about how a drug was working. That ran counter to conventional wisdom.”
    In 2018, as Immuneering continued helping companies appreciate that idea in drugs that were already working, it decided to start developing medicines designed from the start to go after disease signals.
    Today the company has drug pipelines focused around oncology, immune-oncology, and neuroscience. Zeskind says its disease-canceling technology allows Immuneering to launch new drug programs about twice as fast and with about half the capital as other drug development programs.
    “As long as we have a good gene-expression signature from human patient data for a particular disease, we’ll find targets and biological insights that let us go after them in new ways,” he says. “It’s a systematic, quantitative, efficient way to get those biological insights compared to a more traditional process, which involves a lot of trial and error.”
    An inspired path
    Even as Immuneering advances its drug pipelines, its bioinformatics services business continues to grow. Zeskind attributes that success to the company’s employees, about half of which are MIT alumni — the continuation of trend that began in the early days of the company, when Immuneering was mostly made up of recent MIT PhD graduates and postdocs.
    “We were sort of the Navy Seals of bioinformatics, if you will,” Zeskind says. “We’d come in with a small but incredibly well-trained team that knew how to make the most of the data they had available.”
    In fact, it’s not lost on Zeskind that his analogy of drugs as noise-canceling headphones has a distinctively MIT spin: He was inspired by longtime MIT professor and Bose Corporation founder Amar Bose.
    And Zeskind’s attraction to MIT came long before he ever stepped foot on campus. Growing up, his father, Dale Zeskind ’76, SM ’76, encouraged Ben and his sister Julie ’01, SM ’02 to attend MIT.
    Unfortunately, Dale passed away recently after a battle with cancer. But his influence, which included helping to spark a passion for entrepreneurship in his son, is still being felt. Other members of Immuneering’s small team have also lost parents to cancer, adding a personal touch to the work they do every day.
    “Especially in the early days, people were taking more risk [joining us over] a large pharma company, but they were having a bigger impact,” Zeskind says. “It’s all about the work: looking at these successful drugs and figuring out why they’re better and seeing if we can improve them.”
    Indeed, even as Immuneering’s business model has evolved over the last 12 years, the company has never wavered in its larger mission.
    “There’s been a ton of great progress in medicine, but when someone gets a cancer diagnosis, it’s still, more likely than not, very bad news,” Zeskind says. “It’s a real unsolved problem. So by taking a counterintuitive approach and using data, we’re really focused on bringing forward medicines that can have the kind of durable responses that inspired us all those years ago with IL-2. We’re really excited about the impact the medicines we’re developing are going to have.” More

  • in

    3 Questions: Adam Berinsky on how to assess election polls

    As we approach Election Day 2020, all eyes are on polls — but how accurate are they?
    A specialist in political behavior and public opinion, Adam Berinsky is the Mitsui Professor of Political Science at MIT and director of the MIT Political Experiments Research Lab. He is the author of “In Time of War: Understanding American Public Opinion from World War II to Iraq” (University of Chicago Press, 2009) and “Silent Voices: Public Opinion and Political Participation in America” (Princeton University Press, 2004). Here, he speaks on trusting political polls as well as the spread of misinformation in advance of Election Day.
    Q: Pollsters predicted a Clinton win in 2016. Since the polls were wrong then, can people trust what they are seeing in the polls today?A: First of all, the problems in 2016 were much less widespread than most people realize. The national-level polls then actually did a pretty good job of predicting the popular vote. They said Hillary Clinton would win by between 3 percent and 4 percent, and she did win the popular vote by 2 percent.
    The real problem was that some of state polls were off — especially in key Midwestern states — and we don’t actually have a national election for president in the United States. Thanks to the electoral college system, we have a series of state elections, and the winner that emerges from those elections becomes president. In 2016, some states were under-polled for a couple of reasons, and that led to misjudgments about who would win the electoral college and thereby the presidency.The 2016 election was also unusual in that it revealed a new pattern in American voting. In contrast to other recent presidential contests, the education level of voters proved a significant factor in determining the outcome of the election; low-education voters split from the highly educated to support Donald Trump more than they had backed previous Republican candidates. Since highly educated voters are typically over-represented in polls (they have proved more likely to answer surveys), this split led to an undercount of Trump’s supporters that year.This education gap was a surprise in 2016, but today it’s being factored into how the polling results are reported. And, while it’s conceivable that a new cleavage could emerge this year, I think that’s unlikely. There’s a reason why pollsters use past trends to predict elections: Most of the time, it works. Q: What is the best way for people to find trustworthy polling data and to interpret the findings that are reported?A: The most important advice I can give is to look at all the polls, not just one or two. (I recommend poll aggregation sites Real Clear Politics, Pollster, and FiveThirtyEight.) Also, watch how particular polls trend over time. This is important not only because each poll represents a snapshot in time, but also because the accuracy of any poll relies largely on the skill of the pollster, who draws conclusions from the raw data derived by polling a sampling voters. (Sampling is used because contacting everyone in the country is impractical; the reported margin of error reflects this built-in element of estimation.)In a well-constructed scientific poll, the pollster takes several steps to get accurate results, including making adjustments for disparities between those polled and the known population (giving extra weight to the young people who responded, for example, if the pool of respondents skews older than the actual population — as often happens). Pollsters also employ modeling to make their best estimate of who among those polled is likely to vote (a calculation based on such factors as how interested the respondent is in the election and whether they voted in previous elections).So, how can you get the most from the polling information you see reported? First, separate out the poll itself from the predictions that people like Nate Silver at 538.com or the staff at The Economist make from the polling data. An analyst might look at three polls, see Biden leading, and predict a Democratic win. But if you drill down and see Biden was only leading by 2 or 3 percentage points — a figure within the margin of error — then it’s clear the race is really too close to call.Second, look at both the national polls and the state polls. National polls, which tend to be conducted more frequently, can give us a good picture about changes in the overall favorability of a particular candidate. (For example, in 2016, national polling showed steadily declining support for Hillary Clinton from mid-October through the end of the election.) But also pay attention to state polls. Fewer polls are conducted at the state level, but given that the election could come down to the results of a few key states, it’s important to see what is going on there.Q: Your research has shown that misinformation often spreads faster than truth. Given the level of disinformation circling in this year’s campaign — particularly related to the security of the election process itself — can you recommend any counter-measures?A: When I think of all the conflict and misinformation going on today, I try to remember that we have been here before as a nation. Politics in the early part of the 19th century was also rife with rumors and misinformation. In the 1830s, for example, presidential contests were marred by conspiracy theories and traded accusations of immorality, corruption, and misuse of power. That the United States survived this history suggests that there is nothing new here.However, that doesn’t mean that we should simply shrug our shoulders at what’s going on today. We need to think about the role that politicians can play in making things worse, and we need to take responsibility for whom we put in power. That’s because it’s the job of our leaders to stand up and challenge unsubstantiated rumors and outright falsehoods. When one leader becomes a demagogue, others need to take action.When a candidate says the election is rigged — before it’s even taken place — other leaders need to stand up and rein in that misinformation. That includes not just members of the opposing party, but politicians from across the political spectrum. If you have a contested election, and the loser accepts the verdict with the knowledge that he or she could be the victor next time, that’s democracy at work. But it’s incredibly dangerous when a candidate asserts that whatever happens, if he doesn’t win that means the system is corrupt.Spreading rumors and misinformation is not just a Republican problem, however. Today we are seeing conspiracy theories on both sides being amplified by politicians and then picked up by the mainstream media. Our leaders can choose to whip up conspiracy theories or tamp them down.Arguably, the political parties could do more to create saner messaging. They have the power to say, “Even if we can win with this guy, maybe we shouldn’t.” Broadcast television and social media also have roles to play, because they spread information through the masses. But politicians have tremendous power to lead and to shape that information, and voters need to remember this when they head to the polls — in this and every election. More