More stories

  • in

    Day of AI curriculum meets the moment

    MIT Responsible AI for Social Empowerment and Education (RAISE) recently celebrated the second annual Day of AI with two flagship local events. The Edward M. Kennedy Institute for the U.S. Senate in Boston hosted a human rights and data policy-focused event that was streamed worldwide. Dearborn STEM Academy in Roxbury, Massachusetts, hosted a student workshop in collaboration with Amazon Future Engineer. With over 8,000 registrations across all 50 U.S. states and 108 countries in 2023, participation in Day of AI has more than doubled since its inaugural year.

    Day of AI is a free curriculum of lessons and hands-on activities designed to teach kids of all ages and backgrounds the basics and responsible use of artificial intelligence, designed by researchers at MIT RAISE. This year, resources were available for educators to run at any time and in any increments they chose. The curriculum included five new modules to address timely topics like ChatGPT in School, Teachable Machines, AI and Social Media, Data Science and Me, and more. A collaboration with the International Society for Technology in Education also introduced modules for early elementary students. Educators across the world shared photos, videos, and stories of their students’ engagement, expressing excitement and even relief over the accessible lessons.

    Professor Cynthia Breazeal, director of RAISE, dean for digital learning at MIT, and head of the MIT Media Lab’s Personal Robots research group, said, “It’s been a year of extraordinary advancements in AI, and with that comes necessary conversations and concerns about who and what this technology is for. With our Day of AI events, we want to celebrate the teachers and students who are putting in the work to make sure that AI is for everyone.”

    Reflecting community values and protecting digital citizens

    Play video

    On May 18, 2023, MIT RAISE hosted a global Day of AI celebration featuring a flagship local event focused on human rights and data policy at the Edward M. Kennedy Institute for the U.S. Senate. Students from the Warren Prescott Middle School and New Mission High School heard from speakers the City of Boston, Liberty Mutual, and MIT to discuss the many benefits and challenges of artificial intelligence education. Video: MIT Open Learning

    MIT President Sally Kornbluth welcomed students from Warren Prescott Middle School and New Mission High School to the Day of AI program at the Edward M. Kennedy Institute. Kornbluth reflected on the exciting potential of AI, along with the ethical considerations society needs to be responsible for.

    “AI has the potential to do all kinds of fantastic things, including driving a car, helping us with the climate crisis, improving health care, and designing apps that we can’t even imagine yet. But what we have to make sure it doesn’t do is cause harm to individuals, to communities, to us — society as a whole,” she said.

    This theme resonated with each of the event speakers, whose jobs spanned the sectors of education, government, and business. Yo Deshpande, technologist for the public realm, and Michael Lawrence Evans, program director of new urban mechanics from the Boston Mayor’s Office, shared how Boston thinks about using AI to improve city life in ways that are “equitable, accessible, and delightful.” Deshpande said, “We have the opportunity to explore not only how AI works, but how using AI can line up with our values, the way we want to be in the world, and the way we want to be in our community.”

    Adam L’Italien, chief innovation officer at Liberty Mutual Insurance (one of Day of AI’s founding sponsors), compared our present moment with AI technologies to the early days of personal computers and internet connection. “Exposure to emerging technologies can accelerate progress in the world and in your own lives,” L’Italien said, while recognizing that the AI development process needs to be inclusive and mitigate biases.

    Human policies for artificial intelligence

    So how does society address these human rights concerns about AI? Marc Aidinoff ’21, former White House Office of Science and Technology Policy chief of staff, led a discussion on how government policy can influence the parameters of how technology is developed and used, like the Blueprint for an AI Bill of Rights. Aidinoff said, “The work of building the world you want to see is far harder than building the technical AI system … How do you work with other people and create a collective vision for what we want to do?” Warren Prescott Middle School students described how AI could be used to solve problems that humans couldn’t. But they also shared their concerns that AI could affect data privacy, learning deficits, social media addiction, job displacement, and propaganda.

    In a mock U.S. Senate trial activity designed by Daniella DiPaola, PhD student at the MIT Media Lab, the middle schoolers investigated what rights might be undermined by AI in schools, hospitals, law enforcement, and corporations. Meanwhile, New Mission High School students workshopped the ideas behind bill S.2314, the Social Media Addiction Reduction Technology (SMART) Act, in an activity designed by Raechel Walker, graduate research assistant in the Personal Robots Group, and Matt Taylor, research assistant at the Media Lab. They discussed what level of control could or should be introduced at the parental, educational, and governmental levels to reduce the risks of internet addiction.

    “Alexa, how do I program AI?”

    Play video

    The 2023 Day of AI celebration featured a flagship local event at the Dearborn STEM Academy in Roxbury in collaboration with Amazon Future Engineer. Students participated in a hands-on activity using MIT App Inventor as part of Day of AI’s Alexa lesson. Video: MIT Open Learning

    At Dearborn STEM Academy, Amazon Future Engineer helped students work through the Intro to Voice AI curriculum module in real-time. Students used MIT App Inventor to code basic commands for Alexa. In an interview with WCVB, Principal Darlene Marcano said, “It’s important that we expose our students to as many different experiences as possible. The students that are participating are on track to be future computer scientists and engineers.”

    Breazeal told Dearborn students, “We want you to have an informed voice about how you want AI to be used in society. We want you to feel empowered that you can shape the world. You can make things with AI to help make a better world and a better community.”

    Rohit Prasad ’08, senior vice president and head scientist for Alexa at Amazon, and Victor Reinoso ’97, global director of philanthropic education initiatives at Amazon, also joined the event. “Amazon and MIT share a commitment to helping students discover a world of possibilities through STEM and AI education,” said Reinoso. “There’s a lot of current excitement around the technological revolution with generative AI and large language models, so we’re excited to help students explore careers of the future and navigate the pathways available to them.” To highlight their continued investment in the local community and the school program, Amazon donated a $25,000 Innovation and Early College Pathways Program Grant to the Boston Public School system.

    Day of AI down under

    Not only was the Day of AI program widely adopted across the globe, Australian educators were inspired to adapt their own regionally specific curriculum. An estimated 161,000 AI professionals will be needed in Australia by 2030, according to the National Artificial Intelligence Center in the Commonwealth Scientific and Industrial Research Organization (CSIRO), an Australian government agency and Day of AI Australia project partner. CSIRO worked with the University of New South Wales to develop supplementary educational resources on AI ethics and machine learning. Day of AI Australia reached 85,000 students at 400-plus secondary schools this year, sparking curiosity in the next generation of AI experts.

    The interest in AI is accelerating as fast as the technology is being developed. Day of AI offers a unique opportunity for K-12 students to shape our world’s digital future and their own.

    “I hope that some of you will decide to be part of this bigger effort to help us figure out the best possible answers to questions that are raised by AI,” Kornbluth told students at the Edward M. Kennedy Institute. “We’re counting on you, the next generation, to learn how AI works and help make sure it’s for everyone.” More

  • in

    Bringing the social and ethical responsibilities of computing to the forefront

    There has been a remarkable surge in the use of algorithms and artificial intelligence to address a wide range of problems and challenges. While their adoption, particularly with the rise of AI, is reshaping nearly every industry sector, discipline, and area of research, such innovations often expose unexpected consequences that involve new norms, new expectations, and new rules and laws.

    To facilitate deeper understanding, the Social and Ethical Responsibilities of Computing (SERC), a cross-cutting initiative in the MIT Schwarzman College of Computing, recently brought together social scientists and humanists with computer scientists, engineers, and other computing faculty for an exploration of the ways in which the broad applicability of algorithms and AI has presented both opportunities and challenges in many aspects of society.

    “The very nature of our reality is changing. AI has the ability to do things that until recently were solely the realm of human intelligence — things that can challenge our understanding of what it means to be human,” remarked Daniel Huttenlocher, dean of the MIT Schwarzman College of Computing, in his opening address at the inaugural SERC Symposium. “This poses philosophical, conceptual, and practical questions on a scale not experienced since the start of the Enlightenment. In the face of such profound change, we need new conceptual maps for navigating the change.”

    The symposium offered a glimpse into the vision and activities of SERC in both research and education. “We believe our responsibility with SERC is to educate and equip our students and enable our faculty to contribute to responsible technology development and deployment,” said Georgia Perakis, the William F. Pounds Professor of Management in the MIT Sloan School of Management, co-associate dean of SERC, and the lead organizer of the symposium. “We’re drawing from the many strengths and diversity of disciplines across MIT and beyond and bringing them together to gain multiple viewpoints.”

    Through a succession of panels and sessions, the symposium delved into a variety of topics related to the societal and ethical dimensions of computing. In addition, 37 undergraduate and graduate students from a range of majors, including urban studies and planning, political science, mathematics, biology, electrical engineering and computer science, and brain and cognitive sciences, participated in a poster session to exhibit their research in this space, covering such topics as quantum ethics, AI collusion in storage markets, computing waste, and empowering users on social platforms for better content credibility.

    Showcasing a diversity of work

    In three sessions devoted to themes of beneficent and fair computing, equitable and personalized health, and algorithms and humans, the SERC Symposium showcased work by 12 faculty members across these domains.

    One such project from a multidisciplinary team of archaeologists, architects, digital artists, and computational social scientists aimed to preserve endangered heritage sites in Afghanistan with digital twins. The project team produced highly detailed interrogable 3D models of the heritage sites, in addition to extended reality and virtual reality experiences, as learning resources for audiences that cannot access these sites.

    In a project for the United Network for Organ Sharing, researchers showed how they used applied analytics to optimize various facets of an organ allocation system in the United States that is currently undergoing a major overhaul in order to make it more efficient, equitable, and inclusive for different racial, age, and gender groups, among others.

    Another talk discussed an area that has not yet received adequate public attention: the broader implications for equity that biased sensor data holds for the next generation of models in computing and health care.

    A talk on bias in algorithms considered both human bias and algorithmic bias, and the potential for improving results by taking into account differences in the nature of the two kinds of bias.

    Other highlighted research included the interaction between online platforms and human psychology; a study on whether decision-makers make systemic prediction mistakes on the available information; and an illustration of how advanced analytics and computation can be leveraged to inform supply chain management, operations, and regulatory work in the food and pharmaceutical industries.

    Improving the algorithms of tomorrow

    “Algorithms are, without question, impacting every aspect of our lives,” said Asu Ozdaglar, deputy dean of academics for the MIT Schwarzman College of Computing and head of the Department of Electrical Engineering and Computer Science, in kicking off a panel she moderated on the implications of data and algorithms.

    “Whether it’s in the context of social media, online commerce, automated tasks, and now a much wider range of creative interactions with the advent of generative AI tools and large language models, there’s little doubt that much more is to come,” Ozdaglar said. “While the promise is evident to all of us, there’s a lot to be concerned as well. This is very much time for imaginative thinking and careful deliberation to improve the algorithms of tomorrow.”

    Turning to the panel, Ozdaglar asked experts from computing, social science, and data science for insights on how to understand what is to come and shape it to enrich outcomes for the majority of humanity.

    Sarah Williams, associate professor of technology and urban planning at MIT, emphasized the critical importance of comprehending the process of how datasets are assembled, as data are the foundation for all models. She also stressed the need for research to address the potential implication of biases in algorithms that often find their way in through their creators and the data used in their development. “It’s up to us to think about our own ethical solutions to these problems,” she said. “Just as it’s important to progress with the technology, we need to start the field of looking at these questions of what biases are in the algorithms? What biases are in the data, or in that data’s journey?”

    Shifting focus to generative models and whether the development and use of these technologies should be regulated, the panelists — which also included MIT’s Srini Devadas, professor of electrical engineering and computer science, John Horton, professor of information technology, and Simon Johnson, professor of entrepreneurship — all concurred that regulating open-source algorithms, which are publicly accessible, would be difficult given that regulators are still catching up and struggling to even set guardrails for technology that is now 20 years old.

    Returning to the question of how to effectively regulate the use of these technologies, Johnson proposed a progressive corporate tax system as a potential solution. He recommends basing companies’ tax payments on their profits, especially for large corporations whose massive earnings go largely untaxed due to offshore banking. By doing so, Johnson said that this approach can serve as a regulatory mechanism that discourages companies from trying to “own the entire world” by imposing disincentives.

    The role of ethics in computing education

    As computing continues to advance with no signs of slowing down, it is critical to educate students to be intentional in the social impact of the technologies they will be developing and deploying into the world. But can one actually be taught such things? If so, how?

    Caspar Hare, professor of philosophy at MIT and co-associate dean of SERC, posed this looming question to faculty on a panel he moderated on the role of ethics in computing education. All experienced in teaching ethics and thinking about the social implications of computing, each panelist shared their perspective and approach.

    A strong advocate for the importance of learning from history, Eden Medina, associate professor of science, technology, and society at MIT, said that “often the way we frame computing is that everything is new. One of the things that I do in my teaching is look at how people have confronted these issues in the past and try to draw from them as a way to think about possible ways forward.” Medina regularly uses case studies in her classes and referred to a paper written by Yale University science historian Joanna Radin on the Pima Indian Diabetes Dataset that raised ethical issues on the history of that particular collection of data that many don’t consider as an example of how decisions around technology and data can grow out of very specific contexts.

    Milo Phillips-Brown, associate professor of philosophy at Oxford University, talked about the Ethical Computing Protocol that he co-created while he was a SERC postdoc at MIT. The protocol, a four-step approach to building technology responsibly, is designed to train computer science students to think in a better and more accurate way about the social implications of technology by breaking the process down into more manageable steps. “The basic approach that we take very much draws on the fields of value-sensitive design, responsible research and innovation, participatory design as guiding insights, and then is also fundamentally interdisciplinary,” he said.

    Fields such as biomedicine and law have an ethics ecosystem that distributes the function of ethical reasoning in these areas. Oversight and regulation are provided to guide front-line stakeholders and decision-makers when issues arise, as are training programs and access to interdisciplinary expertise that they can draw from. “In this space, we have none of that,” said John Basl, associate professor of philosophy at Northeastern University. “For current generations of computer scientists and other decision-makers, we’re actually making them do the ethical reasoning on their own.” Basl commented further that teaching core ethical reasoning skills across the curriculum, not just in philosophy classes, is essential, and that the goal shouldn’t be for every computer scientist be a professional ethicist, but for them to know enough of the landscape to be able to ask the right questions and seek out the relevant expertise and resources that exists.

    After the final session, interdisciplinary groups of faculty, students, and researchers engaged in animated discussions related to the issues covered throughout the day during a reception that marked the conclusion of the symposium. More

  • in

    Illuminating the money trail

    You may not know this, but the U.S. imposes a 12.5 percent import tariff on imported flashlights. However, for a product category the federal government describes as “portable electric lamps designed to function by their own source of energy, other than flashlights,” the import tariff is just 3.5 percent.

    At a glance, this seems inexplicable. Why is one kind of self-powered portable light taxed more heavily than another? According to MIT political science professor In Song Kim, a policy discrepancy like this often stems from the difference in firms’ political power, as well as the extent to which firms are empowered by global production networks. This is a subject Kim has spent years examining in detail, producing original scholarly results while opening up a wealth of big data about politics to the public.

    “We all understand companies as being important economic agents,” Kim says. “But companies are political agents, too. They are very important political actors.”

    In particular, Kim’s work has illuminated the effects of lobbying upon U.S. trade policy. International trade is often presented as an unalloyed good, opening up markets and fueling growth. Beyond that, trade issues are usually described at the industry level; we hear about what the agriculture lobby or auto industry wants. But in reality, different firms want different things, even within the same industry.

    As Kim’s work shows, most firms lobby for policies pertaining to specific components of their products, and trade policy consists heavily of carve-outs for companies, not industry-wide standards. Firms making non-flashlight portable lights, it would seem, are good at lobbying, but the benefits clearly do not carry over to all portable light makers, as long as products are not perfect substitutes for each other. Meanwhile, as Kim’s research also shows, lobbying helps firms grow faster in size, even as lobbying-influenced policies may slow down the economy as a whole.

    “All our existing theories suggest that trade policy is a public good, in the sense that the benefits of open trade, the gains from trade, will be enjoyed by the public and will benefit the country as a whole,” Kim says. “But what I’ve learned is that trade policies are very, very granular. It’s become obvious to me that trade is no longer a public good. It’s actually a private good for individual companies.”

    Kim’s work includes over a dozen published journal articles over the last several years, several other forthcoming research papers, and a book he is currently writing. At the same time, Kim has created a public database, LobbyView, which tracks money in U.S. politics extending back to 1999. LobbyView, as an important collection of political information, has research, educational, and public-interest applications, enabling others, in academia or outside it, to further delve into the topic.

    “I want to contribute to the scholarly community, and I also want to create a public [resource] for our MIT community [and beyond], so we can all study politics through it,” Kim says.

    Keeping the public good in sight

    Kim grew up in South Korea, in a setting where politics was central to daily life. Kim’s grandfather, Kim jae-soon, was the Speaker of the National Assembly in South Korea from 1988 through 1990 and an important figure in the country’s government.

    “I’ve always been fascinated by politics,” says Kim, who remembers prominent political figures dropping by the family home when he was young. One of the principal lessons Kim learned about politics from his grandfather, however, was not about proximity to power, but the importance of public service. The enduring lesson of his family’s engagement with politics, Kim says, is that “I truly believe in contributing to the public good.”

    Kim’s found his own way of contributing to the public good not as a politician but as a scholar of politics. Kim received his BA in political science from Yonsei University in Seoul but decided he wanted to pursue graduate studies in the U.S. He earned an MA in law and diplomacy from the Fletcher School of Tufts University, then an MA in political science at George Washington University. By this time, Kim had become focused on the quantitative analysis of trade policy; for his PhD work, he attended Princeton University and was awarded his doctorate in 2014, joining the MIT faculty that year.

    Among the key pieces of research Kim has published, one paper, “Political Cleavages within Industry: Firm-level Lobbying for Trade Liberalization,” published in the American Political Science Review and growing out of his dissertation research, helped show how remarkably specialized many trade policies are. As of 2017, the U.S. had almost 17,000 types of products it made tariff decisions about. Many of these are the component parts of a product; about two-thirds of international trade consists of manufactured components that get shipped around during the production process, rather than raw goods or finished products. That paper won the 2018 Michael Wallerstein Award for the best published article in political economy in the previous year.

    Another 2017 paper Kim co-authored, “The Charmed Life of Superstar Exporters,” from the Journal of Politics, provides more empirical evidence of the differences among firms within an industry. The “superstar” firms that are the largest exporters tend to lobby the most about trade politics; a firm’s characteristics reveal more about its preferences for open trade than the possibility that its industry as a whole will gain a comparative advantage internationally.

    Kim often uses large-scale data and computational methods to study international trade and trade politics. Still another paper he has co-authored, “Measuring Trade Profile with Granular Product-level Trade Data,” published in the American Journal of Political Science in 2020, traces trade relationships in highly specific terms. Looking at over 2 billion observations of international trade data, Kim developed an algorithm to group countries based on which products they import and export. The methodology helps researchers to learn about the highly different developmental paths that countries follow, and about the deepening international competition between countries such as the U.S. and China.

    At other times, Kim has analyzed who is influencing trade policy. His paper “Mapping Political Communities,” from the journal Political Analysis in 2021, looks at the U.S. Congress and uses mandatory reports filed by lobbyists to build a picture of which interests groups are most closely connected to which politicians.

    Kim has published all his papers while balancing both his scholarly research and the public launch of LobbyView, which occurred in 2018. He was awarded tenure by MIT in the spring of 2022. Currently he is an associate professor in the Department of Political Science and a faculty affiliate of the Institute for Data, Systems, and Society.

    By the book

    Kim has continued to explore firm-level lobbying dynamics, although his recent research runs in a few directions. In a 2021 working paper, Kim and co-author Federico Huneeus of the Central Bank of Chile built a model estimating that eliminating lobbying in the U.S. could increase productivity by as much as 6 percent.

    “Political rents [favorable policies] given to particular companies might introduce inefficiencies or a misallocation of resources in the economy,” Kim says. “You could allocate those resources to more productive although politically inactive firms, but now they’re given to less productive and yet politically active big companies, increasing market concentration and monopolies.”

    Kim is on sabbatical during the 2022-23 academic year, working on a book about the importance of firms’ political activities in trade policymaking. The book will have an expansive timeframe, dating back to ancient times, which underscores the salience of trade policy across eras. At the same time, the book will analyze the distinctive features of modern trade politics with deepening global production networks.

    “I’m trying to allow people to learn about the history of trade politics, to show how the politics have changed over time,” Kim says. “In doing that, I’m also highlighting the importance of firm-to-firm trade and the emergence of new trade coalitions among firms in different countries and industries that are linked through the global production chain.”

    While continuing his own scholarly research, Kim still leads LobbyView, which he views both as a big data resource for any scholars interested in money in politics and an excellent teaching resource for his MIT classes, as students can tap into it for projects and papers. LobbyView contains so much data, in fact, that part of the challenge is finding ways to mine it effectively.

    “It really offers me an opportunity to work with MIT students,” Kim says of LobbyView. “What I think I can contribute is to bring those technologies to our understanding of politics. Having this unique data set can really allow students here to use technology to learn about politics, and I believe that fits the MIT identity.” More

  • in

    Report: CHIPS Act just the first step in addressing threats to US leadership in advanced computing

    When Liu He, a Chinese economist, politician, and “chip czar,” was tapped to lead the charge in a chipmaking arms race with the United States, his message lingered in the air, leaving behind a dewy glaze of tension: “For our country, technology is not just for growth… it is a matter of survival.”

    Once upon a time, the United States’ early technological prowess positioned the nation to outpace foreign rivals and cultivate a competitive advantage for domestic businesses. Yet, 30 years later, America’s lead in advanced computing is continuing to wane. What happened?

    A new report from an MIT researcher and two colleagues sheds light on the decline in U.S. leadership. The scientists looked at high-level measures to examine the shrinkage: overall capabilities, supercomputers, applied algorithms, and semiconductor manufacturing. Through their analysis, they found that not only has China closed the computing gap with the U.S., but nearly 80 percent of American leaders in the field believe that their Chinese competitors are improving capabilities faster — which, the team says, suggests a “broad threat to U.S. competitiveness.”

    To delve deeply into the fray, the scientists conducted the Advanced Computing Users Survey, sampling 120 top-tier organizations, including universities, national labs, federal agencies, and industry. The team estimates that this group comprises one-third and one-half of all the most significant computing users in the United States.

    “Advanced computing is crucial to scientific improvement, economic growth and the competitiveness of U.S. companies,” says Neil Thompson, director of the FutureTech Research Project at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), who helped lead the study.

    Thompson, who is also a principal investigator at MIT’s Initiative on the Digital Economy, wrote the paper with Chad Evans, executive vice president and secretary and treasurer to the board at the Council on Competitiveness, and Daniel Armbrust, who is the co-founder, initial CEO, and member of the board of directors at Silicon Catalyst and former president of SEMATECH, the semiconductor consortium that developed industry roadmaps.

    The semiconductor, supercomputer, and algorithm bonanza

    Supercomputers — the room-sized, “giant calculators” of the hardware world — are an industry no longer dominated by the United States. Through 2015, about half of the most powerful computers were sitting firmly in the U.S., and China was growing slowly from a very slow base. But in the past six years, China has swiftly caught up, reaching near parity with America.

    This disappearing lead matters. Eighty-four percent of U.S. survey respondents said they’re computationally constrained in running essential programs. “This result was telling, given who our respondents are: the vanguard of American research enterprises and academic institutions with privileged access to advanced national supercomputing resources,” says Thompson. 

    With regards to advanced algorithms, historically, the U.S. has fronted the charge, with two-thirds of all significant improvements dominated by U.S.-born inventors. But in recent decades, U.S. dominance in algorithms has relied on bringing in foreign talent to work in the U.S., which the researchers say is now in jeopardy. China has outpaced the U.S. and many other countries in churning out PhDs in STEM fields since 2007, with one report postulating a near-distant future (2025) where China will be home to nearly twice as many PhDs than in the U.S. China’s rise in algorithms can also be seen with the “Gordon Bell Prize,” an achievement for outstanding work in harnessing the power of supercomputers in varied applications. U.S. winners historically dominated the prize, but China has now equaled or surpassed Americans’ performance in the past five years.

    While the researchers note the CHIPS and Science Act of 2022 is a critical step in re-establishing the foundation of success for advanced computing, they propose recommendations to the U.S. Office of Science and Technology Policy. 

    First, they suggest democratizing access to U.S. supercomputing by building more mid-tier systems that push boundaries for many users, as well as building tools so users scaling up computations can have less up-front resource investment. They also recommend increasing the pool of innovators by funding many more electrical engineers and computer scientists being trained with longer-term US residency incentives and scholarships. Finally, in addition to this new framework, the scientists urge taking advantage of what already exists, via providing the private sector access to experimentation with high-performance computing through supercomputing sites in academia and national labs.

    All that and a bag of chips

    Computing improvements depend on continuous advances in transistor density and performance, but creating robust, new chips necessitate a harmonious blend of design and manufacturing.

    Over the last six years, China was not known as the savants of noteworthy chips. In fact, in the past five decades, the U.S. designed most of them. But this changed in the past six years when China created the HiSilicon Kirin 9000, propelling itself to the international frontier. This success was mainly obtained through partnerships with leading global chip designers that began in the 2000s. Now, China now has 14 companies among the world’s top 50 fabless designers. A decade ago, there was only one. 

    Competitive semiconductor manufacturing has been more mixed, where U.S.-led policies and internal execution issues have slowed China’s rise, but as of July 2022, the Semiconductor Manufacturing International Corporation (SMIC) has evidence of 7 nanometer logic, which was not expected until much later. However, with extreme ultraviolet export restrictions, progress below 7 nm means domestic technology development would be expensive. Currently, China is only at parity or better in two out of 12 segments of the semiconductor supply chain. Still, with government policy and investments, the team expects a whopping increase to seven segments in 10 years. So, for the moment, the U.S. retains leadership in hardware manufacturing, but with fewer dimensions of advantage.

    The authors recommend that the White House Office of Science and Technology Policy work with key national agencies, such as the U.S. Department of Defense, U.S. Department of Energy, and the National Science Foundation, to define initiatives to build the hardware and software systems needed for important computing paradigms and workloads critical for economic and security goals. “It is crucial that American enterprises can get the benefit of faster computers,” says Thompson. “With Moore’s Law slowing down, the best way to do this is to create a portfolio of specialized chips (or “accelerators”) that are customized to our needs.”

    The scientists further believe that to lead the next generation of computing, four areas must be addressed. First, by issuing grand challenges to the CHIPS Act National Semiconductor Technology Center, researchers and startups would be motivated to invest in research and development and to seek startup capital for new technologies in areas such as spintronics, neuromorphics, optical and quantum computing, and optical interconnect fabrics. By supporting allies in passing similar acts, overall investment in these technologies would increase, and supply chains would become more aligned and secure. Establishing test beds for researchers to test algorithms on new computing architectures and hardware would provide an essential platform for innovation and discovery. Finally, planning for post-exascale systems that achieve higher levels of performance through next-generation advances would ensure that current commercial technologies don’t limit future computing systems.

    “The advanced computing landscape is in rapid flux — technologically, economically, and politically, with both new opportunities for innovation and rising global rivalries,” says Daniel Reed, Presidential Professor and professor of computer science and electrical and computer engineering at the University of Utah. “The transformational insights from both deep learning and computational modeling depend on both continued semiconductor advances and their instantiation in leading edge, large-scale computing systems — hyperscale clouds and high-performance computing systems. Although the U.S. has historically led the world in both advanced semiconductors and high-performance computing, other nations have recognized that these capabilities are integral to 21st century economic competitiveness and national security, and they are investing heavily.”

    The research was funded, in part, through Thompson’s grant from Good Ventures, which supports his FutureTech Research Group. The paper is being published by the Georgetown Public Policy Review. More

  • in

    Improving health outcomes by targeting climate and air pollution simultaneously

    Climate policies are typically designed to reduce greenhouse gas emissions that result from human activities and drive climate change. The largest source of these emissions is the combustion of fossil fuels, which increases atmospheric concentrations of ozone, fine particulate matter (PM2.5) and other air pollutants that pose public health risks. While climate policies may result in lower concentrations of health-damaging air pollutants as a “co-benefit” of reducing greenhouse gas emissions-intensive activities, they are most effective at improving health outcomes when deployed in tandem with geographically targeted air-quality regulations.

    Yet the computer models typically used to assess the likely air quality/health impacts of proposed climate/air-quality policy combinations come with drawbacks for decision-makers. Atmospheric chemistry/climate models can produce high-resolution results, but they are expensive and time-consuming to run. Integrated assessment models can produce results for far less time and money, but produce results at global and regional scales, rendering them insufficiently precise to obtain accurate assessments of air quality/health impacts at the subnational level.

    To overcome these drawbacks, a team of researchers at MIT and the University of California at Davis has developed a climate/air-quality policy assessment tool that is both computationally efficient and location-specific. Described in a new study in the journal ACS Environmental Au, the tool could enable users to obtain rapid estimates of combined policy impacts on air quality/health at more than 1,500 locations around the globe — estimates precise enough to reveal the equity implications of proposed policy combinations within a particular region.

    “The modeling approach described in this study may ultimately allow decision-makers to assess the efficacy of multiple combinations of climate and air-quality policies in reducing the health impacts of air pollution, and to design more effective policies,” says Sebastian Eastham, the study’s lead author and a principal research scientist at the MIT Joint Program on the Science and Policy of Global Change. “It may also be used to determine if a given policy combination would result in equitable health outcomes across a geographical area of interest.”

    To demonstrate the efficiency and accuracy of their policy assessment tool, the researchers showed that outcomes projected by the tool within seconds were consistent with region-specific results from detailed chemistry/climate models that took days or even months to run. While continuing to refine and develop their approaches, they are now working to embed the new tool into integrated assessment models for direct use by policymakers.

    “As decision-makers implement climate policies in the context of other sustainability challenges like air pollution, efficient modeling tools are important for assessment — and new computational techniques allow us to build faster and more accurate tools to provide credible, relevant information to a broader range of users,” says Noelle Selin, a professor at MIT’s Institute for Data, Systems and Society and Department of Earth, Atmospheric and Planetary Sciences, and supervising author of the study. “We are looking forward to further developing such approaches, and to working with stakeholders to ensure that they provide timely, targeted and useful assessments.”

    The study was funded, in part, by the U.S. Environmental Protection Agency and the Biogen Foundation. More

  • in

    Study: Carbon-neutral pavements are possible by 2050, but rapid policy and industry action are needed

    Almost 2.8 million lane-miles, or about 4.6 million lane-kilometers, of the United States are paved.

    Roads and streets form the backbone of our built environment. They take us to work or school, take goods to their destinations, and much more.

    However, a new study by MIT Concrete Sustainability Hub (CSHub) researchers shows that the annual greenhouse gas (GHG) emissions of all construction materials used in the U.S. pavement network are 11.9 to 13.3 megatons. This is equivalent to the emissions of a gasoline-powered passenger vehicle driving about 30 billion miles in a year.

    As roads are built, repaved, and expanded, new approaches and thoughtful material choices are necessary to dampen their carbon footprint. 

    The CSHub researchers found that, by 2050, mixtures for pavements can be made carbon-neutral if industry and governmental actors help to apply a range of solutions — like carbon capture — to reduce, avoid, and neutralize embodied impacts. (A neutralization solution is any compensation mechanism in the value chain of a product that permanently removes the global warming impact of the processes after avoiding and reducing the emissions.) Furthermore, nearly half of pavement-related greenhouse gas (GHG) savings can be achieved in the short term with a negative or nearly net-zero cost.

    The research team, led by Hessam AzariJafari, MIT CSHub’s deputy director, closed gaps in our understanding of the impacts of pavements decisions by developing a dynamic model quantifying the embodied impact of future pavements materials demand for the U.S. road network. 

    The team first split the U.S. road network into 10-mile (about 16 kilometer) segments, forecasting the condition and performance of each. They then developed a pavement management system model to create benchmarks helping to understand the current level of emissions and the efficacy of different decarbonization strategies. 

    This model considered factors such as annual traffic volume and surface conditions, budget constraints, regional variation in pavement treatment choices, and pavement deterioration. The researchers also used a life-cycle assessment to calculate annual state-level emissions from acquiring pavement construction materials, considering future energy supply and materials procurement.

    The team considered three scenarios for the U.S. pavement network: A business-as-usual scenario in which technology remains static, a projected improvement scenario aligned with stated industry and national goals, and an ambitious improvement scenario that intensifies or accelerates projected strategies to achieve carbon neutrality. 

    If no steps are taken to decarbonize pavement mixtures, the team projected that GHG emissions of construction materials used in the U.S. pavement network would increase by 19.5 percent by 2050. Under the projected scenario, there was an estimated 38 percent embodied impact reduction for concrete and 14 percent embodied impact reduction for asphalt by 2050.

    The keys to making the pavement network carbon neutral by 2050 lie in multiple places. Fully renewable energy sources should be used for pavement materials production, transportation, and other processes. The federal government must contribute to the development of these low-carbon energy sources and carbon capture technologies, as it would be nearly impossible to achieve carbon neutrality for pavements without them. 

    Additionally, increasing pavements’ recycled content and improving their design and production efficiency can lower GHG emissions to an extent. Still, neutralization is needed to achieve carbon neutrality.

    Making the right pavement construction and repair choices would also contribute to the carbon neutrality of the network. For instance, concrete pavements can offer GHG savings across the whole life cycle as they are stiffer and stay smoother for longer, meaning they require less maintenance and have a lesser impact on the fuel efficiency of vehicles. 

    Concrete pavements have other use-phase benefits including a cooling effect through an intrinsically high albedo, meaning they reflect more sunlight than regular pavements. Therefore, they can help combat extreme heat and positively affect the earth’s energy balance through positive radiative forcing, making albedo a potential neutralization mechanism.

    At the same time, a mix of fixes, including using concrete and asphalt in different contexts and proportions, could produce significant GHG savings for the pavement network; decision-makers must consider scenarios on a case-by-case basis to identify optimal solutions. 

    In addition, it may appear as though the GHG emissions of materials used in local roads are dwarfed by the emissions of interstate highway materials. However, the study found that the two road types have a similar impact. In fact, all road types contribute heavily to the total GHG emissions of pavement materials in general. Therefore, stakeholders at the federal, state, and local levels must be involved if our roads are to become carbon neutral. 

    The path to pavement network carbon-neutrality is, therefore, somewhat of a winding road. It demands regionally specific policies and widespread investment to help implement decarbonization solutions, just as renewable energy initiatives have been supported. Providing subsidies and covering the costs of premiums, too, are vital to avoid shifts in the market that would derail environmental savings.

    When planning for these shifts, we must recall that pavements have impacts not just in their production, but across their entire life cycle. As pavements are used, maintained, and eventually decommissioned, they have significant impacts on the surrounding environment.

    If we are to meet climate goals such as the Paris Agreement, which demands that we reach carbon-neutrality by 2050 to avoid the worst impacts of climate change, we — as well as industry and governmental stakeholders — must come together to take a hard look at the roads we use every day and work to reduce their life cycle emissions. 

    The study was published in the International Journal of Life Cycle Assessment. In addition to AzariJafari, the authors include Fengdi Guo of the MIT Department of Civil and Environmental Engineering; Jeremy Gregory, executive director of the MIT Climate and Sustainability Consortium; and Randolph Kirchain, director of the MIT CSHub. More

  • in

    A breakthrough on “loss and damage,” but also disappointment, at UN climate conference

    As the 2022 United Nations climate change conference, known as COP27, stretched into its final hours on Saturday, Nov. 19, it was uncertain what kind of agreement might emerge from two weeks of intensive international negotiations.

    In the end, COP27 produced mixed results: on the one hand, a historic agreement for wealthy countries to compensate low-income countries for “loss and damage,” but on the other, limited progress on new plans for reducing the greenhouse gas emissions that are warming the planet.

    “We need to drastically reduce emissions now — and this is an issue this COP did not address,” said U.N. Secretary-General António Guterres in a statement at the conclusion of COP27. “A fund for loss and damage is essential — but it’s not an answer if the climate crisis washes a small island state off the map — or turns an entire African country to desert.”

    Throughout the two weeks of the conference, a delegation of MIT students, faculty, and staff was at the Sharm El-Sheikh International Convention Center to observe the negotiations, conduct and share research, participate in panel discussions, and forge new connections with researchers, policymakers, and advocates from around the world.

    Loss and damage

    A key issue coming in to COP27 (COP stands for “conference of the parties” to the U.N. Framework Convention on Climate Change, held for the 27th time) was loss and damage: a term used by the U.N. to refer to harms caused by climate change — either through acute catastrophes like extreme weather events or slower-moving impacts like sea level rise — to which communities and countries are unable to adapt. 

    Ultimately, a deal on loss and damage proved to be COP27’s most prominent accomplishment. Negotiators reached an eleventh-hour agreement to “establish new funding arrangements for assisting developing countries that are particularly vulnerable to the adverse effects of climate change.” 

    “Providing financial assistance to developing countries so they can better respond to climate-related loss and damage is not only a moral issue, but also a pragmatic one,” said Michael Mehling, deputy director of the MIT Center for Energy and Environmental Policy Research, who attended COP27 and participated in side events. “Future emissions growth will be squarely centered in the developing world, and offering support through different channels is key to building the trust needed for more robust global cooperation on mitigation.”

    Youssef Shaker, a graduate student in the MIT Technology and Policy Program and a research assistant with the MIT Energy Initiative, attended the second week of the conference, where he followed the negotiations over loss and damage closely. 

    “While the creation of a fund is certainly an achievement,” Shaker said, “significant questions remain to be answered, such as the size of the funding available as well as which countries receive access to it.” A loss-and-damage fund that is not adequately funded, Shaker noted, “would not be an impactful outcome.” 

    The agreement on loss and damage created a new committee, made up of 24 country representatives, to “operationalize” the new funding arrangements, including identifying funding sources. The committee is tasked with delivering a set of recommendations at COP28, which will take place next year in Dubai.

    Advising the U.N. on net zero

    Though the decisions reached at COP27 did not include major new commitments on reducing emissions from the combustion of fossil fuels, the transition to a clean global energy system was nevertheless a key topic of conversation throughout the conference.

    The Council of Engineers for the Energy Transition (CEET), an independent, international body of engineers and energy systems experts formed to provide advice to the U.N. on achieving net-zero emissions globally by 2050, convened for the first time at COP27. Jessika Trancik, a professor in the MIT Institute for Data, Systems, and Society and a member of CEET, spoke on a U.N.-sponsored panel on solutions for the transition to clean energy.

    Trancik noted that the energy transition will look different in different regions of the world. “As engineers, we need to understand those local contexts and design solutions around those local contexts — that’s absolutely essential to support a rapid and equitable energy transition.”

    At the same time, Trancik noted that there is now a set of “low-cost, ready-to-scale tools” available to every region — tools that resulted from a globally competitive process of innovation, stimulated by public policies in different countries, that dramatically drove down the costs of technologies like solar energy and lithium-ion batteries. The key, Trancik said, is for regional transition strategies to “tap into global processes of innovation.”

    Reinventing climate adaptation

    Elfatih Eltahir, the H. M. King Bhumibol Professor of Hydrology and Climate, traveled to COP27 to present plans for the Jameel Observatory Climate Resilience Early Warning System (CREWSnet), one of the five projects selected in April 2022 as a flagship in MIT’s Climate Grand Challenges initiative. CREWSnet focuses on climate adaptation, the term for adapting to climate impacts that are unavoidable.

    The aim of CREWSnet, Eltahir told the audience during a panel discussion, is “nothing short of reinventing the process of climate change adaptation,” so that it is proactive rather than reactive; community-led; data-driven and evidence-based; and so that it integrates different climate risks, from heat waves to sea level rise, rather than treating them individually.

    “However, it’s easy to talk about these changes,” said Eltahir. “The real challenge, which we are now just launching and engaging in, is to demonstrate that on the ground.” Eltahir said that early demonstrations will happen in a couple of key locations, including southwest Bangladesh, where multiple climate risks — rising sea levels, increasing soil salinity, and intensifying heat waves and cyclones — are combining to threaten the area’s agricultural production.

    Building on COP26

    Some members of MIT’s delegation attended COP27 to advance efforts that had been formally announced at last year’s U.N. climate conference, COP26, in Glasgow, Scotland.

    At an official U.N. side event co-organized by MIT on Nov. 11, Greg Sixt, the director of the Food and Climate Systems Transformation (FACT) Alliance led by the Abdul Latif Jameel Water and Food Systems Lab, provided an update on the alliance’s work since its launch at COP26.

    Food systems are a major source of greenhouse gas emissions — and are increasingly vulnerable to climate impacts. The FACT Alliance works to better connect researchers to farmers, food businesses, policymakers, and other food systems stakeholders to make food systems (which include food production, consumption, and waste) more sustainable and resilient. 

    Sixt told the audience that the FACT Alliance now counts over 20 research and stakeholder institutions around the world among its members, but also collaborates with other institutions in an “open network model” to advance work in key areas — such as a new research project exploring how climate scenarios could affect global food supply chains.

    Marcela Angel, research program director for the Environmental Solutions Initiative (ESI), helped convene a meeting at COP27 of the Afro-InterAmerican Forum on Climate Change, which also launched at COP26. The forum works with Afro-descendant leaders across the Americas to address significant environmental issues, including climate risks and biodiversity loss. 

    At the event — convened with the Colombian government and the nonprofit Conservation International — ESI brought together leaders from six countries in the Americas and presented recent work that estimates that there are over 178 million individuals who identify as Afro-descendant living in the Americas, in lands of global environmental importance. 

    “There is a significant overlap between biodiversity hot spots, protected areas, and areas of high Afro-descendant presence,” said Angel. “But the role and climate contributions of these communities is understudied, and often made invisible.”    

    Limiting methane emissions

    Methane is a short-lived but potent greenhouse gas: When released into the atmosphere, it immediately traps about 120 times more heat than carbon dioxide does. More than 150 countries have now signed the Global Methane Pledge, launched at COP26, which aims to reduce methane emissions by at least 30 percent by 2030 compared to 2020 levels.

    Sergey Paltsev, the deputy director of the Joint Program on the Science and Policy of Global Change and a senior research scientist at the MIT Energy Initiative, gave the keynote address at a Nov. 17 event on methane, where he noted the importance of methane reductions from the oil and gas sector to meeting the 2030 goal.

    “The oil and gas sector is where methane emissions reductions could be achieved the fastest,” said Paltsev. “We also need to employ an integrated approach to address methane emissions in all sectors and all regions of the world because methane emissions reductions provide a near-term pathway to avoiding dangerous tipping points in the global climate system.”

    “Keep fighting relentlessly”

    Arina Khotimsky, a senior majoring in materials science and engineering and a co-president of the MIT Energy and Climate Club, attended the first week of COP27. She reflected on the experience in a social media post after returning home. 

    “COP will always have its haters. Is there greenwashing? Of course! Is everyone who should have a say in this process in the room? Not even close,” wrote Khotimsky. “So what does it take for COP to matter? It takes everyone who attended to not only put ‘climate’ on front-page news for two weeks, but to return home and keep fighting relentlessly against climate change. I know that I will.” More

  • in

    MIT Policy Hackathon produces new solutions for technology policy challenges

    Almost three years ago, the Covid-19 pandemic changed the world. Many are still looking to uncover a “new normal.”

    “Instead of going back to normal, [there’s a new generation that] wants to build back something different, something better,” says Jorge Sandoval, a second-year graduate student in MIT’s Technology and Policy Program (TPP) at the Institute for Data, Systems and Society (IDSS). “How do we communicate this mindset to others, that the world cannot be the same as before?”

    This was the inspiration behind “A New (Re)generation,” this year’s theme for the IDSS-student-run MIT Policy Hackathon, which Sandoval helped to organize as the event chair. The Policy Hackathon is a weekend-long, interdisciplinary competition that brings together participants from around the globe to explore potential solutions to some of society’s greatest challenges. 

    Unlike other competitions of its kind, Sandoval says MIT’s event emphasizes a humanistic approach. “The idea of our hackathon is to promote applications of technology that are humanistic or human-centered,” he says. “We take the opportunity to examine aspects of technology in the spaces where they tend to interact with society and people, an opportunity most technical competitions don’t offer because their primary focus is on the technology.”

    The competition started with 50 teams spread across four challenge categories. This year’s categories included Internet and Cybersecurity, Environmental Justice, Logistics, and Housing and City Planning. While some people come into the challenge with friends, Sandoval said most teams form organically during an online networking meeting hosted by MIT.

    “We encourage people to pair up with others outside of their country and to form teams of different diverse backgrounds and ages,” Sandoval says. “We try to give people who are often not invited to the decision-making table the opportunity to be a policymaker, bringing in those with backgrounds in not only law, policy, or politics, but also medicine, and people who have careers in engineering or experience working in nonprofits.”

    Once an in-person event, the Policy Hackathon has gone through its own regeneration process these past three years, according to Sandoval. After going entirely online during the pandemic’s height, last year they successfully hosted the first hybrid version of the event, which served as their model again this year.

    “The hybrid version of the event gives us the opportunity to allow people to connect in a way that is lost if it is only online, while also keeping the wide range of accessibility, allowing people to join from anywhere in the world, regardless of nationality or income, to provide their input,” Sandoval says.

    For Swetha Tadisina, an undergraduate computer science major at Lafayette College and participant in the internet and cybersecurity category, the hackathon was a unique opportunity to meet and work with people much more advanced in their careers. “I was surprised how such a diverse team that had never met before was able to work so efficiently and creatively,” Tadisina says.

    Erika Spangler, a public high school teacher from Massachusetts and member of the environmental justice category’s winning team, says that while each member of “Team Slime Mold” came to the table with a different set of skills, they managed to be in sync from the start — even working across the nine-and-a-half-hour time difference the four-person team faced when working with policy advocate Shruti Nandy from Calcutta, India.

    “We divided the project into data, policy, and research and trusted each other’s expertise,” Spangler says, “Despite having separate areas of focus, we made sure to have regular check-ins to problem-solve and cross-pollinate ideas.”

    During the 48-hour period, her team proposed the creation of an algorithm to identify high-quality brownfields that could be cleaned up and used as sites for building renewable energy. Their corresponding policy sought to mandate additional requirements for renewable energy businesses seeking tax credits from the Inflation Reduction Act.

    “Their policy memo had the most in-depth technical assessment, including deep dives in a few key cities to show the impact of their proposed approach for site selection at a very granular level,” says Amanda Levin, director of policy analysis for the Natural Resources Defense Council (NRDC). Levin acted as both a judge and challenge provider for the environmental justice category.

    “They also presented their policy recommendations in the memo in a well-thought-out way, clearly noting the relevant actor,” she adds. This clarity around what can be done, and who would be responsible for those actions, is highly valuable for those in policy.”

    Levin says the NRDC, one of the largest environmental nonprofits in the United States, provided five “challenge questions,” making it clear that teams did not need to address all of them. She notes that this gave teams significant leeway, bringing a wide variety of recommendations to the table. 

    “As a challenge partner, the work put together by all the teams is already being used to help inform discussions about the implementation of the Inflation Reduction Act,” Levin says. “Being able to tap into the collective intelligence of the hackathon helped uncover new perspectives and policy solutions that can help make an impact in addressing the important policy challenges we face today.”

    While having partners with experience in data science and policy definitely helped, fellow Team Slime Mold member Sara Sheffels, a PhD candidate in MIT’s biomaterials program, says she was surprised how much her experiences outside of science and policy were relevant to the challenge: “My experience organizing MIT’s Graduate Student Union shaped my ideas about more meaningful community involvement in renewables projects on brownfields. It is not meaningful to merely educate people about the importance of renewables or ask them to sign off on a pre-planned project without addressing their other needs.”

    “I wanted to test my limits, gain exposure, and expand my world,” Tadisina adds. “The exposure, friendships, and experiences you gain in such a short period of time are incredible.”

    For Willy R. Vasquez, an electrical and computer engineering PhD student at the University of Texas, the hackathon is not to be missed. “If you’re interested in the intersection of tech, society, and policy, then this is a must-do experience.” More