News

In the news section, our experts take center stage in shaping discussions on technology and policy. Discover articles featuring insights from our experts or citing our research. CSET’s insights and research are pivotal in shaping key conversations within the evolving landscape of emerging technology and policy.

Dewey Murdick and Miriam Vogel shared their expert analysis in an op-ed published by Fortune. In their piece, they highlight the urgent need for the United States to strengthen its AI literacy and incident reporting systems to maintain global leadership amid rapidly advancing international competition, especially from China’s booming AI sector.

Assessment


China


Filter entries

America’s lead in advanced computing is almost gone

Georgetown Public Policy Review
| February 28, 2023

The Georgetown Public Policy Review recently published a paper that cited a report by CSET's Jack Corrigan, Emily S. Weinstein, Dahlia Peterson, and Ryan Fedasiuk, and CSET alumni Remco Zwetsloot and Diana Gehlhaus. The report examines data on STEM PhD graduation rates and predicts their growth in the next five years, during which China is expected to increase its lead over the United States.

Chips Act Will Test Whether U.S. Can Reverse Semiconductor Exodus

The Wall Street Journal
| February 22, 2023

The Wall Street Journal referred to a policy brief by CSET's Will Hunt, which analyzes how incentives from the CHIPS Act should be allocated among various types of chips. The article talks about the $53 billion CHIPS Act program, which aims to revive the domestic semiconductor industry by offering subsidies for manufacturing incentives and research and development.

BBC News cited a report authored by CSET's Josh Goldstein, Micah Musser, and CSET alumna Katerina Sedova in partnership with OpenAI and Stanford Internet Observatory. Alongside the report, BBC News quoted Josh Goldstein regarding the current status of AI systems.

WIRED highlighted CSET Research Analyst Micah Musser in an article that references a report published by CSET, in collaboration with OpenAI and Stanford Internet Observatory. The report examines the potential misuse of language models in influence operations in the future and offers a framework for evaluating potential countermeasures.

Disinformation Researchers Raise Alarms About A.I. Chatbots

The New York Times
| February 8, 2023

A report by CSET's Josh Goldstein, Micah Musser, and CSET alumna Katerina Sedova in collaboration with OpenAI and Stanford Internet Observatory was cited in an article published by The New York Times about the potential dangers of AI-powered chatbots.

Forbes referred to a report authored by CSET alumni Diana Gehlhaus and Santiago Mutis. The report delves into the domestic AI workforce, providing an initial evaluation of its makeup, size, and essential features.

A report by CSET's Josh Goldstein, Micah Musser, and CSET alumna Katerina Sedova in collaboration with OpenAI and Stanford Internet Observatory was cited in an article published by Grid. The report examines the potential misuse of language models for influence operations in the future and proposes a structure for evaluating possible solutions to this problem.

A report by CSET’s Josh Goldstein, Micah Musser, and CSET alumna Katerina Sedova in collaboration with OpenAI and Stanford Internet Observatory was cited in an article published on Medium. The report explores how language models could be misused for influence operations in the future, and it provides a framework for assessing potential mitigation strategies.

In an interview with CyberScoop, Research Fellow Josh A. Goldstein discussed his research, in collaboration with Open AI and Stanford's Internet Observatory, on the use of large language models to deploy propaganda.

Compute Accounting Principles Can Help Reduce AI Risks

Tech Policy Press
| November 30, 2022

In an opinion piece for Tech Policy Press, CSET's Krystal Jackson, Karson Elmgren, Jacob Feldgoise, and their coauthor Andrew Critch wrote about computational power as a key factor driving AI progress.