Worth Knowing
Meta’s Language Models — Released to Researchers — Leak Online: On February 24, Meta released a family of language models — dubbed LLaMA (Large Language Model Meta AI) — to the broader research community. The performance of the models, which range in size from 7 billion to 65 billion parameters, is not particularly exceptional — in an accompanying paper, Meta researchers say the 13B-parameter model outperforms OpenAI’s GPT-3 on “most benchmarks” while the 65B-parameter model is “competitive with the best models,” such as DeepMind’s Chinchilla and Google’s PaLM. But while those comparable models are either inaccessible to the public or only accessible through their developers’ APIs, Meta released both the source code and the model weights to researchers from academia and industry on a case-by-case basis. Much as with last year’s semi-public release of the less-sophisticated OPT-175B, Meta said the relatively unrestricted LLaMA release was part of its “commitment to open science” and expressed hope that it would help researchers identify and mitigate the problems of large language models. The restrictions that did exist were quickly rendered moot, however, as the model weights were (predictably) leaked on 4chan soon after the release — making it, in effect, the most powerful model available “in the wild.” Meta has defended its release strategy, but it is worth wondering whether the company will continue to be so open with new models in the future. CEO Mark Zuckerberg announced plans last week to create a “top-level product group” focused on generative AI. It remains to be seen whether Meta’s “commitment to open science” can coexist with a more AI-dependent business strategy.
- More: Meta’s powerful AI language model has leaked online — what happens now? | Forecasting Potential Misuses of Language Models for Disinformation Campaigns — and How to Reduce Risk
Expert Take: “The evaluation that the city of Rotterdam produced of its own model’s performance is one of the worst I’ve ever seen, and shows a model that should never have been trusted to make impactful decisions about individuals. What’s interesting about both this article and the 2016 ProPublica report is the decision to frame these models’ problems as problems of bias — which invites complicated statistical and philosophical arguments about fairness — instead of as an even worse failure to function with meaningful accuracy in the first place.” — Micah Musser, Research Analyst
- More: Suspicion Machines | Dutch scandal serves as a warning for Europe over risks of using algorithms
Government Updates
The Commerce Department Opens Up Applications for CHIPS Funds: Last week, the Commerce Department announced the first funding opportunity for CHIPS for America semiconductor manufacturing incentives — a key step toward disbursing the $39 billion in incentives provided by the CHIPS and Science Act of 2022. The funding opportunity specifically seeks applications for projects related to the construction, expansion or modernization of U.S. facilities that produce “leading-edge, current-generation, and mature-node” chips (the full Notice of Funding Opportunity details the differences between each type of chip). Future funding opportunities — which the Commerce Department said would be announced later this year — will solicit applications for facilities related to R&D and semiconductor materials and equipment. In a speech by Commerce Secretary Gina Raimondo at Georgetown’s School of Foreign Service (with a subsequent question-and-answer session moderated by CSET’s Emily Weinstein) and in a “Vision for Success” document that accompanied last week’s announcement, the Commerce Department laid out the “cross-cutting themes” guiding its implementation plan and specific objectives the CHIPS Program Office has targeted. The former includes “catalyzing private investment,” “building a skilled and diverse workforce,” and “enforcing guardrails” (such as blocking stock buybacks), while the latter includes plans to have two or more “new large-scale clusters of leading-edge logic fabs,” two or more “high-volume advanced packaged facilities,” high volume DRAM production, and increased current-generation and mature-node chip production in the United States by the end of the decade. The Commerce Department will begin accepting full applications for leading-edge facilities on March 31 and for current-generation, mature-node, and back-end facilities on June 26.
The FTC Warns Companies About Overselling Their AI Products: In an item posted on its blog last week, the Federal Trade Commission issued new guidance warning businesses to “keep [their] AI claims in check.” AI hype cycles are nothing new, but the current boom — fueled in large part by generative AI systems like those powering OpenAI’s ChatGPT and the image-generating Stable Diffusion — has sparked what some observers are calling “a new gold rush.” Officials from the FTC’s Division of Advertising Practices seem concerned that marketers might oversell their products’ capabilities in their rush to cash in on the AI hype. The agency’s new guidance outlines four questions the FTC will look to answer while scrutinizing potentially misleading AI marketing claims: whether companies are exaggerating their AI products’ capabilities, whether they are promising (without sufficient evidence) that their products perform better than a non-AI competitor, whether they have made an effort to address the “reasonably foreseeable risks and impact” of putting their AI products on the market, and whether the product even uses AI. This isn’t the first time the FTC has weighed in on issues concerning AI — in 2021, the agency warned companies about the potential legal ramifications of deploying biased AI systems — and it is unlikely to be the last. As we covered last month, the FTC launched a technology office to support its tech-related enforcement efforts, including those related to AI.
The Air Force Wants to Ramp Up Its “Loyal Wingman” Program in 2024: President Biden’s FY2024 budget request — which he is set to unveil later today in Philadelphia — is expected to include plans for a “significant investment” in collaborative combat aircraft for the Air Force. In a speech earlier this week, Secretary Frank Kendall said the Air Force plans to eventually deploy approximately 1,000 CCAs — uncrewed “loyal wingman” aircraft that work together with planes piloted by humans to carry out elements of a mission — though he later said in comments to the press that the number was a “reasonable starting point” and could go even higher. In previous comments, Kendall has said that as many as five CCAs could ultimately be assigned to a manned aircraft, but the current plan calls for fewer: two each for 300 F-35s and 200 still-in-development “Next Generation Air Dominance” fighters. Kendall added that the CCA-related request will be related to the development of the program and “not the ultimate CCA.” Exact details will almost certainly be classified, but last year Kendall said that part of the budget request will go towards funding a 2024 competition to choose the best-suited CCA.
In Translation
CSET’s translations of significant foreign language documents on AI
CSET’s translations of significant foreign language documents on AI
PRC ML Security Standard: Information Security Technology-Security Specification and Assessment Methods for Machine Learning Algorithms. This is a draft Chinese national standard for the security of machine learning (ML) algorithms. Drafted by Chinese government entities and tech companies, the standard provides security guidelines for developers of ML models and procedures for assessing ML algorithms’ susceptibility to cyberattacks.
South Korean Industrial Technology Protection Law: Act on Prevention of Divulgence and Protection of Industrial Technology. This document is South Korea’s industrial technology protection law, as amended in January 2023. The law aims to prevent technologies vital to South Korean national security or economic competitiveness from being divulged to or shared with foreign countries or corporations without the government’s knowledge.
If you have a foreign-language document related to security and emerging technologies that you’d like translated into English, CSET may be able to help! Click here for details.
What’s New at CSET
REPORTS
- Examining Singapore’s AI Progress by Kayla Goode, Heeu Millie Kim and Melissa Deng
- Reducing the Risks of Artificial Intelligence for Military Decision Advantage by Wyatt Hoffman and Heeu Millie Kim
PUBLICATIONS, PODCASTS AND MORE
- CSET: Data Snapshot: Analyzing the Directionality of Citations in the Map of Science by Autumn Toney and Melissa Flagg
- CSET and the U.S. Mission to NATO: Twitter Space: The Challenges of China’s Tech Advances to Transatlantic Security featuring Emily Weinstein and Lynne Weil
- ABROADcast: Another Player On the Field – Technology Companies and Conflict featuring Emelia Probasco
TESTIMONY
- U.S.-China Economic and Security Review Commission: On February 24, Director of Biotechnology Programs and Senior Fellow Anna Puglisi and Research Analyst Dahlia Peterson testified before the U.S.-China Economic and Security Review Commission at a hearing on “China’s Challenges and Capabilities in Educating and Training the Next Generation Workforce.” Read Puglisi’s testimony for the panel on “Advancing Growth, Knowledge, and Innovation through Higher Education,” read Peterson’s testimony for the panel on “The Role of Education in Promoting China’s Strategic and Emerging Industries” and watch the full hearing.
EVENT RECAPS
- On February 23, CSET and Georgetown’s Walsh School of Foreign Service hosted U.S. Secretary of Commerce Gina Raimondo, during which she delivered a speech on “The CHIPS Act and a Long-term Vision for America’s Technological Leadership,” and participated in a moderated conversation with CSET Research Fellow Emily Weinstein. Watch a full recording of the event.
IN THE NEWS
- The Wall Street Journal: In an article about the CHIPS and Science Act’s potential to coax chipmakers back to the United States, Yuka Hayashi cited key numbers from Will Hunt’s 2022 brief, Sustaining U.S. Competitiveness in Semiconductor Manufacturing: Priorities for CHIPS Act Incentives.
- The Wall Street Journal: In an op-ed about the House Select China Committee, Adam White cited two CSET papers: John VerWey’s 2021 brief No Permits, No Fabs: The Importance of Regulatory Reform for Semiconductor Manufacturing and Owen Daniels and Will Hunt’s 2022 primer, Sustaining and Growing the U.S. Semiconductor Advantage.
- The Wire China: For an article about venture capital investment into China, Eliot Chen cited Emily Weinstein and Ngor Luong’s recent brief, U.S. Outbound Investment into Chinese AI Companies.
- Roll Call: Gopal Ratnam cited Weinstein and Luong’s brief in a story on Treasury Department efforts to regulate such investment flows, interviewing the two authors about the ins and outs of the unique data resources they relied upon.
- The National Interest: An Eric Mandel and Sarit Zehavi piece on open-source intelligence cited Lead Analyst William Hannas’ December 2022 comments to The Wall Street Journal about China’s OSINT activities.
- GovInfoSecurity: Matthew J. Schwartz quoted CSET Non-Resident Research Fellow Chris Rohlf in an analysis of the recently announced U.S. government national cybersecurity strategy.
- South China Morning Post: Coco Feng spoke with Research Analysts Dahlia Peterson and Hanna Dohmen for a piece with Ben Jiang about China’s struggles to develop a ChatGPT equivalent.
What We’re Reading
Paper: High-Resolution Image Reconstruction With Latent Diffusion Models From Human Brain Activity, Yu Takagi and Shinji Nishimoto (December 2022)
Article: Special Operations Forces Require Greater Proficiency in Artificial Intelligence, Kelley Jhong, War on the Rocks (February 2023)
Article: Algorithmic Black Swans, Noam Kolt, Washington University Law Review (February 2023)
Upcoming Events
- March 30: CSET Event, Betting the House: Strengthening the Full Microelectronics Supply Chain, featuring John VerWey and In-Q-Tel’s Yan Zheng
What else is going on? Suggest stories, documents to translate & upcoming events here.