Work at CSET
We’re hiring! If you’re interested in joining our team, check out the positions in the “Job Openings” section below or consult our careers page.
Worth Knowing
Chip Giants Propose a Way to Speed Up AI Training: Last week, researchers from Nvidia, Arm and Intel published a whitepaper detailing a proposed type of computer number format they say will offer significant improvements in training efficiency without major loss of accuracy. Historically, most AI training required allocating 32 bits of memory to store each model parameter with a high degree of precision. Many AI developers have switched to using 16 bits of memory per parameter, which reduces memory requirements and makes model training less computationally costly. Some researchers have experimented with reducing precision even further, down to as few as 8 bits, though this risks degrading accuracy. The new whitepaper presents a 8-bit floating point (FP8) specification for deep learning that Nvidia, Arm and Intel claim can avoid significant loss in accuracy relative to FP16 approaches. This proposed format, which could allow training larger models on the same amount of compute, could help address a key issue facing AI developers: compute usage in AI is growing at a potentially unsustainable rate. Nvidia, Arm and Intel announced they had made the FP8 format open and license-free with the hopes that it would encourage broader industry adoption.
Nvidia’s New Consumer Graphics Cards — What They Mean for AI: On Tuesday, Nvidia launched the first in its new line of 4000-series consumer graphics cards, which it says offer significant performance gains compared to its 3000-series line. Consumer GPUs helped launch the deep learning revolution — Alex Krizhevsky used two Nvidia GTX 580 GPUs to train the influential “AlexNet” convolutional neural network in 2012 — but their role has been largely usurped by data center-specific chips like Nvidia’s H100, which offer more memory bandwidth and greater networking capabilities (we covered the H100 announcement back in March). But while the biggest breakthroughs in AI will likely come from large models trained on massive clusters of H100s or equivalent data center chips, consumer cards (which are primarily marketed to video gamers) still play an important role in AI research. For researchers looking to experiment with smaller models and less complex applications, consumer GPUs can offer ample performance at a fraction of the price of data center cards (while Nvidia prohibited the use of consumer cards for “data center deployments” in 2017, it often makes exceptions for university users). Additionally, consumer GPUs will likely play a significant role in the way non-researchers engage with AI — the growing popularity of open-source AI models means that consumer GPUs can be used to fine-tune large language models, generate art locally, and even generate deepfakes. With Nvidia touting twice the performance of the previous generation’s chips, those workloads can be processed much more quickly.
- More: Nvidia Reportedly Asks TSMC to Rush Lucrative GPU Orders Before US-China Sanctions Bite | Nvidia launches new services for training large language models
Government Updates
Chinese Tech Investments — Biden Provides Additional Guidance on CFIUS: Last week, President Biden signed an executive order meant to provide additional guidance on reviewing Chinese investment in critical U.S. tech companies. The order (fact sheet available here) directs the Committee on Foreign Investment in the United States — an interagency panel tasked with reviewing foreign investments in U.S. companies — to consider the following factors in its reviews:
- A transaction’s effect on critical supply chains.
- A transaction’s effect on U.S. leadership in critical technologies, including AI, microelectronics, and biotechnology.
- A transaction’s effect in terms of broader investment trends.
- Investments that may create a cybersecurity risk.
- Investments that pose a risk to “U.S. persons’ sensitive data.”
NIST Partners with Google to Design Dozens of Research Chips: The National Institute of Standards and Technology announced a partnership with Google to develop and produce up to 40 new chip designs optimized for research applications. NIST will work with partners from a number of universities to develop and tailor the chip designs to suit their needs, and will make the resulting designs open-source so they can be further adapted and refined by other academic or industry researchers. The chips will be manufactured by Minnesota-based SkyWater Technology, with Google paying for the initial setup costs and subsidizing the first production run. While the chips will be manufactured using SkyWater’s relatively antiquated SKY130 node, as the Register’s Tobias Mann notes, these types of simple chips can be helpful for testing out experimental uses. Designing and manufacturing chips for such niche applications would normally be prohibitively expensive for most academic and small business researchers, but NIST hopes that the project will bring down costs significantly.
NIH to Spend $130M to Advance AI Health Research: The National Institutes of Health announced plans to spend $130 million over four years to accelerate AI adoption in biomedical and behavioral research. The awards are the first issued by the NIH Common Fund for its “Bridge to Artificial Intelligence” (Bridge2AI) program, which aims to make the biomedical and behavioral research fields AI-ready. To date, AI has had a mixed record when it comes to healthcare. In some cases, it has proved genuinely useful. But in other instances, it has been ineffective or worse, with poorly trained AI systems producing biased results that exacerbate existing inequities. Encouragingly, Bridge2AI’s first grants reflect a considered approach to the potential pitfalls of AI adoption: four of the seven grants are for data generation projects meant to produce “high-quality ethically sourced data sets” on which AI systems can be trained, and the other three are to create a Bridge Center that will develop best practices, disseminate information on the use of data and AI systems, and evaluate their application.
In Translation
CSET’s translations of significant foreign language documents on AI
CSET’s translations of significant foreign language documents on AI
Demand for AI Talent in China: Artificial Intelligence Industry Talent Development Report (2019–2020 Edition). This document, published by China’s Ministry of Industry and Information Technology in 2020 on the basis of job posting data and surveys, describes the country’s strong demand for — and insufficient supply of — artificial intelligence talent. The authors make several recommendations to increase China’s supply of AI talent and to ensure that AI training in universities prepares graduates with the practical skills valued by employers.
If you have a foreign-language document related to security and emerging technologies that you’d like translated into English, CSET may be able to help! Click here for details.
Job Openings
We’re hiring! Please apply or share the roles below with candidates in your network:
- Visual Communications Specialist: The Visual Communications Specialist will support the work of the External Affairs Team to raise the profile of CSET’s research through a variety of outreach activities such as graphics, multimedia, and publications. Apply by October 3.
- Research Fellow — AI Applications: This Research Fellow will focus on helping decision makers evaluate and translate new and emerging technologies, particularly in the field of AI, into novel capabilities by separating real trends and strategic opportunities from technological hope and hype. Rolling application — Apply today.
What’s New at CSET
PUBLICATIONS
- CSET: Data Snapshot: Upcoming Survey Work at CSET by Ronnie Kinoshita
- The Harvard Kennedy School Misinformation Review: Research Note: This Salesperson Does Not Exist: How Tactics From Political Influence Operations on Social Media Are Deployed for Commercial Lead Generation by Renée DiResta and CSET’s Josh A. Goldstein
- American Political Science Association: Director of Data Science Catherine Aiken presented a CSET working paper entitled “Persuasive AI: An Experimental Test of GPT-3’s Targeted Political Statements” with co-authors Data Scientist James Dunham and Senior Fellow Andrew Lohn at the American Political Science Association annual meeting last weekend in Montreal.
- Vox: Deputy Director of Analysis and a Research Fellow Margarita Konaev spoke with Ellen Ioanes about the state of Russia’s military after seven months of war in Ukraine.
- CyberScoop: Research Fellow Emily Weinstein weighed in on the lack of language capabilities at the Commerce Department’s Bureau of Industry and Security in an article by Suzanne Smalley on BIS shortcomings in carrying out its mission.
What We’re Reading
Report: Rewire: Semiconductors and U.S. Industrial Policy, Chris Miller, Center for a New American Security (September 2022)
Report: Software-Defined Warfare: Architecting the DOD’s Transition to the Digital Age by Nand Mulchandani and Jack Shanahan, Center for Strategic & International Studies (September 2022)
Paper: Robust Speech Recognition via Large-Scale Weak Supervision, Alex Radford, Jong Wook Kim, Tao Xu, Greg Brockman, Christine McLeavey and Ilya Sutskever, OpenAI (September 2022)
Upcoming Events
- September 22: CSET Webinar, The Biotechnology Landscape: How Understanding Global Biology Research Activity Can Inform Pandemic Preparedness, featuring Amesh Adalja, Caroline Schuerger and Anna Puglisi
- September 28: Chatham House Security and Defence 2022 conference, New Frontiers of Conflict: The Role of Big Data and Artificial Intelligence in Warfare, featuring Margarita Konaev
- September 29: Andrew W. Marshall Foundation, Reconfigurations and Revolutions: Presentations of the Inaugural Andrew W. Marshall Paper Prizes, featuring Owen J. Daniels
- October 12: American Conference Institute, U.S.-China Trade Controls conference, featuring Anna Puglisi
What else is going on? Suggest stories, documents to translate & upcoming events here.