Recent advances in the field of artificial intelligence (AI) have put the world on edge. AI experts, policymakers, journalists, and others are expressing concern about the potential existential risks that AI may pose. These include those related to large language models (LLMs), such as ChatGPT and GPT-4. Experts across the policy and technology community who believe in the urgency for action, including OpenAI’s Sam Altman and Senator Chuck Schumer, have called for regulation and controls on AI development and proliferation. Others in the broader tech policy community have expressed concerns about the existential risk posed by AI systems, even going as far as to argue that “mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.”
The nearer term concern, however, is military applications. LLMs and other AI-enabled systems and capabilities are likely to become militarily applicable soon, if not already. More specifically, these capabilities could soon be deployed on the battlefield by hostile nations and actors against U.S. and allied forces, or used in bioterrorism or weapons of mass destruction (WMDs). Regardless of which time-scale you are most concerned about, existing U.S. government tools and approaches may help mitigate some of the issues worrying observers. We believe that the long-standing dual-use “catch-all” controls, administered by the Department of Commerce’s Bureau of Industry and Security (BIS), can be used to address some of these threats.
Today, existing controls would already make it difficult for U.S. or foreign persons to use U.S.-origin LLMs or other AI systems to help other countries or entities develop weapons. For more than 30 years, the Export Administration Regulations (EAR) have prohibited exports, reexports, and transfers by U.S. and foreign persons of any type of U.S.-origin commodity, software, or technology if there is knowledge that the activity would support the development, production, or use of a missile, chemical or biological weapon, or nuclear weapon.
The EAR also prohibit U.S. persons from providing support for the development, production, or use of these weapons of mass destruction (WMDs) — even when all the underlying commodities, technologies, and software are not U.S.-origin or otherwise subject to export controls. It is important to remember that “U.S. persons” in this context refers to more than just people—it can also refer to U.S. companies.1
“Catch-All” Controls
Most policy discussions as of late have focused on two aspects of export controls:
- item-based controls that impose license requirements on specific items on a list, or
- end-user controls, including the Entity List, which imposes restrictions on shipments to specific entities.
For instance, export control classification number (ECCN) 4A004 is a list-based control on certain types of computers, including “neural computers” — AI systems that can learn from examples like neural networks, but can also store complex data like computers.
With respect to AI-related end-user controls, BIS has added entities such as Chinese chip firm Cambricon to the Entity List after the firm allegedly attempted “to acquire U.S.-origin items in support of China’s military modernization.” BIS pointed out that Cambricon (and its other listed subsidiaries and affiliates) is a “major AI chip research and development company whose manufacturing and sales entities are, or have close ties to, government organizations that support the Chinese military and the defense industry.”2
There is, however, a third leg of the export control compliance stool that is well known to compliance professionals, but not as much discussed in the think tanks or the media—the end-use “catch-all” controls. These controls were created in the early 1990s to regulate the export, reexport, and transfer of widely available items that were still nonetheless useful in the development, production, or use of missiles, chemical and biological weapons, or nuclear weapons. They “catch all” exports of unlisted items for these types of end-uses.
Specifically, EAR sections 744.2, 744.3, and 744.4, prohibit any U.S. or foreign person from exporting, without a license, any type of commodity, software, or technology that is U.S.-origin (or in some cases, foreign-origin) to a foreign country, if there is “knowledge” that the item will be used directly or indirectly in the production, development, or use of:
- Nuclear energy or explosives;
- Rocket systems, missiles, certain unmanned aerial vehicles; or
- Chemical or biological weapons.
The EAR’s definition of the “knowledge” in this context is not just limited to situations when there is positive knowledge that the item in question will be used for one of these end-uses. It also applies when there is a “high probability” that an export will be for one of these end uses. If someone deliberately avoids learning facts about how an item would be used, the exporter will still be deemed to have had such knowledge if the export is, in fact, for one of these end uses. Even when someone does not have “knowledge,” as defined, of a prohibited end use, the EAR’s “Know Your Customer” Guidance effectively obligates people to determine if there are any abnormal circumstances in a transaction that would indicate a prohibited end use (or end user or destination) might be involved. If so, then an exporter is obligated to resolve its suspicions — or “red flags” — before proceeding with the export.
In situations where an exporter does not know whether an export could involve a prohibited end-use, BIS has the authority to inform any U.S. or foreign person that a transaction could involve a prohibited end-use or end-user. Once so informed, whether orally or in writing, the person may not provide the items at issue — even though the items would otherwise not require a license to ship.
What could this look like for AI? If a hypothetical foreign company were planning to export U.S.-origin AI model (technology or software) to another entity overseas, and BIS believed there was a risk that it could be used for, as an example, the development of a missile, then BIS has the authority to inform the foreign company that its planned actions involving otherwise uncontrolled technology and software require a BIS license, which would likely be denied.
“U.S. Person” Controls
In addition to these “catch-all” controls, the EAR (specifically, Section 744.6) also impose controls over activities of “U.S. persons” wherever they are located, if the activities are in support of the development, production, or use of:
Thus, for example, if a U.S. person wanted to use an AI model in the development of, for instance, a novel biological weapon anywhere in the world, that activity would already be prohibited by EAR section 744.6(b)(3)). Failure to abide by this prohibition could result in significant civil and criminal penalties for the U.S. person — whether an individual citizen or a U.S. company.
The EAR’s definition of “support” is extremely broad and encompasses not only exporting an item, but also “facilitating” (e.g., making easier) any shipments or performing any contracts or services that may assist or benefit the prohibited end use in any way. In the context of AI, for instance, this means that if a “U.S. person” were found to have “supported” the development, production, or use of a bioweapon using an AI model (regardless of the country of origin of that model), that would be in violation of U.S. export controls.
Similar to the “catch-all” controls, these “U.S. person” controls also give BIS the authority to inform U.S. persons — whether directly or through regulation — that a particular type of activity “could involve” an end use of concern. Indeed, it was under this “could involve” authority that, in October 2022, BIS imposed controls on the activities of U.S. persons if they were in support of the development or production in China of:
- semiconductors at facilities that developed or produced advanced node semiconductors, or
- specific equipment used to produce them.
BIS described these new controls as necessary to prevent companies in China from using advanced node semiconductors in advanced computing items and supercomputers that could be used, among other things, to “enhance data processing and analysis capabilities, including through AI applications.”
The EAR’s “U.S. person” controls even apply when all the underlying software or technology is the result of fundamental research, wholly foreign, open source or published research, or otherwise uncontrolled and widely available. These controls do not create any First Amendment “prior restraint” issues because they are over the activities of the “U.S. persons,” not the information or software they release to the world. In other words, the item in question is not being controlled; rather, the activities of the “U.S. person” using that item would be the controlled event.
In enacting these controls, BIS has put the act of U.S. persons supporting the development or production of advanced node semiconductors and related equipment on the same legal and policy footing as a U.S. persons’ support for the development or production of nuclear weapons, chemical-biological weapons, or missiles. Whether these two issues should be brought under the same “could involve” authority umbrella is a critical topic of debate that we do not address here. Nevertheless, EAR section 744.6 already contains a regulatory structure to do so once a decision is made about what should be controlled.
Other Considerations
In addition to these U.S. persons controls, in December 2022, Congress (in section 5589(b) of the National Defense Authorization Act) gave BIS additional authorities to impose controls over “U.S. person” activities if they would be in support of not only military-intelligence end uses in countries of concern (such as China and Russia), but also if they would support any type of foreign “military, security, or intelligence service.” This change is notable, as the statute in question previously only referred to “military-intelligence” end-uses and end-users, which limited the scope to the intelligence arms of military entities. This new authority should be of interest to those thinking of how to address AI-related concerns because it can be applied even when the underlying AI-related technology or software is open source or otherwise so widely available that it cannot be controlled. Using this new authority to control activities of U.S. persons could be an option for U.S. policymakers if and when they are unable to define in list-based controls clear and enforceable technical thresholds for AI systems of concern.
The Biden-Harris Administration has not stated how or when it plans to implement this new authority. However, this new authority will likely be useful in controlling U.S. persons using AI technologies and applications, regardless of their source, to support the intelligence and surveillance activities of foreign governments of concern.
The State Department’s International Traffic in Arms Regulations (ITAR) also regulate the provision of “defense services” to foreign persons. These controls prohibit, among other things, any U.S. person from providing, without authorization, any assistance or training to a foreign person in developing, producing, or using a defense article, i.e., an item described on the U.S. Munitions List. Thus, if a U.S. person were to use an AI system to help a foreign person develop a better weapons system, that act — even when the underlying AI technology were not controlled — would already be prohibited if without an authorization. Violating these controls also can result in significant civil or criminal penalties.
Limitations of U.S. Export Controls Authorities
There are many activities that the EAR and the ITAR controls and authorities do not or could not apply to, such as those:
- involving the development or production of AI or other unlisted applications without knowledge or information from BIS that the activities could support the development, production, or use of a WMD;
- by foreign persons outside the United States when the underlying foreign-made items are not or could not be subject to U.S. regulatory jurisdiction; and
- that occur completely within the United States when no foreign persons are involved.
Conclusion
Although most experts agree that AI has not yet reached its full potential, ensuring that legal authorities exist to handle these emergent threats is in our best interest. Fortunately, in many cases, the U.S. government may already have the necessary authorities to address the use of AI in more traditional national security realms. Taking a step back to determine if our existing laws are sufficient will save us time in the long run and prevent regulatory overreach that could have negative impacts on the competitiveness of U.S. technology and innovation.
- According to the EAR § 772.1, a “U.S. Person” includes: 1) Any individual who is a citizen of the United States, a permanent resident alien of the United States, or a protected individual as defined by 8 U.S.C. 1324b(a)(3); 2) any juridical person organized under the laws of the United States or any jurisdiction within the United States, including foreign branches; and 3) any person in the United States. Foreign persons outside the United States are also affected by U.S. person controls if they are acting for and on behalf of the U.S. person.
- As evidence of how aggressively BIS is controlling activities that could support military modernization in China using AI-related items, BIS made Cambricon and other listed companies involved in AI-related development “Footnote 4” entity. This is a somewhat novel Entity List designation BIS created in October 2022 to subject even wholly foreign-made and otherwise uncontrolled commercial items without any US-origin content to US export controls if they are developed or produced with U.S. technology or equipment and shipped from outside the United States to anyone if a Footnote 4 entity is involved. This new authority is referred to as the Footnote 4 Entity List Foreign Direct Product rule, which is far more complicated in application than this summary.