Digitally manufactured weapons: can they be controlled?
In previous articles on this blog, Vincent Müller and Wouter Tebbens have discussed the threats posed by dangerous information available online, particularly information that can then be used by digitally driven devices to make potentially dangerous artefacts. One striking example is the so-called “Ghost Gunner”, a CNC mill used to manufacture firearm parts. The resulting firearm has no serial number and is therefore untraceable. As Müller points out, it can circumvent existing forms of gun control. (Another example are 3D printed plastic guns like the "Liberator" pistol, as pictured above.)
Here I will assume with both Müller and Tebbens that some form of gun control is desirable, among other things to forestall the sort of tragedies we regularly witness across the Atlantic. (Some will of course object that gun control is not the best way of avoiding such tragedies; this is a debate for another time.) Both also seem rather pessimistic about the prospects for successfully preserving such control in the age of digitally manufactured weapons. Tebbens doubts that there is any efficient way of restricting access to the information required to make such weapons (i.e. gun designs). He does, however, suggest some alternative avenues. Here I want to endorse one of Tebbens’s suggestions, namely that we might focus on controlling some of the basic materials needed to make the final product. Yet prior to that, I propose to take a closer look at some alternative solutions, including controlling access to dangerous information.
To justify his skepticism about the feasibility of this latter form of control, Tebbens points to the war on illegal filesharing: while individual sites hosting copyright-infringing files have been taken down, such files have regularly migrated to new sites, like the Pirate Bay, which despite many attempts to take it down has proved surprisingly resilient. Or they might simply get concealed into the hidden part of the web dubbed the “Dark Web”, access to which requires special software. Does this make any attempt at regulating access to such data completely pointless? I am not so sure. Consider the fight against child pornography. Files of that nature have occasionally been uploaded to sites like The Pirate Bay; however, when this has happened, they have been taken down, despite the site’s official policy of “zero censorship”. And even though most such files are now shared via the Dark Web, some spectacular crackdowns have nevertheless been conducted by law enforcement agencies in this context as well. Admittedly, such interventions haven’t fully eradicated the production and distribution of child pornography, yet it does seem reasonable to assume that they have had a deterrent effect on some potential offenders. Why couldn’t similar measures, in principle at least, be applied to the online sharing of gun designs? No doubt, this may not be an option in the present context: whereas the harms to the victims of child pornography are all too real and beyond dispute, mass shootings or accidental deaths involving the use of digitally manufactured guns are (luckily) still just speculation. But if this threat were to become a reality, the social consequences might warrant (and be viewed as warranting) such drastic measures, with the police resources they require. (True, the viability of this method would partly depend on the total number of people who were illicitly sharing the files.)
Assuming a person has already gotten hold of a gun design without proper authorization, an alternative option would consist in building a safety mechanism into each digitally driven device (e.g. a 3D printer or a CNC mill) that would prevent it from making a dangerous artefact upon identifying the design for that artefact. The Danish company Create It REAL, for instance, thus claims to have developed software that can stop the 3D printing of a gun. This method is not without its shortcomings: besides the fact that the software in question could be hacked and rendered inoperative (which would only be a fatal problem if it were relatively easy to do so), digitally manufactured guns are often assembled from a number of smaller parts. The software we are considering would thus need to identify designs for those constituents and prevent them from being made. Whether or not this would be feasible depends on whether those constituents possess the characteristic of versatility. By something’s being versatile, I mean that it can fulfill several functions, not just that of firearm part, as a result of which it cannot automatically be assumed that a person digitally manufacturing that component is planning to make a gun; they could be making something very different, say a plastic container for food. If a component is versatile in this sense, preventing it from being manufactured in the absence of a proper authorization would be equivalent to requiring a gun license to use a 3D printer or CNC mill at all – a rather extreme measure. (For similar reasons, it wouldn’t be appropriate to simply try and ban devices like the Ghost Gunner, which its manufacturer describes as a “general purpose CNC mill” that “anyone can program anything for”.) But if a digitally manufactured gun part is not versatile, and can be identified as such from its design, then the method just outlined might work.
In light of those uncertainties, it seems that we do need one of the options suggested by Tebbens (perhaps in conjunction with the alternatives just sketched): namely, we should seek to impose appropriate controls over the some of the materials used to produce dangerous artefacts, materials that cannot themselves be digitally produced. In the specific context of digitally made guns, the best candidate for such a component would be the ammunition – and indeed, calls to shift our focus on ammunition control have already been made (e.g. see here and here). While bullets can actually be 3D printed, gun powder is still needed, and it cannot be digitally manufactured, hence its promise as a potential target for regulation.
It would be unfortunate if innocent people had to die before the need to regulate the digital manufacturing of dangerous weapons were generally acknowledged. Should we come to that point, however, the considerations above suggest at least that some courses of action are open to us to try and address the problem – we need not stand powerless as our societies turn into the Wild West.
(I thank Vincent Müller for discussion that helped shape some of the ideas defended in this entry.)
Alexandre Erler http://www.didiy.eu/people/alexandre-erler