Discussion: (0 comments)
There are no comments available.
A public policy blog from AEI
When you buy software, how do you know if it is secure? If you are a normal user, you have only its past track record and published reports to go on. But if you are a government or other large customer, you can do a bit better: You can demand that the vendor show you their source code before buying. (Source code is like the blueprints for a computer program; it is the set of directions used to produce a runnable piece of software.)
Reuters recently reported that HP Enterprise (HPE) allowed a Russian government contractor to do such an audit on Arcsight, a tool for large-scale data analysis, for instance for intrusion detection. This review caused some concern, since the US government also relies on Arcsight. Is our security endangered by this review? What should we do about it? “Code review” is an ambiguous term, and there is less here than one might infer from the headline. The review was done in a period of a few hours, on HPE premises, and the reviewers were not allowed to take the code home.
Reviews of that sort are useful and relatively safe. Looking casually through source code will reveal some security-relevant properties: whether the code was designed carefully, the overall level of polish and maturity, and if it has particularly risky or safe coding practices. But, if the product is any good, it will not be possible to find unobvious security flaws from this sort of short-term inspection. (If a foreign consultant can find a security problem within a few hours, without knowing anything about the internal design beforehand, why couldn’t the company find the same problem?)
If a product has serious security weaknesses, keeping the source code secret won’t prevent malicious parties from finding the vulnerabilities. You can learn a lot about a car by reading the blueprints — but you can also learn a lot by opening the hood and disassembling the thing. Similarly, it is routine to reverse engineer software from the compiled artifact and to look for vulnerabilities even without the source code. If we would be endangered by a foreign intelligence agency finding vulnerabilities in a computer program, we shouldn’t allow commercial sale of the program at all. Once we allow copies to be sold to private parties, we should take for granted that foreign governments will have copies and be able to look for vulnerabilities.
To the extent that some software merits special protections, such measures can be imposed without new legislation. We already have several suitable categories. For example, some kinds of software are covered under the State Department’s International Traffic in Arms Regulations. These rules would precisely restrict the transfer of computer code to foreign persons and organizations.
But we should hesitate before imposing such measures. The United States is probably a net exporter of software, but both the government and the private sector do buy considerable amounts of software from overseas. An international norm in favor of source-code inspections is likely to protect us from our own ill-advised software acquisitions.
Companies are traditionally uncomfortable having outsiders looking at their source code, but there are times when it is appropriate, even with an untrusted reviewer. If the review is structured carefully, the value of increased trust exceeds the risk of the disclosure. From published accounts, the review of Arcsight on behalf of the Russian government was indeed conducted responsibly and should not be cause for alarm from policymakers.
There are no comments available.
1789 Massachusetts Avenue, NW, Washington, DC 20036
© 2017 American Enterprise Institute