It was hard to believe, but the student insisted it was true. He had discovered that compact discs from a major record company, Sony BMG, were installing dangerous software on people’s computers, without notice. The graduate student, Alex Halderman (now a professor at the University of Michigan), was a wizard in the lab. As experienced computer security researchers, Alex and I knew what we should do: First, go back to the lab and triple-check everything. Second, warn the public.
But by this point, in 2005, the real second step was to call a lawyer. Security research was increasingly becoming a legal minefield, and we wanted to make sure we wouldn’t run afoul of the Digital Millennium Copyright Act. We weren’t afraid that our research results were wrong. What scared us was having to admit in public that we had done the research at all.
Meanwhile, hundreds of thousands of people were inserting tainted music CDs into their computers and receiving spyware. In fact, the CDs went beyond installing unauthorized software on the user’s computer. They also installed a “rootkit”—they modified the Windows operating system to create an invisible area that couldn’t be detected by ordinary measures, and in many cases couldn’t be discovered even by virus checkers. The unwanted CD software installed itself in the invisible area, but the rootkit also provided a safe harbor for any other virus that wanted to exploit it. Needless to say, this was a big security problem for users. Our professional code told us that we had to warn them immediately. But our experience with the law told us to wait.
The law that we feared, the DMCA, was passed in 1998 but has been back in the news lately because it prohibits unlocking cellphones and interferes with access by people with disabilities. But its impact on research has been just as dramatic. Security researchers have long studied consumer technologies, to understand how they work, how they can fail, and how users can protect themselves from malfunctions and security flaws. This research benefits the public by making complex technologies more transparent. At the same time, it teaches the technology community how to design better, safer products in the future. These benefits depend on researchers being free to dissect products and talk about what they find.
We were worried about the part of the DMCA called 17 U.S.C. § 1201(a)(1), which says that “No person shall circumvent a technological measure that effectively controls access to a work protected under [copyright law].” We had to disable the rootkit to detect what it was hiding, and we had to partially disable the software to figure out what it was doing. An angry record company might call either of those steps an act of circumvention, landing us in court. Instead of talking to the public, we talked to our lawyer.
This wasn’t the first time the DMCA had interfered with my security research. Back in 2001, my colleagues and I had had to withdraw a peer-reviewed paper about CD copy protection, because the Recording Industry Association of America and others were threatening legal action, claiming that our paper was a “circumvention technology” in violation of another section of the DMCA. Later we sued for the right to publish these results—and we did publish, four months later. We had won, but we had also learned firsthand about the uncertainty and chaos that legal threats can cause. I was impressed that some of my colleagues had been willing to risk their jobs for our work, but none of us wanted to relive the experience.
Alex had dealt with his own previous DMCA threat, although this one was more comical than frightening. After he revealed that a CD copy protection product from a company called SunnComm could be defeated by holding down the computer’s Shift key while inserting the disc, the company had threatened him with DMCA action. Given the colorful history of the company—it had started corporate life as a booking agency for Elvis impersonators—and the company’s subsequent backtracking from the threat, we weren’t too worried about being sued. Nevertheless, it showed that the DMCA had become a go-to strategy for companies facing embarrassing revelations about their products.
What was Congress thinking when it passed this part of the DMCA? The act was meant to update copyright law for the 21st century, to shore up the shaky technologies that tried to stop people from copying music and movies. But the resulting law was too broad, ensnaring legitimate research activities.
The research community saw this problem coming and repeatedly asked Congress to amend the bill that would become the DMCA, to create an effective safe harbor for research. There was a letter to Congress from 50 security researchers (including me), another from the heads of major scientific societies, and a third from the leading professional society for computer scientists. But with so much at stake in the act for so many major interests, our voice wasn’t heard. As they say in Washington, we didn’t have a seat at the table.
Congress did give us a research exemption, but it was so narrowly defined as to be all but useless. (So perhaps we did have a seat—at the kids’ table.) I’ll spare you the details, but basically, there is a 116-word section of the Act titled “Permissible Acts of Encryption Research,” and it appears to have been written without consulting any researchers. There may be someone, somewhere, who has benefited from this exemption, but it fails to protect almost all of the relevant research. It didn’t protect Alex and me, because we were investigating spyware that didn’t rely on the mathematical operations involved in encryption.
SINGLE PAGEPage: 1 | 2
No comments:
Post a Comment