The Dangers Of USB Drives - Talking to a computer security researcher about Stuxnet is like asking an art critic to describe the finer points of the Mona Lisa. The world's top cybersecurity minds are absolutely in awe. Stuxnet, which was discovered in June and has since spread to millions of machines around the world, is the most sophisticated computer attack we've ever seen. Though its true purpose is unknown—teams of experts across the globe are poring through the code in an effort to divine its intentions—the deviousness of its design has prompted many researchers to call it a "cyber-weapon," one perhaps created by the United States or Israel to disrupt Iran's nuclear program.
Why should we think of Stuxnet as a weapon? Because it's the first digital worm known to infiltrate and secretly reprogram machines that run sensitive industrial processes—power plants, pipelines, telecommunications centers, airports, and ships. Iranian officials have said that Stuxnet infected employee computers at the country's Bushehr nuclear-power plant. Siemens, the German conglomerate, says that Stuxnet has already breached at least 14 factories running its software. (It hasn't caused any damage.) The worm, researchers say, is clearly the product of months or even years of work, perhaps by a large team with specialized knowledge about obscure industrial systems. In order to invade their targets, hackers often try to find a hidden bug—known as a "zero-day vulnerability"—in Windows or some other widely used software. Stuxnet's brilliant authors didn't find just one bug; the worm gets into Windows PCs using four distinct and previously unknown security holes. Its authors also managed to "sign" the worm with encryption certificates they'd stolen from two computer companies in Taiwan. These pilfered certificates allow Stuxnet to masquerade as legitimate Windows software.
But what's most interesting about Stuxnet isn't how smart its authors were; it's how dumb they guessed we all would be. How did the worm's creators expect to get it inside some of the most secure installations in the world? After all, sensitive machines often operate behind an "air gap"—that is, their networks are physically separated from the Internet and other dangerous networks where viruses can roam freely. Getting anything inside one of these zones requires the complicity of an employee. That's exactly what Stuxnet got, because its authors designed the worm to piggyback on the perfect delivery system—the ubiquitous, innocent-looking USB flash drive, the planet's most efficient vector of viruses, worms, and other malware.
Look at some of the most spectacular computer attacks in the last few years, and you'll usually find a USB stick at the center. Conficker, the worm that corralled millions of PCs into a giant botnet last year, got into the French navy and the city of Manchester, England—among many, many other organizations—through infected USB disks. (Manchester was temporarily unable to issue parking tickets as a result.) In August, William Lynn, the deputy secretary of defense, disclosed that the U.S. military was hit by a worm called agent.btz two years ago "when an infected flash drive was inserted into a U.S. military laptop at a base in the Middle East." (Lynn says that the attack was a deliberate effort by a "foreign intelligence agency," but security experts are skeptical that it was anything more than a routine infection.) In 2008, the central computer at the Spanish airliner Spainair was hit by a virus introduced through a USB drive; the malware slowed down a machine responsible for monitoring airplane failures, which an investigative report later fingered as one factor in the cause of the deadliest air disaster in Spanish history.
What makes USB drives so great at carrying malware? They're the mosquitoes of the digital world—small, portable, and everywhere, so common as to be nearly invisible. I've got half a dozen USB disks on my desk right now, several of unknown origin—I know I purchased a couple of them, but I've also picked up USB drives from friends, colleagues, and at trade shows, where they're handed out as freely as pens and candy. Funny story: At a conference in Australia last year, IBM handed out thumb drives that turned out to be infected by malware. It was a computer-security conference.
That gets to what's most dangerous about USB drives—many computer users are in the dark about their capacities for trouble. Over the last decade we've all grown used to the dangers of Internet-borne scams and malware. We know we shouldn't click on e-mail attachments from strangers, and we know we should be wary of typing our passwords into shady sites online. But the USB disk has somehow evaded our suspicion; few of us look at them and recoil at the dangers that could be lying within. Indeed, USB sticks evoke exactly the opposite emotion—if you saw a stray one on the street or lying around your office, wouldn't you pick it up and put it in your computer to try to identify the rightful owner?
Chester Wisniewski, a researcher at the security firm Sophos, says Stuxnet's authors might have exploited this naiveté when designing the worm. JMicron and RealTek, the two companies that own the digital certificates that were stolen for Stuxnet, are located in the same office park in Taiwan. Wisniewski offers the following theory: "What if the attackers dropped a couple USB drives in the parking lot between JMicron and RealTek, and then employees picked them up and stuck them into their computers?" Voila, instant infection.
For Stuxnet, sticking it in is all it takes. Sean Sullivan, a researcher at the security firm F-Secure, points out that most USB-borne malware operates on a Windows feature known as AutoRun. AutoRun was developed in the 1990s to make it easier for people to install software on their computers; when you insert a disk, Windows looks for instructions telling it what to do. Usually these instructions are benign—the disk tells the PC to install a legitimate application—but AutoRun could also be used by hackers to install malware instantly. Over the years, Microsoft, security firms, and IT managers have become much more sophisticated about fighting AutoRun viruses. New versions of Windows prompt users about the software on a disk before running it, and corporate IT staffers often disable Windows' AutoRun features. But Stuxnet evades those measures; it can infect PCs even when AutoRun is turned off. "All you have to do is open up the folder and view the contents, and you're infected," Sullivan says. "It's such a minimal action that's required—something anyone would do just to see what's on the disk. That's why it spread."
There is, of course, a failsafe way to prevent Stuxnet from infecting high-security machines—why not just prohibit users from sticking USB devices into computers that have been purposefully separated from the Internet? "That would have worked," says Sophos' Wisniewski, "but the reality is the world is still pretty crappy at security." Companies either don't have such policies or don't enforce them—maybe, perhaps, because selfish employees (like yours truly) consider USB sticks extremely convenient. If you want to hand over a huge PowerPoint presentation to your colleagues down the hall, what's easier than sticking it on a USB disk?
If a company wants to ratchet up security, it's not as simple as banning all thumb drives. To be extra careful, you'd have to ban iPods, cameras, and every other USB-based doohickey—all of those devices are capable of carrying Stuxnet-like viruses, too. I asked Sean Sullivan, of F-Secure, if he could imagine any failsafe IT policy that would have worked to thwart Stuxnet. "Well, in our malware test machines, sometimes we put glue in the USB ports," he joked. Wisniewski, of Sophos, says, the only hope is education: Don't trade USB sticks, don't stick an unknown one into your machine, and don't pick one up off the street and plug it in your machine just to see what's inside.
"But I don't know if we're ever going to win that battle," Wisniewski says. "It's human nature. If I were a normal person and I didn't work in this bubble of security? If I found a USB drive, the first thing I would want to do is want to plug it in, too." ( slate.com )
Why should we think of Stuxnet as a weapon? Because it's the first digital worm known to infiltrate and secretly reprogram machines that run sensitive industrial processes—power plants, pipelines, telecommunications centers, airports, and ships. Iranian officials have said that Stuxnet infected employee computers at the country's Bushehr nuclear-power plant. Siemens, the German conglomerate, says that Stuxnet has already breached at least 14 factories running its software. (It hasn't caused any damage.) The worm, researchers say, is clearly the product of months or even years of work, perhaps by a large team with specialized knowledge about obscure industrial systems. In order to invade their targets, hackers often try to find a hidden bug—known as a "zero-day vulnerability"—in Windows or some other widely used software. Stuxnet's brilliant authors didn't find just one bug; the worm gets into Windows PCs using four distinct and previously unknown security holes. Its authors also managed to "sign" the worm with encryption certificates they'd stolen from two computer companies in Taiwan. These pilfered certificates allow Stuxnet to masquerade as legitimate Windows software.
But what's most interesting about Stuxnet isn't how smart its authors were; it's how dumb they guessed we all would be. How did the worm's creators expect to get it inside some of the most secure installations in the world? After all, sensitive machines often operate behind an "air gap"—that is, their networks are physically separated from the Internet and other dangerous networks where viruses can roam freely. Getting anything inside one of these zones requires the complicity of an employee. That's exactly what Stuxnet got, because its authors designed the worm to piggyback on the perfect delivery system—the ubiquitous, innocent-looking USB flash drive, the planet's most efficient vector of viruses, worms, and other malware.
Look at some of the most spectacular computer attacks in the last few years, and you'll usually find a USB stick at the center. Conficker, the worm that corralled millions of PCs into a giant botnet last year, got into the French navy and the city of Manchester, England—among many, many other organizations—through infected USB disks. (Manchester was temporarily unable to issue parking tickets as a result.) In August, William Lynn, the deputy secretary of defense, disclosed that the U.S. military was hit by a worm called agent.btz two years ago "when an infected flash drive was inserted into a U.S. military laptop at a base in the Middle East." (Lynn says that the attack was a deliberate effort by a "foreign intelligence agency," but security experts are skeptical that it was anything more than a routine infection.) In 2008, the central computer at the Spanish airliner Spainair was hit by a virus introduced through a USB drive; the malware slowed down a machine responsible for monitoring airplane failures, which an investigative report later fingered as one factor in the cause of the deadliest air disaster in Spanish history.
What makes USB drives so great at carrying malware? They're the mosquitoes of the digital world—small, portable, and everywhere, so common as to be nearly invisible. I've got half a dozen USB disks on my desk right now, several of unknown origin—I know I purchased a couple of them, but I've also picked up USB drives from friends, colleagues, and at trade shows, where they're handed out as freely as pens and candy. Funny story: At a conference in Australia last year, IBM handed out thumb drives that turned out to be infected by malware. It was a computer-security conference.
That gets to what's most dangerous about USB drives—many computer users are in the dark about their capacities for trouble. Over the last decade we've all grown used to the dangers of Internet-borne scams and malware. We know we shouldn't click on e-mail attachments from strangers, and we know we should be wary of typing our passwords into shady sites online. But the USB disk has somehow evaded our suspicion; few of us look at them and recoil at the dangers that could be lying within. Indeed, USB sticks evoke exactly the opposite emotion—if you saw a stray one on the street or lying around your office, wouldn't you pick it up and put it in your computer to try to identify the rightful owner?
Chester Wisniewski, a researcher at the security firm Sophos, says Stuxnet's authors might have exploited this naiveté when designing the worm. JMicron and RealTek, the two companies that own the digital certificates that were stolen for Stuxnet, are located in the same office park in Taiwan. Wisniewski offers the following theory: "What if the attackers dropped a couple USB drives in the parking lot between JMicron and RealTek, and then employees picked them up and stuck them into their computers?" Voila, instant infection.
For Stuxnet, sticking it in is all it takes. Sean Sullivan, a researcher at the security firm F-Secure, points out that most USB-borne malware operates on a Windows feature known as AutoRun. AutoRun was developed in the 1990s to make it easier for people to install software on their computers; when you insert a disk, Windows looks for instructions telling it what to do. Usually these instructions are benign—the disk tells the PC to install a legitimate application—but AutoRun could also be used by hackers to install malware instantly. Over the years, Microsoft, security firms, and IT managers have become much more sophisticated about fighting AutoRun viruses. New versions of Windows prompt users about the software on a disk before running it, and corporate IT staffers often disable Windows' AutoRun features. But Stuxnet evades those measures; it can infect PCs even when AutoRun is turned off. "All you have to do is open up the folder and view the contents, and you're infected," Sullivan says. "It's such a minimal action that's required—something anyone would do just to see what's on the disk. That's why it spread."
There is, of course, a failsafe way to prevent Stuxnet from infecting high-security machines—why not just prohibit users from sticking USB devices into computers that have been purposefully separated from the Internet? "That would have worked," says Sophos' Wisniewski, "but the reality is the world is still pretty crappy at security." Companies either don't have such policies or don't enforce them—maybe, perhaps, because selfish employees (like yours truly) consider USB sticks extremely convenient. If you want to hand over a huge PowerPoint presentation to your colleagues down the hall, what's easier than sticking it on a USB disk?
If a company wants to ratchet up security, it's not as simple as banning all thumb drives. To be extra careful, you'd have to ban iPods, cameras, and every other USB-based doohickey—all of those devices are capable of carrying Stuxnet-like viruses, too. I asked Sean Sullivan, of F-Secure, if he could imagine any failsafe IT policy that would have worked to thwart Stuxnet. "Well, in our malware test machines, sometimes we put glue in the USB ports," he joked. Wisniewski, of Sophos, says, the only hope is education: Don't trade USB sticks, don't stick an unknown one into your machine, and don't pick one up off the street and plug it in your machine just to see what's inside.
"But I don't know if we're ever going to win that battle," Wisniewski says. "It's human nature. If I were a normal person and I didn't work in this bubble of security? If I found a USB drive, the first thing I would want to do is want to plug it in, too." ( slate.com )
No comments:
Post a Comment