Friday 15 November 2024
Font Size
   
Tuesday, 28 December 2010 18:50

C3 Security Analysts Explore Heart of Stuxnet

Rate this item
(0 votes)

Software engineer Bruce Dang led Microsoft's analysis of the Stuxnet worm.

BERLIN — It is a mark of the extreme oddity of the Stuxnet

computer worm that Microsoft’s Windows vulnerability team learned of it first from an obscure Belarusian security company that even the Redmond security honchos had never heard of.

The sophisticated worm, which many computer experts believe was created as a specific attempt to sabotage Iran’s nuclear power plant centrifuges, has written a new chapter in the history of computer security. Written to affect the very Siemens components used at Iran’s facilities, some analysts have even speculated it may have been the work of a state, rather than of traditional underground virus writers.

Much of the attention has focused on the worm’s origin and ultimate effects. But in a standing-room-only session at the Chaos Computer Club (CCC) Congress here Monday, Microsoft’s lead vulnerability analyst on the Stuxnet project offered a blow-by-blow account of the software company’s response to and analysis of the software’s multipronged attack on Windows vulnerabilities.

Much of the technical side — which flaws were attacked, and how they were fixed — are now well-known. But the story offered unusual insight into the software company’s race to stay ahead of the security firms seeking themselves to peel back the worm’s layers of attacks, and into the intense pressure put on the team of analysts.

“We knew a lot of other people were looking, and it’s important to us to know the details before other companies,” said Bruce Dang, the security-software engineer in Microsoft’s Security Response Center who led the analysis. Management “is smart: They know it takes time, but they want results.”

The public Stuxnet story began when Belarusian firm VirusBlokAda first identified the Stuxnet code in June, and contacted Microsoft with a PDF showing a screenshot of the effects. Dang said his team was initially tempted to dismiss the report, thinking it a common and known problem. But a case was opened, and once a team began looking at the code, they realized it was something new.

The code that had been provided to the team was large — close to 1 MB of information, Dang said. A team of 20 to 30 people with expertise in various components of the Windows system was assembled and began quickly exchanging emails.

They traced the apparent problem to code that came from an infected USB stick. By exploiting a vulnerability in the Windows icon shortcut feature, or LNK files, the worm gained the ability to execute commands on the infected computer, but only with the current user’s level of access.

Several fixes were proposed, and others in the company turned down those that would have contradicted messages already provided to outside developers. Dang said the urgency was nevertheless high, because the company was getting reports of considerable numbers of infections, and the vulnerability turned out to be extremely simple to use.

“A 7-year-old could exploit this. It’s bad news,” Dang said. “Of course it turned out that this vulnerability had been known for several years by some people, but no one told me.”

Case closed. They thought they were finished. But as Dang and another colleague began doing a bit of further analysis, they noticed that extra drivers were being installed on their test computers, both in Windows XP and Windows 7 environments. This was definitely not good, they thought.

Closer investigation showed that scheduled tasks were being added, and XML-based task files were being created and rewritten. Working with a colleague overseas, Dang discovered that the way Windows Vista and later operating systems stored and verified scheduled tasks contained a vulnerability that would give the attacking worm (which had already gained the ability to drop code with user-based access privileges) the ability to give itself far broader — and thus more dangerous — privileges on the infected computer.

In short, the two flaws working together allowed the worm to gain code-execution privileges, and then to deepen those privileges to install a rootkit.

The team thought again about how to fix the problem, and settled on changing the way the Vista and Windows 7 task scheduler uses hash values to verify files. Once implemented, this would block the dangerous privilege escalation.

Finished, then? Not yet. Dang’s colleague noted that a particular DLL, or system file, was being loaded in a suspicious way. They looked harder, and saw it was happening differently in XP and Windows 7 systems. But they couldn’t figure this one out immediately.

Dang started going over the binary code line by line, but with more than 1,000 lines, he realized this tactic simply wasn’t going to be fast enough. Management was putting severe pressure on the team to get results, and they had no answers.

He took this one home. He stayed up brainstorming until the small hours of the night, but all his ideas came to nothing. He even tried just letting the exploit run, on the theory that most virus code isn’t perfect, and will ultimately cause a blue-screen system crash, exposing the problem in the crash logs. But no dice: This one ran perfectly 10 times in a row.

“I knew we were getting close,” he said. “I knew it was searching for something, but exactly what wasn’t clear to me.”

The next day, an old kernel-debugger-analysis trick finally paid off. The team identified a flaw in the way Windows XP systems are allowed to switch user keyboard layouts — from an English keyboard to a German configuration, for example. Once again, this allowed the worm to gain elevated privileges on the infected computer.

Smart, almost chillingly so, Dang said. The task-scheduling attack previously identified worked only on Vista and later systems. The keyboard layout attack worked only on XP. Some people somewhere had set their sights very broadly.

“We felt pretty good at that point,” he said. “How could there be more?”

But there was more. The team got word from the Kaspersky Lab security company that there was strange “remote procedure call” traffic being sent over a network — a kind of communication that allows one computer to trigger activity on another, such as printing from a remote device.

Dang and his team set up a mini-VPN, infected one computer, and went away. They came back to find their entire mini-network had been infected.

“I said, ‘What the f***! This is really weird,’” Dang recounted.

They brought Microsoft’s printer team in, and this time the problem proved simple to uncover. In 5 minutes they had traced the source: a print-spooler flaw that allowed remote guest accounts to write executable files directly to disk. A terrible flaw, but luckily fixed quickly.

The flaw gave more insight into the attacker’s intentions. The configuration vulnerable to this flaw was very uncommon in normal corporations, but allowed widespread infection within a network that was configured in this way, Dang said.

From the perspective of Microsoft’s vulnerability team, the story essentially ends there. But Stuxnet has been in the wild for a year, and revelations continue as to the breadth of the infection, and the sophistication of its apparent attack on Iran’s nuclear centrifuges.

Dang says several things are clear from his reading of the code. It was written by at least several people, with the different components bearing the fingerprints of different authors. And the creators were careful to make sure that it ran perfectly, with high impact and 100 percent reliability, he said. That’s a goal even commercial software developers often fail to meet.

The total time taken from discovery to the final fix was between three and four days, or about 40 Microsoft staff-hours. But the effects of this sophisticated exploitation of unknown or “zero-day” Windows vulnerabilities will surely continue to resonate for months or even years to come.

Authors: John Borland

to know more click here

French (Fr)English (United Kingdom)

Parmi nos clients

mobileporn