Sysinternals Freeware - Mark Russinovich & Bryce Cogswell

Mark's Sysinternals Blog

On My Way to Microsoft!

I’m very pleased to announce that Microsoft has acquired Winternals Software and Sysinternals. Bryce Cogswell and I founded both Winternals and Sysinternals (originally NTInternals) back in 1996 with the goal of developing advanced technologies for Windows. We’ve had an incredible amount of fun over the last ten years working on a wide range of diverse products such as Winternals Administrator’s Pak, Protection Manager, Defrag Manager, and Recovery Manager, and the dozens of Sysinternals tools, including Filemon, Regmon and Process Explorer, that millions of people use every day for systems troubleshooting and management. There’s nothing more satisfying for me than to see our ideas and their implementation have a positive impact.

That’s what makes being acquired by Microsoft especially exciting and rewarding. I’m joining Microsoft as a technical fellow in the Platform and Services Division, which is the division that includes the Core Operating Systems Division, Windows Client and Windows Live, and Windows Server and Tools. I’ll therefore be working on challenging projects that span the entire Windows product line and directly influence subsequent generations of the most important operating system on the planet. From security to virtualization to performance to a more manageable application model, there’s no end of interesting areas to explore and innovate.

So what’s going to happen to Winternals and Sysinternals? Microsoft is still evaluating the best way to leverage the many different technologies that have been developed by Winternals. Some will find their ways into existing Microsoft products or Windows itself and others will continue on as Microsoft-branded products. As for Sysinternals, the site will remain for the time being while Microsoft determines the best way to integrate it into its own community efforts, and the tools will continue to be free to download.

Personally, I remain committed to the Sysinternals and Windows IT pro communities and so I’ll continue to blog here, to write about Windows technologies, and to speak at conferences. Until I know my Microsoft email address and post it you can continue to contact me at mark@sysinternals.com.

I’m looking forward to making Windows an even better platform for all of us!

posted by Mark Russinovich @ 10:20 AM (130) comments

The Power in Power Users

Placing Windows user accounts in the Power Users security group is a common approach IT organizations take to get users into a least-privilege environment while avoiding the many pains of truly running as a limited user. The Power Users group is able to install software, manage power and time-zone settings, and install ActiveX controls, actions that limited Users are denied.

What many administrators fail to realize, however, is that this power comes at the price of true limited-user security. Many articles, including this Microsoft Knowledge Base article and this blog post by Microsoft security specialist Jesper Johansen, point out that a user that belongs to the Power Users group can easily elevate themselves to fully-privileged administrators, but I was unable to find a detailed description of the elevation mechanisms they refer to. I therefore decided to investigate.

Before I could start the investigation, I had to define the problem. In the absence of a security flaw such as a buffer overflow privilege escalation is possible only if an account can configure arbitrary code to execute in the context of a more-privileged account. The default accounts that have more privilege than Power Users include Administrators and the Local System account, in which several Windows service processes run. Thus, if a Power Users member can modify a file executed by one of these accounts, configure one of their executables to load an arbitrary DLL, or add an executable auto-start to these accounts, they can obtain full administrative privileges.

My first step was to see what files and directories to which the Power Users group has write access, but that limited users do not. The systems I considered were stock Windows 2000 Professional SP4, Windows XP SP2, and Windows Vista. I'm not going to bother looking at server systems because the most common Power Users scenario is on a workstation.

The brute force method of seeing what file system objects Power Users can modify requires visiting each file and directory and examining its permissions, something that’s clearly not practical. The command-line Cacls utility that Windows includes dumps security descriptors, but I’ve never bothered learning Security Descriptor Description Language (SDDL) and parsing the output would require writing a script. The AccessEnum utility that Bryce wrote seemed promising and it can also look at Registry security, but it’s aimed at showing potential permissions weaknesses, not the accesses available to particular accounts. Further, I knew that I’d also need to examine the security applied to Windows services.

I concluded that I had to write a new utility for the job, so I created AccessChk. You pass AccessChk an account or group name and a file system path, Registry key, or Windows service name, and it reports the effective accesses the account or group has for the object, taking into consideration the account’s group memberships. For example, if the Mark account had access to a file, but Mark belongs to the Developers group that is explicitly denied access, then AccessChk would show Mark as having no access.

In order to make the output easy to read AccessChk prints ‘W’ next to the object name if an account has any permissions that would allow it to modify an object, and ‘R’ if an account can read the object’s data or status. Various switches cause AccessChk to recurse into subdirectories or Registry subkeys and the –v switch has it report the specific accesses available to the account. A switch I added specifically to seek out objects for which an account has write access is –w.

Armed with this new tool I was ready to start investigating. My first target was a Windows XP SP2 VMWare installation that has no installed applications other than the VMWare Tools. The first command I executed was:

accesschk –ws “power users” c:\windows

This shows all the files and directories under the \Windows directory that the Power Users group can modify. Of course, many of the files under \Windows are part of the operating system or Windows services and therefore execute in the Local System account. AccessChk reported that Power Users can modify most of the directories under \Windows, which allows member users to create files in those directories. Thus, a member of the Power Users group can create files in the \Windows and \Windows\System32 directory, which is a common requirement of poorly written legacy applications. In addition, Power Users needs to be able to create files in the \Windows\Downloaded Program Files directory so that they can install ActiveX controls, since Internet Explorer saves them to that directory. However, simply creating a file in these directories is not a path to privilege elevation.

Despite the fact that Power Users can create files underneath \Windows and most of its subdirectories, Windows configures default security permissions on most files contained in these directories so that only members of the Administrators group and the Local System account have write access. Exceptions include the font files (.fon), many system log files (.log), some help files (.chm), pictures and audio clips (.jpg, .gif, and .wmv) and installation files (.inf), but none of these files can be modified or replaced to gain administrative privilege. The device drivers in \Windows\System32\Drivers would allow easy escalation, but Power Users doesn’t have write access to any of them.

I did see a number of .exe’s and .dll’s in the list, though, so I examined them for possible exploits. Most of the executables for which Power Users has write access are interactive utilities or run with reduced privileges. Unless you can trick an administrator into logging into the system interactively, these can’t be used to elevate. But there’s one glaring exception: ntoskrnl.exe:



That’s right, Power Users can replace or modify Windows’ core operating system file. Five seconds after the file is modified, however, Windows File Protection (WFP) will replace it with a backup copy it retrieves, in most cases, from \Windows\System32\Dllcache. Power Users doesn’t have write access to files in Dllcache so it can’t subvert the backup copy. But members of the Power Users group can circumvent WFP by writing a simple program that replaces the file, flushes the modified data to disk, then reboots the system before WFP takes action.

I verified that this approach works, but the question remained of how this vulnerability can be used to elevate privilege. The answer is as easy as using a disassembler to find the function that Windows uses for privilege checks, SeSinglePrivilegeCheck, and patching its entry point in the on-disk image so that it always returns TRUE, which is the result code that indicates that a user has the privilege being checked for. Once a user is running on a kernel modified in this manner they will appear to have all privileges, including Load Driver, Take Ownership, and Create Token, to name just a few of the privileges that they can easily leverage to take full administrative control of a system. Although 64-bit Windows XP prevents kernel tampering with PatchGuard, few enterprises are running on 64-bit Windows.

Replacing Ntoksrnl.exe isn’t the only way to punch through to administrative privilege via the \Windows directory, however. At least one of the DLLs for which default permissions allow modification by Power User, Schedsvc.dll, runs as a Windows service in the Local System account. Schedsvc.dll is the DLL that implements the Windows Task Scheduler service. Windows can operate successfully without the service so Power Users can replace the DLL with an arbitrary DLL, such as one that simply adds their account to the Local Administrators group. Of course, WFP protects this file as well so replacing it requires the use of the WFP-bypass technique I’ve described.

I’d already identified several elevation vectors, but continued my investigation by looking at Power Users access to the \Program Files directory where I found default permissions similar to those in the \Windows directory. Power Users can create subdirectories under \Program Files, but can’t modify most of the preinstalled Windows components. Again, the exceptions, like Windows Messenger (\Program Files\Messenger\Msmgs.exe) and Windows Media Player (\Program Files\Windows Media Player\Wmplayer.exe) run interactively.

That doesn’t mean that \Program Files doesn’t have potential holes. When I examined the most recent output I saw that Power Users can modify any file or directory created in \Program Files subsequent to those created during the base Windows install. On my test system \Program Files\Vmware\Vmware Tools\Vmwareservice.exe, the image file for the Vmware Windows service that runs in the Local System account, was such a file. Another somewhat ironic example is Microsoft Windows Defender Beta 2, which installs its service executable in \Program Files\Windows Defender with default security settings. Replacing these service image files is a quick path to administrator privilege and is even easier than replacing files in the \Windows directory because WFP doesn’t meddle with replacements.

Next I turned my attention to the Registry by running this command:

accesschk –swk “power users” hklm

The output list was enormous because Power Users has write access to the vast majority of the HKLM\Software key. The first area I studied for possible elevations was the HKLM\System key, because write access to many settings beneath it, such as the Windows service and driver configuration keys in HKLM\System\CurrentControlSet\Services, would permit trivial subversion of the Local System account. The analysis revealed that Power Users doesn’t have write access to anything significant under that key.

Most of the Power Users-writeable areas under the other major branch of HKLM, Software, related to Internet Explorer, Explorer and its file associations, and power management configuration. Power Users also has write access to HKLM\Software\Microsoft\Windows\CurrentVersion\Run, allowing them to configure arbitrary executables to run whenever someone logs on interactively, but exploiting this requires a user with administrative privilege to log onto the system interactively (which, depending on the system, may never happen, or happen infrequently). And just as for the \Program Files directory, Power Users has default write access to non-Windows subkeys of HKLM\Software, meaning that third-party applications that configure executable code paths in their system-wide Registry keys could open security holes. VMWare, the only application installed on the system, did not.

The remaining area of exploration was Windows services. The only service permissions AccessChk considers to be write accesses are SERVICE_CHANGE_CONFIG and WRITE_DAC. A user with SERVICE_CHANGE_CONFIG can configure an arbitrary executable to launch when a service starts and given WRITE_DAC they can modify the permissions on a service to grant themselves SERVICE_CHANGE_CONFIG access. AccessChk revealed the following on my stock Windows XP SP2 system:



I next ran PsService to see the account in which the DcomLaunch service executes:



Thus, members of the Power Users group can simply change the image path of DComLauncher to point at their own image, reboot the system, and enjoy administrative privileges.

There can potentially be other services that introduce exploits in their security. The default permissions Windows sets on services created by third-party applications do not allow Power Users write access, but some third party applications might configure custom permissions to allow them to do so. In fact, on my production 64-bit Windows XP installation AccessChk reveals a hole that not only Power Users can use to elevate themselves, but that limited users can as well:



I’d now finished the major phase of my investigation and just confirmed what everyone has been saying: a determined member of the Power Users group can fairly easily make themselves full administrator using exploits in the operating system and ones created by third-party applications.

My final step was to see how Microsoft’s approach to the Power Users account has evolved over time. This 1999 Microsoft Knowledge Base article documents the famous screen-saver elevation vulnerability that existed on Windows NT 4, but Microsoft closed that hole before the release of Windows 2000. The KB article also shows that Microsoft was apparently unaware of other vulnerabilities that likely existed. Windows 2000 SP4 also includes holes, but is actually slightly more secure than the default Windows XP SP2 configuration: Power Users don’t have write access to Ntoskrnl.exe or the Task Scheduler image file, but instead of write-access to the DComLauncher service they can subvert the WMI service, which also runs in the Local System account.

Windows XP SP1 added more Power Users weaknesses, including write access to critical system files like Svchost.exe, the Windows service hosting process, and additional services, WMI and SSDPSRV, with exploitable permissions. Several services even allowed limited users to elevate as described in this Microsoft KB article from March of this year.

Microsoft’s newest operating system, Windows Vista, closes down all the vulnerabilities I’ve described by neutering Power Users so that it behaves identically to limited Users. Microsoft has thus closed the door on Power Users in order to force IT staffs into securing their systems by moving users into limited Users accounts or into administrative accounts where they must acknowledge end-user control over their systems.

The bottom line is that while Microsoft could fix the vulnerabilities I found in my investigation, they can’t prevent third-party applications from introducing new ones while at the same time preserving the ability of Power Users to install applications and ActiveX controls. The lesson is that as an IT administrator you shouldn’t fool yourself into thinking that the Power Users group is a secure compromise on the way to running as limited user.

posted by Mark Russinovich @ 11:01 AM (59) comments

Why Winternals Sued Best Buy

This post I’m taking a break from my standard technical postings to discuss a disturbing discovery regarding a large corporation’s unauthorized software usage. By now many of you have heard via Slashdot, arstechnica, Digg, or your local newspaper that Winternals Software, the company I co-founded with Bryce Cogswell in 1996, filed suit in Federal court against Geek Squad and Best Buy for illegal use of the Administrator’s Pak. What the press coverage to date might not have made clear is what Geek Squad and Best Buy did prior to approaching Winternals in October 2005 about a license to our software, what they continued to do after terminating licensing discussions in February 2006, and why we felt we had no alternative but to protect our software through the legal system. This is the first lawsuit Winternals has ever initiated, and we did not approach the decision lightly.

Best Buy acquired the Geek Squad several years ago and has grown the unit to a size of approximately 12,000 employees that analysts estimate will generate over a billion dollars of revenue this year alone. The Geek Squad provides system repair, data salvaging, and installation services in each of the Best Buy retail outlets and, for an additional significant fee, a 911 service that travels to customer homes to perform repairs on site.

The Administrator’s Pak is a collection of powerful system utilities, including enhanced versions of Sysinternals Filemon and Regmon that work remotely and have log-to-file capability, that’s sold to individual systems administrators. The flagship tool is ERD Commander, a Windows Preinstallation Environment (WinPE)-based recovery environment with a familiar Windows user-interface that is the latest generation of the original ERD Commander product we released in 1998 and upon which Winternals was built. While Windows includes a rudimentary unbootable system repair tool in the form of the Recovery Console, Microsoft has chosen not to provide an advanced unbootable system repair, diagnosis and recovery environment on par with ERD Commander. The BartPE freeware alternative that clones WinPE offers some of the functionality as ERD Commander, but is missing key features such as the System Restore Wizard, hotfix and Service Pack uninstaller, password changer, crash analyzer wizard, and integrated Registry editor.

As outlined in our Complaint and Motion for Temporary Restraining Order (which can be found, along with all other legal documents filed in the case, at http://www.winternals.com/legal/), Best Buy and Geek Squad initially contacted us and said that a license was needed to come into compliance. Rather than focus on the degree to which Best Buy and Geek Squad had previously engaged in the unauthorized copying and use of our products, we entered negotiations for a software license and to establish a long-term business relationship. To educate their employees on the software and facilitate these negotiations, we even held a training session at our expense on the Administrator's Pak at their facilities in Minneapolis and offered an eminently reasonable software license for all Geek Squad employees. While surprised that they ultimately decided against a license, we were willing to go our separate way with the hope that they would someday change their mind.

However, after receiving information that Geek Squad employees continued to use ERD Commander frequently in repairing customers' computers we decided to investigate the situation on our own. The level of unauthorized copying and usage we’ve uncovered in our preliminary investigation is substantial and has apparently taken place over several years. Our evidence includes admissions by highly-placed current Geek Squad and Best Buy employees and interviews of many former employees. As alleged in the Complaint, we also found that Geek Squad employees across the country were still using unlicensed copies of our software to repair computers.

In the end we concluded that the only remaining option was to take legal action. Winternals has invested substantial time and capital in developing this software and believes that Geek Squad should not be permitted to allow its 12,000 employees to use unlicensed copies for free while generating substantial profits from those efforts. Our press release provides a summary of the lawsuit and the court's action to date.

posted by Mark Russinovich @ 9:28 AM (148) comments

The Case of the Mysterious Driver

The other day I used Process Explorer to examine the drivers loaded on a home system to see if I’d picked up any Sony or Starforce-like digital rights management (DRM) device drivers. The DLL view of the System process, which reports the currently loaded drivers and kernel-mode modules (such as the Hardware Abstraction Layer – HAL), listed mostly Microsoft operating system drivers and drivers associated with the DVD burning software I have installed, but one entry, Asctrm.sys caught my attention because its company information is “Windows (R) 2000 DDK provider”:



This is the company name included in the version information of drivers that have been based on sample code from the Windows 2000 Device Driver Kit (DDK) and it’s obviously unusual to see it in production images. The driver’s description is equally unenlightening: “TR Manager”. My suspicions aroused, I set about investigating.

My first step was to right-click on the entry and “Google” the driver image name. The resulting Google search reveals that others have this driver and that in some cases it had been identified as the cause of system crashes, but although several spyware databases have entries for it, none of the ones I checked conclusively tied the driver with an application or vendor.

I next looked for clues in the image itself by double-clicking on the driver entry in the DLL view to open the Process Explorer DLL properties dialog. The image page revealed nothing of interest other than the fact that the driver had been linked in December of 2004. I turned my attention to the Strings tab to look for some hint as to the driver’s reason for existence. None of the few intelligible strings Process Explorer found in the image were unique except for the last one:



When a driver compiles the linker stores the path to the debug information file it generates, which has the extension .pdb, in the image. The path in this case appears to include the name of a company, “AegiSoft”. However, the http://www.aegisoft.com/ web site describes Aegis Software, Inc. as a company that creates “powerful, sophisticated and easy to use trading software and services for financial companies that demand performance, robustness, availability, and flexibility.” That doesn’t sound like a company that ships device drivers.

On a whim I did a Google search of “aegis” and came across this January 2001 news item announcing RealNetworks’ acquisition of Aegisoft Corp. (notice the difference in name from Aegis Software, Inc.). I knew I had RealPlayer installed on the system so I ran RealPlayer and confirmed that it uses the driver by doing a handle search for “asctrm”, the name of the device object I had seen in one of the driver’s strings:




Newer versions of RealPlayer don’t appear to include a device driver, but I have an old version on this system. I haven't gotten new release notifications because after installing RealPlayer I always use Autoruns to delete the HKLM\Software\Microsoft\Windows\CurrentVersion\Run item that the RealPlayer setup creates to launch the Real Networks Scheduler at each boot. That Run entry, incidentally, is “TkBellExe”, another misleading label.

So the driver is not malicious after all (but is related to DRM, so agreement with that view depends on your feelings about DRM), however this example highlights the need for all software vendors (Microsoft included!) to clearly identify their applications and drivers in their version resources and in any associated Registry keys or values.

I’m still researching Vista User Account Control and so will blog on that in the near future.

posted by Mark Russinovich @ 3:52 PM (30) comments

Running as Limited User - the Easy Way

Malware has grown to epidemic proportions in the last few years. Despite applying layered security principles, including running antivirus, antispyware, and a firewall, even a careful user can fall victim to malware. Malware-infected downloads, drive-by exploits of Internet Explorer (IE) vulnerabilities, and a careless click on an Outlook attachment sent by a friend can render a system unusable and lead to several hours with the Windows setup CD and application installers.

As this eWeek study shows, one of the most effective ways to keep a system free from malware and to avoid reinstalls even if malware happens to sneak by, is to run as a limited user (a member of the Windows Users group). The vast majority of Windows users run as members of the Administrators group simply because so many operations, such as installing software and printers, changing power settings, and changing the time zone require administrator rights. Further, many applications fail when run in a limited-user account because they’re poorly written and expect to have write access to directories such as \Program Files and \Windows or registry keys under HKLM\Software.

An alternative to running as limited user is to instead run only specific Internet-facing applications as a limited user that are at greater risk of compromise, such as IE and Outlook. Microsoft promises this capability in Windows Vista with Protected-Mode IE and User Account Control (UAC), but you can achieve a form of this today on Windows 2000 and higher with the new limited user execution features of Process Explorer and PsExec.

Process Explorer’s Run as Limited User menu item in the File menu opens a dialog that looks like and acts like the standard Windows Run dialog, but that runs the target process without administrative privileges:



PsExec with the –l switch accomplishes the same thing from the command line:



An advantage to using PsExec to launch limited-user processes is that you can create PsExec desktop shortcuts for ones you commonly launch. To make a shortcut for Outlook, for example, right-click on the desktop, choose New->Shortcut, enter the path to PsExec in the location field and click Next. Enter Outlook as the name of the shortcut and press Finish. Then right click on the shortcut to open its properties, add “-l –d“ and the path to Outlook (e.g. C:\Program Files\Microsoft Office\Office11\Outlook.exe) to the text in the Target field. Finally, select Change Icon, navigate to the Outlook executable and choose the first icon. Activating the shortcut will result in a Command Prompt window briefly appearing as PsExec launches the target with limited rights.

Both Process Explorer and PsExec use the CreateRestrictedToken API to create a security context, called a token, that’s a stripped-down version of its own, removing administrative privileges and group membership. After generating a token that looks like one that Windows assigns to standard users Process Explorer calls CreateProcessAsUser to launch the target process with the new token.

You can use Process Explorer itself to compare the token of a process running with full administrative rights and one that’s limited by viewing the Security tab in the Process Properties dialog. The properties on the left are for an instance of IE running in an account with administrative group membership and the one on the right for IE launched using Run as Limited User:



The privilege lists immediately stand out as different because the limited-user token has so few privileges. Process Explorer queries the privileges assigned to the Users group and strips out all other privileges, including powerful ones like SeDebugPrivilege, SeLoadDriverPrivilege and SeRestorePrivilege.

The difference between the group lists is more subtle: both tokens contain the Builtin\Administrators group, but the group has a Deny flag in the limited-user version. Fully understanding the effect of that flag requires a quick background on the Windows security model.

Windows stores an object’s permissions in a Discretionary Access Control Lists (DACL) that consists of zero or more Access Control Entries (ACEs). Each ACE specifies the user or group to which it applies, a type of Allow or Deny and the accesses (e.g. read, delete) it allows or denies. When a process tries to open an object Windows normally considers each ACE in the object’s DACL that matches the user or any of the groups in the process’ token. However, when the Deny flag is present on a group that group is only used by during a security access check to deny access to objects, never to grant access.

CreateRestrictedToken marks groups you don’t want present in the resulting token with the Deny flag rather than removing them altogether to prevent the security hole doing so would create: a process using the new token could potentially access objects for which the removed groups have been explicitly denied access. Users would therefore be able to essentially bypass permissions by using the API. Consider a directory that has permissions denying the Builtin\Administrators account access, but allows Mark access. That directory wouldn’t be accessible by the original instance of IE above, but would be accessible by the limited user version.

The result of running applications as limited user is that malware invoked by those applications won’t be able to modify system settings, disable antivirus or antispyware, install device drivers, or configure themselves in system-wide autostart locations.

There are some limitations, however: because the limited-user processes are running in the same account and on the same desktop as other processes running with administrative privileges, sophisticated malware could potentially inject themselves into more privileged processes or remotely control them using Windows messages. When it comes to security, there’s no single cure all and every layer of protection you add could be the one that eventually saves you or your computer.

Next post I’ll take a look inside Vista’s UAC to see how it uses the same approach as Process Explorer and PsExec, but leverages changes to the Windowing system and process object security model to better isolate limited-user processes from those running with higher privilege.

posted by Mark Russinovich @ 10:29 AM (78) comments

Using Rootkits to Defeat Digital Rights Management

The Sony rootkit debacle highlighted the use of rootkits to prevent pirates and authors of CD burning, ripping, and emulation utilities from circumventing Digital Rights Management (DRM) restrictions on access to copyrighted content. It’s therefore ironic, though not surprising, that several CD burning and disc emulation utilities are also using rootkits, though the technology is being used in the opposite way: to prevent DRM software from enforcing copy restrictions.

Because PC game CDs and DVDs do not need to be compatible with set-top players software vendors can store data on media in unorthodox ways that require software support to read it. Attempts to make a copy of such media without the aid of the software results in a scrambled version and the software has DRM measures to detect and foil unauthorized copying.

CD burning and emulation software companies owe a significant amount of their sales to customers that want to store games on their hard drives. The legitimate claim for doing this is that it enables fast, cached access to the game., though it is well known that this is also used to make illegal copies of games to share with friends - so content-protected CDs and DVDs present a challenge the companies can’t ignore. One way to deal with the problem is to re-engineer the software that interprets the data stored on the media, but that approach requires enormous and on-going resources dedicated to deciphering changes and enhancements made to the encoding schemes.

An easier approach is to fool game DRM software into thinking its reading data for playing a game from its original CD rather than from an on-disk copy. DRM software uses a number of techniques to try to defeat that trick, but a straightforward one is simply to detect if CD emulation software is present on the system and if so, if the game is being run from an on-disk emulated copy. That’s where rootkits come in. Two of the most popular CD emulation utilities are Alcohol and Daemon Tools and they both use rootkits.

Alcohol advertises itself as enabling you “to make a duplicate back-up to recordable media of nearly all your expensive Game/Software/DVD titles, and/or an image that can be mounted and run from any one of Alcohol's virtual drives”. When you run a RootkitRevealer scan of a system on which Alcohol is installed you see several discrepancies:



The first two are data mismatches whereas the last one is a key that’s hidden from Windows. A data mismatch occurs when RootkitRevealer obtains a different value from a Registry API than it sees when it looks at the raw Registry data where the value resides. When you view either of the values in Regedit they appear to be composed of sequences of space characters:



Why would Alcohol want to use data mismatching rather than the typical cloaking technique to hide the value altogether? The values in question are located in HKLM\Software\Classes\Installer\Products and HKLM\Software\Microsoft\Windows\CurrentVersion\Uninstall and both areas are where applications store information for use by the Windows Add/Remove Programs (ARP) utility. ARP uses the ProductName value in an application’s Products key as the name it displays in its list of installed applications so an empty value implies that we should see a product with no name in the list. However, a quick look shows that there are no missing names and we know that the value is associated with Alcohol, but it shows up in the list:



Using Regmon to capture a Registry activity trace of ARP, which as a Control Panel applet is implemented as a DLL hosted by Rundll32.exe, confirms that ARP reads displayed Alcohol text from the mismatched ProductName value whereas Regedit sees only empty data for the same value:



The other mismatched value behaves the same way and it’s my guess that Alcohol masquerades strings that identifies its presence on a system from anything but ARP in order to avoid detection by DRM software like that included in games that disable themselves in the presence of CD/DVD copy and emulation software. There are many other signs DRM software can use to sense Alcohol’s presence, but the Alcohol developers likely discovered that a check of installed products is or was the most commonly used.

The remaining RootkitRevealer discrepancy is the cloaked Jgdd40 key in the Config subkey of the Vax347s driver. Alcohol must include a device driver that presents phantom devices to Windows in order to create virtual CD and DVD devices and Vax347s is the driver that fills that role. An easy way to see inside a cloaked Registry key is to open the parent of the inaccessible key in Regedit, choose Export from the File menu and select Registry Hive Files from the format drop down. Then copy the file to a different system, launch Regedit, navigate to HKLM, and choose Load Hive in the File menu. The name you enter for the key is up to you. When you follow the steps on the cloaked key you see a single value, Ujdew, within it:



The contents are binary data, but my guess is that it describes the volumes that the driver virtualizes. Game DRM software that is Alcohol-aware would be unable to determine whether the volume from which it was executing was on a real device or one that was emulated. Evidence that supports this theory lies in Jdgg40’s parent key, Config, which also contains a single value named Ujdew, but with slightly different contents than the one that’s hidden. The second value is almost certainly a decoy to throw off DRM developers that determined that it at one time contained virtual drive mappings:



Alcohol, like Sony’s rootkit, uses system call hooking to intercept Registry APIs and manipulate their behavior. This memory dump of the Windows kernel-mode system call table contains addresses that fall outside of the kernel image, the telltale sign of a system-call hook:



The addresses correspond to Registry-related system calls and the debugger confirms that the addresses lie in a second Alcohol driver, Vax347b, that’s responsible for the cloaking:



On a system with Daemon Tools installed RootkitRevealer reports the presence of a single discrepancy:



An interesting aspect of Daemon Tools’s rootkit is that it doesn’t cloak the presence of the key listed, but rather denies access even to RootkitRevealer, which should be able to open any key regardless of the key’s security. Following the same steps I described earlier for gaining access to off-limit keys unveils the key’s contents:



Paralleling the Alcohol example, the key is part of Daemon Tools’ virtual device driver and appears to contain configuration information, implying that Daemon Tools hides the key to fool game anti-emulation software by preventing it from finding a way to distinguish virtual volumes from real ones.

There’s no proof that Alcohol and Daemon Tools use rootkits to evade DRM, but the evidence is compelling. If they do their usage is clearly unethical and even potentially runs afoul of the US Digital Millennium Copyright Act (DMCA). In any case, there’s no reason for these products, or any product as I’ve stated previously, to employ rootkit techniques.

[2/7/06: Clarification: when I say "their usage is celarly unethical" I'm not referring to users of the products, but to the utilities themselves being designed to circumvent DRM. I've previously defined rootkits and explained their risks.]

Speaking of rootkits, here’s an amusing video of a song named Patch Me Up by the North Sydney band Rootkit.

posted by Mark Russinovich @ 9:27 AM (152) comments

Inside the WMF Backdoor

Steve Gibson (of SpinRite fame) proposed a theory in his weekly Thursday-night podcast last week that if true, would be the biggest scandal to ever hit Microsoft - that the Windows Metafile (WMF) vulnerability that drew so much media attention last month is actually a backdoor programmed intentionally by Microsoft for unknown reasons. Slashdot picked up the story the next day and I received a flood of emails asking me to look into it. I finished my analysis, which Steve aided by sending me the source code to his WMF-vulnerability tester program (KnockKnock), over the weekend. In my opinion the backdoor is one caused by a security flaw and not one made for subterfuge. I sent my findings to both Steve and to Microsoft Monday morning, but because the issue continues to draw media attention I’ve decided to publicly document my investigation.

Understanding the WMF vulnerability requires a brief background in WMF files. A WMF file is a script for executing graphics commands, called graphics device interface (GDI) functions. Each command is stored as a record in the WMF file and examples of GDI functions include ones to draw lines, fill rectangles, and copy bitmaps. Image files like bitmaps, GIFs, or JPEGs, on the other hand, store the representation of pre-rendered images. Because an application can direct a WMF file’s execution at different devices, like screens and printers, with different graphics resolutions and color depths, their advantage over pre-rendered formats is that they scale to the capabilities of the target device. For this reason, many clipart images, including those used by Microsoft Office, are stored in WMF files.

WMF files originated with early 16-bit versions of Windows that implemented single-threaded cooperative multitasking. In that programming environment a process can’t perform two tasks, such as printing a document and displaying a print-cancel dialog, concurrently. Instead, they have to manually interleave the tasks, periodically polling to see if the user has asked to cancel the printing. The programming model for printing in Windows therefore has the concept of an abort procedure that an application can set before calling the printing API. If such a procedure is registered Windows periodically calls it to give an application a chance to signal that it wants the print job cancelled. Otherwise there would be no way to abort a long-running print job.

The WMF vulnerability stems from the fact that WMF supports the SetAbortProc API, which is the GDI call to set an abort procedure, that Windows expects abort procedure code to be stored directly in the SetAbortProc WMF record, and that Windows will invoke the procedure under certain conditions immediately after processing the record. Thus, if an attacker can get your computer to execute their WMF file through Internet Explorer or Outlook, for example, they can make your system execute arbitrary Windows commands, including downloading malicious applications and launching them.

Steve Gibson’s intentional backdoor theory is based on four suspicious observations he made regarding the vulnerability and the behavior of his tests with WMF files that contain a SetAbortProc record:

  1. There is no need for WMF files to include support for the SetAbortProc API.
  2. Even if an abort procedure is set by a WMF file, Windows shouldn’t execute it unless some abort condition is triggered, which should never occur when executing a WMF file.
  3. He could only get his WMF file’s abort procedure to execute when he specified certain invalid values for the size of the record containing the SetAbortProc command.
  4. Windows executes code embedded within the SetAbortProc record rather than expect the record to reference a procedure within the application executing the WMF file.
Steve’s belief that WMF files should not support the SetAbortProc API comes from the documentation for how Windows calls an abort procedure registered via SetAbortProc:

It [the abort proc] is called when a print job is to be cancelled during spooling.

The statement implies that Windows detects that a user or printer wants to cancel a print job and informs an application by executing the registered abort procedure. Steve echoes this understanding in a posting on his website’s news group:

[the abort proc] is the address of an application-provided "callback" -- a subroutine provided by the application that is expressly designed to asynchronously accept the news and notification of a printing job being aborted for whatever reason.

Steve reasoned that WMF files execute to screens, not printers, and so it makes no sense to abort their execution. Further, his tests showed that Windows calls the abort procedure registered by a WMF file immediately, when there’s no apparent cause for cancellation.

WMF files can be directed at a printer, however, and not only that, but the abort procedure documentation is misleading. Its correct description is in the Microsoft documentation that describes the basic steps for writing code that prints to a printer:

After the application registers the AbortProc abort procedure, GDI calls the function periodically during the printing process to determine whether to cancel the job.

Thus, the abort procedure really works both ways, providing Windows a way to notify an application of printing errors and the application a way to notify Windows that it wants to cancel printing. With this interpretation Windows’ execution of the abort procedure immediately after one is registered makes sense: Windows is calling the procedure to ask it if the playback of the rest of the procedure should be aborted.

Even still, the question remains as to why WMF files implement the SetAbortProc GDI function at all. My belief is that Microsoft developers decided to implement as much as the GDI function-set as possible. Including SetAbortProc makes sense for the same reason that abort procedures for printing make sense: WMF files can consist of many records containing complex GDI commands that can take along time to execute, especially when sent to a printer and on old hardware like the kind on which the cooperatively multitasked Windows 3.1 operating system ran. The abort procedure gives applications the ability to monitor the progress of a playback and to unilaterally abort it if a user makes UI choices that make a complete playback unnecessary. In addition, if a WMF file is sent to a printer and there’s a printer error Windows must have a way to know that an application wants to cancel WMF playback, which is another reason to invoke the abort procedure from within the PlayMetaFile loop. This Microsoft article from 1992 confirms the behavior as designed.

I’ve addressed the first two of Steve’s observations, but what about his claim that the abort procedure only executes when the SetAbortProc record contains certain invalid record sizes? I’ve analyzed the control flow of the PlayMetaFile function that executes WMF file records and found that, if an abort procedure is registered, it calls it after executing each record except the last record of the file. That behavior makes sense since there’s no need to ask an application if playback should be aborted when the playback is already completed.

Steve’s example WMF file contains only one record, the one that specifies SetAbortProc, so under normal circumstances PlayMetaFile will never call his abort procedure. The record sizes that he found trigger its execution cause PlayMetaFile to incorrectly increment its pointer into the WMF file such that it believes that there are more records to process, whereas the values he used that don’t trigger the execution land it on data values that indicate there are no more records. So his assertion that only certain magic values open the backdoor is wrong.

The remaining question is why PlayMetaFile expects the abort procedure to be in-lined in the metafile. It’s that fact that allows a hacker to transport malicious code within a WMF file. The actual reason is lost with the original developer of the API, but my guess is that he or she was being as flexible as possible. When a WMF file is generated in memory and played back by the application in the same run, like it would to create a graphics macro and use it mulitple times, it generally makes no difference if the procedure is copied or not.

For the code in on-disk WMF files to work any references it makes to data or code, such as Windows functions, must be to hard-coded addresses. This means that WMF file code won’t work if Windows system DLLs change in size or load into different locations in memory and therefore WMF vulnerability exploits only work on specific patch-levels of Windows. While this might make an argument against a design that includes the abort code in the WMF file things were different when the format was architected. In the Windows 3.1 “large” memory model code is inherently location-independent and Windows was never patched, so both Windows and an application could simply copy an application function into the WMF file and assume it would work when played back by the same application in a later run session. In any case, its not clear that the developers envisioned applications creating on-disk metafiles with abort procedures. Also, as Microsoft’s Stephen Toulouse pointed out in Microsoft’s rebuttal to Steve’s claims, the security landscape in the early 1990’s was very different than today and all code, including that stored in a WMF file, was inherently trusted.

The vulnerability is subtle enough that the WINE project, whose intent is to implement the Windows API for non-Windows environments, copied it verbatim in their implementation of PlayMetaFile. A secret backdoor would probably have been noticed by the WINE group, and given a choice of believing there was malicious intent or poor design behind this implementation, I’ll pick poor design. After all, there are plenty of such examples all throughout the Windows API, especially in the part of the API that has its roots in Windows 3.1. The bottom line is that I'm convinced that this behavior, while intentional, is not a secret backdoor.

posted by Mark Russinovich @ 11:05 PM (102) comments

This page is powered by Blogger. Isn't yours?

RSS Feed

RSS
    2.0

Index

Full Blog Index

Recent Posts

On My Way to Microsoft!
The Power in Power Users
Why Winternals Sued Best Buy
The Case of the Mysterious Driver
Running as Limited User - the Easy Way
Using Rootkits to Defeat Digital Rights Management
Inside the WMF Backdoor
Rootkits in Commercial Software
The Antispyware Conspiracy
Sony Settles

Archives

03/01/2005 - 03/31/2005
04/01/2005 - 04/30/2005
05/01/2005 - 05/31/2005
06/01/2005 - 06/30/2005
07/01/2005 - 07/31/2005
08/01/2005 - 08/31/2005
09/01/2005 - 09/30/2005
10/01/2005 - 10/31/2005
11/01/2005 - 11/30/2005
12/01/2005 - 12/31/2005
01/01/2006 - 01/31/2006
02/01/2006 - 02/28/2006
03/01/2006 - 03/31/2006
04/01/2006 - 04/30/2006
05/01/2006 - 05/31/2006
07/01/2006 - 07/31/2006

Other Blogs

Raymond Chen
Dana Epp
Aaron Margosis
Wes Miller
Larry Osterman
Bruce Schneier
Larry Seltzer