Getty Images
Hackers are using open source software that’s popular with video game cheaters to allow their Windows-based malware to bypass restrictions Microsoft put in place to prevent such infections from occurring.
The software comes in the form of two software tools that are available on GitHub. Cheaters use them to digitally sign malicious system drivers so they can modify video games in ways that give the player an unfair advantage. The drivers clear the considerable hurdle required for the cheat code to run inside the Windows kernel, the fortified layer of the operating system reserved for the most critical and sensitive functions.
Researchers from Cisco’s Talos security team said Tuesday that multiple Chinese-speaking threat groups have repurposed the tools—one called HookSignTool and the other FuckCertVerifyTimeValidity. Instead of using the kernel access for cheating, the threat actors use it to give their malware capabilities it wouldn’t otherwise have.
A new way to bypass Windows driver restrictions
“During our research we identified threat actors leveraging HookSignTool and FuckCertVerifyTimeValidity, signature timestamp forging tools that have been publicly available since 2019 and 2018 respectively, to deploy these malicious drivers,” the researchers wrote. “While they have gained popularity within the game cheat development community, we have observed the use of these tools on malicious Windows drivers unrelated to game cheats.”
With the debut of Windows Vista, Microsoft enacted strict new restrictions on the loading of system drivers that can run in kernel mode. The drivers are critical for devices to work with antivirus software, printers, and other kinds of software and peripherals, but they have long been a convenient inroad for hackers to run malware in kernel mode. These inroads are available to hackers post-exploit, meaning once they’ve already gained administrative privileges on a targeted machine.
While attackers who gain such privileges can steal passwords and take other liberties, their malware typically must run in the Windows kernel to perform a large number of more advanced tasks. Under the policy put in place with Vista, all such drivers can be loaded only after they’ve been approved in advance by Microsoft and then digitally signed by a trusted certificate authority to verify they are safe.
Malware developers with admin privileges already had one well-known way to easily bypass the driver restrictions. The technique is known as “bring your own vulnerable driver.” It works by loading a publicly available third-party driver that has already been signed and later is found to contain a vulnerability allowing system takeover. The hackers install the driver post exploit and then exploit the driver vulnerability to inject their malware into the Windows kernel.
Although the technique has existed for more than a decade, Microsoft has yet to devise working defenses and has yet to provide any actionable guidance on mitigating the threat despite one of its executives publicly lauding the efficacy of Windows to defend against it.
The technique Talos has discovered represents a new way to bypass Windows driver restrictions. It exploits a loophole that has existed since the start of the policy that grandfathers in older drivers even when they haven’t been reviewed for safety by Microsoft. The exception, designed to ensure older software was still able to run on Windows systems, is triggered when a driver is signed by a Windows-trusted certificate authority prior to July 29, 2015.
“If a driver is successfully signed this way, it will not be prevented from being installed and started as a service,” Tuesday’s Talos post explained. “As a result, multiple open source tools have been developed to exploit this loophole. This is a known technique though often overlooked despite posing a serious threat to Windows systems and being relatively easy to perform due in part to the tooling being publicly available.”
When you come to a roadblock, take a detour
HookSignTool was originally released in 2019 on a Chinese-speaking software-cracking forum by “JemmyLoveJenny,” the moniker used by its author. The tool has been available on GitHub since 2020. FuckCertVerifyTimeValidity has been available on GitHub since 2019. The underlying code in both tools incorporates a software package known as Microsoft Detours to interrupt the driver signing process at critical points so that key pieces of data may be modified during the stop. The most important change the tools make is to the date the signing took place.
The threat actors write malware they want to run in kernel mode and sign it with an existing code-signing certificate that expired or was issued prior to July 29, 2015. Normally, the resulting signature wouldn’t be valid because it wouldn’t include cryptographic proof Microsoft had verified it as safe. Using HookSignTool or FuckCertVerifyTimeValidity, the hackers (or game cheaters) tamper with the CertTimeValidity function Microsoft uses during the signing process. Microsoft uses the function to determine if the code qualifies for the driver-signing exception. Once the malicious driver has been signed using the tool, Windows won’t enforce the requirement that it also be signed by the Developer Portal.
Talos researchers explained:
By attaching to the CertVerifyTimeValidity function, HookSignTool performs a “detour” to a custom implementation of CertVerifyTimeValidity named NewCertVerifyTimeValidity. This allows HookSignTool to pass a custom time in the “pTimeToVerify” parameter, thereby allowing an invalid time to be verified.
To change the signing timestamp during execution, HookSignTool again uses the DetourAttach function to attach to the Windows API function GetLocalTime and detours to another function named NewGetLocalTime. This detour replaces the local time with the date supplied by the user that is within the valid range for the certificate being used. Once both GetLocalTime and CertVerifyTimeValidity are detoured, HookSignTool can supply and verify an illegitimate timestamp for the target binary.
The researchers continued:
FuckCertVerifyTimeValidity works in a similar fashion to HookSignTool in that it uses the Microsoft Detours package to attach to the “CertVerifyTimeValidity” API call and sets the timestamp to a chosen date. Like HookSignTool, a function must be added to the import table of the legitimate signing tool, but in this case it’s “FuckCertVerifyTimeValidity.dll!test”. Unlike HookSignTool, FuckCertVerifyTimeValidity does not leave artifacts in the binary that it signs, making it very difficult to identify when this tool has been used.
The technique works on any stolen or expired certificate, as long as (1) it’s compromised, meaning the attacker has obtained the password and private encryption key corresponding to it, and (2) the certificate expired or was issued before July 29, 2015. Some of the more commonly abused such certificates are a batch of 13 that were included in a forked version of FuckCertVerifyTimeValidity. Game cheaters have been using them for years. More recently, threat actors have been doing the same thing.
Three of the certificates came from the 2015 hack of Hacking Team, a developer of software exploits it sold to governments around the world. The remaining 10 have been available for years on a Chinese-language software-cracking forum. The certificates are:
● Open Source Developer, William Zoltan
● Luca Marcone
● HT Srl
● Beijing JoinHope Image Technology Ltd.
● Shenzhen Luyoudashi Technology Co., Ltd.
● Jiangsu innovation safety assessment Co., Ltd.
● Baoji zhihengtaiye co.,ltd
● Zhuhai liancheng Technology Co., Ltd.
● Fuqing Yuntan Network Tech Co.,Ltd.
● Beijing Chunbai Technology Development Co., Ltd
● 绍兴易游网络科技有限公司
● 善君 韦
● NHN USA Inc.
Whacking moles
Talos researchers notified Microsoft of their findings earlier, and, in response, the software maker released a Windows update on Tuesday that blocks all certificates reported by Talos.
“Microsoft has released Window Security updates (see Security Updates table) that untrust drivers and driver signing certificates for the impacted files and has suspended the partners’ seller accounts,” company officials wrote in an advisory. “Additionally, Microsoft has implemented blocking detections (Microsoft Defender 1.391.3822.0 and newer) to help protect customers from legitimately signed drivers that have been used maliciously in post-exploit activity.” More information is here.
As I noted last October, Windows was failing to properly download and apply updates to the driver blocklist even though one of its executives provided public assurances to the contrary. Microsoft’s attempts to block signed drivers used maliciously has been either (1) a combination of what’s called memory integrity and HVCI, short for Hypervisor-Protected Code Integrity or (2) a separate mechanism for preventing bad drivers from being written to disk, known as ASR, or Attack Surface Reduction.
While reporting the October article, I received a fully updated Windows 10 machine to install some of the drivers Microsoft said were blocked. Security researchers ran tests that also found Windows installed supposedly blocked drivers just fine. One of the researchers was Will Dormann, who found two gaping holes in the protections. One was that the list of drivers hadn’t been updated since 2019, despite Microsoft assurances to the contrary. The other: He could find no evidence of ASR working at all.
“Microsoft vulnerable driver blocking was essentially completely broken last fall,” Dormann told me on Tuesday “But they’ve fixed it, so that the block list is working and distributed to endpoints.”
Microsoft’s fix, he said, came in the form of distributing a file named driversipolicy.p7b to endpoints “once or twice per year.” With Tuesday’s update, Microsoft released a new file named driver.stl that contains the certificates found by Talos.
Microsoft’s actions continue the company’s whack-a-mole approach to the problem of malicious drivers used in post-exploit scenarios, meaning after a hacker has already gained admin privileges. The approach is to block drivers known to be used maliciously but to do nothing to close the gaping loophole. That leaves attackers free to simply use a new batch of drivers to do the same thing. As demonstrated in the past and again now, Microsoft often fails to detect drivers that have been used maliciously for years.
In fairness to Microsoft, a working solution is elusive because many vulnerable drivers continue to be used legitimately by large numbers of paying customers. A revocation of such drivers could cause crucial software worldwide to suddenly stop working.
Given that drivers can be used maliciously only after a hacker has obtained system rights, the best defense is to prevent compromises in the first place. People who want to check if their systems have been infected through the technique Talos discovered can search for artifacts that one of the two tools sometimes leaves behind. As Microsoft noted, many AV programs can also detect apps installed with the drivers.