It currently takes security specialists on average 312 days to discover software vulnerabilities. It can take weeks, months, and sometimes years for these vulnerabilities to be fixed across all exposed systems. That leaves plenty of time for malware, ransom-ware, and other malicious software to exploit these vulnerabilities and maximize damage. The Cyber Grand Challenge was designed to improve the current state of cyber security by testing the concept of using automation to discover and repair software vulnerabilities, a technological ability that does not currently exist. The lessons learned in the Cyber Grand Challenge could lead to a world in which computer viruses, malware, and other attacks are discovered and fixed in a matter of days, hours, and even seconds.
Source: DARPA
Starting with over 100 teams consisting of some of the top security researchers and hackers in the world, the Defense Advanced Research Projects Agency (DARPA) pit seven teams against each other in the Cyber Grand Challenge final event, held August 4 in Las Vegas. During the competition, each team’s Cyber Reasoning System (CRS) automatically identified software flaws, and scanned a purpose-built, air-gapped network to identify affected hosts. For nearly twelve hours teams were scored based on how capably their systems protected hosts, scanned the network for vulnerabilities and maintained the correct function of software.
The Cyber Grand Challenge featured never-before-seen autonomous systems and highly trained experts, many of whom compete regularly on a global “Capture the Flag” tournament circuit. Drawing on the best traditions of computer security competitions, CGC challenged these experts to have their specially-engineered systems compete against each other to evaluate software, test for vulnerabilities, generate security patches and apply them to protected computers on a network. In the end, CGC validated the concept of automated cyber defense, bridging the gap between the best security software and cutting-edge program analysis research.
The winning computer system, dubbed Mayhem, was created by a team known as ForAllSecure—one of seven teams that competed for nearly $4 million in prizes in today’s all-day competition, performed in front of 5,000 computer security professionals and others at the Paris Las Vegas Conference Center.
Xandra, a computer system designed by team TECHx of Ithaca, N.Y., and Charlottesville, Va., was declared the presumptive second-place winner. And Mechanical Phish, a system designed by team Shellphish of Santa Barbara, Calif., was named the presumptive third-place winner. Judges will spend the night verifying those preliminary results, and winners will be officially crowned at an award ceremony Friday morning, immediately before the launch of DEF CON, the nation’s largest hacker tournament, also being hosted at the Paris Hotel. First place in the CGC carries a cash award of $2 million; second- and third-place teams will receive $1 million and $750,000, respectively.
At Friday’s ceremony, DEF CON organizers are expected to formally invite Mayhem to participate in this year’s DEF CON Capture the Flag competition, marking the first time a machine will be allowed to play in that historically all-human tournament.
“I’m enormously gratified that we achieved CGC’s primary goal, which was to provide clear proof of principle that machine-speed, scalable cyber defense is indeed possible,” said Mike Walker, the DARPA program manager who launched the challenge in 2013. “The effort by the teams, the DARPA leadership and staff, and all the hundreds of people who helped make this unique, open-to-the-public test happen was enormous. I’m confident it will speed the day when networked attackers no longer have the inherent advantage they enjoy today.”
DARPA’s Cyber Grand Challenge was designed to accelerate the development of advanced, autonomous systems that can detect, evaluate, and patch software vulnerabilities before adversaries have a chance to exploit them. The seven competing teams in today’s final event were composed of white-hat hackers, academics, and private-sector cyber systems experts.
The need for automated, scalable, machine-speed vulnerability detection and patching is large and growing fast as more and more systems—from household appliances to major military platforms—get connected to and become dependent upon the internet. Today, the process of finding and countering bugs, hacks, and other cyber infection vectors is still effectively artisanal. Professional bug hunters, security coders, and other security pros work tremendous hours, searching millions of lines of code to find and fix vulnerabilities that could be taken advantage of by users with ulterior motives.
The Heartbleed security bug existed in many of the world’s computer systems for nearly two and a half years, for example, before it was discovered and a fix circulated in spring 2014. By that time, the bug had rendered an estimated half million of the internet’s secure servers vulnerable to theft and other mischief. Analysts have estimated that, on average, such flaws go unremediated for 10 months before being discovered and patched, giving nefarious actors ample opportunity to wreak havoc in affected systems before they move on to exploit new terrain.
Today’s event was the first head-to-head competition among developers of some of the most sophisticated automated bug-hunting systems ever developed. For almost 10 hours, competitors played the classic cybersecurity exercise of Capture the Flag in a specially created computer testbed laden with an array of bugs hidden inside custom, never-before-analyzed software. The machines were challenged to find and patch within seconds—not the usual months—flawed code that was vulnerable to being hacked, and find their opponents’ weaknesses before the defending systems did. The entire event was visualized for attendees on giant monitors and livestreamed for remote viewers, with expert “sportscasters” documenting the historic competition.