Virtual CSI: How Real Life Digital Forensics Investigators Track Down Hackers
As society becomes increasingly data driven, the importance of ensuring this data’s security and fidelity grows exponentially. While prevention is still the best medicine though, tracking down those who have already entered your system is also vital. This has led to the growth of digital forensics - a field of cybersecurity focused on tracking down those who have hijacked the internet for their own nefarious ends.
Television shows such as CSI and NCIS have long presented a stylised version of digital forensics teams, doing battle with mysterious hackers who bombard their screens with flashing red skull-and-crossbones graphics and other such clichés. However, the reality is far more subtle, further reaching, and more important.
To find out the realities of life in the field, BDJ spoke to Brett Shavers, a former police officer and digital forensics investigator assigned to various state and federal cases. Brett has written several award-winning books on the Digital Forensics Incident Response (DFIR), and is an adjunct professor in digital forensics at the University of Washington.
An Evolving Field
Shavers founded his police department’s digital-forensics division in a small storage closet. Today, he commands large crowds at conferences, companies, and universities. He has been on the frontline during the rapid evolution of computer security and analysis, and witnessed as the talent to deal with it evolved too.
“The granularity of information we can pull from data today is incredible compared to years ago, when I first started in the field,” says Shavers. “We’ve learnt to dig deep in logs and databases to determine incredible details of computer-user activity that we never before knew existed on storage media. Today’s tools are amazing at being able to recover this data and give output that is easy to interpret. Coupled with the fact that universities now award advanced degrees in digital forensics, we have both the technology and educated practitioners we didn’t have a decade ago.”
“We’ve learnt to dig deep in logs and databases to determine incredible details of computer-user activity that we never before realised existed on storage media”
While the field of cybersecurity has evolved, however, the scale of data today is such that individuals actually feel powerless to protect themselves. Public knowledge needs to keep pace with technological progress or it will be for nothing.
“The average person is overwhelmed by how much data is stored and can be recovered by not only the government, but by criminal actors,” says Shavers. “Indeed, the amount of information available overwhelms them to the point that they may feel there is nothing they can do to protect their data, especially as it’s a constant effort to maintain control of personal information. Until we realise that personal protection requires constant effort, few will make the effort last longer than an initial or sporadic short-run effort.”
Until this happens, there will inevitably be cyber attacks. And this means digital forensics.
Industry Need For Digital Forensics
Lack of understanding as to what constitutes a sustainable plan to safeguard data is inevitably carried over into the workplace. Smartphones facilitate personal and professional email accounts syncing on to one device, with apps such as Google Drive also contributing to the blurring of definitive work/social device use. Granted, many workers are still issued with company smartphones, but this does not necessarily mean their usage of these phones differs when it comes to adequately protecting their data.
“I typically see practically nothing proactive being done – or if it is done, companies doing it completely wrong – or I see incredible work in setting up defenses against malicious actors”
DFIR can become a necessary expenditure for a company when there are unclear or non-existent policies in place to govern employee data practices. This lack of safeguarding can lead to malicious actors having a much greater chance of compromising internal systems. Despite the relative flexibility of a company’s options to create digital-defense strategies, Shavers notes that the reality is far more black and white:
“I haven’t seen much of a middle ground. I typically see practically nothing proactive being done – or if it is done, companies doing it completely wrong – or I see incredible work in setting up defenses against malicious actors in incident response.
“From what I have seen, it really depends on the individual company as to how much effort and expense they choose to go to up front to prevent theft and damage to their systems. I don’t believe that being a victim in the past makes as much difference as the leadership choosing to be proactive.”
Digital Forensics vs Manufacturer Encryption
The good practice of not opening randomly-received .exe files, or avoiding sending sensitive details to unsolicited emails have long been hammered home to consumers. With Google, Facebook, the NSA and many others now harvesting tremendous amounts of user data, the idea of private user data is increasingly laughable. It’s no longer a case of avoiding hackers’ attempts to steal information from a home computer, but, rather, understanding the breadth of information that can be collected with implied consent.
Following the 2015 San Bernardino terrorist attack in the US, the FBI and Apple became locked in a series of disputes over whether manufacturers can be compelled to unlock mobile phones for law enforcement agencies. Historically, iPhones that have cryptographically protected data cannot be forcibly unlocked, even by Apple themselves. The FBI requested that Apple write dedicated software that would allow agents to circumvent the encryption and four-digit passcode, allowing complete access to the device of one of the attack perpetrators. Apple declined, which resulted in the scheduling of a United States District Court for the Central District of California case against Apple by the FBI.
This legal situation, in particular, highlighted the degree of encryption that iPhone users perhaps didn’t expect, and the lengths to which certain manufacturers would go to protect their privacy.
“Ideally, private companies protect the data of their users because the users are paying for the services,” says Shavers. “Practically speaking, any government – local, state, and federal – can compel any private company to provide any data that a court demands through subpoenas and search warrants. The only options for private companies to not cooperate are to argue in court against government demands, or stop providing services that collect user data.”
“Practically speaking, any government – local, state, and federal – can compel any private company to provide any data that a court demands through subpoenas and search warrants”
The above case highlights a complex matter for digital forensics. As the adoption of consumer devices such as smartphones continues to increase, the sophistication and number of applications of which they’re capable will also grow. Edward Snowden alleged in 2013 that various surveillance agencies, including the UK’s GCHQ, could access almost all user data in iOS, Android and Blackberry phones, which likely contributed to Apple’s increased security and encryption standards in iOS 9. Digital forensics are now faced with levels of encryption that manufacturers don’t even have the root access to break.
Legislation and Digital Forensics
Lawmakers are at odds as to how to deal with this reality. Following the aforementioned FBI-Apple case, US Senator Dianne Feinstein has twice tried to spearhead new legislation that requires manufacturers such as Apple to allow access to all users’ encrypted data, should a legal challenge arise. Her initial attempt in 2017 didn’t make it to the floor of Congress, but she initiated a new attempt in 2018.
“Legislation will always be behind current technology,” Shavers says, “much like anyone working in the field is behind, simply because major advances occur faster than any law or person can effectively respond. Knowing this is the norm shouldn’t prevent attempts to keep up, but should encourage constant improvements in response to incidents, and constant updates to laws relevant to changes in technology.”
As the FBI-Apple case highlights, updating laws as technology progresses isn’t always an easily achievable reality. Not only has encryption technology progressed, but the encryption complexity of products by the likes of Apple has resulted in consumers enjoying increased protection standards that new legislation will seek to remove.
“One of the fields that needs improvement is the updating of outdated laws or, in some cases, their complete removal – some not only don’t apply any more but may criminalise acts that, by virtue of changing technology, are no longer criminal. Ethical hacking, for instance,” says Shavers.
“Some [laws] may criminalise acts that, by virtue of changing technology, are no longer criminal. Ethical hacking, for instance”
The skill set of digital forensics lends itself well to hacking, which many large companies have recognised and sought to utilise. Ethical hacking has allowed companies such as Google and Facebook to reward individuals who have discovered exploits in major consumer platforms and products through the so-called ‘bug bounty’ programs.
The Future of Digital Forensics
With data production increasing, there is no question about the ongoing need to decrypt and analyse devices and online storage resources. However, what remains to be seen is the ability of digital-forensics experts to effectively tackle the vast data streams.
“With electronic data propagating like bunnies, I see digital forensics focusing more on relevant data captures over the more traditional complete data captures,” says Shavers. “For example, rather than creating full images of terabytes of data to later sift for evidence, triage and disperse, it may become more common to selectively target known areas of compromise or locations where electronic evidence is typically stored.”
Another area that holds the potential to benefit the digital-forensics field is the adoption of blockchain on a wider personal and industrial scale. In any kind of investigation, a forensics expert is concerned with the ease of access to the data in question, but also with ascertaining the data’s authenticity and fidelity. Because of blockchain’s foundation of providing an immutable ledger of all transactions, it could theoretically act as a whole new medium via which digital-forensics professionals can streamline their data analysis.
“I can’t wait to see how blockchain technology will be practically applied to digital forensics and incident response,” says Shavers. “Up to this point, most of what we see is the marketing tactic of simply using blockchain as a buzzword. I do see some application of the blockchain in the field of DFIR, but I’m keen to see how companies will actually make a use study out of it.”
“I can’t wait to see how blockchain technology will be practically applied to digital forensics and incident response. Up to this point, most of what we see is the marketing tactic of simply using blockchain as a buzzword”
Blockchain’s infancy as a platform, combined with a saturation of speculation concerning its potential applications, does make it difficult to effectively predict just how useful it may prove to digital-forensics experts seeking new methods of collecting high-quality, uncompromised data. With experts such as Brett Shavers taking an interest in its development, however, it’s likely that companies that are proceeding with their own blockchain development will have factored in its digital-forensics implications.
Since its inception, social media has essentially been based on encouraging users to share their personal information publicly. This has existed in the form of sharing status updates, photographs and videos, as well as more personal details like birthdays and interests. Facebook, in particular, has been successful in this. It has reassured its users that it cares about their privacy, while profiting from their sharing and being embarrassingly open to data breaches as a number of controversies have shown.
This March, though, Mark Zuckerberg announced in a blog post that his new ‘vision’ for social networking was one of privacy. He compared Facebook and Instagram to a digital “town hall”, and said that people were increasingly wanting to spend time in a digital “living room”. Zuckerberg’s announcement relates more to Facebook’s proposed plan to integrate the messaging elements of WhatsApp, Messenger and Instagram than it does its news feed. The announcement was met with immediate scepticism and suspicion, thanks in large part to Facebook’s history. Its business model is predicated on the immense volumes of data it collects about its users - is it ready to give this up in the name of privacy?
Illustrations by Kseniya Forbender
To contact the editor responsible for this story:
Margarita Khartanovich at [email protected]
- Artificial Intelligence Isn’t Ready to Take Over From Doctors and Nurses, Just Yet
- Machine Learning Vs. Analysts: Will AI Eventually Replace Data Scientists?
- Why The Danger of Deepfakes Is No Danger At All
- Sarcastic Robots? How Deep Convolutional Neural Networks Are Making AI Worryingly Human
- Traffic Lights in The Sky: Flying Cars Can Appear Sooner Than You Think