Digital forensics
Digital forensics is a branch of forensic science encompassing the recovery, investigation, examination, and analysis of material found in digital devices, often in relation to mobile devices and computer crime. The term "digital forensics" was originally used as a synonym for computer forensics but has been expanded to cover investigation of all devices capable of storing digital data. With roots in the personal computing revolution of the late 1970s and early 1980s, the discipline evolved in a haphazard manner during the 1990s, and it was not until the early 21st century that national policies emerged.
Digital forensics investigations have a variety of applications. The most common is to support or refute a hypothesis before criminal or civil courts. Criminal cases involve the alleged breaking of laws that are defined by legislation and enforced by the police and prosecuted by the state, such as murder, theft, and assault against the person. Civil cases, on the other hand, deal with protecting the rights and property of individuals, but may also be concerned with contractual disputes between commercial entities where a form of digital forensics referred to as electronic discovery may be involved.
Forensics may also feature in the private sector, such as during internal corporate investigations or intrusion investigations.
The technical aspect of an investigation is divided into several sub-branches related to the type of digital devices involved: computer forensics, network forensics, forensic data analysis, and mobile device forensics. The typical forensic process encompasses the seizure, forensic imaging, and analysis of digital media, followed with the production of a report of the collected evidence.
As well as identifying direct evidence of a crime, digital forensics can be used to attribute evidence to specific suspects, confirm alibis or statements, determine intent, identify sources, or authenticate documents. Investigations are much broader in scope than other areas of forensic analysis, often involving complex time-lines or hypotheses.
History
Prior to the 1970s, crimes involving computers were dealt with using existing laws. The first computer crimes were recognized in the 1978 Florida Computer Crimes Act, which included legislation against the unauthorized modification or deletion of data on a computer system. Over the next few years, the range of computer crimes being committed increased, and laws were passed to deal with issues of copyright, privacy/harassment, and child pornography. It was not until the 1980s that federal laws began to incorporate computer offences. Canada was the first country to pass legislation in 1983. This was followed by the US Federal Computer Fraud and Abuse Act in 1986, Australian amendments to their crimes acts in 1989, and the British Computer Misuse Act in 1990. Digital forensics methods are increasingly being applied to preserve and authenticate born-digital cultural materials in heritage institutions.1980s–1990s: Growth of the field
The growth in computer crime during the 1980s and 1990s caused law enforcement agencies to begin establishing specialized groups, usually at the national level, to handle the technical aspects of investigations. For example, in 1984, the FBI launched a Computer Analysis and Response Team and the following year a computer crime department was set up within the British Metropolitan Police fraud squad. As well as being law enforcement professionals, many of the early members of these groups were also computer hobbyists and became responsible for the field's initial research and direction.One of the first practical examples of digital forensics was Cliff Stoll's pursuit of hacker Markus Hess in 1986. Stoll, whose investigation made use of computer and network forensic techniques, was not a specialized examiner. Many of the earliest forensic examinations followed the same profile.
Throughout the 1990s, there was high demand for these new, and basic, investigative resources. The strain on central units lead to the creation of regional, and even local, level groups to help handle the load. For example, the British National Hi-Tech Crime Unit was set up in 2001 to provide a national infrastructure for computer crime, with personnel located both centrally in London and with the various regional police forces.
During this period, the science of digital forensics grew from the ad-hoc tools and techniques developed by these hobbyist practitioners. This is in contrast to other forensics disciplines, which developed from work by the scientific community. It was not until 1992 that the term "computer forensics" was used in academic literature ; a paper by Collier and Spaul attempted to justify this new discipline to the forensic science world. This swift development resulted in a lack of standardization and training. In his 1995 book, High-Technology Crime: Investigating Cases Involving Computers, K. Rosenblatt wrote the following:
2000s: Developing standards
Since 2000, in response to the need for standardization, various bodies and agencies have published guidelines for digital forensics. The Scientific Working Group on Digital Evidence produced a 2002 paper, Best practices for Computer Forensics, this was followed, in 2005, by the publication of an ISO standard. A European-led international treaty, the Budapest Convention on Cybercrime, came into force in 2004 with the aim of reconciling national computer crime laws, investigative techniques, and international co-operation. The treaty has been signed by 43 nations and ratified by 16.The issue of training also received attention. Commercial companies began to offer certification programs, and digital forensic analysis was included as a topic at the UK specialist investigator training facility, Centrex.
In the late 1990s, mobile devices became more widely available, advancing beyond simple communication devices, and were found to be rich forms of information, even for crime not traditionally associated with digital forensics. Despite this, digital analysis of phones has lagged behind traditional computer media, largely due to problems over the proprietary nature of devices.
Focus has also shifted onto internet crime, particularly the risk of cyber warfare and cyberterrorism. A February 2010 report by the United States Joint Forces Command concluded the following:
The field of digital forensics still faces unresolved issues. A 2009 paper, "Digital Forensic Research: The Good, the Bad and the Unaddressed" by Peterson and Shenoi, identified a bias towards Windows operating systems in digital forensics research. In 2010, Simson Garfinkel identified issues facing digital investigations in the future, including the increasing size of digital media, the wide availability of encryption to consumers, a growing variety of operating systems and file formats, an increasing number of individuals owning multiple devices, and legal limitations on investigators. The paper also identified continued training issues, as well as the prohibitively high cost of entering the field.
Development of forensic tools
During the 1980s, very few specialized digital forensic tools existed. Consequently, investigators often performed live analysis on media, examining computers from within the operating system using existing sysadmin tools to extract evidence. This practice carried the risk of modifying data on the disk, either inadvertently or otherwise, which led to claims of evidence tampering. A number of tools were created during the early 1990s to address the problem.The need for such software was first recognized in 1989 at the Federal Law Enforcement Training Center, resulting in the creation of IMDUMP and in 1990, SafeBack . Similar software was developed in other countries; DIBS was released commercially in the UK in 1991, and Rob McKemmish released Fixed Disk Image free to Australian law enforcement. These tools allowed examiners to create an exact copy of a piece of digital media to work on, leaving the original disk intact for verification. By the end of the 1990s, as demand for digital evidence grew, more advanced commercial tools such as EnCase and FTK were developed, allowing analysts to examine copies of media without using any live forensics. More recently, a trend towards "live memory forensics" has grown, resulting in the availability of tools such as WindowsSCOPE.
More recently, the same progression of tool development has occurred for mobile devices; initially investigators accessed data directly on the device, but soon specialist tools such as XRY or Radio Tactics Aceso appeared.
Police forces have begun implementing risk-based triage systems to manage the overwhelming demand for digital forensic services.
Forensic process
A digital forensic investigation commonly consists of three stages:- acquisition or imaging of exhibits,
- analysis, and
- reporting.
An alternative approach combines digital forensics and ediscovery processes. This approach has been embodied in a commercial tool called ISEEK that was presented together with test results at a conference in 2017.
During the analysis phase an investigator recovers evidence material using a number of different methodologies and tools. In 2002, an article in the International Journal of Digital Evidence referred to this step as "an in-depth systematic search of evidence related to the suspected crime." In 2006, forensics researcher Brian Carrier described an "intuitive procedure" in which obvious evidence is first identified and then "exhaustive searches are conducted to start filling in the holes."
The actual process of analysis can vary between investigations, but common methodologies include conducting keyword searches across the digital media, recovering deleted files and extraction of registry information.
The evidence recovered is analyzed to reconstruct events or actions and to reach conclusions, work that can often be performed by less specialized staff. When an investigation is complete the data is presented, usually in the form of a written report, in lay persons' terms.