The Gutmann method is an algorithm for safely removing computer hard disk content, such as files. Recommended by Peter Gutmann and Colin Plumb and presented in a paper Secure Removal of Data from Magnetic and Solid-State Memory in July 1996, it involves writing a series of 35 patterns over the area to be removed.
The pattern selection assumes that the user is unaware of the encoding mechanisms used by the drive, so it includes patterns designed specifically for the three drive types. Users who know which type of encoding the drive uses can select only patterns that are meant for their drives. Drives with different encoding mechanisms will require different patterns.
Most patterns in the Gutmann method are designed for disks encoded by older MFM/RLL. Gutmann himself has noted that the more modern drives no longer use this older coding technique, making parts of the method irrelevant. He says "At the time since this writing was published, some people have treated the 35-pass overwrite technique described in it more as a kind of voodoo spell to ward off evil spirits rather than technical analysis of drive coding techniques".
Since about 2001, some ATA IDE and SATA hard drive designs include support for the ATA Secure Erase standard, which negates the need to apply Gutmann's method when removing entire drives. However, a 2011 study found that 4 out of 8 manufacturers did not apply ATA Secure Erase properly.
Video Gutmann method
Technical description
One standard way of recovering data that has been overwritten on a hard drive is to capture and process the analog signals obtained from the read/write head of a drive before these analog signals are digitized. This analog signal will be close to the ideal digital signal, but the difference will reveal important information. By calculating the ideal digital signal and then subtracting it from the actual analog signal, it is possible to amplify the difference signal obtained and use it to determine what was previously written on the disk.
As an example:
Analog signal: 11.1 -8.9 9.1 -11.1 10.9 -9.1 The ideal digital signal: 10.0 -10.0 10.0 -10.0 10.0 -10.0 Difference: 1.1 1.1 -0.9 -1.1 0.9 0.9 Previous signal: 11 11 -9 -11 9 9
This can then be done again to view previously written data:
Signal recovered: 11 11 -9 -11 9 9 The ideal digital signal: 10.0 10.0 -10.0 -10.0 10.0 10.0 Difference: 1 1 1 -1 -1 -1 Previous signal: 10 10 10 -10 -10 -10
However, even when overwriting disks repeatedly with random data, it is theoretically possible to recover the previous signal. The permittivity of a medium changes with the frequency of the magnetic field. This means that the lower frequency field will penetrate deeper the magnetic material on the drive rather than the high frequency. So a low frequency signal will, in theory, still be detected even after being overwritten hundreds of times by high frequency signals.
The pattern used is designed to apply a magnetic field alternating various frequencies and various phases to the surface of the drive and thus approaching the degaussing material below the drive surface.
Maps Gutmann method
Method
An overwrite session consists of an introduction of four random writing patterns, followed by patterns 5 through 31 (see table row below), executed in random order, and leads out of four random patterns.
Each of the 5 to 31 patterns is designed with a specific magnetic media encoding scheme in mind, each of which is a target pattern. Drive is written for all operands although the table below shows only bit patterns for operands that are specifically targeted at each coding scheme. The end result should obscure any data on the drive so that only the most sophisticated physical scanning (for example, using a magnetic force microscope) from a drive is likely to recover any data.
The pattern set is as follows:
The encoded bit shown in bold is what should exist in the ideal pattern, though because the complementary bit encoding actually exists at the beginning of the track.
Criticism
The delete function in most operating systems simply marks the space occupied by the file as reusable (deletes the pointer to the file) without immediately deleting any of its contents. At this point the file can be easily recovered by many recovery applications. However, once the space is overwritten with other data, there is no known way to use the software to restore it. This can not be done with software alone because the storage device only restores the current content through the normal interface. Gutmann claims that intelligence agencies have sophisticated tools, including magnetic force microscopes, which, together with image analysis, can detect previous values ââof bits in media-exposed areas (eg hard disks).
Daniel Feenberg of the National Bureau of Economic Research, a nonprofit private research organization of America, criticized Gutmann's claim that intelligence agencies tend to read overwritten data, citing a lack of evidence for the claim. However, some security procedures issued by the government consider the disk to be overwritten once to remain sensitive.
Gutmann himself has responded to some of these criticisms and also criticized how his algorithm was abused in epilogue to his original paper, where he stated:
At the time since this writing was published, some people have treated the 35-pass overwrite technique described in it more as a kind of voodoo incantation to ward off evil spirits rather than technical analysis results from drive coding techniques. As a result, they advocate applying voodoo to PRML and EPRML drives even though it will not have more effect than just scrubbing it with random data. Even doing a full 35-pass override is useless for any drives because it targets a mixture of scenarios involving all types of encoding technology (typically used), which includes everything back to 30 year-old MFM methods (if you do not understand the statement, review the paper ). If you're using a drive using X encoding technology, you only need to do a custom path for X, and you do not need to do 35 passes. For every modern PRML/EPRML drive, some passing random scrubbing is the best you can do. As written on the paper, "Scrub well with random data will do as well as expected". This was true in 1996, and still applies now.
Implementation of the software
- CCleaner and Recuva, a utility developed by Piriform
- Boot Darik and Nuke (DBAN) (entire disk only)
- Disk Utility a program provided with Mac OS X (entire disk or just free space)
- FreeOTFE and FreeOTFE Explorer (disk encryption system)
- Lavasoft Privacy Toolbox
- PeaZip The delete function is secure (file/directory only)
- corrupted program from GNU Core Utilities
- srm, also used by Mac OS X
- TrueCrypt (disk encryption system) (empty space only)
- Unblock secure utility
- BCWipe and Total WipeOut, software developed by Jetico
See also
- Data remanent â â¬
- Data recovery â ⬠<â â¬
- Computer forensics
Note
External links
- Secure Removal of Data from Magnetic and Solid-State Memory, Gutmann's original paper
Source of the article : Wikipedia