Our Editorial Values and Testing Process

At Data Recovery Mentor, our goal is to provide the most accurate and trustworthy data recovery content on the web, ranging from in-depth software reviews to detailed guides to comprehensive listicles. To learn more about who we are and what we do, head over to the about us page.

Our Editorial Values

As data recovery professionals, we have a wealth of first-hand experience with the worst data loss scenarios possible, and it’s our mission to share this experience with you, our readers, to help you avoid losing important data.

Whether you’re a casual computer user or an IT professional responsible for other people’s data, we hope that our content will provide the answers to all your questions.

All articles on this website reflect our core content values:

  • 🔬 Information accuracy: There’s no shortage of advice on the web on how to recover lost data. The problem is that most articles are—to a smaller or larger—extent inaccurate. In the worst cases, they contain serious mistakes that cause more harm than good. To provide our readers with the most accurate data recovery content, we always personally test all solutions and techniques we describe, and we update published articles on a regular basis to keep them relevant.
  • 🧪 Lack of bias: When describing data recovery methods, reviewing available software applications, and creating all kinds of other content, we always follow our repeatable testing methodologies instead of relying on subjective measures of evaluation. That way, we prevent personal bias from seeping into our content.
  • 🛡️ Delivering value to readers: We do what we do because we’re on a mission to help our readers recover important files. That’s why we would never promote a solution that doesn’t deliver value—regardless of how profitable such a promotion would be for us.

In other words, Data Recovery Mentor strives to be the best source of data recovery information on the web, a destination that you can always rely on for advice when the situation seems utterly desperate. Visit our blog to see what makes our content special.

How We Test Data Recovery Software

Data recovery software testing is a huge part of what we do here at Data Recovery Mentor, and our comprehensive testing guidelines reflect this.

Real-Life Recovery Challenges

The true effectiveness of data recovery software can be revealed only when it’s used to address real-life recovery challenges, such as:

  • 🖼️ Raw photo recovery
  • 📹 Video formats recovery
  • 📄 Document format recovery

When supported by the tested software, we always test how well can real-life recovery challenges be addressed using the following two data recovery modes:

  1. Signature Scan. This method reconstructs files by looking for known patterns in raw data stored on a storage device. Each data recovery software has its own database of known patterns, so vastly different data recovery results can be achieved when two different applications are used on an identical set of data.
    ⚠️ Note: Signature scan mode can only reconstruct data that was not fragmented.
  2. Quick & Clever Deep Scan. Quick and Clever Deep scan methods analyze the file system instead of raw data stored on a storage device. This allows the two methods to recover files with intact file names to their original locations and do so even if the files are fragmented.
    ⚠️ Note: Quick and Clever Deep scan methods do not support signature recovery. They only recover access to the file system.

Signature Scan Testing Steps

Here are detailed steps that show how we test signature scan:

  1. We search the web for different data sets. For example, for raw photo files, we try to find raw photos taken using as many different digital cameras as possible. This helps us evaluate if the tested software supports both new as well as older versions of raw photo files (Canon constantly updates their cr2 and cr3 formats).
  2. When we’re happy with the amount of collected data, we begin to check each downloaded file. We check if the file has the right signature by viewing its HEX (binary data) and comparing it with the vendor specification for the file format. Then, we check if the file can be opened without any issues.
  3. After creating a valid data set, we create a virtual hard drive for each file format:
    • Step 3.1. We open the Windows Disk Management tool;
    • Step 3.2. In the main menu, we choose ActionCreate VHD. In the “Create and attach Virtual Hard disk” window, we input the size of the data set created in step 2 + 5 GB to reserve space for file table and to avoid fragmentation. We also change “Virtual hard disk format” to “VHD”, and change “Virtual hard disk type” to “Fixed size”;
  4. Then, we initialize the disk using the MBR (master boot record) partition style.
  5. After that, we format the partition to ntfs\exfat\fat32 file system. Generally, we choose fat32 for smaller data sets (< 4 GB), and we choose NTFS for larger data sets.
  6. We copy the prepared data to VHD, typically using Total Commander.
  7. We check the copied data for fragmentation. For this, we use O&O Defrag software. This app can show a real map of file fragments and tell us which files are located in each sector of the disk. If there is no fragmentation we proceed to the next step. If we see that there are red files in “File status” section of O&O Defrag software, we do the following:
    • Step 7.1. We remember the name of the fragmented file;
    • Step 7.2. Copy the fragmented file to a temporary place;
    • Step 7.3. Launch WinHex app to securely wipe (shred) the fragmented file;
    • Step 7.4. Copy file from step 7.2 (from the temporary folder) to VHD;
    • Step 7.5. Check again if fragmentation is gone;
    • Step 7.6. Repeat for each fragmented file;
  8. We erase the VDH’s file system. This is needed to accurately evaluate Signature Scan capabilities. With the file system intact, data recovery software applications could cheat and use different scan modes instead. To clear file system nodes from the drive we will use R-Studio, O&O Defrag, Windows Disk Management, and WinHex. R-studio can show us where the existing file system is located. With WinHex, we can zero all file system clusters. O&O Defrag can show us which files take first place in the file table and on VHD.
    • Step 8.1. By analyzing VHD drive with O&O Defrag we can find which file is the first to start on the drive. By pressing “Analyze” button, the app can give us a map of files and file system hidden attributes;
    • Step 8.2. By clicking on the map (visually presented by squares), we can get additional info on what data is located in each block. We look for the first file record and remember the associated file name. We then locate this file on VHD disk;
    • Step 8.3. Then, we launch WinHex and drag & drop the file from step 8.2;
    • Step 8.4. In WinHex, we press F9, in the Physical Storage Devices section, we select VHD device and click Ok;
    • Step 8.5. In the tab where the file is opened, we select the first 512 bytes in offset and copy them as hex values (ctrl + shift + c or Main MenuEditCopy BlockHex Values);
    • Step 8.6. In the tab where VHD is opened, we search for hex value from step 5 (ctrl + alt + x or Main MenuSearchFind Hex Values). Then, we insert copied data from step 8.5 and press Ok;
    • Step 8.7. When data is found, we must erase the beginning of the VHD drive where typically MFT, Boot, MFT mirror, Bitmap, Deleted, Log data is stored. This is needed to make drive unmountable. To do this we must select in WinHex with Main MenuEditDefine Block the amount of data to be zeroed. At the beginning, we type 0. At the end, we choose the option “Current position”. Then, we choose Main MenuEditFill Block and type 0 in the field “Fill with hex values” then press Ok and save changes to the VHD device.
    • Step 8.8. We close 2 opened tabs in WinHex;
    • Step 8.9. We go to Windows Disk Management and detach the VHD file by right-clicking on it and choosing “Detach VHD”;
    • Step 8.10. We mount the VHD file again to make Windows understand that now there is no file system on it. We double-click the VHD file and ignore all messages that may appear. We repeat step 8.4 (open VHD file in WinHex);
    • Step 8.11. We launch R-studio and begin to examine the VHD by pressing the “Scan” button. R-studio can show us all file system records that are left on the VHD file. We are interested in the following file system records: NTFS boot sectors, NTFS Restore Points, FAT Table Entries, NTFS MFT Extents, NTFS Log File, Fat directory entries, NTFS directory entries, Fat boot sector;
    • Step 8.12. When the scan is finished and we are represented with a map of found data, we search for the start offset (in sectors) of each file system record from step 8.11 by double-clicking the square that is responsible for file system records;
    • Step 8.13. In WinHex, we go to Main MenuNavigationGo to sector. In LBA, we input the value from step 8.12 and press Ok. This will bring us to file system records;
    • Step 8.14. Then, we scroll through WinHex and analyze where the file system record ends. When we find the end of the record, we erase it with zeroes. For this we select Main MenuEditDefine Block. At the beginning, we type value from step 8.12. At the end, we choose the option “Current position”. Then save changes to the VHD file;
    • Step 8.15. We repeat steps 8.12-14 until all file systems records are zeroed;
    • Step 8.16. When all file system records are zeroed we rescan the VHD file in R-studio to make sure that there isn’t a single record left. If R-studio does not find any records from step 8.11, our job is done: we deleted the file system from the drive. If file systems records are left, we repeat 8.12-15;
    • Step 8.17. As a precautionary measure, we detach the VHD file and make it read-only. This ensures that nothing can modify the file;
  9. When the VHD files are prepared, we mount them:
    • Step 9.1. On Windows, by double-clicking and ignoring any messages that may appear.
    • Step 9.2. On Mac, by changing the extension of the file to dmg and dragging then dropping to Disk DrillStorage Devices page.
  10. Then we begin to scan the attached VHD\DMG file and restore raw data.
  11. After that, we manually analyze the recovered data for quality and quantity. For raw photo files, we use Adobe Lightroom or Photoshop to determine if their original quality has been preserved. For videos and professional video data, we use vendor software to determine if the files are still playable, with their original codec and bitrate preserved. Then we begin another detailed analysis:
    • We compare the recovered file size with the original size
    • We analyze how many files were not found or were duplicates
    • We analyze how many files can’t be opened after recovery
    • We analyze how many files changed their format
    • We analyze how many files lost their original quality (acceptable to videos and photos)
    • We analyze if the application can display filenames based on recovered metadata
  12. Based on results from step 11, we give points to software for each recovered file type (signature).

Quick & Clever Deep Scan Testing Steps

Here are detailed steps that show how we test Quick & Clever Deep scan:

  1. We prepare data sets with different types of data:
    • Folder “!Signatures” – contain modern file formats such as photo, audio, video, archives, and documents. Each format is located in a separate folder.
    • Folder “Long_Path1” – contains child folder objects with a total path length longer than 260 symbols. In the last folder, a txt file is located.
    • Folder “DOCX” – contains 5,000 docx documents in one folder with file size smaller than 12 kB.
    • Folder “Long_Path2” – contains child folder objects with a total path length longer then 260 symbols. In the last folder, some user data is stored – png, flac, etc.
    • Folder “PNG” – contains 99,999 png files in one folder with a file size smaller than 1 kB
    • In the root folder, checksum files are stored.
  2. All data is copied to a real SD card or VHD.
  3. To test Quick Scan, we permanently delete files.
  4. To test Clever Deep Scan, we format drive to fat32 file system or make it raw by erasing the first 512 bytes of drive
  5. We then put the drive into read only mode to prevent future file overwriting. SD card can be put to read only mode with Disk Drill or by “Physical Write Protection Tab”. VHD file is given the read only attribute.
  6. Then we scan the whole drive and begin analyzing of results (we do not analyze raw data because that’s the job of Signature Scan!). Here’s what we look for:
    • How folder structure and file names are corrupted or restored to their original state.
    • Were files in root found or not
    • Did the program find a big number of files in the DOCX and PNG folders
    • Does the program correctly supports long paths
    • Did the program find user data in the signature folder
  7. When we see a folder structure, we begin to analyze each found file by its SHA checksum compared to the original file. Based on the data received from step 6, we give points to software.

General User Experience

As important as data recovery performance is, it’s not the only criterion that determines the overall quality of data recovery software. That’s why we also focus on the following:

  1. 🎉 Karma: This unique evaluation criterion encompasses everything from the update frequency to operating system support to customer support. It determines how reputable the developer of the tested data recovery software application is. 
  2. 💰 Bang for the buck: Data recovery software that reliably recovers all kinds of lost files can hardly be considered as the best solution if its price is outside the reach of most people. To reflect this, we evaluate how much bang for the buck the tested software delivers. 
  3. 🔎 Usability: Contrary to popular belief, no expert skills or advanced technical knowledge are required to recover lost data—at least not if you choose an easy-to-use data recovery tool. Our usability score indicates how intuitive tested software is, putting confusing tools at a disadvantage.
  4. 🎁 Extras: Many developers of data recovery software believe that data loss is a complex issue that’s best addressed before it happens. That’s why such developers bundle their applications with extra data backup and management tools. While not essential, such tools are nice to have, and our review scores reflect this.