HDD GURU FORUMS
http://forum.hddguru.com/

Mapping blocks to files?
http://forum.hddguru.com/viewtopic.php?f=7&t=38636
Page 1 of 1

Author:  rubenatch [ July 13th, 2019, 13:50 ]
Post subject:  Mapping blocks to files?

Probably this is an already answered question, but...

My laptop HDD has developed some bad blocks. It has the standard stuff (a hidden NTFS partition and a system NTFS partition), and it has too much 7z and zip files to take care of. The computer still boots, but it has obviously some DLLs corrupted.

So, I'm looking for utilities (it doesn't matter if they run under DOS, Windows or Linux, or if they have GUI or CLI) that can perform those tasks:

- A tool that can map a physical block to a file, so I can know what files have been corrupted. It should support disks with more than one partition, and FAT/NTFS/ext filesystems. I know that there is a tool that can get a ddrescue log file and find which file can be corrupted, but it does need an image previously created and I don't have enough disk space to do so.
- A tool that can create a file that occupies a physical block, so I can be sure that important data won't be written there (until I get a new disk).
- A tool that can check every compressed file in a directory looking for corrupted data (i.e.: check all zip files under a folder and all subfolders). This is not a hard disk tool, but it can be handy.

Thanks in advance.

Author:  fzabkar [ July 13th, 2019, 17:20 ]
Post subject:  Re: Mapping blocks to files?

rubenatch wrote:
So, I'm looking for utilities (it doesn't matter if they run under DOS, Windows or Linux, or if they have GUI or CLI) that can perform those tasks:

- A tool that can check every compressed file in a directory looking for corrupted data (i.e.: check all zip files under a folder and all subfolders).

Here is a command line example that uses 7Zip to recurse through the subdirectories of the Recovery folder and perform an integrity check of all ZIP files. The results are written to zip_tests.txt.

    for /r C:\Recovery_Root_Dir %f in (*.zip) do "c:\program files\7-zip\7z" t %f >> C:\zip_tests.txt

Author:  maximus [ July 13th, 2019, 18:36 ]
Post subject:  Re: Mapping blocks to files?

Quote:
My laptop HDD has developed some bad blocks. It has the standard stuff (a hidden NTFS partition and a system NTFS partition), and it has too much 7z and zip files to take care of. The computer still boots, but it has obviously some DLLs corrupted.

First I am going to state the obvious. Backup all your important data now, before the drive completely fails.
Quote:
- A tool that can map a physical block to a file, so I can know what files have been corrupted. It should support disks with more than one partition, and FAT/NTFS/ext filesystems. I know that there is a tool that can get a ddrescue log file and find which file can be corrupted, but it does need an image previously created and I don't have enough disk space to do so.

You do not need an image or clone (although it is highly recommended to image or clone your drive before it fails). You can run ddrescue with a destination of /dev/null, that will create a log file without creating an actual data backup. Then you can use the ddrutility programs with the ddrescue log and choosing the failing disk as the input.
Quote:
- A tool that can create a file that occupies a physical block, so I can be sure that important data won't be written there (until I get a new disk).

I don’t think such a tool exists. The closest thing would be to run chkdsk with the /r option (chkdsk /r). This will try to recover data in the bad blocks (clusters), and when it fails it will put the bad cluster in $badclus, so the OS will not use that cluster. Just make sure you have backed up your data before running chkdsk, because it could destroy files, corrupt the file system, and possibly kill the disk.

Author:  maximus [ July 13th, 2019, 20:23 ]
Post subject:  Re: Mapping blocks to files?

I would like to add that you should get a replacement disk as soon as possible, and use the computer as little as possible (not at all would be best), including not doing any of the above mentioned actions. The more you mess with the disk, the sooner it will further fail and leave you with no data.

Author:  rubenatch [ July 14th, 2019, 2:44 ]
Post subject:  Re: Mapping blocks to files?

fzabkar wrote:
    for /r C:\Recovery_Root_Dir %f in (*.zip) do "c:\program files\7-zip\7z" t %f >> C:\zip_tests.txt


This will test the current directory, not subfolders. I guess a script in linux using find would work, but...

maximus wrote:
First I am going to state the obvious. Backup all your important data now, before the drive completely fails.


All the data is backed up, but I'm wondering if there is some data corrupted. That's why I want to check those compressed files.

maximus wrote:
You do not need an image or clone (although it is highly recommended to image or clone your drive before it fails). You can run ddrescue with a destination of /dev/null, that will create a log file without creating an actual data backup. Then you can use the ddrutility programs with the ddrescue log and choosing the failing disk as the input.


You're right. I was thinking that ddru_ntfsfindbad needed a image file, but it can be used with device files. The problem is that it doesn't support FAT or ext filesystems...

maximus wrote:
I would like to add that you should get a replacement disk as soon as possible, and use the computer as little as possible (not at all would be best), including not doing any of the above mentioned actions. The more you mess with the disk, the sooner it will further fail and leave you with no data.


I know, but I'm still waiting for the new disk to come.

Thanks to all you guys.

Author:  maximus [ July 14th, 2019, 21:33 ]
Post subject:  Re: Mapping blocks to files?

If you really want to know which files are corrupt, wait until you get the replacement disk, use ddrescue (or hddsuperclone) to clone the disk, then use the fill function to fill the unrecovered sectors with a pattern. Then use grep in linux to recursively search the files for the pattern. That is old school ddrescue for finding the corrupt files, slow but effective.

Author:  fzabkar [ July 15th, 2019, 0:33 ]
Post subject:  Re: Mapping blocks to files?

rubenatch wrote:
fzabkar wrote:
    for /r C:\Recovery_Root_Dir %f in (*.zip) do "c:\program files\7-zip\7z" t %f >> C:\zip_tests.txt


This will test the current directory, not subfolders.

https://www.computerhope.com/forhlp.htm

Quote:
FOR /R [[drive:]path] %variable IN (set) DO command [command-parameters]

Walks the directory tree rooted at [drive:]path, executing the FOR statement in each directory of the tree. If no directory specification is specified after /R, then the current directory is assumed. If set is only a single period (.) character, then it will enumerate the directory tree.

Author:  abolibibelot [ July 17th, 2019, 19:43 ]
Post subject:  Re: Mapping blocks to files?

Other suggestions (for NTFS partitions) :
nfi.exe
fsutil
Defraggler (only tool I know which can provide a list of all files included within an interval of “blocks” of data, unfortunately it is currently not accurate with regards to the size or the exact location of the analyzed “blocks”) (other free defragmenting softwares may have a more accurate grid but lack this particular feature)
Other tools which can provide that kind of information but are not recommended with a failing drive as they attempt to access the requested sectors (which can further spread the surface damage) :
WinHex
HD Sentinel
R-Studio (with the “show files in HexEditor” feature – it works although not flawlessly, as many sectors which are allocated are shown with no allocation information, couldn't figure out why yet ; plus it's quite convoluted and confusing, and there's an inconsistency : the contents of a file are shown by requesting its logical sector number, but the allocation information are shown at the absolute sector number, thus requiring to add the partition offset to the expected values ; for instance : if a JPG file starts at sector 12345 of a partition which starts at sector 2048 relative to the begining of the device, a JPG header will be visible at sector 12345 of R-Studio Hex viewer, but the name of the file will appear when requesting sector 12345 + 2048 = 14393)

https://superuser.com/questions/1266135 ... bad-sector
https://superuser.com/questions/1267334 ... rs-in-ntfs
http://www.disktuna.com/finding-out-whi ... ad-sector/
(The comments posted under the name “Gabriel” are my own.)

Indeed you shouldn't be running that drive at all if at all possible, if its condition is bad enough to cause such symptoms.
How many reallocated and/or pending sectors are there according to its SMART status ?

Page 1 of 1 All times are UTC - 5 hours [ DST ]
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group
http://www.phpbb.com/