First you do need to clone the bad drive to a new one. Only then you can worry about recovery and folders, unless of course you have very expensive hardware based tools like PC-3000 DE, MRT DE, HRT-DRE, RapidSpar, etc that will allow you to check master file table and clone/image only the portion of the drive with the files.
If the filesystem is NTFS, ddru_ntfsbitmap, which is part of
ddr_utilities, can do just that, i.e. skip the unallocated sectors (by checking the $Bitmap, instead of the $MFT). It's useful if the drive is far from full. It's a quick process and normally won't harm the drive. It doesn't allow to selectively extract the data corresponding to particular files/folders, though, which the aforementioned professional tools might be able to do.
It also provides the option of recovering the whole MFT first, which is crucial to “keep folders in order”, and it's wise to get it right away, as the state of the drive might worsen during the cloning process (especially considering that there are already
many reallocated sectors here – 7B8 in hexadecimal is 1976 in decimal), to the point where it can no longer be possible to extract it fully, even though it didn't contain bad sectors initially (that's what happened to me with a 3TB Seagate HDD : I could save everything except 6 files containing bad sectors, but when I tried to get the most out of those six files the drive quickly became very unstable, while I didn't have the whole MFT saved – some of it was at the very end of the drive/partition, I didn't think about using that tool to get it in full, I know I acted like a fool even though I knew the rule that was really uncool...). But if there are too many damaged areas in the MFT, it's better not to insist to get the most of it, and try to get the rest instead (which means that many files will have to be recovered in “raw” mode, i.e. without the metadata and folder structure, but it's better than nothing...).
That G-Sense error rate means that the drive did suffer from sort of shock caused by dropping the drive or hitting the drive, etc ...
I recently examined a Toshiba HDD from a laptop computer, which also had a quite high number in the G-Sense field (not
that high, but still high, 441, even though it had been running only 232 hours), and as I was replied in
a thread about that drive, it's vendor specific, so it could be that Toshiba HDDs happen to be very sensitive. Does anyone else have G-Sense figures for Toshiba drives, and are they indeed higher than average ?
(from “maximus”)
When looking up how the drive may produce the G-Sense Error Rate provided a possible answer that could be very simple. It could be a count of how many times the drive experience a level of g force that caused it to abort a write. This is a safety feature and could only take a very small bump to trigger it. Values are vendor specific (and also likely drive specific), so unless you can find the exact meaning of the value for that drive, I would not read too much into it.