Ah.Ah.Ahh...........
I had actually been speaking to their technical dept. (they did a "special" version, we got it upto situations where the program had access to 270gb of ram(18gb)+swap)
Took another look at it yesterday and now think i know what is happening..... .
The recovery algorithms are possibly sub-optimal (so to speak)
I was under the assumption that the scan just built up a metadata file of the types of blocks, but instead it actually attempts to mark the blocks as being parts of files, which is not a particularly smart move, because when the data is saved the database can be massive.
Under certain situations the algorithms end up treating some image files as directory content,
for example:
If i have 1 disk image stored on a device and that file gets deleted, the r-Studio program rather than finding the 1 file, actually goes off into the image and finds the sub files.(which is understandable, if not wrong)
The result is that with a disk of a couple of hundred image files, things can get very complicated , this in its self has implications for people that work a lot with disk images of operating systems.
(it also accounts for the massive number of files being found) and hence the size of the scn file.
Interestingly it seems this has been tagged before, but in a general way:
http://forum.r-tt.com/viewtopic.php?f=13&t=74As of 15-JUN-2010 they say "We're trying to fix the problem."
It has actually been a profitable exercise.