Page 1 of 1

R-Studio 5.2

Posted: June 24th, 2010, 23:20
by code_slave
Hi,
i'm seeing really poor performance on loading of ".scn" files from R-STUDIO 5.2
Scanned drive is 680gb , resulting scan file is 7.3gb of data.

2.8ghz dual processor, 3Gb ram, SATA drives.
re-load time of metadata .SCN file = 1hr20 minutes

Is this about right?

Re: R-Studio 5.2

Posted: June 25th, 2010, 9:26
by drc
You're saying the scan file itself is 7GB?

Re: R-Studio 5.2

Posted: June 25th, 2010, 19:27
by code_slave
yep....
the drive is actually 1TB, the partition analyzed 686GB.

The file is 7.35gb
it was saved using "save scan information" fro a HFS+ volume.

it is supposedly just a metadata "map" file of all the blocks on the partition, so it works out at about 1% of the actual volume size.

on the windows version it takes 1:18 to load, on the mac version 17 minutes.

Someone had asked me to do an analysis/evaluation on R-Studio, with a view to them using it in their recovery situations.

Re: R-Studio 5.2

Posted: June 25th, 2010, 21:34
by drc
In my experience if your scan takes more than a minute or two to load then there is some severe corruption/damage to the filesystem.

Typical .scn files should be much smaller than that, I think.

Re: R-Studio 5.2

Posted: June 26th, 2010, 3:44
by code_slave
Hi,
I figured the best way to "test" recovery software was to freshly format a drive, dump about 100 files on it
then erase them.

Once that was tested, I would try a more difficult case.

What I think is that the "software" is looking inside the files and assuming the data content is metadata.

But you answered my question, it should only take a couple of minutes, now I just need to analyze exactly what is going on.

Re: R-Studio 5.2

Posted: June 26th, 2010, 10:12
by drc
You could also try asking on the rstudio forums

Re: R-Studio 5.2

Posted: June 26th, 2010, 20:30
by code_slave
Ah.Ah.Ahh...........

I had actually been speaking to their technical dept. (they did a "special" version, we got it upto situations where the program had access to 270gb of ram(18gb)+swap)

Took another look at it yesterday and now think i know what is happening..... .
The recovery algorithms are possibly sub-optimal (so to speak)

I was under the assumption that the scan just built up a metadata file of the types of blocks, but instead it actually attempts to mark the blocks as being parts of files, which is not a particularly smart move, because when the data is saved the database can be massive.

Under certain situations the algorithms end up treating some image files as directory content,
for example:
If i have 1 disk image stored on a device and that file gets deleted, the r-Studio program rather than finding the 1 file, actually goes off into the image and finds the sub files.(which is understandable, if not wrong)

The result is that with a disk of a couple of hundred image files, things can get very complicated , this in its self has implications for people that work a lot with disk images of operating systems.
(it also accounts for the massive number of files being found) and hence the size of the scn file.

Interestingly it seems this has been tagged before, but in a general way:
http://forum.r-tt.com/viewtopic.php?f=13&t=74

As of 15-JUN-2010 they say "We're trying to fix the problem."


It has actually been a profitable exercise.

Re: R-Studio 5.2

Posted: July 7th, 2010, 4:21
by scratchy
actually goes off into the image and finds the sub files.(which is understandable, if not wrong)


Not wrong really, in scan mode it will scan each sector, if it is a file or not. The idea for this type of scanning is for recovering lost data (e.g. from a formatted drive) so the only way it can do this logically is to scan each sector for old file system information ignoring any existing/current file system info and then attempt to rebuild its structure.

So when it happens to start scanning an image file it will interpret this as a partition and present the files from within that partition - it doesn't know it is a file at this stage.

In non-scan mode, it will just look at the existing file system info and build a directory tree from that.

Re: R-Studio 5.2

Posted: July 13th, 2010, 18:22
by code_slave
Then if this is the case it is doomed to failure on any reasonably complex task.

Since there is no block 'translation' feature to convert the file systems BTL (Block translation level) mappings to the BTL storage mappings of the file.

I.E it is a nested BTL system and it is treating an OS image file as part of a the larger block mapped device, rather than an Image of a block mapped device, as a result any attempts to locate file parts will only undergo one block mapped translation rather than two, which may result in less data being recovered and an intermixing of the outer BTL with the inner BTL contents of an image file.

What could then get really messy is if i have image files stored in an image file of a disk , which is then stored in an examined file system.

The program needs some sort of override functionality where it can be instructed not to treat embedded imaged file systems as part of the device exploration but rather as a unique and distinct file.

The search routines need to be made more intelligent.

Re: R-Studio 5.2

Posted: July 13th, 2010, 22:32
by drc
code_slave wrote:The program needs some sort of override functionality where it can be instructed not to treat embedded imaged file systems as part of the device exploration but rather as a unique and distinct file.
How is it supposed to know that it is looking at an image file if you are doing a raw scan?

Re: R-Studio 5.2

Posted: July 14th, 2010, 19:24
by code_slave
From the allocation structure?
and the fact that the data is in a place it should not be if it is system data for the drive?

Re: R-Studio 5.2

Posted: July 14th, 2010, 21:26
by drc
If you're doing a raw scan, the program is not paying attention to allocation structure (FAT table, MFT, whatever).

Re: R-Studio 5.2

Posted: September 22nd, 2010, 4:05
by einstein9
code_slave wrote:yep....
the drive is actually 1TB, the partition analyzed 686GB.

The file is 7.35gb
it was saved using "save scan information" fro a HFS+ volume.

it is supposedly just a metadata "map" file of all the blocks on the partition, so it works out at about 1% of the actual volume size.

on the windows version it takes 1:18 to load, on the mac version 17 minutes.

Someone had asked me to do an analysis/evaluation on R-Studio, with a view to them using it in their recovery situations.


Well, i will tell u something here about R-Studio, read it carefully,

If you are using Pirated ver. of v.5.xx ++ and higher then this is login, coz i tried R-Studio 5.0, 5.1, 5.2 all cracked ver. of the App. with ANY 1TB HDD will do the same, sometimes you feel that the Loading Indicator is not moving but it is running on the BG.

PC Speed doesn`t matter alot really, but makes a Difference. about the image size you mentioned, it is TOO BIG
i remember for 1TB might or should be no more than 40-50MB

i suggest to use another Appz not R-Studio if ur using the cracked ver.
btw. i bought the latest and working 100% with 1TB while the cracked ver. is not

Re: R-Studio 5.2

Posted: September 22nd, 2010, 5:47
by code_slave
Er actually my copy is 100% legal.

both for windows and osx, the reason it does not work is embedded image files.

Re: R-Studio 5.2

Posted: September 22nd, 2010, 9:10
by einstein9
code_slave wrote:Er actually my copy is 100% legal.

both for windows and osx, the reason it does not work is embedded image files.


PM sent