Got a defective SSD with a Marvell 88SS9175 controller and 4x SanDisk 05055 064G (id 453ea782) chips on it.
When saving the "decrypted data" (aka descrambled data) using the functionality under "Test - > Utility tests -> Dump savings" we get clear text data, however, we're a bit unsure on one setting that must be specified in the "Tests -> Utility tests -> NAND" setting page. Here you are asked to enter "Blocks per plane", but this term is a bit vague since PC3000 uses a different terminology from e.g. Rusolut VNR.
If we identify the chip with Rusolut VNR, we get the following data: Page size: 8832 bytes Block size: 256 pages / block = 2260992 bytes Plane size (nominal): 8192 blocks / plane = 18522046464 bytes Plane size (real): 4196 blocks / plane = 9487122432 bytes Chip size: (nominal): 8 planes / chip = 148176371712 bytes Chip size: (nominal): 8 planes / chip = 75896979456 bytes
Now the interesting question is, what do these numbers translate into for the PC3000 settings? PC3000 specifies: 4 chips STAR count: 8 Banks per STAR: 1 Channel count: 4 CE count: 1 LUN: 8 Planes: 4 Page size: 8832 bytes Block size: 256 pages / block = 2260992 bytes
In order to get descrambled data at all, we have to specify this "Blocks per plane" value in PC3000 as well, but what is the correct value? The different terms between the different software makes it a bit confusing.
Also, in order to reconstruct the data, would it be possible to first do a read of each NAND chip with PC3000 using the descrambled data functionality (that gives data stripped for spare area bytes), and the do a raw NAND read of the same chips (that includes the spare area bytes) and concatenate the spare area bytes only from the raw NAND read to the descramble read in order to do the translation of the data to a file system?
E.g. if the page structure of the raw NAND is [160 bytes of spare/ecc | 2048 bytes SCRAMBLED data] | [160 bytes of spare/ecc | 2048 bytes SCRAMBLED data] | [160 bytes of spare/ecc | 2048 bytes SCRAMBLED data] | [160 bytes of spare/ecc | 2048 bytes SCRAMBLED data]
and the PC3000 descrambled data is just 8192 bytes of data with no spare area data, could we split each page the PC3000 descrambled data into 4x 2048 byte chunks, prepend the 160 bytes of spare/ecc from the raw dump and reconstruct a page structure looking like
[160 bytes of spare/ecc | 2048 bytes DE-SCRAMBLED data] | [160 bytes of spare/ecc | 2048 bytes DE-SCRAMBLED data] | [160 bytes of spare/ecc | 2048 bytes DE-SCRAMBLED data] | [160 bytes of spare/ecc | 2048 bytes DE-SCRAMBLED data]
and then work our way from this?
|