ZenHAX

Free Game Research Forum | Official QuickBMS support | twitter @zenhax | SSL HTTPS://zenhax.com
It is currently Wed Nov 21, 2018 5:41 pm

All times are UTC




Post new topic  Reply to topic  [ 22 posts ]  Go to page 1 2 Next
Author Message
PostPosted: Sat Sep 15, 2018 4:33 pm 

Joined: Thu Sep 14, 2017 12:02 am
Posts: 15
Would anyone be willing to make a script/tool to unpack the files from Black Clover Quartet Knights?
I'd upload the files, but they're pretty big and would take too long to upload, but looks like the DATH files are for file directories I think, but can be grabbed down below.
Quote:
Image
Image


Attachments:
GxArchivedFile000.rar [87.4 KiB]
Downloaded 8 times
Top
   
PostPosted: Sat Sep 15, 2018 5:36 pm 

Joined: Sat Sep 15, 2018 5:28 pm
Posts: 1
Hello!
I need some help to extract the files of this game.
Can anyone help me, please? Some script for this type of archive?

I need to found the texts, and i think it is on that folder.
Image

Thank you!


Top
   
PostPosted: Sat Sep 15, 2018 6:46 pm 
Site Admin
User avatar

Joined: Wed Jul 30, 2014 9:32 pm
Posts: 9267
The format is very simple but there are no filenames stored so be ready to some headache browsing about 13000 nameless files (13000 only for the first dat, so double that number):
http://aluigi.org/bms/black_cover_gdath2.bms


Top
   
PostPosted: Sun Sep 16, 2018 12:44 am 

Joined: Wed Nov 15, 2017 1:54 pm
Posts: 55
Chrrox's bms script for another game seems to be relevant for this archive too : http://forum.xentax.com/viewtopic.php?p=104389#p104389 It is possible to at least set a file extension based on "FLDRHASH".

Also are you sure that the files are not compressed? I was looking at one of the model files : http://www.mediafire.com/file/46cqv2woi ... l.rar/file . I am not sure, but looking at the bone names around 0xB0, they seem like compressed. Like some variation of RLE maybe?

Edit : Indeed the files are compressed with lz4.


Top
   
PostPosted: Sun Sep 16, 2018 11:27 am 
Site Admin
User avatar

Joined: Wed Jul 30, 2014 9:32 pm
Posts: 9267
Yes it's lz4:
Code:
comtype lz4
goto 0x80
get DUMMY long
get SIZE long
get ZSIZE long
get DUMMY long
savepos OFFSET
get NAME basename
clog NAME OFFSET ZSIZE SIZE


Top
   
PostPosted: Sun Sep 16, 2018 1:54 pm 

Joined: Wed Nov 15, 2017 1:54 pm
Posts: 55
There is still something off though. The output is fine for small files, but with bit larger ones it seems inaccurate. Vertex/index buffer sizes don't match the size given in the header, and you can also see that indices are weird (compared to correct output on smaller files). Maybe there is padding (unnecessary bytes) that messes up the decompression?

I am not sure how can I find the source of the problem, but I will give it a try.

Edit : It wasn't unnecessary bytes, but chunk sizes. The file is separated into chunks. The smaller files with single chunk were working fine, but multiple chunks were failing. So the second "DUMMY" long in your script is the chunk size. After you read that many bytes there will be another long which is the size of the next chunk.


Top
   
PostPosted: Sun Sep 16, 2018 6:58 pm 
Site Admin
User avatar

Joined: Wed Jul 30, 2014 9:32 pm
Posts: 9267
Upload that new sample.


Top
   
PostPosted: Sun Sep 16, 2018 7:22 pm 

Joined: Wed Nov 15, 2017 1:54 pm
Posts: 55
Here, a file with 4 chunks.


Attachments:
chunked_model.rar [70.18 KiB]
Downloaded 7 times
Top
   
PostPosted: Sun Sep 16, 2018 8:19 pm 
Site Admin
User avatar

Joined: Wed Jul 30, 2014 9:32 pm
Posts: 9267
There is something weird about the chunks because apparently the chunks are ignored and the decompression must be applied on the whole data (basically it's like maintaining the "context" during the decompression of the chunks) instead of decompressing each chunk separately.
That's something that can't be supported by quickbms so I tried to collect all the chunks in a buffer and decompressing that buffer which is indeed what's expected, but test.mdl failed:
Code:
comtype lz4
goto 0x80
get CHUNK_SIZE long
get SIZE long
get ZSIZE long
get NAME basename

log MEMORY_FILE 0 0
append
for MEM_SIZE = 0 != ZSIZE
    get CHUNK_ZSIZE long
    savepos OFFSET
    log MEMORY_FILE OFFSET CHUNK_ZSIZE
    math OFFSET + CHUNK_ZSIZE
    goto OFFSET
next MEM_SIZE + CHUNK_ZSIZE
append
clog NAME 0 ZSIZE SIZE MEMORY_FILE

Just for reference, this is the test script in case the chunks were decompressed one-by-one as it usually happens (I leave it here just in case):
Code:
comtype lz4
goto 0x80
get CHUNK_SIZE long
get SIZE long
get ZSIZE long
get NAME basename
log NAME 0 0
append
for MEM_SIZE = 0 < SIZE
    get CHUNK_ZSIZE long
    savepos OFFSET
    clog NAME OFFSET CHUNK_ZSIZE CHUNK_SIZE
    math OFFSET + CHUNK_ZSIZE
    goto OFFSET
next MEM_SIZE + CHUNK_SIZE
append


Top
   
PostPosted: Sun Sep 16, 2018 9:55 pm 

Joined: Thu Aug 07, 2014 10:28 pm
Posts: 186
if i use the -e option and just read the whole file starting at 0x90 as the zsize it extracts the large files.


Top
   
PostPosted: Sun Sep 16, 2018 10:06 pm 

Joined: Wed Nov 15, 2017 1:54 pm
Posts: 55
The problem is that the last lz4 block before the new chunk isn't complete, the 2 byte offset is missing (offset for copying from the output buffer). In my own implementation the 2 byte offset was necessary even though the token is 0, so it caused problems. I haven't checked how you implemented lz4 in quickbms, but maybe quickbms also needs it?

I had to change my implementation, so that for the last block only the literals are copied from the block, copying from output buffer (with offset and token) is ignored. I am not sure how this can be achieved in quickbms. Closest I can get to correct output was with inserting 0 as 2 byte short (like dummy offset). It will run on test.mdl without error, but still the output is not exactly correct :
Code:
comtype lz4
goto 0x80
get CHUNK_SIZE long
get SIZE long
get ZSIZE long
get NAME basename
set DYN_SIZE ZSIZE

log MEMORY_FILE 0 0
append
for MEM_SIZE = 0 != ZSIZE
    get CHUNK_ZSIZE long
    savepos OFFSET
    log MEMORY_FILE OFFSET CHUNK_ZSIZE
    put 0 short MEMORY_FILE
    math DYN_SIZE + 2
    math OFFSET + CHUNK_ZSIZE
    goto OFFSET
next MEM_SIZE + CHUNK_ZSIZE
append
clog NAME 0 DYN_SIZE 20000000 MEMORY_FILE

Anyway I got this working with custom lz4 code, but just wanted to help with the bms script. I am not sure if this can be done without changing the actual lz4 code though.

chrrox wrote:
if i use the -e option and just read the whole file starting at 0x90 as the zsize it extracts the large files.

Is the unpacked data correct? It might seem fine at the beginning of the file but I suspect it might become nonsense as you go further.


Top
   
PostPosted: Sun Sep 16, 2018 11:16 pm 

Joined: Thu Aug 07, 2014 10:28 pm
Posts: 186
yeah you are right data looks wrong.


Top
   
PostPosted: Sun Sep 16, 2018 11:55 pm 
Site Admin
User avatar

Joined: Wed Jul 30, 2014 9:32 pm
Posts: 9267
implementation of lz4 in quickbms is very simple:
size = LZ4_decompress_safe_partial(in, out, zsize, size, *outsize);

and this is an important comment about why I opted for safe_partial:
Quote:
// hard choice here:
// LZ4_decompress_safe returns errors if there are additional bytes after the compressed stream (because it's raw)
// LZ4_decompress_safe_partial returns no errors if the stream is valid
// currently I opt for the second one because gives more freedom to quickbms and its scanner


If interested in how the various algorithms are implemented in quickbms you must check the perform_compression function in perform.c


Top
   
PostPosted: Mon Sep 17, 2018 12:45 am 

Joined: Wed Nov 15, 2017 1:54 pm
Posts: 55
aluigi wrote:
If interested in how the various algorithms are implemented in quickbms you must check the perform_compression function in perform.c

I should really do that some time. The amount of algorithms that quickbms supports is insane :D I could definitely learn a few things.

I have attached a little program for decompressing the files. Drag and drop the file on the exe. Only tested it with mdl and tex files.


Attachments:
BC_Dec_U1.rar [2.77 KiB]
Downloaded 11 times


Last edited by akderebur on Tue Sep 18, 2018 1:02 am, edited 1 time in total.
Top
   
PostPosted: Mon Sep 17, 2018 9:58 am 

Joined: Thu Sep 14, 2017 12:02 am
Posts: 15
I say, ur BC_Dec is failing to download for me, chrome, IE, Edge, and JDownloader just doesn't want to download it due to a virus being detected, should get that checked out.


Top
   
PostPosted: Mon Sep 17, 2018 10:06 am 

Joined: Wed Nov 15, 2017 1:54 pm
Posts: 55
Try this : http://www.mediafire.com/file/q3q19k5qj ... 1.rar/file


Last edited by akderebur on Tue Sep 18, 2018 1:01 am, edited 1 time in total.

Top
   
PostPosted: Mon Sep 17, 2018 10:17 am 

Joined: Thu Sep 14, 2017 12:02 am
Posts: 15
it still failed even with that, had to disable windows anti-virus to grab it, sadly it goes away after turning it back on.

edit: so what's the best way to unpack those DAT files, aluigi's script then the BC_dec?


Top
   
PostPosted: Mon Sep 17, 2018 10:24 am 

Joined: Wed Nov 15, 2017 1:54 pm
Posts: 55
Demonslayerx8 wrote:
it still failed even with that, had to disable windows anti-virus to grab it

Weird, seems like false positive for some anti-virus programs.
Demonslayerx8 wrote:
edit: so what's the best way to unpack those DAT files, aluigi's script then the BC_dec?

Exactly.


Top
   
PostPosted: Mon Sep 17, 2018 10:34 am 

Joined: Thu Sep 14, 2017 12:02 am
Posts: 15
akderebur wrote:
Demonslayerx8 wrote:
edit: so what's the best way to unpack those DAT files, aluigi's script then the BC_dec?

Exactly.

gotcha.. anyway to make it do it to the whole folder instead of 1 by 1? Would be easier for me lol


Top
   
PostPosted: Mon Sep 17, 2018 10:40 am 

Joined: Wed Nov 15, 2017 1:54 pm
Posts: 55
Yea, I can do that, will update it when I have the time.


Top
   
Display posts from previous:  Sort by  
Post new topic  Reply to topic  [ 22 posts ]  Go to page 1 2 Next

All times are UTC


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Powered by phpBB® Forum Software © phpBB Limited