There appears to be a bug in gzip decompression using quickbms. i can extract a few files at a time like this but when i extract multiple gzlip archives using one script the program runs out of memory but if i start 1/2 way through the same file it extracts those files that it failed at at first.
Here is a sample for you to see. http://www.sendspace.com/file/mpb4fz I had to make quickbms export .gz files otherwise i would get random memory errors.
Joined: 13 Aug 2007 21:44 Posts: 4068 Location: http://aluigi.org
uhmm I have used the virtua_fighter_5.bms script on my website but I had no problems during the extraction. I have even used it in loop ("for goto 0 ... next") but it still continues to use only 50 megabytes of RAM in total.
do you have a way to replicate the problem you see?
I did not know you made a script i will try it on some files i had problems with. Why do you need to set the endian little for gzip? I did not set it to little in my script i had made and i was able to extract all the files but only about 1 - 3 per pass ill try to replicate it for you.
Joined: 13 Aug 2007 21:44 Posts: 4068 Location: http://aluigi.org
I guess that one is the reason of the memory consuming you had. in short the gzip files have the size of the original file set at their end and for reading it I used the function that takes care of the global endianess. but in the next version I will remove it because this is the proof that even on big endian systems gzip maintains the little endian size at the end
Al that explains a lot i understand what was happening now. Well I would keep that in mind for your tool in the future but now i can just wait for the new version. Thanks again.
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot post attachments in this forum