I know there has been already similar posts to this, but I couldn’t find any that works.
I’ve used the exact same decompress function from this post. It didn’t seem to work and only returned “???”.
The reason why I didn’t use an existing TMX loader is because I don’t really like the way linking packages work in C++, and I mostly want to write code inside the project instead.
How should I be exactly decompressing the zlib-compressed data?
You will need a base64 library and a zlib library. Make sure you decode the base64 data first, and then pass that into the zlib decoder. Note that the decompressed data is binary and should be interpreted as an array of integers, so you might still get garbage if you try to print it without interpreting it.
I’ve written about this in a bit more detail in my tip sheet.
Edit: Base64 and zlib libraries are available as header-only libraries, so you should be able to just drop those into your project without any linking difficulties.
I already got the base64 decoder part, it’s only the zlib decompression I’m having issues with. I already have the zlib library installed.
I haven’t tried printing only one part of the decompressed data, I’ll see if it works that way.
Just tried it, it only seems to return ?
. Am I supposed to be using a different type other than std::vector<char>
?
Like I said, the tile layer data, after being decoded and decompressed, should be an array of integers, specifically, an array of unsigned int
. A char
is 1 byte, an unsigned int
is 4 bytes, so you’re printing some random byte of the int and expecting it to make sense xP
You can copy or cast the data into an unsigned int array (like in the example I linked), or you can manually interpret every 4 bytes of your byte array as an int (like in the Tiled GID documentation), either will work.
I have tried your code in your site instead and it seems to have worked. I didn’t really have any hopes of the previous decompressing function that I used working.