How to read base64 zlib data on tmx files in c++?

Hello,

I am tying to make a map loader for my game and want to load the tmx file format. nof i have sumbled across a problem with decompressing the data after i have decoded it. i know my decoded code works because it works just fine for the uncompressed file.

After decompressing i can see correct data, but after a while it fails/corrupts and it just repeats the last decoded values.

std::vector<std::string> dd = split(prop->value(), ' ');
std::string d;

for (std::string tmp : dd) // i want to remove the /n/r_ from the beginning of the string
{
	if (tmp.size() > 10)
	{
		d = tmp;
		break;
	}
}

//decode
std::vector<char> tmp = base64_decode(d);

//decompress
std::vector<char> tmp2 = decompress_data(tmp);

// Copy the data over. 
Layer.Data.resize(tmp2.size() / sizeof(unsigned int));
memcpy(&Layer.Data[0], &tmp2[0], tmp2.size());
int gg = 0; //break point

and my borrowed decompress function.

std::vector<char> decompress_data(const std::vector<char>& data)
{
// why not use http://www.cs.unc.edu/Research/compgeom/gzstream/#src

std::vector<char> newdata;
newdata.resize(5000);
//https://gist.github.com/arq5x/5315739
//// STEP 2.
//// inflate b into c
//// zlib struct
z_stream infstream;
infstream.zalloc = Z_NULL;
infstream.zfree = Z_NULL;
infstream.opaque = Z_NULL;
// setup "b" as the input and "c" as the compressed output
infstream.avail_in = (uInt)data.size(); // size of input
infstream.next_in = (Bytef *)data.data(); // input char array
infstream.avail_out = (uInt)newdata.size(); // size of output
infstream.next_out = (Bytef *)newdata.data(); // output char array

								 // the actual DE-compression work.
inflateInit(&infstream);
inflate(&infstream, Z_NO_FLUSH);
inflateEnd(&infstream);

//int i = 0;
return newdata;
}

i fixed the size to 5000 the actual map should contain 4000 entry’s. do any of you have an idea what is wrong here?
let me know if you need more information

What engine is that TMX loader for? What about using one of the existing loaders, would those work for you? If not, they might still serve as good examples where you can copy & paste some code from.

I’ve also started early-access for TilemapKit’s C++11 TMX loader here in case you’re using cocos2d-x (though I may eventually port this to other engines respectively the loader code and map model classes are completely unaware of any engine and can be used even with custom engines - in case you’d rather just buy stuff than re-inventing the wheel wink wink nudge nudge :wink: ):
https://tilemapkit.com/downloads/tilemapkit-cocos2d-x-source/

I’ll just lazily paste the decompress routines, maybe that helps since you appear to be using zlib/libz as well. :wink:

#define TK_BUFFER_INC_FACTOR (1.5)
		
static int32_t inflateMemoryWithHintImp(uint8_t *in, uint32_t inLength, uint8_t **out, uint32_t *outLength, uint32_t outLenghtHint)
{
	int32_t err = Z_OK;
	uint32_t bufferSize = outLenghtHint;
	*out = (uint8_t*) malloc(bufferSize);
			
	z_stream d_stream; /* decompression stream */
	d_stream.zalloc = (alloc_func)0;
	d_stream.zfree = (free_func)0;
	d_stream.opaque = (voidpf)0;
	d_stream.next_in  = in;
	d_stream.avail_in = inLength;
	d_stream.next_out = *out;
	d_stream.avail_out = bufferSize;
			
	/* window size to hold 256k */
	if ((err = inflateInit2(&d_stream, 15 + 32)) != Z_OK)
		return err;
			
	for (;;) {
		err = inflate(&d_stream, Z_NO_FLUSH);
		if (err == Z_STREAM_END)
			break;
				
		switch (err) {
			case Z_NEED_DICT:
				err = Z_DATA_ERROR;
			case Z_DATA_ERROR:
			case Z_MEM_ERROR:
				inflateEnd(&d_stream);
				return err;
		}
				
		// not enough memory ?
		if (err != Z_STREAM_END) {
			uint8_t *tmp = (uint8_t*)realloc(*out, (uint32_t)(bufferSize * TK_BUFFER_INC_FACTOR));
			if (!tmp) {
				inflateEnd(&d_stream);
				return Z_MEM_ERROR;
			}
					
			*out = tmp;
			d_stream.next_out = *out + bufferSize;
			d_stream.avail_out = bufferSize;
			bufferSize *= TK_BUFFER_INC_FACTOR;
		}
	}
			
	*outLength = bufferSize - d_stream.avail_out;
	err = inflateEnd(&d_stream);
	return err;
}
		
uint32_t inflateMemoryWithHint(uint8_t *in, uint32_t inLength, uint8_t **out, uint32_t outLengthHint)
{
	uint32_t outLength = 0;
	int32_t err = inflateMemoryWithHintImp(in, inLength, out, &outLength, outLengthHint);
	if (err != Z_OK || *out == NULL) {
		if (err == Z_MEM_ERROR)
			fprintf(stderr, "TKMapReader: Out of memory while inflating tile data!");
		else if (err == Z_VERSION_ERROR)
			fprintf(stderr, "TKMapReader: Incompatible zlib version!");
		else if (err == Z_DATA_ERROR)
			fprintf(stderr, "TKMapReader: Incorrect zlib compressed data!");
		else
			fprintf(stderr, "TKMapReader: unknown error while inflating tile data!");
		
		free(*out);
		*out = NULL;
		outLength = 0;
	}
	return outLength;
}

And this is how it’s used (included base64 decoding but you already have that covered):

CString dataString = trim(dataNode.text().as_string());
uint32_t expectedLength = (uint32_t)(tileLayer->_size.width * tileLayer->_size.height) * sizeof(TileGID);
uint8_t* tileBuffer = nullptr;
uint32_t tileBufferLength = Func::base64Decode((uint8_t*)dataString, (uint32_t)strlen(dataString), &tileBuffer);
	
if (tileBuffer == nullptr) {
	std::ostringstream ss; ss << "Error decoding tile data, base64Decode() returned NULL for layer " << tileLayer->_name << ".";
	failWithMessage(ss.str());
	return false;
}
	
if (dataFormat != TMXDataFormat::Uncompressed) {
	uint8_t* buffer;
	uint32_t length = Func::inflateMemoryWithHint(tileBuffer, tileBufferLength, &buffer, expectedLength);
		
	free(tileBuffer);
	tileBuffer = buffer;
	tileBufferLength = length;
}
	
if (tileBuffer == nullptr || tileBufferLength != expectedLength) {
	std::ostringstream ss; ss << "Error deflating compressed tile data, inflateMemory() returned NULL for layer " << tileLayer->_name << ".";
	failWithMessage(ss.str());
	return false;
}
	
tileLayer->_tiles = (TileGID*)tileBuffer; // tile layer takes ownership, will free the tileBuffer 

Hello Steffen,

thanks for your example, it helped a lot. the official documentation is a little how you doing so i was not able to figure out what was failing. i had a 2 part problem, one was simple a buffer that was to small. how do you define the “outLengthHint” do you calculate it on the had of the map size ?

As for the engine i am using. i just finished school and i want to make my own game, so the engine is 100% my own… well as much as possible.

outlengthHint is identical to expectedLength, which is simply the size of the uncompressed map data (w * h * bytes per tile which is w * h * 4 bytes because GID is a 32-bit unsigned integer)

uint32_t expectedLength = (uint32_t)(tileLayer->_size.width * tileLayer->_size.height) * sizeof(TileGID);