January 6th, 2013, 02:03 PM
Decoding '%F6' Like Unicode Characters to string in C++(Mac OS X)
I'm having some problems in receiving fileNames from Server to Client(C++) in Mac OS X. I send a serialized object , which has a char pointer with the fileName or sometimes a string object, when i receive it in the client, it seems to be having %F6 or %E9 ,etc . This issue don't arise in Windows OS though, even thought it's the same code. Is there anyway decoding these '%' characters back to their original form in Mac OS & Linux ..?
Fex characters i got into problems with : ǡ ȅ ȉ
It would be difficult to change the code in server, so if there's a way decoding the characters back to its original form, it would be easier.I'm using Boost Library for Serialization and i'm just looking for ways to decode %F6 back to ȅ in C++, like if some library is available ..?
January 6th, 2013, 04:47 PM
Re: Decoding '%F6' Like Unicode Characters to string in C++(Mac OS X)
>> ... i'm just looking for ways to decode %F6 back to ȅ ...
The doesn't make sense. Latin-small-letter-E-with-double-grave has no encoding that includes a 0xF6 byte - assuming "%F6" is a hex byte.
>> .I'm using Boost Library for Serialization ...
What are you serializing - a std::string? You'll need to make sure the assumed encoding of the string is the same on both the client and the server. UTF8 is often used as the encoding for char strings for example.
When using a serialization library, whatever you "encode" should be identical once it is later "decoded". So I question whether the serialization library is being misused, or if this is just an encoding/interpretation issue. Where exactly does "%F6" come from? Can you provide a simple, compilable app that demonstrates the issue?
Click Here to Expand Forum to Full Width