-
November 13th, 2006, 04:38 PM
#1
convert unicode codepoint to character
How do I convert a unicode code point into a UTF-16 encoded string in Visual C++ 6?
I believe there is a .net function called ConvertFromUtf32 which does what I want.
-
November 13th, 2006, 05:08 PM
#2
Re: convert unicode codepoint to character
Originally Posted by Bob H
How do I convert a unicode code point into a UTF-16 encoded string in Visual C++ 6?
You can simply insert the 16-bit hexadecimal codes into a UNICODE string literal by escaping them with "\x" (provided that your project is built for UNICODE):
Code:
LPCTSTR str = _T("Some text: \x00C4\x00D6\x00DC");
-
November 14th, 2006, 06:59 AM
#3
Re: convert unicode codepoint to character
Thank you. I have been searching for days on how to do this.
-
November 14th, 2006, 07:31 AM
#4
Re: convert unicode codepoint to character
How do I go the other direction -- from a unicode string to code points?
-
November 14th, 2006, 08:19 AM
#5
Re: convert unicode codepoint to character
I am still struggling with the first problem. You created a string using hex constants. How do I create that string if the code points are variables? In other works, how do I form an escaped hex number programmically?
For example, the following won't work:
TCHAR buffer[10];
_stprintf(buffer, _T("%04x"), uCode);
CString sHex = buffer;
CString sEscaped = _T("\x") + sHex;
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
|
Click Here to Expand Forum to Full Width
|