CodeGuru Home VC++ / MFC / C++ .NET / C# Visual Basic VB Forums Developer.com
Results 1 to 5 of 5
  1. #1
    Join Date
    Apr 1999
    Posts
    123

    convert unicode codepoint to character

    How do I convert a unicode code point into a UTF-16 encoded string in Visual C++ 6?

    I believe there is a .net function called ConvertFromUtf32 which does what I want.

  2. #2
    Join Date
    Sep 2002
    Location
    14° 39'19.65"N / 121° 1'44.34"E
    Posts
    9,815

    Re: convert unicode codepoint to character

    Quote Originally Posted by Bob H
    How do I convert a unicode code point into a UTF-16 encoded string in Visual C++ 6?
    You can simply insert the 16-bit hexadecimal codes into a UNICODE string literal by escaping them with "\x" (provided that your project is built for UNICODE):
    Code:
    LPCTSTR str = _T("Some text: \x00C4\x00D6\x00DC");

  3. #3
    Join Date
    Apr 1999
    Posts
    123

    Re: convert unicode codepoint to character

    Thank you. I have been searching for days on how to do this.

  4. #4
    Join Date
    Apr 1999
    Posts
    123

    Re: convert unicode codepoint to character

    How do I go the other direction -- from a unicode string to code points?

  5. #5
    Join Date
    Apr 1999
    Posts
    123

    Re: convert unicode codepoint to character

    I am still struggling with the first problem. You created a string using hex constants. How do I create that string if the code points are variables? In other works, how do I form an escaped hex number programmically?

    For example, the following won't work:

    TCHAR buffer[10];
    _stprintf(buffer, _T("%04x"), uCode);
    CString sHex = buffer;
    CString sEscaped = _T("\x") + sHex;

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  





Click Here to Expand Forum to Full Width

Featured