CodeGuru Home VC++ / MFC / C++ .NET / C# Visual Basic VB Forums Developer.com
Results 1 to 11 of 11

Hybrid View

  1. #1
    Join Date
    Dec 2003
    Location
    Middletown, DE
    Posts
    67

    Binary to ASCII number conversion algorithms

    I'm curious to know what algorithms are known for converting binary numbers to ASCII decimal.

    When I first got started in assembly (6502), I came across someone's code that had a 24-bit conversion routine. Since 6502 does not have a mul or div op-code, nor floating point instructions, all computation was done with addition/subtraction and bit shifting. What was most interesting is that the numbers were converted to ASCII starting with the one's place and worked up. I was easily able to adapt it to n-bit number conversion.

    Unfortunately, the code to this routine got lost somewhere and I never understood it enough to know how it worked.

    So I was curious to know if others know this algorithm or others like it. I'm looking for efficiency.

  2. #2
    Join Date
    Dec 2003
    Location
    Middletown, DE
    Posts
    67

    Re: Binary to ASCII number conversion algorithms

    One more shot... Can anyone provide some input?

  3. #3
    Join Date
    May 2004
    Location
    Pell City, Alabama
    Posts
    126

    Re: Binary to ASCII number conversion algorithms

    Couldn't get much simpiler than this...

    #define __toascii(_c) ( (_c) & 0x7f )

  4. #4
    Join Date
    Dec 2003
    Location
    Middletown, DE
    Posts
    67

    Re: Binary to ASCII number conversion algorithms

    Quote Originally Posted by Mutilated1
    Couldn't get much simpiler than this...

    #define __toascii(_c) ( (_c) & 0x7f )
    Well... all I can say is that doesn't filter out non-ascii control codes. Nor does it convert binary numbers to ASCII decimals.

    Can anyone else spare their knowledge? I can't seem to search on this subject anywhere online.

  5. #5
    Join Date
    May 2004
    Location
    Pell City, Alabama
    Posts
    126

    Re: Binary to ASCII number conversion algorithms

    oh so you want to take a binary 0000000000000010 and convert to "2" ? Is that what you want ? Your first post didn't say anything about non-Ascii control codes either.

  6. #6
    Join Date
    Nov 2004
    Posts
    4

    Re: Binary to ASCII number conversion algorithms

    Quote Originally Posted by Mutilated1
    oh so you want to take a binary 0000000000000010 and convert to "2" ? Is that what you want ? Your first post didn't say anything about non-Ascii control codes either.
    Something...
    I'm curious to know what algorithms are known for converting binary numbers to ASCII decimal.

  7. #7
    Join Date
    Jun 2007
    Posts
    1

    Re: Binary to ASCII number conversion algorithms

    #include<stdio.h>
    int main()
    {
    char buffer[8]="0010000";
    char iVal={0};

    for (int i = 0 ; i < 7 ; i++ )
    {
    iVal = (iVal << 1) + (buffer[i] & 1);
    }
    cout<<iVal;
    }
    return 0;
    }

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  





Click Here to Expand Forum to Full Width

Featured