CodeGuru Home VC++ / MFC / C++ .NET / C# Visual Basic VB Forums Developer.com
Results 1 to 11 of 11
  1. #1
    Join Date
    Dec 2003
    Location
    Middletown, DE
    Posts
    67

    Binary to ASCII number conversion algorithms

    I'm curious to know what algorithms are known for converting binary numbers to ASCII decimal.

    When I first got started in assembly (6502), I came across someone's code that had a 24-bit conversion routine. Since 6502 does not have a mul or div op-code, nor floating point instructions, all computation was done with addition/subtraction and bit shifting. What was most interesting is that the numbers were converted to ASCII starting with the one's place and worked up. I was easily able to adapt it to n-bit number conversion.

    Unfortunately, the code to this routine got lost somewhere and I never understood it enough to know how it worked.

    So I was curious to know if others know this algorithm or others like it. I'm looking for efficiency.

  2. #2
    Join Date
    Dec 2003
    Location
    Middletown, DE
    Posts
    67

    Re: Binary to ASCII number conversion algorithms

    One more shot... Can anyone provide some input?

  3. #3
    Join Date
    May 2004
    Location
    Pell City, Alabama
    Posts
    126

    Re: Binary to ASCII number conversion algorithms

    Couldn't get much simpiler than this...

    #define __toascii(_c) ( (_c) & 0x7f )

  4. #4
    Join Date
    Dec 2003
    Location
    Middletown, DE
    Posts
    67

    Re: Binary to ASCII number conversion algorithms

    Quote Originally Posted by Mutilated1
    Couldn't get much simpiler than this...

    #define __toascii(_c) ( (_c) & 0x7f )
    Well... all I can say is that doesn't filter out non-ascii control codes. Nor does it convert binary numbers to ASCII decimals.

    Can anyone else spare their knowledge? I can't seem to search on this subject anywhere online.

  5. #5
    Join Date
    May 2004
    Location
    Pell City, Alabama
    Posts
    126

    Re: Binary to ASCII number conversion algorithms

    oh so you want to take a binary 0000000000000010 and convert to "2" ? Is that what you want ? Your first post didn't say anything about non-Ascii control codes either.

  6. #6
    Join Date
    Nov 2004
    Posts
    4

    Re: Binary to ASCII number conversion algorithms

    Quote Originally Posted by Mutilated1
    oh so you want to take a binary 0000000000000010 and convert to "2" ? Is that what you want ? Your first post didn't say anything about non-Ascii control codes either.
    Something...
    I'm curious to know what algorithms are known for converting binary numbers to ASCII decimal.

  7. #7
    Join Date
    Oct 2002
    Posts
    1,134

    Re: Binary to ASCII number conversion algorithms

    It sounds like your major problem is dividing by 10 without using a DIV instruction. I came across the following algorithm by R.J. Mitchell when I was looking into a similar problem. It may help you:

    Code:
    unsigned int dividend = 1234;     // The number to be converted
    unsigned int divisor = 10;
    unsigned int answer = 0;
    unsigned int count ;
    unsigned int MSB = 0x80000000;    // The most significant bit of a 32-bit unsigned number
    
    // Normalize the number 10 for the size of an integer.
    // This is a constant, of course, but I've included its calculation
    // here to show how it's arrived at.
    count = 1;
    while( (divisor & MSB) == 0 )
    {
      divisor <<= 1;
      ++count;
    }
    
    answer = 0;
    while( count > 0 )
    {
      answer <<= 1;
      if( divisor <= dividend )
      {
        dividend -= divisor;
        ++answer;
      }
      divisor >>= 1;
      --count;
    }
    // Quotient is in "answer", remainder is in "dividend"
    The result of the division is in "answer", and the remainder is in "dividend".

    To convert a binary number to ASCII, you would iteratively apply the above algorithm to the binary number and add hex 30 to each remainder.

    After looking at this, and other associated overhead, I came to the conclusion that the hardware DIV instruction is a lot more efficient....
    Regards
    Robert Thompson

  8. #8
    Join Date
    Dec 2003
    Location
    Middletown, DE
    Posts
    67

    Re: Binary to ASCII number conversion algorithms

    Interesting code bit. However you are all still misunderstanding the requirements. I merely stated I knew of a code snippet that did it without div/mul simply because the processor didn't support it. Obviously, it's good to use mul/div or even fmul/fdiv on today's Intel processors. I just want to see what other algorithms are known for binary to ASCII number conversion, meaning convert binary 1234 to string "1234".

    So we know one obvious method of converting, based on the above code. Divide by powers of 10, starting in the highest place, to determine each decimal place in the ASCII number string, and subtract as you go. i.e. with 1234, divide by 1000, 100, 10, and the remainder in the one's place.

    The method in the lost code snippet I mentioned went the opposite direction, by determining the number starting with the one's place and working upwards. I thought this was pretty interesting. I'd like to find the algorithm to do it, since it appeared to be a very efficient method.

  9. #9
    Join Date
    Oct 2002
    Posts
    1,134

    Re: Binary to ASCII number conversion algorithms

    The code actually works the way you initially described. Dividing 1234 by 10 gives a quotient of 123 and a remainder of 4 (low order digit). Dividing the quotient by 10 gives a new quotient of 12 and a remainder of 3 (next low-order digit), etc.

    I suspect your 6502 code used the fact that some shift operations work with two registers "end-to-end", while other operations work with a single register. The division by 10 - which is the real crux here - then becomes easier. There are similar operations in the Intel world, too, but my assembler is far too rusty to attempt programming such an algorithm.

    As you note, R.J. Mitchell's algorithm is interesting, but hardly of any practical use (other than as an assignment to some poor unsuspecting Computer Science class).
    Regards
    Robert Thompson

  10. #10
    Join Date
    Dec 2003
    Location
    Middletown, DE
    Posts
    67

    Re: Binary to ASCII number conversion algorithms

    I hadn't thought of it that way. That's very straightforward. So then the only real optimization point is to divide by 10 to get a quotient and remainder.

  11. #11
    Join Date
    Jun 2007
    Posts
    1

    Re: Binary to ASCII number conversion algorithms

    #include<stdio.h>
    int main()
    {
    char buffer[8]="0010000";
    char iVal={0};

    for (int i = 0 ; i < 7 ; i++ )
    {
    iVal = (iVal << 1) + (buffer[i] & 1);
    }
    cout<<iVal;
    }
    return 0;
    }

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  





Click Here to Expand Forum to Full Width

Featured