-
November 5th, 2004, 09:20 AM
#1
Binary to ASCII number conversion algorithms
I'm curious to know what algorithms are known for converting binary numbers to ASCII decimal.
When I first got started in assembly (6502), I came across someone's code that had a 24-bit conversion routine. Since 6502 does not have a mul or div op-code, nor floating point instructions, all computation was done with addition/subtraction and bit shifting. What was most interesting is that the numbers were converted to ASCII starting with the one's place and worked up. I was easily able to adapt it to n-bit number conversion.
Unfortunately, the code to this routine got lost somewhere and I never understood it enough to know how it worked.
So I was curious to know if others know this algorithm or others like it. I'm looking for efficiency.
-
November 18th, 2004, 04:15 PM
#2
Re: Binary to ASCII number conversion algorithms
One more shot... Can anyone provide some input?
-
November 18th, 2004, 04:44 PM
#3
Re: Binary to ASCII number conversion algorithms
Couldn't get much simpiler than this...
#define __toascii(_c) ( (_c) & 0x7f )
-
November 18th, 2004, 09:08 PM
#4
Re: Binary to ASCII number conversion algorithms
Originally Posted by Mutilated1
Couldn't get much simpiler than this...
#define __toascii(_c) ( (_c) & 0x7f )
Well... all I can say is that doesn't filter out non-ascii control codes. Nor does it convert binary numbers to ASCII decimals.
Can anyone else spare their knowledge? I can't seem to search on this subject anywhere online.
-
November 19th, 2004, 01:47 PM
#5
Re: Binary to ASCII number conversion algorithms
oh so you want to take a binary 0000000000000010 and convert to "2" ? Is that what you want ? Your first post didn't say anything about non-Ascii control codes either.
-
November 21st, 2004, 08:50 AM
#6
Re: Binary to ASCII number conversion algorithms
Originally Posted by Mutilated1
oh so you want to take a binary 0000000000000010 and convert to "2" ? Is that what you want ? Your first post didn't say anything about non-Ascii control codes either.
Something...
I'm curious to know what algorithms are known for converting binary numbers to ASCII decimal.
-
June 2nd, 2007, 04:59 PM
#7
Re: Binary to ASCII number conversion algorithms
#include<stdio.h>
int main()
{
char buffer[8]="0010000";
char iVal={0};
for (int i = 0 ; i < 7 ; i++ )
{
iVal = (iVal << 1) + (buffer[i] & 1);
}
cout<<iVal;
}
return 0;
}
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
|
Click Here to Expand Forum to Full Width
|