I'm curious to know what algorithms are known for converting binary numbers to ASCII decimal.
When I first got started in assembly (6502), I came across someone's code that had a 24-bit conversion routine. Since 6502 does not have a mul or div op-code, nor floating point instructions, all computation was done with addition/subtraction and bit shifting. What was most interesting is that the numbers were converted to ASCII starting with the one's place and worked up. I was easily able to adapt it to n-bit number conversion.
Unfortunately, the code to this routine got lost somewhere and I never understood it enough to know how it worked.
So I was curious to know if others know this algorithm or others like it. I'm looking for efficiency.