-
May 14th, 2009, 05:11 PM
#1
'long long' not standard
Hi,
I was doing a bit of reading into the C++03 standard and I noticed that 'long long' is not standard. If it is not standard, why have so many compilers (Especially MSVC) made up a non-standard integral type?
On 32-bit windows, 'unsigned int' happens to be 32-bits. Also, 'unsigned long' happens to be 32-bits. They could have completely eliminated the need for 'long long' if they had made 'long' 64-bits on 32-bit architectures.
Why didn't things turn out this way? I want to be as standards conforming as possible in my code, which means I can't use 'long long'. However, I need access to 64-bit integrals on 32-bit architectures (Windows XP, in this example). Is there anything I can do?
Thanks.
--MrDoomMaster
--C++ Game Programmer
Don't forget to rate me if I was helpful!
-
May 14th, 2009, 05:18 PM
#2
Re: 'long long' not standard
Just use long long. C++0x will standardize it anyway, and you don't have any better options.
By the way, long is a 32-bit type on 64-bit Windows, too. That's the part that's *really* strange. (It's 64 bits on 64-bit Linux.)
-
May 14th, 2009, 05:28 PM
#3
Re: 'long long' not standard
Lang Lang is a fabulous pianist. I heard him play the first Tchaikovsky concerto with the Cleveland Orchestra a few seasons ago.
-
May 14th, 2009, 05:45 PM
#4
Re: 'long long' not standard
Originally Posted by Lindley
Just use long long. C++0x will standardize it anyway, and you don't have any better options.
By the way, long is a 32-bit type on 64-bit Windows, too. That's the part that's *really* strange. (It's 64 bits on 64-bit Linux.)
On 64-bit windows, long is only 32 bits? What the? That means 'int' must be less than or equal to 32 bits!!
--MrDoomMaster
--C++ Game Programmer
Don't forget to rate me if I was helpful!
-
May 14th, 2009, 09:31 PM
#5
Re: 'long long' not standard
Technically it's a property of the compiler rather than OS:
http://en.wikipedia.org/wiki/64-bit#...ic_data_models
-
May 15th, 2009, 09:20 AM
#6
Re: 'long long' not standard
Originally Posted by 0xC0000005
Lang Lang is a fabulous pianist. I heard him play the first Tchaikovsky concerto with the Cleveland Orchestra a few seasons ago.
I think, he's a bit overrated. I also heard him a couple years back and thought he was good but not fabulous. Or maybe it's just that here in Germany there is too much fuzz being made about him...
More computing sins are committed in the name of efficiency (without necessarily achieving it) than for any other single reason - including blind stupidity. --W.A.Wulf
Premature optimization is the root of all evil --Donald E. Knuth
Please read Information on posting before posting, especially the info on using [code] tags.
-
May 16th, 2009, 10:56 AM
#7
Re: 'long long' not standard
-
May 16th, 2009, 12:25 PM
#8
Re: 'long long' not standard
Originally Posted by 0xC0000005
Lang Lang is a fabulous pianist. I heard him play the first Tchaikovsky concerto with the Cleveland Orchestra a few seasons ago.
And Ling Ling was a famous zoo animal.
Anyway, the "bitness" used by a compiler is usually tied to the CPU architecture. However nothing stops a compiler to have integers of any bits and on any platform. In these cases, the compiler would emulate an "n-bit" integer via software, not hardware,
This was the case with old DOS compilers that had 32-bit longs. Even though the Intel architecture at that time was 16-bit, 32-bit longs were implemented by using emulation (if you debugged into the calls that generated the 32-bit longs, you actually could see a 10 or so line assembly routine just to generate the long numbers).
Regards,
Paul McKenzie
-
May 18th, 2009, 02:30 PM
#9
Re: 'long long' not standard
To expand on what Paul said, IIRC, the standard only says:
Code:
sizeof(char) <= sizeof(short) <= sizeof(int) <= sizeof(long)
So, theoretically, if a compiler implements a char as 7 bits, your longs could also be 7 bits!
Viggy
-
May 18th, 2009, 02:51 PM
#10
Re: 'long long' not standard
Originally Posted by MrViggy
So, theoretically, if a compiler implements a char as 7 bits, your longs could also be 7 bits!
As I pointed out in the input unsigned long from file thread, there are minimum ranges as well. A compiler that implements a char as 7 bits would not conform to the C++ standard, since the C++ standard refers to the C standard, which mandates that CHAR_BIT be at least 8.
-
May 18th, 2009, 03:54 PM
#11
Re: 'long long' not standard
Originally Posted by MrDoomMaster
I need access to 64-bit integrals on 32-bit architectures (Windows XP, in this example). Is there anything I can do?
You'll have to look at the compiler specification. Like say you use the Microsoft compiler,
http://msdn.microsoft.com/en-us/library/cc953fe1.aspx
Then you define your own long primitive somewhere globally, like
typedef __int64 LongInt;
You use LongInt at places in code where you want to make sure you have a 64 bit int. If you change compilers you just adjust the one global typedef.
Last edited by nuzzle; May 18th, 2009 at 04:51 PM.
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
|
Click Here to Expand Forum to Full Width
|