Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Chriss Miller on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Fixed Data Size

Status
Not open for further replies.

Miquella

Programmer
Mar 11, 2005
41
US
I've been looking for the best way to ensure a data type size, cross-platform. I need to ensure that the integers I'm using are 8 and 32 bits.

I've found the Microsoft __intn and OSX (that I think will work on Unix, but haven't had a chance to check yet): SIntn/UIntn.

Is there a more basic, possibly C++ standard way to do this?

Thanks in advance,
Miq
 
__intn is a microsoft thing - it may work on some versions of Unix but not all. I don't know about standard ways - it is normally company culture and every company is different. The easiest way is to typedef.
Code:
typedef unsigned char UInt8;
typedef char SInt8;
typedef unsigned short UInt16;
typedef short SInt16;
typedef unsigned long UInt32;
typedef long SInt32;
#ifdef MICROSOFT // whatever you use to identify MS
typedef __int64  SInt64;
typedef unsigned __int64 UInt64;
#else
typedef long long SInt64;
typedef unsigned long long UInt64;
#endif
That will work on both platforms. Just don't use int/unsigned int except as loop control variables: it is the most efficient one in terms of code generation.

I find that SInt and UInt lead to a lot of typos - I kept on typing Sint and Uint. It would be better if it was SINT/UINT or sint/uint. SInt may look prettier but it is a pig to type, especially when you have tons of them.
 
Thank you for your response, however, it doesn't look like Microsoft supports SIntn/UIntn, so I'll have to use the Microsoft specific __intn, but I cannot find something that has a defined size in Linux systems, apparently UIntn doesn't work in Linux systems.

I have to avoid using the pre-defined char/int/long types, because they can change from one compiler to another, and the things that I'm using them for have to have defined lengths.

Thanks in advance again,
Miq
 
On all the compilers I've used since I first started using C in 1986,

char is always 8 bits even though sizeof(char) can vary.
short is always 16 bits
long is always 32 bits
long long, if it is supported is always 64 bits

I don't think you will have a problem as long as you don't use int. That is the most convenient machine sized word. This varies from machine to machine.

It is only different if you are using something strange like a Honeywell level 66 which has 36 bit words or a CDC which has 60 bit words. That is pretty rare nowadays.
 
Hmm... interesting! Thank you very much!

I'll give that a try as soon as I get a chance!

Many thanks,
Miq
 
> char is always 8 bits even though sizeof(char) can vary.
Wrong way round I'm afraid.
sizeof(char) is by definition always 1, irrespective of the number of bits in it. It refers to the smallest quantity of memory which has a unique address, and is the base quantity of all memory allocations.

The actual number of bits in a char is defined by CHAR_BIT in limits.h.

Also in limits.h are macro constants for short, int and long.

> short is always 16 bits
The standard states all limits as minimum ranges, not absolute ranges.

--
 
Okay, but is limits.h a standardized include file? And on a further note, am I allowed to overwrite these values??

Thanks again,
Miq
 
Yes <limits.h> (or <climits> in C++) is a standard header file.
If you mean changing the values in the standard header files, I would strongly advise against it.

From what I've always read, the data sizes are as follows:

char: 8 bits
short: 16 bits
int: 2 words (on 32-bit OSes a word is 16 bits, on DOS it's 8 bits)
long: 32 bits
 
Okay, so we seem to have established that:

short: is okay at 16 bits
long: is okay at 32 bits
long long: is sometimes okay at 64 bits

But that char can vary in the number of bits that it contains. Is there a different way?

Is perhaps byte cross-platform?

Thanks again guys,
Miq
 
> But that char can vary in the number of bits that it contains.
Every type can vary in the number of bits from one platform to another. Some embedded DSP chips for example have 32 bit characters (along with 32 bit shorts, 32 bit ints and 32 bit longs). Very strange to be sure but within the standards.

The standard only sets a MINIMUM value for the number of bits in each quantity. For chars, the minimum number of bits is 8.

> Is perhaps byte cross-platform?
There's no such thing as a byte, unless you typedef it.
But then what would you typedef it as?

--
 
Exactly you see, Salem has stated my question possibly better than I have. The standard only sets a minimum for the number of bits for each.

I am looking for a way on each platfrom to specify the number of bits for our number variables. I am going to typedef each of the types that we use, but I need to know what to typedef them as, again, any ideas? Even if they are specific to each platform? I know __intn works for MS Windows. Any suggestions for Linux? Or Mac OSX? Unix?

Thanks to everyone who has contributed so far,
Miq
 
If you're not using any crazy platforms and are just planning on using the usual Windows & UNIX OSes, xwb's typedefs should be fine.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top