- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I'm having trouble finding a compiler option that will force an int declaration to use 32 bit integers in x86-64 mode. I am bandwidth constrained, and I'm thinking it might be useful if I could cut down the size of the integers. If I am not bandwidth constrained, would using 32 bit integer operations slow me down in x86-64 mode?
Thanks,
-Jeff
Link Copied
6 Replies
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Quoting - jeff_keasler
If I am not bandwidth constrained, would using 32 bit integer operations slow me down in x86-64 mode?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Quoting - jeff_keasler
Hi,
I'm having trouble finding a compiler option that will force an int declaration to use 32 bit integers in x86-64 mode. I am bandwidth constrained, and I'm thinking it might be useful if I could cut down the size of the integers. If I am not bandwidth constrained, would using 32 bit integer operations slow me down in x86-64 mode?
Thanks,
-Jeff
Could you try "-m32" which generates code for a 32-bit, probably the 32-bit environment sets int, long and pointer to 32 bits.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Quoting - srimks
Could you try "-m32" which generates code for a 32-bit, probably the 32-bit environment sets int, long and pointer to 32 bits.
Can you please tell me in which compiler are you using this switch -m32
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Quoting - manugupt1
Can you please tell me in which compiler are you using this switch -m32
Both GNU & ICC supports pobably..
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Quoting - srimks
Both GNU & ICC supports pobably..
long int does change from a 32-bit type in the 32-bit compiler to 64-bit in the 64-bit compiler, on linux. I guess the idea is that long is the largest type with efficient native support.
I suppose the original question was somewhat ambiguous, as it didn't specify clearly that 64-bit mode was intended, although it didn't make sense otherwise.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Quoting - tim18
No, gcc -m32 selects the 32-bit gcc compiler, if installed. That's a prerequisite for the icc ia32 compiler, but icc doesn't act on -m32. Anyway, it has little bearing on the original question.
long int does change from a 32-bit type in the 32-bit compiler to 64-bit in the 64-bit compiler, on linux. I guess the idea is that long is the largest type with efficient native support.
I suppose the original question was somewhat ambiguous, as it didn't specify clearly that 64-bit mode was intended, although it didn't make sense otherwise.
long int does change from a 32-bit type in the 32-bit compiler to 64-bit in the 64-bit compiler, on linux. I guess the idea is that long is the largest type with efficient native support.
I suppose the original question was somewhat ambiguous, as it didn't specify clearly that 64-bit mode was intended, although it didn't make sense otherwise.
Sorry, I had a small test where I had absentmindedlyprinted sizeof(double) whenI had meant to type sizeof(int). I was under the impression that in x86-64 mode, the integers were 64-bit. I'm actually a bit more troubled now, since we will soon be scaling to hundreds of thousands of processors where I work, and int will need to be 64-bit for those runs.
Thanks to everyone who tried to answer my original bogus question.
-Jeff
Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page