General Category > Ideas

deleted

<< < (2/2)

bas:
Currently in C if you want to write portable code, you often avoid the use of int, since since changes
the behaviour between platforms. Other recent languages (like Rust) do still have a general integere type
(isize/usize in Rust). I'm not sure if there is a real performance penalty (and how big) of using a 32bit
integer on a 64bit system. If there is a big one, we should introduce a generic 'int' like type.

lerno:
I would use it exclusively for loops BTW, for any data structure I'd use explicit sizes.

I did a little research and seems like the question is a bit deeper than I first thought. int/i32/i64 will affect register allocation, so performance depends a bit on the processor.

Something "good to have" out of the box is getting information form the compiler directly into the macro system to know things like data width, # and types of registers available, endianness and so on.

bas:
It's common to store pointers in numbers. Code like that often keeps track of the meaning of the value. It could be a real number or a pointer.
For code like this, having a usize/isize is a real requirement.

lerno:
I would have prefered using the well known Java convention of:

byte = i8
short = i16
int = i32
long = i64
float = f32
double = f64

Then extend to unsigned:

char = u8
ushort = u16
uint = u32
ulong = u64

However, if we want to support: i/u24 i/u48, f128 then that naming scheme is better. I prefer shorter if numbers are added. For the real horror example: NSUInteger of Objective-C on mac. That one is 32 bits or 64 depending on architecture. Aside from the obvious problem of 32/64 bit, the real pain is writing 4 uppercase followed by lower case. Hard to type as well as overly long.

lerno:
I've recently discovered that i32 can be significantly faster than "register sized" int.

Navigation

[0] Message Index

[*] Previous page

Go to full version