Hi, I'm a newbie C programmer, Iv been doing some small exercises using the gcc compiler on OS X, and am a-bit confused.
In C I was told when you initialise an int, but do not assign it a value
(int i
it would assume an arbitary value.
However on my Mac int i; will assign i the value 0. More like Java.
Could anyone clarify this for me? Are int's supposed to assume an arbitrary value?
Thankyou
Oxy
we are all of us living in the gutter.
But some of us are looking at the stars.
In C I was told when you initialise an int, but do not assign it a value
(int i
However on my Mac int i; will assign i the value 0. More like Java.
Could anyone clarify this for me? Are int's supposed to assume an arbitrary value?
Thankyou
Oxy
we are all of us living in the gutter.
But some of us are looking at the stars.