Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations bkrike on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

C files from Mac to Unix cause segmentation fault!

Status
Not open for further replies.

Oxymoron

Technical User
Dec 17, 2000
168
GB
Dear all,

Please help, I've been working really hard on a uni project and have finished the project and my C program works fine written, compiled and run on the Mac iBook using Darwin and the gcc compiler that comes with it.
I compile it always with the -ansi flag too.
I tried compiling these files on a linux machine and I firstly got a newline character missing warning on each of the 3 files, but they compiled and produced an executable.
Running that causes a segmentation fault though!

Why? surely source code is source code?I'm using unix style linefeeds and told my editor to save the source file as C source.

Has anyone got any ideas? I'd be soooo grateful!
Thanks every1,
Oxy

we are all of us living in the gutter.
But some of us are looking at the stars.
 
There are still bugs in your code.

If you take a random pointer, say
[tt]int *a;[/tt]
and try and use it, say
[tt]*a = 42;[/tt]

Then whether it crashes or not is also random. Unfortunately, you've equated "not crashing" with "bug free".

You have to associate each pointer with a memory location before you try and dereference it.

Another popular mistake is forgetting to allocate enough memory.
[tt]int *array = malloc( 10 );[/tt]
This does NOT allocate space for 10 integers. Trying to do so will cause unpredictable consequences. You can usually tell that you've done something like this, because when it crashes, it usually crashes in some unrelated malloc/free call elsewhere in the program.

[tt]int *array = malloc( 10 * sizeof *array );[/tt]
Is the correct way to allocate some memory, and ensure you get enough memory as well.


If you go back to your Darwin version, and try say turning on the optimiser
[tt]gcc -W -Wall -ansi -O2 prog.c[/tt]
This will have the effect of rearranging all the random memory locations you're using and could well cause a segmentation fault there as well.

Depending on how you view these things, you could say
You were lucky that it ran on Darwin
You were lucky that Linux found a bug you didn't know about.

--
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top