TheObserver
Programmer
I have a character array that ultimately ends up with binary/character data in it. It is an array of four elements, and of course each element has eight bits in it.
So I have (for example):
char length[4];
and the data therein is:
length[0] = 00000000
length[1] = 00000000
length[2] = 00111111
length[3] = 01111100
Or, alternately, as characters:
length[0] = " "
length[1] = " "
length[2] = "?"
length[3] = "|"
when output to the console.
I need to convert these into a number, but not for each item, a number made up of all of the array elements (4) that is 32 bits wide, thus:
00000000000000000011111101111100
and then convert that into an int or a long.
I know about atoi, atol, etc, but they only work on numeric or alphanumeric characters, and as you can see above, I definately would need to plan on not receiving alphanumeric characters.
I've looked all over and haven't found anything. Attempts to do this on my own have been unsuccessful. Any input on this matter would be very much appreciated.
Thanks for your time.
So I have (for example):
char length[4];
and the data therein is:
length[0] = 00000000
length[1] = 00000000
length[2] = 00111111
length[3] = 01111100
Or, alternately, as characters:
length[0] = " "
length[1] = " "
length[2] = "?"
length[3] = "|"
when output to the console.
I need to convert these into a number, but not for each item, a number made up of all of the array elements (4) that is 32 bits wide, thus:
00000000000000000011111101111100
and then convert that into an int or a long.
I know about atoi, atol, etc, but they only work on numeric or alphanumeric characters, and as you can see above, I definately would need to plan on not receiving alphanumeric characters.
I've looked all over and haven't found anything. Attempts to do this on my own have been unsuccessful. Any input on this matter would be very much appreciated.
Thanks for your time.