If anyone saw my last wonderfully perplexing post I've almost finished my binary-conversion lib.
However while writing it I've come across a couple of simple questions that google doesn't seem to want to answer for me. Thus I once again find myself pulling my hair out over something trivial.
So mighty coders of Dani-Web, does anyone know the answers to the following:
C++ standard questions:
------------------------------------------------------------------------------------
The C standard states that the maximum bit's in a char is 8, C++ inherits. I believe the standard also states the minimum bits in a char is 8, is this correct?
Am I correct in assuming a char will this always be 8 bits?
I've never seen sizeof(char) report anything but one byte. Does the standard state anywhere a char must be one byte? (I'm pretty lame at reading standards... I just about hacked OpenGL)
sizeof() cannot be <1, if a char IS 8bits, does that mean the system byte must also be 8bits? or does it only use the first 8bits of a larger system-byte?
If both the above questions are true, does that mean that Char = 8bits = one byte on all implementations?
--------------------------------------------------------------------------------
Platform questions:
I'm currently calculating the endianess and sign bit location for ALL data types:
EG: Int_Endian, char_endian, etc...etc...etc:
Has anyone actually ever heard of a system (any system) that has a c++ compiler and stores data types with different endians?
Can I reasonably expect the system to use the same endian and sign-bit-location for all data types?
Conversion library questions
-------------------------------------------------------------------------------
Due to the way I'm currently storing my data the system must state during the call what data type needs to be extracted from my binary data (that is effectively a single very-very-very long block of 10101011010101... Like you'd see in a 1970's sci-fi) Obviously I have no way of ensuring that the data being recalled is in that data-type. Thus the program could request a char and recive half a short (nonsence data).
would you (the reader) consider this good coding? or prefer safeguards despite the processing speed loss?