#define AUID 34567
#include <stdio.h>
#include <math.h>
int main() {
    float id = AUID;
	int floatSize = sizeof(float);
	unsigned char *bits;
	int i, part2;
	int mant = 128;
	bits = (unsigned char*) &id;
	for (part2 = 0; part2 < sizeof(bits); part2++) {
		printf("+");
		for(i = 0; i < 8; i++){
			if((bits[part2] & mant) > 0){
				printf("%d", 1);
			}else{
				printf("%d",0);
			}
			mant = mant >>1;
		}
		mant = 128;
	}
    
        printf("%f\n",id);
    
	
}

This is the code that I have modified...the output is meant to print out the full mantissa bits of the AUID, and also it's floating point value....where am i going wrong???

>where am i going wrong???
here, there and everywhere ;)
1. What for floatSize? You never used it.
2. sizeof(bits) is a size of a pointer to unsigned char. It's not bear a relation to the problem. Fortunately it's equal to 4 on 32-bit CPU...
3. 128 as a bit mask is binary 10000000 - it selects MSBit, but you want LSBit.
4. Why + was printed?
5. You are trying to print 32 bits (incorrectly) but float mantissa has 23+1(implicit) bits only.
6. You did not bear in mind mantissa bit order on little-endian CPU...

See http://en.wikipedia.org/wiki/IEEE_754-1985 and try again...

I haven't looked at this in ages, but it seems related.

I haven't looked at this in ages, but it seems related.

Any advice from this?

Thanks for that i'll keep it updated :)

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.