Hi, I have problem putting a value to 16 bits variable. Here is, how I am doing it.

typedef unsigned int UINT;
typedef UINT* ID_PTR;

unsigned short int myclass[5];
const ID_PTR  arr16_bit[5] = {
                               (UINT*) &myclass[0],
                               (UINT*) &myclass[1],
                               (UINT*) &myclass[2],
                               (UINT*) &myclass[3],
                             };

Now accessing the myclass like this.

*(*(arr16_bit + 2)) =  65000;

I can access the array, but the 65000 is cut to 8 bits. Why?

/thanks
kursist

how did you decide this is the case?

and how big is your "int"?

Sorry dude, but it accesses fine (for the case you gave).

main(){
   *(*(arr16_bit + 2)) =  65000; 
    
   cout << dec << **(arr16_bit + 2) << " " << *arr16_bit[2] << endl;
   
   system("pause");
   }

You'll probably find that using a pointer to a 32bit int to write to a 16bit int overwrites the next 16bit int. i.e. Make sure you access 16bit numbers with a pointer to a 16bit number.

So use typedef unsigned short int UINT; (Note the addition of 'short').

Thanks, This solved the problem.

unsigned int* ptr =(unsigned int *)*(arr16_bit + 5);
	
 *ptr = 10000;

:)

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.