I have this code....
char ch;
for (int i = 0; i < 10; i++)
{
ch = (char)('a' + i);//here
Console.WriteLine(ch);
ch = (char)((int)ch & 65503);//and here
Console.WriteLine(ch + "");
Console.ReadLine();
}
Now what I would like to ask is where ('a' + i) is, how is this statement working.
Along with this, the statement ch = (char)((int)ch & 65503). I don't understand this bit...I know 65503 is the decimal value of 1111 1111 1101 1111. I know that ANDing ch and 65503 will change the sixth bit to 0 and then make ch an uppercase A instead of lowercase.
But just cannot understand how it is changing this bit to 0...
Hope this makes sense and someone can help!
Thanks
James