hey there,
I have the following code:

import java.io.*;

   public class Karakters{
      public static void main(String[] args) throws IOException{
      FileWriter in = new FileWriter("karakters.txt");
      PrintWriter outfile = new PrintWriter(in);
         char x = 'A';
      
         for(int i = 0; i < 300; i++){
            System.out.println((i + 1) + " " + (char)(x + i));
            outfile.println((char)(x + i));
         }
      outfile.close();
      }
   
   }

This class creates a txt file named karakters.txt, and puts 300 consecutive characters, starting at 'A', in the file, one character each line.

The following class is supposed to generate binary representations of each character using Java's Integer.toBinaryString(character) method. however, this is not working properly. for characters whose ascii values range between 127 and 160 (inclusive), the same value is returned, and it equals 111111 (decimal 63).

import java.io.*;

   public class Aski{
      public static void main(String[] args) throws IOException{
      FileReader x = new FileReader("karakters.txt");
		BufferedReader outfile = new BufferedReader(x);
		int i = 0;
		while(outfile.ready()){
		i++;
		String s = outfile.readLine();
		System.out.println(i + " " + s.charAt(0) + " " + (Integer.toBinaryString(s.charAt(0))));
		}
      }
   }

there are other ranges, like 256 - 365 for which the same occurs. I expect that a character generated as follows:
char x = (char)(270);
would return a binary equivalent of 270 after passed to Integer.toBinaryString() method.

please help! thank you.

As character everything > 255 will be registered in the file as String "?"
which is decimal 63 hexa 3F bin 111111
so this is why when you read it back "?" it will be translated to 111111
unless your PC is configured with extanded Arabic/Kanji characters set

well, even for a character corresponding to a decimal value of 130, the result is the same?

Not in the file generated by cut & pasting your code

A
B
C
D
E
F
G
H
I
J
K
L
M
N
O
P
Q
R
S
T
U
V
W
X
Y
Z
[
\
]
^
_
`
a
b
c
d
e
f
g
h
i
j
k
l
m
n
o
p
q
r
s
t
u
v
w
x
y
z
{
|
}
~

?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?

¡
¢
£
¤
¥
¦
§
¨
©
ª
«
¬
­
®
¯
°
±
²
³
´
µ

·
¸
¹
º
»
¼
½
¾
¿
À
Á
Â
Ã
Ä
Å
Æ
Ç
È
É
Ê
Ë
Ì
Í
Î
Ï
Ð
Ñ
Ò
Ó
Ô
Õ
Ö
×
Ø
Ù
Ú
Û
Ü
Ý
Þ
ß
à
á
â
ã
ä
å
æ
ç
è
é
ê
ë
ì
í
î
ï
ð
ñ
ò
ó
ô
õ
ö
÷
ø
ù
ú
û
ü
ý
þ
ÿ
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
Œ
œ
?
?
?
?
?
?
?
?
?
?
?
?
Š
š
?
?
?
?
?
?
?
?
?
?
?

{ 123
| 124
} 125
~ 126
127
? 128
? 129
? 130
? 131 and so on :)
?
?
?
?
?
?
?
?

Yes they are not defined in standard ASCII

I am marking this thread as solved, although no solution has been found. I guess, for anyone encountering the same problem, a solution may be found in extended ascii or something.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.