Hi,

I have a problem while reading a jsp file. I have a jsp file called scan.jsp. It is a very big file having 9000 lines. It has java script codes, some imported files etc. Unable to read all lines of data when I try the following code. 



public static final void startReadProcess(String sourcFile){
                LineNumberReader lnr = null;
                try {
                    lnr = new LineNumberReader (new FileReader (sourcFile));
                    while ((lnr.readLine ()) != null) {
                            System.out.println(lnr.readLine ());
                        }
                } catch (Exception e) {
                    e.printStackTrace();
                    PrintErrorLog.printError("Error while readingSoureFile");
                }
                finally{
                    closeReaderObj(lnr);           
                }

            }



Can any one please let me know why Data loss happing while reading big files? 

Perhaps you should use BufferedReader.

 try (BufferedReader reader = Files.newBufferedReader(sourceFile, charset)) {
        String line = null;
        while ((line = reader.readLine()) != null) {
           System.out.println(line);
        }
    } catch (IOException x) {
        System.err.format("IOException: %s%n", x);
    }

Can any one please let me know why Data loss happing while reading big files?

Because your readLine call actually reads in a line and increments the file pointer. In your current code, you are basically discarding every other line. You need to assign the result of readLine to a variable (as shown above) and print it out for getting expected behaviour.

commented: Excellent focused and correct answer. +6

Thanks for your replies. I Got an idea.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.