Hi,
I have a swing application which works on CSV file. It reads full file line by line, computes some required statistics and shows output.
The Upper part of output screen shows each record from file in that order in JTable, whereas lower part shows statistics computed based on that data. The problem is that JVM take 4 times the memory than that of file size. (while processing 86MB of file Heap area uses 377 MB of space - memory utilization checked using jVisualVM).
Note:
1. I have used LineNumberReader for reading file (beacause of specific requirement, I can change it if that helps in memory usage)
2. For reading every line readLine() is used and then .split(',') of that line which is String is called for individual fields of that record.
3. Each record in stored in Vector for displaynig in JTable, whereas other statisics are stored in HashMap, TreeMap and summary data in JavaBean class. Also one graph is plotted using JFreeChart.

Please suggest to reduce Memory utilization as I need to process 2GB file.

I got the same link as corrupt, so I'd myself recommend it! taught me a lot just by skimming through it.

If you are storing multiple copies of the complete data in Vectors etc then the way you read the file is not the real problem. From your description, with a 2Gig file, you will create a 2Gig Vector which you will use to display 2Gig of data in a JTable. That's just not realistic. The challenge here is to design something that computes summaries from the whole file, and acts as a "window" into the full file for viewing etc

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.