Hello everyone,
I currently have an application that reads in an entire excel file then iterates through the records from the file and queries our database for a match. If a match is found the record is ignored if a match is not found the record gets added to the database.
Now there are ~2500 records per excel file and each of those records has a few attributes linked to it. My problem is that when I try to read in certain files I get a GC overhead error or out of memory error. To resolve this I know I could probably read in a chunk of records instead of the entire file but I also get the error when I try to read in another file right after that.
Is there any way to get rid of the old data from the JVM's heap after I am through uploading it? Or is there some obvious way I may be missing to do this more efficiently?
Thanks for your help!!