I am new in Python. I wrote a program that reads-in large text files using xreadlines, stores some values in lists (and arrays) and does some calculations. It runs fine for individual files, but when I try to consecutively process files from a folder, I get a memory error.
my program looks like this:
data = fileName.xreadlines()
for line in data:
tokens = line.split(';')
list1.append(tokens[2])
list2.append(tokens[3])
...
...
outfile.write(results)
When I enter fileName manually and run it for one file it works fine, but when i do:
for file in os.listdir(dir):
code as above
I get a memory error after processing the first file.
I have tried manually to delete the lists after the calculations either by aList = [] or del(aList). So, how can I free memory? platform: windows XP