Hello, I am developing a program that will read files with many lines of text (perhaps 20 thousand), and to look for a text on each line, hit the text, perform a function.
I have a problem because the files have as many lines, the program will be very slow, but I need to be fast. This program will make sure to look for text in all files, and certainly will be slow.
What can I do? Are there any alternative to this?
(sorry for some mistake, I'm using an online translator)

Too slow? I just made a program which loads 60000 words, and it managed that in about a 1/10th of a second. Use simple file streams (std::ifstream) and vectors, and speed shouldn't be a problem.

Okay, I'll try, but the program will make this routine even every second, then probably give much I / O.
I am thinking of a way to do this efficiently, it is opening and looking at the text file will be slow.

How about
- 30 milliseconds to create a text file with 100000 lines (10 Mb)
- 400 milliseconds to scan 100000 lines, search pattern and call a function
on AMD 5000+/Windows XP/VC++ 2008 release with standard i/o streams only?
;)
PS. Of course, it's not an optimal solution - no asynch i/o, no working threads pool etc - that's an ordinar synchronous i/o in a single thread...

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.