how i can read file faster. I have millions of file my motto is to read
all files and extract particular patter and then create a hash with
those extracted value. What i did is open the file and then reading each line.it is taking to much time.
how i can read file faster. I have millions of file my motto is to read
all files and extract particular patter and then create a hash with
those extracted value. What i did is open the file and then reading each line.it is taking to much time.
It is possible to set up your own search engine on a desktop pc. It will
index all your data and store it locally on your pc. If the machine with a
million files on it is considered safe then building your own personal
search engine on the same machine should not be a privacy issue.
Having said that grep could still be the tool for you
$ grep 'ifdef.*SUM' files/*
Will search all the files that have the string 'ifdef' followed by 'SUM' on
the same line. Grep is quite fast.
What Peter meant is that if this is something that you want to
repeatedly use, such as today searching for this pattern, tomorrow for
a different pattern and so on, using this functionality many times,
then you might be better off using a search engine or database. Also
you can use a search engine locally without any privacy issues.
If it's just a one off operation, you probably won't need that complexity.