Dave, I need to conduct a number of searches through more than 2500 text files. Each search is for a different specific text string. The text files are on my Mac hard drive (Mac OS X 10.3.7) and are arranged into folders within folders within folders. I want the result in a new text file. I think I should be able to do this using some sort of grep or script but cannot figure out how to do it. Please help.
Ah, the serendipity is marvelous! I’ve just been writing about the find command for the new Tiger edition of my best-selling book Learning Unix for Mac OS X so it’s all very fresh in my mind.
Whenever you have a nested file heirarchy that you want to search, you should always, automatically reach for the find monkey wrench, coupled with its partner command xargs. But let’s step through this slowly so you can see how these all work together, because we’re going to use three different commands in a pipe to accomplish what you seek.
First off, the find command has some of the weirdest syntax in Unix, so if you want to learn more about it, use the man find command within Terminal. For now, just follow along. 🙂 To find all files below the current point in the file system that are HTML files, you’d use:
$ find . -name "*html" -print
Notice that by not using the pattern *.html this also matches files that have the suffix “shtml” too (typically server-side include HTML). This generates a long list of filenames. To search through them for a specific pattern, you want to use the grep command, as you know, but the wrinkle is that you can’t just do something like find | grep because grep just isn’t expecting a list of filenames from standard input (stdin).
That’s when our pal xargs comes in. The xargs command does expect a list of filenames from stdin and it then acts as a wrapper for Unix commands that don’t work that way.
Putting them all together, here’s how you could find all HTML files that have a 2004 copyright notice in them, just as a topical example:
$ find . -name "*html" -print | xargs grep 2004 | \ grep '©'
Or, if you want to get fancy, use grep -E ‘(\© .* 2004)’ which is probably better, but it’s a bit more complex because it’s a regular expression not just a simple pair of patterns. Either way, the result will be a list of filenames and the lines from those files that match.
Now, the last step of your task is to save that output to a new file, which can be done with a simple redirect:
$ find . -name "*html" -print | xargs grep 2004 | \ grep '©' > copyright.2004.txt
I hope that gets you going. Drop that into a shell script then you tweak it to meet the specific pattern and output file naming scheme you need to use.
You can use the find command within a loop, Deepu, but if there are too many files, you’re hitting the buffer limit within the shell and that won’t solve the problem. Try having those more than 18 months old and another search for 12-18 months, for example, to try and chop things down more narrowly.
Hi,
I have a find command which lists files older than 1 year. As the total number of files being picked up is huge, the find command fails saying it cannot list the files.
Is it ok to use the find command inside a for loop?
Question: Why is the technique in this article better than just:
$cd TextfileDirectoryName
$sudo grep -lr SearchString *.html >OutputTextFile.txt
?
Thanks,
Mike
As an addendum to the above comment: be aware the the spotlight indexes are not necessarily complete or accurate. There are directories and files that do not get included in the index. That’s the tradeoff you get for the speed… instead of giving you true, current information about your system, Spotlight gives you incomplete, not always-up-to-date information about your system.
But on the upside, it fails to give you accurate information much faster than anything else out there!
BTW, when I tried them, none of the command lines above give me anything but “xargs: unterminated quote”. The quest to be able to search my hard drive continues…
Now that Tiger (Mac OS X 10.4) has shipped, people who find this helpful article on the net will want to know that a dramatically faster command line tools is included with Tiger.
mdfind uses the Spotlight index. Life has suddenly become too short to wait for the old fashioned UNIX find — mdfind rocks.
If you can’t remember how to use mdfind, just type it with no parameters, and it provides nice examples, like this:
$ mdfind
mdfind: no query specified.
Usage: mdfind [-live] [-onlyin directory] query
list the files matching the query
query can be an expression or a sequence of words
-live Query should stay active
-onlyin Search only within given directory
-0 Use NUL (“\0”) as a path separator, for use with xargs -0.
example: mdfind image
example: mdfind “kMDItemAuthor == ‘*MyFavoriteAuthor*'”
example: mdfind -live MyFavoriteAuthor
— end of usage —
I’d also recommend throwing a sed in there to take care of files with spaces in there:
find ./ -name “*html” -print | sed -e ‘s/.*/”&”/’ | xargs grep ‘2004’ | grep ‘©’
the sed -e ‘s/.*/”&”/g’ will wrap each line in double quotes:
~ $ ls | sed -e ‘s/.*/”&”/’
“Desktop”
“DiscBlaze Temp Folder”
“Documents”
…
Nice Post btw… people forget the wizardry of [Uu]nix
How do I search lots of Mac OS X files at once using find?
Dave, I need to conduct a number of searches through more than 2500 text files. Each search is for a different specific text string. The text files are on my Mac hard drive (Mac OS X 10.3.7) and are arranged into folders within folders within folders. …