Topics:
DVR
nvrec
Mplayer
Links
Misc
Commands
Humor
New user
uploaded files
|
(linux_command_line)-> (Parent)->command line too long, Argument list too long |
submited by Russell Tue 07 Mar 06 Edited Wed 21 Apr 10 |
Sometime when I have a script go crazy or something else (usually as a result of my own stupidity) I end up with a directory with thousands of files in it.
[russell@localhost temp2]$ ls |nl|tail -2
8000 99977.7487154121
8001 makefiles.pl
[russell@localhost temp2]$ rm -v *
-bash: /bin/rm: Argument list too long
Now if I just wanted to delete all the files, I could just change directoty ( cd .. ) to the parent and delete the whole directory. But more often than not, I don't want to delete all of it, just prune it of stuff that I don't need, but bash ( and just about every other shell I have tried) has limits on how big a command line or argument list can be. Now in all seriousness, 8001 paramaters is a lot, and I don't know that I should serously expect this to just work. I have devised a few work arounds for this problem:
To delete some of the files ( first ones sorted) use:
rm -v `ls |head -1000` this uses the backtick to imbed another command, and this deletes the first 1000 files. ( the -v is optional, but I like to see what I am doing It is faster without -v ) You will need to issue this command over and over until no files are left, but it's a lot faster than say, manualy typing 8000 filenames. :-). One Thousand files usualy fits in the argument list, unless the filenames are very long, where you may need to use less than 1000. This will fail on filenames that contain spaces. If just a few contain spaces, ignore the errors, and delte those manualy. If they all contain spaces, use the zip method below.
To delete only the oldest files first:
rm -v `ls -tr |head -1000`
The -tr option on ls lists files by date, oldest ones first. Refer to the ls man page for more options.
There are other options as well, you can still select files by a filespec, even if the results would be too long for the command line.
rm -v `ls |grep ^[0-9][0-9]|head -1000`
Will delete the first 1000 files that start with two numerical digits.
Another trick I have found for dealing huge lists of files is to pipe the names into zip.
ls |grep ^[0-9][0-9]|zip -m@ tempfile.zip
Pipes all the files that start with atleast two digits into a zip file -m deletes the files after ziping and -@ tells zip to use the filelist from the pipe. The cool thing about this format is that this works even if some of the filenames contain spaces. ( filenames with spaces sneek by back-tick arg list as above). The other thing about using zip , is that you can delete files inside of the zipfile without reguard for the argument list length.
zip -d tempfile.zip "4*.*"
Deletes all the files that start with a 4 inside of the zipfile. You must put quotes around the paramater, or else the shell would try to evaluate them. Ofcourse, if you don't want any of it, just rm tempfile.zip and be gone with it.
This kind of stuff can also be done using find and the --exec option, but franky, that scares me too much. It's too powerful, and could with a small type-o create great havoc. what I like about this method is that I can test if first. I do the ls |head -1000 command or whatever to test to seek that it will get the files I want, then I press up-arrow, retreving the last command and add the rm -v ` to the beginging and the ` to the end. and then press enter to issue the new command. This way, I reduce the chance of a typeo deleting something I didn't want to.
It goes without saying that you can use these tricks with mv (move), cp (copy) or just about any other linux command.
I'd love to hear about a better way to do this. For me, this comes up more than I'd care to admit. ... what I really need to do is fix some scripts so they don't barf lots of uneeded little temp files.
Some have commented that some of this can be done using the -exec command of find.
find -exec some_command {} \;
(you need that escaped semi-coloin at the end)
In truth, it's really powerful. I feel too powerful. I would NOT not use this command unless you have played with it on some test files and you are 100% sure it will do what you expect. I have also noticed (no specific data taken) that find is SLOWER than using long command lines to issue commands. I believe this is because the command is issued for each parameter, so the command has to start up each time, instead of just once per batch.
Links from:
shell script delete 1000 oldest files
Replys:
|
|