UNIX Shell Scripting question

I’m having a problem.

I’m writing some scripts to do various things to lots of files in a directory. Primarily, working on mp3 filenames.

My problem is this:
Suppose I want to rename all my mp3s to a specific naming scheme. I alreay wrote a script using sed and grep to do just that, but the problem is that I have too many files to say ls *.mp3. I had to ls | grep mp3$ instead.

The bigger problem is that I like to use for loops. my mp3 renaming script depended heavily (in its original conception) on for i in (*.mp3); do blah blah; done. Except that most of my filenames have spaces in them. Let me illustrate the trouble:

file number 1.mp3
file number 2.mp3
file number 3.mp3

are the files I’m working on. When I use *for i in .mp3; do foo $i; done, this is what happens:
file not found
number not found
1.mp3 not found
file not found
number not found
2.mp3 not found
file not found
number not found
3.mp3 not found

On the other hand, if I use *for “i” in .mp3; do foo “$i”; done, I get this instead:

“file number 1.mp3file number 2.mp3file number 3.mp3” not found

So. How do I get the shell to feed me each complete filename as a single line of input that can be quoted, instead of either a block of text containing all filenames, or each token as a separate line?

If I’m not explaining this clearly, let me know and I’ll try again.

Here ya go:

:
find . -name ‘*.sql’ | sed ‘s/^.///’ >foo.dat
cat foo.dat | while read filename ; do
echo $filename
done

Um, and change ‘.sql’ to '.mp3’.

OK to fix your file not found error, use braces around the variable.
This should work:
for file in “*.mp3”;do foo ${file};done
I’m assuming you’re using bash/sh/ksh. I don’t think this works in csh/tcsh.

For the “too many arguments” error, you’ll need to use xargs.
Unix has a limit to how many arguments each command can take, but xargs breaks up the command line so you never run into that limit.

An easy way is to just use a “while…do” instead of a “for…do”:

ls | grep mp3$ | while read f; do echo “$f”; done

This will read your filenames in one at a time regardless of spaces.

Thanks for all the suggestions. I’m off to try them all out. :slight_smile:

Actually, I didn’t describe my problem properly.

The way I was getting my input was with a command substitution:

for i in ls |grep *.mp3
do echo “$i”
done

Now this is where the problem happens. Doing ‘for i in *.mp3’ works fine, and is how I ended up working around it. The problem is that everything inside the backticks gets fed as a single block (i.e. all filenames) of text.

So I guess my actual question is whether there’s a convenient way to break that up into each filename. I’m thinking at the moment that piping it to tr to get rid of the newline characters. Who knows.

Backticks change the newlines to spaces. If you capture the output of ls with ls | grep *.mp3, you won’t be able to break it at newlines because they’re gone, and you won’t be able to break at spaces since your filenames contain spaces. “for i in *.mp3” is your best bet.

Incidentally, you could do it with find(1) if you want to process the files recursively:
find . -name ‘*.mp3’ -exec foo {} ;