0

I am trying to write a bash script to list the size of each file/subdir of the current directory, as follows:

for f in $(ls -A)
do
    du -sh $f
done

I used ls -A because I need to include hidden files/dirs starting with a dot, like .ssh. However, the script above cannot handle spaces if the file names in $f contain spaces. e.g. I have a file called:

books to borrow.doc

and the above script will return:

du: cannot access `books': No such file or directory
du: cannot access `to': No such file or directory
du: cannot access `borrow.doc': No such file or directory

There is a similar question Shell script issue with filenames containing spaces, but the list of names to process is from expanding * (instead of ls -A). The answer to that question was to add double quotes to $f. I tried the same, i.e., changing

    du -sh $f

to

    du -sh "$f"

but the result is the same. My question is how to write the script to handle spaces here?

Thanks.

5
  • You could use find ... -execdir ... instead. Commented Sep 14, 2014 at 21:47
  • The problem is that space is the internal field separator (IFS). Use find -print0 … or temporarily overwrite the IFS variable. Commented Sep 14, 2014 at 21:51
  • Something like shopt -s nullglob dotglob; for f in *; do du -sh "$f"; done ? Commented Sep 14, 2014 at 21:58
  • 2
    See mywiki.wooledge.org/ParsingLs Commented Sep 14, 2014 at 22:00
  • @lxg, modifying IFS doesn't prevent globbing, so it's not sufficient to make unquoted expansions safe (and if nullglob is set, it can have effects in circumstances one might otherwise not expect). Commented Sep 15, 2014 at 17:38

5 Answers 5

3

Dont parse the output from ls. When the file contains a space, the $f contains the parts of teh filename splitted on the space, and therefore the double quotes doesn't got the whole filename

The next will work and will do the same as your script

GLOBIGNORE=".:.."  #ignore . and ..
shopt -s dotglob   #the * will expand all files, e.g. which starting with . too
for f in *
do
    #echo "==$f=="
    du -sh "$f"  #double quoted (!!!)
done
Sign up to request clarification or add additional context in comments.

3 Comments

Shouldn't need to directly set GLOBIGNORE to .:.., since The file names . and .. are always ignored when GLOBIGNORE is set and not null.
Yes, but don't hurts, and ensures that it is really SET to something.
True. And I was actually a bit confused about how this worked when writing my previous comment. . and .. still won't match a naked *, even if GLOBIGNORE is nulled out and dotglob is set (presumably since . and .. aren't really filenames). They will match .* for some reason, unless you set GLOBIGNORE to a non-null value.
2

Unless the directory is so big that the list of file names is too big:

du -sh * .*

Be aware that this will include . and .., though. If you want to eliminate .. (probably a good idea), you can use:

for file in * .*
do
    [ "$file" = ".." ] && continue
    du -sh "$file"  # Double quotes important
done

You can consider assigning the names to an array and then working on the array:

files=( * .* )
for file in "${files[@]}"
do
    ...
done

You might use variations on that to run du on groups of names, but you could also consider using:

printf "%s\0" "${files[@]}" | xargs -0 du -sh

4 Comments

( shopt -s dotglob; du -sh * ) would eliminate . and .. as well. Still a bit confused as to why .* matches . and ..
I tried the second option. files=( * .*) ..., all that I got was 1.7G . 1.7G . 1.7G . ... Am I missing something?
It depends on what you put in place of the three dots. If it was du -sh "$file", you should be OK. If you put something else, there are certainly ways you can get the same size over and over again (du -sh "$files", with the plural name, would be one such).
I was using du -sh $file. Now that I realize that I forgot to add quotes, I have tried exactly your option 1, I got a bunch of du: invalid zero-length file name. I am using MinGW32 from mingw.org. What might the issue be?
1

I generally prefer using the program find if a for loop would cause headaches. In your case, it is really simple:

$ find . -maxdepth 1 -exec du -sh '{}' \;

There are a number of security issues with using -exec which is why GNU find supports the safer -execdir that should be preferred if available. Since we are not recursing into directories here, it doesn't make a real difference, though.

The GNU version of find also has an option (-print0) to print out matched file names separated by NUL bytes but I find the above solution much simpler (and more efficient) than first outputting a list of all file names, then splitting it at NUL bytes and then iterating over it.

2 Comments

-execdir is itself a non-traditional option -- so you're already not supporting pure POSIX find.
@CharlesDuffy Thanks, I forgot this; updated my answer.
0

Try this:

    ls -A |
    while read -r line
    do
    du -sh "$line"
    done

Instead of checking for the ls -A output word by word, the while loop checks line by line. This way, you don't need to change the IFS variable.

Comments

0

Time to summarize. Assuming you are using Linux, this should work in most (if not all) cases.

find -maxdepth 1 -mindepth 1 -print0 | xargs -r -0 du -sh

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.