I am working on a script that needs to perform an action in every sub-directory of a specific folder.
What is the most efficient way to write that?
mkdir 'foo * bar'
will cause foo
and bar
to be iterated over even if they don't exist, and the *
will be replaced with a list of all filenames, even non-directory ones).
mkdir -p '/tmp/ /etc/passwd /'
-- if someone runs a script following this practice on /tmp
to, say, find directories to delete, they could end up deleting /etc/passwd
.
A version that avoids creating a sub-process:
for D in *; do
if [ -d "${D}" ]; then
echo "${D}" # your processing here
fi
done
Or, if your action is a single command, this is more concise:
for D in *; do [ -d "${D}" ] && my_command; done
Or an even more concise version (thanks @enzotib). Note that in this version each value of D
will have a trailing slash:
for D in */; do my_command; done
for D in `find . -type d`
do
//Do whatever you need with D
done
find
params if you want recursive or non-recursive behavior.
find . -mindepth 1 -type d
mkdir 'directory name with spaces'
into four separate words.
-mindepth 1 -maxdepth 1
or it went too deep.
The simplest non recursive way is:
for d in */; do
echo "$d"
done
The /
at the end tells, use directories only.
There is no need for
find
awk
...
shopt -s dotglob
to include dotfiles/dotdirs when expanding wildcards. See also: gnu.org/software/bash/manual/html_node/The-Shopt-Builtin.html
/*
instead of */
with /
representing the path you want to use.
/*
would be for absolute path whereas */
would include the subdirectories from the current location
${d%/*}
Use find command.
In GNU find
, you can use -execdir
parameter:
find . -type d -execdir realpath "{}" ';'
or by using -exec
parameter:
find . -type d -exec sh -c 'cd -P "$0" && pwd -P' {} \;
or with xargs
command:
find . -type d -print0 | xargs -0 -L1 sh -c 'cd "$0" && pwd && echo Do stuff'
Or using for loop:
for d in */; { echo "$d"; }
For recursivity try extended globbing (**/
) instead (enable by: shopt -s extglob
).
For more examples, see: How to go to each directory and execute a command? at SO
-exec {} +
is POSIX-specified, -exec sh -c 'owd=$PWD; for arg; do cd -- "$arg" && pwd -P; cd -- "$owd"; done' _ {} +
is another legal option, and invokes fewer shells than -exec sh -c '...' {} \;
.
Handy one-liners
for D in *; do echo "$D"; done
for D in *; do find "$D" -type d; done ### Option A
find * -type d ### Option B
Option A is correct for folders with spaces in between. Also, generally faster since it doesn't print each word in a folder name as a separate entity.
# Option A
$ time for D in ./big_dir/*; do find "$D" -type d > /dev/null; done
real 0m0.327s
user 0m0.084s
sys 0m0.236s
# Option B
$ time for D in `find ./big_dir/* -type d`; do echo "$D" > /dev/null; done
real 0m0.787s
user 0m0.484s
sys 0m0.308s
find . -type d -print0 | xargs -0 -n 1 my_command
my_command
.
This will create a subshell (which means that variable values will be lost when the while
loop exits):
find . -type d | while read -r dir
do
something
done
This won't:
while read -r dir
do
something
done < <(find . -type d)
Either one will work if there are spaces in directory names.
find ... -print0
and while IFS="" read -r -d $'\000' dir
-d ''
is less misleading about bash syntax and capabilities, since -d $'\000'
implies (falsely) that $'\000'
is in some way different from ''
-- indeed, one could readily (and again, falsely) infer from it that bash supports Pascal-style strings (length-specified, able to contain NUL literals) rather than C strings (NUL delimited, unable to contain NULs).
You could try:
#!/bin/bash
### $1 == the first args to this script
### usage: script.sh /path/to/dir/
for f in `find . -maxdepth 1 -mindepth 1 -type d`; do
cd "$f"
<your job here>
done
or similar...
Explanation:
find . -maxdepth 1 -mindepth 1 -type d
: Only find directories with a maximum recursive depth of 1 (only the subdirectories of $1) and minimum depth of 1 (excludes current folder .
)
cd "$f"
. It doesn't work when the output from find
is string-split, so you'll have the separate pieces of the name as separate values in $f
, making how well you do or don't quote $f
's expansion moot.
find
's output is line-oriented (one name to a line, in theory -- but see below) with the default -print
action.
find -print
is not a safe way to pass arbitrary filenames, since one can run something mkdir -p $'foo\n/etc/passwd\nbar'
and get a directory that has /etc/passwd
as a separate line in its name. Handling names from files in /upload
or /tmp
directories without care is a great way to get privilege escalation attacks.
the accepted answer will break on white spaces if the directory names have them, and the preferred syntax is $()
for bash/ksh. Use GNU find
-exec
option with +;
eg
find .... -exec mycommand +;
#this is same as passing to xargs
or use a while loop
find .... | while read -r D
do
# use variable `D` or whatever variable name you defined instead here
done
find -exec
and passing to xargs
: find
will ignore the exit value of the command being executed, while xargs
will fail on a nonzero exit. Either might be correct, depending on your needs.
find ... -print0 | while IFS= read -r d
is safer -- supports names that begin or end in whitespace, and names that contain newline literals.
Success story sharing
if
or[
with:for D in */; do
for D in *; do [ -d "${D}" ] && my_command; done
or a combination of the two latest:for D in */; do [ -d $D ] && my_command; done
for D in .* *; do
insteadfor D in *; do
.