ChatGPT解决这个技术问题 Extra ChatGPT

find -exec with multiple commands

I am trying to use find -exec with multiple commands without any success. Does anybody know if commands such as the following are possible?

find *.txt -exec echo "$(tail -1 '{}'),$(ls '{}')" \;

Basically, I am trying to print the last line of each txt file in the current directory and print at the end of the line, a comma followed by the filename.

As far as checking for the possibility of the command, did you not try it out on your system?
From the find manual page: There are unavoidable security problems surrounding use of the -exec option; you should use the -execdir option instead.unixhelp.ed.ac.uk/CGI/man-cgi?find
@JVE999 link is broken, alternative at ss64.com/bash/find.html

A
Alan W. Smith

find accepts multiple -exec portions to the command. For example:

find . -name "*.txt" -exec echo {} \; -exec grep banana {} \;

Note that in this case the second command will only run if the first one returns successfully, as mentioned by @Caleb. If you want both commands to run regardless of their success or failure, you could use this construct:

find . -name "*.txt" \( -exec echo {} \; -o -exec true \; \) -exec grep banana {} \;

how to grep twice? this is failing: find ./* -exec grep -v 'COLD,' {} \; -exec egrep -i "my_string" {} \;
@rajeev The second exec will only run if the return code for the first returns success, otherwise it will be skipped. This should probably be noted in this answer.
Note the use of -n in some of the other answers to suppress the newline generated by echo, which is handy if your second command produces only one line of output and you want them to be easier to read.
Pipe the results of the first -exec into grep? find . -iname "*.srt" -exec xattr -l {} | grep "@type" \; > text.txt
b
bensiu
find . -type d -exec sh -c "echo -n {}; echo -n ' x '; echo {}" \;

If you want to run Bash instead of Bourne you can also use ... -exec bash -c ... instead of ... -exec sh -c ....
Never embed {} in shell code. See unix.stackexchange.com/questions/156008/…
+1 @Kusalananda Injecting filenames is fragile and insecure. Use parameters. see SC2156
D
Dennis Williamson

One of the following:

find *.txt -exec awk 'END {print $0 "," FILENAME}' {} \;

find *.txt -exec sh -c 'echo "$(tail -n 1 "$1"),$1"' _ {} \;

find *.txt -exec sh -c 'echo "$(sed -n "\$p" "$1"),$1"' _ {} \;

What is the underscore before {} for?
@qed: It is a throw-away value that holds the place of $0. Try this with "foobar" instead of "_": find /usr/bin -name find -exec sh -c 'echo "[$0] [$1]"' foobar {} \; - the output: "[foobar] [/usr/bin/find]".
@XuWang: Yes, I would say that's the case. As you know, $0 is usually the program name (ARGV[0]).
It is critical, for this method, that the script passed to sh -c is in single quotes, not double. Otherwise $1 is in the wrong scope.
@Nick quotes has nothing to do with it - you can write '$1' with double quotes as long as you escape the variable ("\$1"). You can escape other characters as well ("\"").
S
Souvik Ghosh

Another way is like this:

multiple_cmd() { 
    tail -n1 $1; 
    ls $1 
}; 
export -f multiple_cmd; 
find *.txt -exec bash -c 'multiple_cmd "$0"' {} \;

in one line

multiple_cmd() { tail -1 $1; ls $1 }; export -f multiple_cmd; find *.txt -exec bash -c 'multiple_cmd "$0"' {} \;

"multiple_cmd()" - is a function

"export -f multiple_cmd" - will export it so any other subshell can see it

"find *.txt -exec bash -c 'multiple_cmd "$0"' {} \;" - find that will execute the function on your example

In this way multiple_cmd can be as long and as complex, as you need.

Hope this helps.


perfect, just what I needed!
@Thomas it does but try this one liner, tested in osx. I made a directory called 'aaa' with some files/dirs in there and CDd to it. Then, ~/aaa$ acmd() { echo x \"$1\" x; }; export -f acmd; find . -exec bash -c 'acmd {}' \;
It stays stuck on ">"
C
Camilo Martin

There's an easier way:

find ... | while read -r file; do
    echo "look at my $file, my $file is amazing";
done

Alternatively:

while read -r file; do
    echo "look at my $file, my $file is amazing";
done <<< "$(find ...)"

filenames can have newlines in them, this is why find has the -print0 argument and xargs has the -0 argument
@abasterfield I always hope never to find those in the wild lol
what I wanted to do was "find ... -exec zcat {} | wc -l \;" which didn't work. However, find ... | while read -r file; do echo "$file: zcat $file | wc -l"; done does work, so thank you!
In comment above I have "back ticks" around "zcat $file | wc -l". Unfortunately SO turns those into formatting, so I've added it as an actual answer with the correct code visible
@GregDougherty You can escape the backticks ` to do that you use backslashes like so: \​` (still, that's another good reason to use $() instead).
u
user9869932

Extending @Tinker's answer,

In my case, I needed to make a command | command | command inside the -exec to print both the filename and the found text in files containing a certain text.

I was able to do it with:

find . -name config -type f \( -exec  grep "bitbucket" {} \; -a -exec echo {} \;  \) 

the result is:

    url = git@bitbucket.org:a/a.git
./a/.git/config
    url = git@bitbucket.org:b/b.git
./b/.git/config
    url = git@bitbucket.org:c/c.git
./c/.git/config

You can also print the filename and the grep'd content on a single line by passing /dev/null as a second argument to the grep command with one -exec parameter: find . -name config -type f -exec grep "bitbucket" {} /dev/null \;
In this case, you could do: $ find . -name config -type f -exec grep -nl "bitbucket" {} \; And it will only print the name of the files that matches
A
Andrea Spadaccini

I don't know if you can do this with find, but an alternate solution would be to create a shell script and to run this with find.

lastline.sh:

echo $(tail -1 $1),$1

Make the script executable

chmod +x lastline.sh

Use find:

find . -name "*.txt" -exec ./lastline.sh {} \;

backticks are deprecated, please encourage the usage of $(...) which is better readable, fontindependently, and easy to nest. Thank you.
G
Greg Dougherty

Thanks to Camilo Martin, I was able to answer a related question:

What I wanted to do was

find ... -exec zcat {} | wc -l \;

which didn't work. However,

find ... | while read -r file; do echo "$file: `zcat $file | wc -l`"; done

does work, so thank you!


E
Eric Duruisseau

1st answer of Denis is the answer to resolve the trouble. But in fact it is no more a find with several commands in only one exec like the title suggest. To answer the one exec with several commands thing we will have to look for something else to resolv. Here is a example:

Keep last 10000 lines of .log files which has been modified in the last 7 days using 1 exec command using severals {} references

1) see what the command will do on which files:

find / -name "*.log" -a -type f -a -mtime -7 -exec sh -c "echo tail -10000 {} \> fictmp; echo cat fictmp \> {} " \;

2) Do it: (note no more "\>" but only ">" this is wanted)

find / -name "*.log" -a -type f -a -mtime -7 -exec sh -c "tail -10000 {} > fictmp; cat fictmp > {} ; rm fictmp" \;


But this will break if one of the filenames has a space, I believe.
J
Johannes Braunias

I usually embed the find in a small for loop one liner, where the find is executed in a subcommand with $().

Your command would look like this then:

for f in $(find *.txt); do echo "$(tail -1 $f), $(ls $f)"; done

The good thing is that instead of {} you just use $f and instead of the -exec … you write all your commands between do and ; done.

Not sure what you actually want to do, but maybe something like this?

for f in $(find *.txt); do echo $f; tail -1 $f; ls -l $f; echo; done

It's worth noting that according to ShellCheck it's not the best solution - SC2044: For loops over find output are fragile. Use find -exec or a while read loop. There is a great example and description on ShellCheck wiki github.com/koalaman/shellcheck/wiki/Sc2044
This also is exactly what BashPitfalls #1 advises against.
s
smapira

should use xargs :)

find *.txt -type f -exec tail -1 {} \; | xargs -ICONSTANT echo $(pwd),CONSTANT

another one (working on osx)

find *.txt -type f -exec echo ,$(PWD) {} + -exec tail -1 {} + | tr ' ' '/'

This overlooks a major use case for find - situations where the number of matching files is too large for a command line. -exec is a way to get around this limit. Piping out to a utility misses that benefit.
xargs -n exists to choose the number of matches per invocation. xargs -n 1 foocmd will execute foocmd {} for every match.
c
ccpizza

A find+xargs answer.

The example below finds all .html files and creates a copy with the .BAK extension appended (e.g. 1.html > 1.html.BAK).

Single command with multiple placeholders

find . -iname "*.html" -print0 | xargs -0 -I {} cp -- "{}" "{}.BAK"

Multiple commands with multiple placeholders

find . -iname "*.html" -print0 | xargs -0 -I {} echo "cp -- {} {}.BAK ; echo {} >> /tmp/log.txt" | sh

# if you need to do anything bash-specific then pipe to bash instead of sh

This command will also work with files that start with a hyphen or contain spaces such as -my file.html thanks to parameter quoting and the -- after cp which signals to cp the end of parameters and the beginning of the actual file names.

-print0 pipes the results with null-byte terminators.

for xargs the -I {} parameter defines {} as the placeholder; you can use whichever placeholder you like; -0 indicates that input items are null-separated.


xargs -I{} sh -c '...{}...' has major security problems, and xargs -I{} echo '...{}...' | sh is just as bad. What happens when you get a filename that contains $(/tmp/evil) in its name as literal text? (Yes, every character in that string is valid in a filename). Or $(rm -rf ~)'$(rm -rf ~)' -- yes, again, single quotes can exist in filenames on UNIX.
The safe thing is to keep your names out-of-band from your code. find ... -exec bash -c 'for arg; do something_with "$arg"; done' _ {} + keeps the fienames as arguments, out-of-band from the string interpreted by the shell as code.
J
James Bond

Here is my bash script that you can use to find multiple files and then process them all using a command.

Example of usage. This command applies a file linux command to each found file:

./finder.sh file fb2 txt

Finder script:

# Find files and process them using an external command.
# Usage:
#   ./finder.sh ./processing_script.sh txt fb2 fb2.zip doc docx

counter=0
find_results=()
for ext in "${@:2}"
do
    # @see https://stackoverflow.com/a/54561526/10452175
    readarray -d '' ext_results < <(find . -type f -name "*.${ext}" -print0)

    for file in "${ext_results[@]}"
    do
        counter=$((counter+1))
        find_results+=("${file}")
        echo ${counter}") ${file}"
    done
done
countOfResults=$((counter))
echo -e "Found ${countOfResults} files.\n"


echo "Processing..."
counter=0
for file in "${find_results[@]}"
do
    counter=$((counter+1))
    echo -n ${counter}"/${countOfResults}) "
    eval "$1 '${file}'"
done
echo "All files have been processed."