ChatGPT解决这个技术问题 Extra ChatGPT

Setting an environment variable before a command in Bash is not working for the second command in a pipe

In a given shell, normally I'd set a variable or variables and then run a command. Recently I learned about the concept of prepending a variable definition to a command:

FOO=bar somecommand someargs

This works... kind of. It doesn't work when you're changing a LC_* variable (which seems to affect the command, but not its arguments, for example, '[a-z]' char ranges) or when piping output to another command thusly:

FOO=bar somecommand someargs | somecommand2  # somecommand2 is unaware of FOO

I can prepend somecommand2 with "FOO=bar" as well, which works, but which adds unwanted duplication, and it doesn't help with arguments that are interpreted depending on the variable (for example, '[a-z]').

So, what's a good way to do this on a single line?

I'm thinking something on the order of:

FOO=bar (somecommand someargs | somecommand2)  # Doesn't actually work

I got lots of good answers! The goal is to keep this a one-liner, preferably without using "export". The method using a call to Bash was best overall, though the parenthetical version with "export" in it was a little more compact. The method of using redirection rather than a pipe is interesting as well.

(T=$(date) echo $T) will work
In the context of cross-platform (incl. windows) scripts or npm-based projects (js or else), you might want to take a look at the cross-env module.
I was hoping one of the answers would also explain why this only sort of works, i.e. why it's not equivalent to exporting the variable before the call.
The why is explained here: stackoverflow.com/questions/13998075/…

D
Dennis Williamson
FOO=bar bash -c 'somecommand someargs | somecommand2'

This satisfies my criteria (one-liner without needing "export")... I take it there's no way to do this without calling "bash -c" (e.g., creative use of parentheses)?
@MartyMacGyver: None that I can think of. It won't work with curly braces either.
Note that if you need to run your somecommand as sudo, you need to pass sudo the -E flag to pass though variables. Because variables can introduce vulnerabilities. stackoverflow.com/a/8633575/1695680
Note that if your command already has two levels of quotes then this method becomes extremely unsatisfactory because of quote hell. In that situation exporting in subshell is much better.
Odd: on OSX, FOO_X=foox bash -c 'echo $FOO_X' works as expected but with specific var names it fails: DYLD_X=foox bash -c 'echo $DYLD_X' echos blank. both work using eval instead of bash -c
0
0xC0000022L

How about exporting the variable, but only inside the subshell?:

(export FOO=bar && somecommand someargs | somecommand2)

Keith has a point, to unconditionally execute the commands, do this:

(export FOO=bar; somecommand someargs | somecommand2)

I'd use ; rather than &&; there's no way export FOO=bar is going to fail.
@MartyMacGyver: && executes the left command, then executes the right command only if the left command succeeded. ; executes both commands unconditionally. The Windows batch (cmd.exe) equivalent of ; is &.
In zsh I don't seem to need the export for this version: (FOO=XXX ; echo FOO=$FOO) ; echo FOO=$FOO yields FOO=XXX\nFOO=\n.
@PopePoopinpants: why not use source (aka .) in that case? Also, the backticks shouldn't be used anymore these days and this is one of the reasons why, using $(command) is waaaaay safer.
So simple, yet so elegant. And I like your answer better than the accepted answer, as it will start a sub shell equal to my current one (which may not be bash but could be something else, e.g. dash) and I don't run into any trouble if I must use quotes within the command args (someargs).
T
Teemu Leisti

You can also use eval:

FOO=bar eval 'somecommand someargs | somecommand2'

Since this answer with eval doesn't seem to please everyone, let me clarify something: when used as written, with the single quotes, it is perfectly safe. It is good as it will not launch an external process (like the accepted answer) nor will it execute the commands in an extra subshell (like the other answer).

As we get a few regular views, it's probably good to give an alternative to eval that will please everyone, and has all the benefits (and perhaps even more!) of this quick eval “trick”. Just use a function! Define a function with all your commands:

mypipe() {
    somecommand someargs | somecommand2
}

and execute it with your environment variables like this:

FOO=bar mypipe

@Alfe: Did you also downvote the accepted answer? because it exhibits the same “problems” as eval.
@Alfe: unfortunately I don't agree with your critique. This command is perfectly safe. You really sound like a guy who once read eval is evil without understanding what's evil about eval. And maybe you're not really understanding this answer after all (and really there's nothing wrong with it). On the same level: would you say that ls is bad because for file in $(ls) is ,bad? (and yeah, you didn't downvote the accepted answer, and you didn't leave a comment either). SO is such a weird and absurd place sometimes.
@Alfe: when I say You really sound like a guy who once read eval is evil without understanding what's evil about eval, I'm referring to your sentence: This answer lacks all the warnings and explanations necessary when talking about eval. eval is not bad or dangerous; no more than bash -c.
Votes aside, the comment provided @Alfe does somehow imply that the accepted answer is somehow safer. What would have been more helpful would have been for you to describe what you believe to be unsafe about the usage of eval. In the answer provided the args have been single quoted protecting from variable expansion, so I see no problem with the answer.
I think the function solution is clearer, though somewhat more verbose, than the eval solution.
b
benjimin

Use env.

For example, env FOO=BAR command. Note that the environment variables will be restored/unchanged again when command finishes executing.

Just be careful about about shell substitution happening, i.e. if you want to reference $FOO explicitly on the same command line, you may need to escape it so that your shell interpreter doesn't perform the substitution before it runs env.

$ export FOO=BAR
$ env FOO=FUBAR bash -c 'echo $FOO'
FUBAR
$ echo $FOO
BAR

This is the exactly correct answer, the entire purpose of env is to solve the stated question.
A
Akhil

A simple approach is to make use of ;

For example:

ENV=prod; ansible-playbook -i inventories/$ENV --extra-vars "env=$ENV"  deauthorize_users.yml --check

command1; command2 executes command2 after executing command1, sequentially. It does not matter whether the commands were successful or not.


It works because it defines ENV in the environment of the same shell in which the commands that follow the semicolon execute. How this differs from the other answers, though, is that this one defines ENV for all subsequent references in the shell and not just those on the same line. I believe that the original question intended to alter the environment only for the references on the same line.
adding ;unset ENV to the same line will make it one liner. but I ignored it as it doesn't make sense.
P
Peter Mortensen

Use a shell script:

#!/bin/bash
# myscript
FOO=bar
somecommand someargs | somecommand2

> ./myscript

You still need export; otherwise $FOO will be a shell variable, not an environment variable, and therefore not visible to somecommand or somecommand2.
It'd work but it defeats the purpose of having a one-line command (I'm trying to learn more creative ways to avoid multi-liners and/or scripts for relatively simple cases). And what @Keith said, though at least the export would stay scoped to the script.
@KeithThompson Your comment had something new to teach me. I never thought of shell vs environment variables. I used to think of global vs local environment variables. Now I know the correct terminology.