Basically, the -v case was by design, so for `-v 'hash[$key]'`, "$key is expanded before the array subscript evaluation, and then the whole array plus expanded index is evaluated in a second pass". "Newer versions of bash (5.0 and higher) have a assoc_expand_once option which will suppress the multiple evaluations"
Note that the `-v` case doesn't really work the way one may infer from reading the OP:
Yuck, I was always instinctively put off by [[, now I finally have some arguments to justify it.
IMO safe shell scripting is kind of dead. I can do it if I really have to, but too many external programs have tricky "convenience" features like interpreting flags after positional parameters, etc.
Ack to yuck, but dead.. definitely not. Pretty sure large amounts of shell still get written, mostly due to it being the default scripting interface to the operating system.
A lot of this behavior is only a major problem if you're putting arbitrary input in, and especially, externally sourced input.
The "good news" is that bash is so full of ways to get command execution that people blow their foot off and get compromised long before these little details are what are compromising their system. People get popped putting in user input at the base string layer where all you have to do is slap down a semi-colon to get arbitrary command execution long before they're getting popped by obscure "test" behaviors.
Curious what you use instead of bash? When you spin up a server somewhere, what's the first thing you like to install that replaces what we typically use bash for?
Do these apply to NuShell? I think something like that is the way forward. Something with real data types rather than implicitly doing weird array processing. I would be pretty happy with something similar to Python but with easier IO redirection and subprocess management.
xonsh is neat in principle, but painful in actual usage ime. And I suspect vulnerable to similar issues around the Python-bash interop.
Let's say you need to install some third party software that is pretty standard `./configure && make && make install`, what would you do? Port `configure` to python?
The first function one is not particularly well-written, but harmless. The quoting of
${num}
is completely useless. Inside [[ bash does not do any word splitting after variable expansion. Double quotes never prevent variable expansion. I am not sure what the author is talking about. Shellcheck is correct to not complain. I stopped reading there.
> Double quotes never prevent variable expansion. I am not sure what the author is talking about. Shellcheck is correct to not complain. I stopped reading there.
I think it would behoove you to read the rest of the post. The double quotes are not the operative part of example there; they're only there to demonstrate that the code execution doesn't come from splatting or word splitting.
The actual code execution in Case #1 comes from the fact that bash (and other ksh descendants) run arithmetic evaluation on some strings in arithmetic contexts, regardless of their double or single quoting. That evaluation, in turn, can run arbitrary shell commands.
`-eq` is for arithmetic comparison; `=` is for string comparison. They don't do the same thing, and it's unsound to uniformly replace either with the other.
The dangerous thing here is that an undefined number of contexts exist where Bash treats strings as arithmetic expressions, which can contain arbitrary code despite not being quoted for expansion. `-eq` is just one example of that; others have linked other examples.
(This is all for case #1. With case #2, `[` and `test` are also susceptible so long as their builtin variants are used.)
Oh, hex. Another bashism. Not sure when I would have needed that in a shell script last time. So in most cases just using [ solves the problem. If you want to use hex from untrusted user input you need to validate the input first. Yes, the bash programmer needs to be aware of many pitfalls. I wasn't, but I would call myself more a bash avoider than a bash programmer. Yes, I use bash for interactive use, talking only about scripting.
Myself I typically don't script in bash. Most of the extras like [[ are not needed, you can do everything in dash. Arrays are the only feature that comes to my mind where bash would be handy.
I... don't understand. I thought the whole reason for using [[ and breaking posix compatibility was to prevent just this kind of vulnerability. Why would bash do this.
or if test $num -eq 42, which is the most sensible way to do it in my view, since it really makes the point clear that what you're really evaluating is the exit status of the evaluated command
From what I understand, based on the premise that this results from switching into 'arithmetic' mode, you don't even need test. The following will also work with the proposed attack:
Why I couldn't guess but an example similar to the article that I tried does not immediately execute (version 5.2.37(1)-release) when indrected through a variable as you show although other aritmetic evaluation does still happen when indirected. You can echo "${num}" and it shows the passed string. If you change it to declare -i num ; num="${1}" then it does immediately execute.
Honestly I just don't write shell scripts anymore, bash or otherwise. By the time any system I use is up, Python is available. I don't know if I've found a true need for shell in anything application level. I'll even fire up a Python shell for something simple like mass renaming files, simply because the string manipulation is so much easier.
I have a related question: is integer/"((math))" logic really safer (in bash) than "[normal]"?
I usually try hard to use declare -i iMyVar; as many applicable variables as possible. But evaluation of strings is still usually a hellhole... I mean hole hell.
Question: why does the evaluation inside a[] (which does not produce a value) not result in a bad array subscript error in this case?
if you try to evaluate this kind of things as an arithmetic expression directly, it will fail with an error of a bad subscript (mind you, the attack will still work though).
Yep, this is specifically a bashism (by way of being a kshism). However, it's worth noting that the second variant (`type -v`) will work in `[` and `test`.
(It's also a still a bashism, but IME people don't realize how little of `type` is actually POSIX.)
Did you run it in bash, or in sh? It won't work in a strictly POSIX sh (in that context, I assume `type` will attempt to query each argument as if it were a PATH candidate, and then return nothing).
I ran it by creating a file named "guess.sh" with the function and a `guess "$@"` call to it, then passing 'a[$(cat /etc/passwd > /tmp/pwned)] + 42' as a parameter to the script. Bash 5.2.
> (It's also a still a bashism, but IME people don't realize how little of `type` is actually POSIX.)
I just declare all of my shell scripts to use bash, since I've got no idea how much of anything is a bashism versus POSIX, and I hate shell scripts enough that I don't care to learn.
You are defining a function and then you use it interactively. That does not demonstrate that bash scripting is dangerous. Can you demonstrate the problem in a script?
Yes, you can do dangerous things in bash scripts. This might be one of them. Not at my computer now and no time to experiment.
Basically, the -v case was by design, so for `-v 'hash[$key]'`, "$key is expanded before the array subscript evaluation, and then the whole array plus expanded index is evaluated in a second pass". "Newer versions of bash (5.0 and higher) have a assoc_expand_once option which will suppress the multiple evaluations"
Note that the `-v` case doesn't really work the way one may infer from reading the OP:
> $ key='$(cat /etc/passwd > /tmp/pwned)'
> $ [[ -v 'x[$key]' ]]
> bash: $(cat /etc/passwd > /tmp/pwned): syntax error: operand expected (error token is "$(cat /etc/passwd > /tmp/pwned)") *
> [[ -v "${x[$key]}" ]]
> bash: $(cat /etc/passwd > /tmp/pwned): syntax error: operand expected (error token is "$(cat /etc/passwd > /tmp/pwned)")