# The Fastest Way to Make Shell Scripts Bulletproof

Just run them through a static analysis tool.

Most computer languages have built-in diagnostic mechanisms. That is to say, if a semicolon or bracket is missing, the source line of the problem will be identified. Here's an example of a basic error in Bash:

```
$ echo "fi" > test.sh
$ chmod +x ./test.sh
$ bash test.sh
test.sh: line 1: syntax error near unexpected token `fi'
test.sh: line 1: `fi'
$
```

Bash printed this error because it could not possibly execute the statement. However, it will never print warnings such as deprecated features or bad practices. For this reason, *even if a shell script runs without any apparent issues, it may still have unpredictable behaviour.* This becomes an even bigger issue when trying to achieve strict POSIX compliance. Here's a common example - an unquoted file path:

```
$ echo "cat \$1" > test.sh # $1 is unquoted
$ echo "File 1" > "file.txt" # No space in filename
$ echo "File 2" > "file 2.txt" # Space in the filename
$ dash test.sh "file.txt" # Apparently working!
File 1
$ dash test.sh "file 2.txt" # Fails due to word splitting
cat: file: No such file or directory
cat: 2.txt: No such file or directory
$
```

Shells do not have sophisticated enough diagnostic mechanisms to ensure a script is safe. This is why it is a good idea to always run *static analysis* on shell scripts using a program like `shellcheck`. The shell type is specified with the `-s` argument:

```
$ shellcheck -s sh test.sh

In test.sh line 1:
cat $1
^-- SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean:
cat "$1"

For more information:
https://www.shellcheck.net/wiki/SC2086 -- Double quote to prevent globbing ...
$
```

It's easy to see how problems like this could fly under the radar if the script was only tested on file paths without spaces. Of course, passing `shellcheck` tests won't guarantee the script will be problem-free, but it will root out many syntax-related issues.