Running remote scripts (cloud scripts) locally --- valid and securely as possible
I use CentOS with Bash and I would like to download, execute and delete the executed downloaded file (running a remote/cloud script locally).
I often prefer to load my own shell scripts from my own GitHub account. I will normally do it for small shell scripts not exceeding approximately 25 code lines.
I tried to execute a remote script ending with a
while true; do case esac done with:
wget -O - https://raw.githubusercontent.com/<username>/<project>/<branch>/<path>/<file> | bash
But then I had the problem of endless loop of
echo in a
case esac for some reason (CTRL+C stopped it) so I turned to a more "traditional" way of running remote scripts such as:
cd DESTINATION && wget https://raw.githubusercontent.com/<username>/<project>/<branch>/<path>/<file> && source FILENAME && rm FILENAME
How would you make that "traditional" code more validated? More secured?
An example for a current problem; the file downloaded can have a trivial name such as
install.sh and collide with similar files (the
rm is especially problematic here I think).