Welcome to Software Development on Codidact!
Will you help us build our independent community of developers helping developers? We're small and trying to grow. We welcome questions about all aspects of software development, from design to code to QA and more. Got questions? Got answers? Got code you'd like someone to review? Please join us.
Automatically install all packages needed
When running various Python scripts, I often need to do this annoying dance:
$ python script.py
...
ModuleNotFoundError: No module named 'foo'
$ pip install foo
$ python script.py
...
ModuleNotFoundError: No module named 'bar'
$ pip install bar
$ python script.py
...
ModuleNotFoundError: No module named 'baz'
$ pip install baz
$ python script.py
(correct output from script)
Yes, I know this can be solved by creating a requirements.txt
file, packaging the script, etc. But I'm asking about cases where that ship has sailed. All I have is a script that optimistically imports stuff.
Python already knows what package it's supposed to be, since it's named in the ModuleNotFoundError
. Is there a way to tell Python to react by attempting a pip install
on that, rather than raising an exception?
I am aware that:
-
ModuleNotFoundError
may raise not just for missing packages, but also modules (eg.foo.py
) in the same directory. I'm happy with a solution that blindly assumes it's always PyPi packages. -
Some packages use a different name for
pip install
andimport
. I'm happy with a solution that fails or installs the wrong package in this case. - It is dangerous to blindly install packages from PyPi. I'm okay with the risks.
3 answers
A compromise exists between automatically inferring package names (unreliable and potentially dangerous) and writing out an explicit separate requirements.txt
file: script-running tools such as pip-run
may offer the ability to parse requirements declared explicitly within the source file itself, and also allow specifying dependencies explicitly on the command line. Above and beyond managing dependencies, tools like this are designed to provide an isolated environment for the script (by automatically creating a virtual environment) - among other things, this means an error in the dependencies won't mess up your existing environment.
When the script is your own code, then, it's easy to leave a note about dependencies that won't require later remembering that e.g. up-to-date versions of PIL
need to be provided by pip install pillow
instead. For others' code, at least there is a simpler way for others to communicate the exact requirements without distributing a separate file.
The proposed PEP 722 (currently being heavily discussed on the Python Discourse forum) is an attempt to standardize a format for such metadata, so that it can become a target for more tools (note that these would not be necessary for packaging!) going forward. According to the author of the PEP, pipx
also has support for this feature implemented; but as far as I can tell, this refers to a development version not yet released.
Speaking of which, the discussion for the PEP just pointed me at another tool that seems reasonably fit for purpose, and much more lightweight than the above script-runners: viv
. The model here is that scripts can explicitly register their dependencies by importing the main Viv module and telling it to .use
a third-party package (identified by a string name).
0 comment threads
The following users marked this post as Works for me:
User | Comment | Date |
---|---|---|
matthewsnyder | (no comment) | Jun 20, 2023 at 15:00 |
You can use pipreqs
- It will automate the generation of a requirements file.
- This can spare you the annoying dance without necessarily mixing environment setup and script execution.
1 comment thread
The best approach is probably to just check the script beforehand, something like the following
grep import script.py
should list all imports and you can then evaluate and install them.
If you really want to automate things you can write a short shell script to loop & install modules for example this one in bash:
#!/usr/bin/env bash
while :;
do
{ STDERR="$( { ${@}; } 2>&1 1>&3 3>&- )"; } 3>&1;
if [[ $? -eq 0 ]]; then
break
else
module=$(printf '%s' "$STDERR" | grep ModuleNotFoundError | cut -d\' -f 2)
pip install $module
fi
done
when saved as install.sh
in the current directory can be used as bash install.sh python script.py
If you create a virtual environment with something like python -m venv env; source env/bin/activate
(which you should do to not pollute your global packages anyways) before you can generate a requirements.txt
with pip freeze > requirements.txt
afterwards and save yourself the hassle in future.
1 comment thread