It’s pretty hard to be clever if you’re an executable file. Understanding how your programming interpreter views the world can save you days of development time and release you from the perils of virtual environments.
After you hand your interpreter code, you will likely be including libraries. The PATH is what will be used to find available modules and code.
Let’s check it out in Python.
1 | > \>> python -c "import sys; print(sys.path)" |
Some quick things to notice.
- Python will crack open zip files and use them just like they are directories. Awesome.
/usr/lib/python3.10...
python ships with some core libraries, and those get stored in a shared part of the file system, available to more than one user./home/trevor...
libraries get installed in your user’s home dir – best choice yet.
Now let’s go custom.
1 | PYTHONPATH=.:.venv |
I like to live dangerously and include the present working directory, .
, and if there is something in the folder named .venv
it’s always something I will want to include. venv
works too, but conflicts with python’s venv
package setup.
Between these directories and my environment variables I can completely control the state of my environment. If I want to install a lib for all of my apps, go for it: python -m pip install {lib}
– I never want to do that. Maybe I’ll do it begrudgingly if there is a command line executable I want available.
When I want to install a module for one project (which I always do, hard disk is cheap), I install to .venv
.
For pip this is the -t
argument.
1 | python -m pip install -t .venv {lib} |
And that’s pretty much my virtual environment. You don’t have to activate, or deactivate anything. There is no state being changed in my environment to confuse me. Every time I run python
everything works exactly as I expect it to. If something was misbehaving I have full competence over my environment and would be able to debug it quickly.
Is it inconvenient to remember the -t
flag, maybe. I place a Makefile
in my project and I haven’t typed a pip install
since. The Makefile I use is idempotent as well, so I can test
, run
, or serve
and it will only install or update packages once and exactly when it needs to.
This doesn’t solve running multiple versions of your interpreter. I download mine individually at this point. It’s pretty hard to get organizations to keep bleeding-edge on interpreters, so this hasn’t been a problem for me. Once again, the binary I’m using gets checked into the Makefile
. If I’m testing cross binaries, I use a framework, like tox. If I’m running something, I like being explicit with what I’m using.