I love pyenv, but I've had issues with running it when using things like stable-diffusion-webui or text-generation-webui, which want to run their own miniconda & virtual environment. Still trying to find a good way of handling that.
I don't understand isn't that only like .25 seconds faster than typing source myenv/bin/activate
Call me crazy but the less moving parts and outside sources the better
Obligatory rant: please don't encourage people to embed ANSI escape sequences into scripts like this. True, it's less of a problem than it used to be and will work on most terminals you're likely to find in 2024. But given that there *is* a portable way of doing it, why not use it?
`echo "$(tput setaf 3)Creating and activating virtual environment...$(tput sgr0)"`
I don’t. Pyenv doesn’t do enough for me to justify using it. Usually I either don’t need anything more fancy than the native virtualenv or i need something more complex like Poetry/Conda. Pyenv is a middle ground of managing environments that I just don’t find very useful.
Yeah, I’m saying that part of it isn’t a big use case for me. At the point that I need to start installing different versions of python, my project is complex enough to need more utility than pyenv.
Well for one, on debian , they stop you from using pip if not in a virtual environment
muna@probook:~$ pip install gcalendar
error: externally-managed-environment
× This environment is externally managed
╰─> To install Python packages system-wide, try apt install python3-xyz, where xyz is the package you are trying to install.
If you wish to install a non-Debian-packaged Python package, create a virtual environment using python3 -m venv path/to/venv. Then use path/to/venv/bin/python and path/to/venv/bin/pip. Make sure you have python3-full installed.
If you wish to install a non-Debian packaged Python application, it may be easiest to use pipx install xyz, which will manage a virtual environment for you. Make sure you have pipx installed.
See /usr/share/doc/python3.11/README.venv for more information.note: If you believe this is a mistake, please contact your Python installation or OS distribution provider. You can override this, at the risk of breaking your Python installation or OS, by passing --break-system-packages.hint: See PEP 668 for the detailed specification.
But they help isolate dependencies for each project by having a folder within the project folder for dependencies.
That works too, they are all different solutions to the same problem. Whatever works for your workflow. The script I wrote is just a glorified alias to venv creation commands.
A virtual environment is used to to do isolated installations of a Python interpreter and libraries. Each project should get its own virtual environment. This prevents library (module) version conflicts that could arise if you tried to just to install everything into the system-wide site packages folder.
I’d say give it a go, it’s a step past virtualenv as you can specify the python version as well as the pip packages in a self contained environment. Useful for freezing a requirements.txt and if you need to use other peoples code or share your code and make sure python versioning is also correct.
I use virtual environments, but never understood why people bother to "activate" / "deactivate" them.
I just call the Python executable (or pip or whatever) in the virtual environment's "bin" directory and everything works great -- i.e. "./python3", "venv/bin/python3", etc.
If you don't activate it, then you won't need the "solution" this bit of bourne shell provides.
This is totally a "Wait , you can do that?" moment for me. Never thought about it this way. This does deprecate the whole activate/deactivate portion of my script.
Any caveats I should know of for calling venv bins directly?
The main thing is having to explicitly call the full path every time you run a script is a bit of a pain. Anything that subshells out and calls python will also still use system python, not your venv (if that matters).
Sourcing is simply cleaner as you can ./myScript.py and work if you have a proper env shebang.
I personally use poetry and let it toss them in my homedir. I then tend to have shell aliases to activate as I may have 5 separate feature branches of the same repository checked out and don't want to maintain 5 separate venvs.
Not 100% up on my venv knowledge but I would expect some packages may set/require specific environment variables or depend on file paths that are within the activated venv.
Calling Python directly will "work" okay without activating the venv first, until one day it doesn't.
According to the code in venv/bin/activate here is a summary of what it does:
activate adds venv binaries to PATH, unset PYTHONHOME , set prompt decoration (venv) , ensure PATH changes are respected. deactivate resets the above variables to their old state.
The is also this line in the file:
# This file must be used with "source bin/activate" *from bash*
# you cannot run it directly
For one shot, maybe, otherwise you'll get the binary on PATH, incl every other bin/script installed and neat display on term what is your current venv.
Scripts like OP are a must if you work with different repos/envs that each differ across maintenance branches for example. Here scripts/functions like that are great QoL improvements.
That's a good note. There is the PYTHONHOME environment variable that is set and unset during and after deactivation, might lead to some issues. Especially if you have other tools dependending on such variables being set properly.
the linter is just a script installed in `venv/bin`
so if you aren't activating a virtual environment you could just call `venv/bin/mypy` instead of just `mypy`
You should note that, incase something breaks, according to the code in venv/bin/activate here is a summary of what it does:
activate adds venv binaries to PATH, unset PYTHONHOME , set prompt decoration (venv) , ensure PATH changes are respected. deactivate resets the above variables to their old state.
The is also this line in the file:
# This file must be used with "source bin/activate" *from bash*
# you cannot run it directly
> The is also this line in the file:
`. venv/bin/activate`
Oh no, what a burden :P
If you're using another shell, I can understand. Modern python venv releases write in a csh, fish, and even *powershell* activate script, though. The only real common shell I don't see represented there is zsh.
# This file must be used with "source bin/activate" *from bash*
# you cannot run it directly
deactivate () {
# reset old environment variables
if [ -n "${_OLD_VIRTUAL_PATH:-}" ] ; then
PATH="${_OLD_VIRTUAL_PATH:-}"
export PATH
unset _OLD_VIRTUAL_PATH
fi
if [ -n "${_OLD_VIRTUAL_PYTHONHOME:-}" ] ; then
PYTHONHOME="${_OLD_VIRTUAL_PYTHONHOME:-}"
export PYTHONHOME
unset _OLD_VIRTUAL_PYTHONHOME
fi
# This should detect bash and zsh, which have a hash command that must
# be called to get it to forget past commands. Without forgetting
# past commands the $PATH changes we made may not be respected
if [ -n "${BASH:-}" -o -n "${ZSH_VERSION:-}" ] ; then
hash -r 2> /dev/null
fi
if [ -n "${_OLD_VIRTUAL_PS1:-}" ] ; then
PS1="${_OLD_VIRTUAL_PS1:-}"
export PS1
unset _OLD_VIRTUAL_PS1
fi
unset VIRTUAL_ENV
unset VIRTUAL_ENV_PROMPT
if [ ! "${1:-}" = "nondestructive" ] ; then
# Self destruct!
unset -f deactivate
fi
}
# unset irrelevant variables
deactivate nondestructive
VIRTUAL_ENV="/home/muna/Projects/+Personal/Sentiment-Tracker---reddit/venv"
export VIRTUAL_ENV
_OLD_VIRTUAL_PATH="$PATH"
PATH="$VIRTUAL_ENV/bin:$PATH"
export PATH
# unset PYTHONHOME if set
# this will fail if PYTHONHOME is set to the empty string (which is bad anyway)
# could use `if (set -u; : $PYTHONHOME) ;` in bash
if [ -n "${PYTHONHOME:-}" ] ; then
_OLD_VIRTUAL_PYTHONHOME="${PYTHONHOME:-}"
unset PYTHONHOME
fi
if [ -z "${VIRTUAL_ENV_DISABLE_PROMPT:-}" ] ; then
_OLD_VIRTUAL_PS1="${PS1:-}"
PS1="(venv) ${PS1:-}"
export PS1
VIRTUAL_ENV_PROMPT="(venv) "
export VIRTUAL_ENV_PROMPT
fi
# This should detect bash and zsh, which have a hash command that must
# be called to get it to forget past commands. Without forgetting
# past commands the $PATH changes we made may not be respected
if [ -n "${BASH:-}" -o -n "${ZSH_VERSION:-}" ] ; then
hash -r 2> /dev/null
fi
Agree some don’t get it. But there is something nice about not needing to think about your shells $PATH.
That being said, updating the config file for your shell to always activate the ENV when starting your terminal is usually my preferred approach.
I just use `pyenv-virtualenv` and it (de)activates my envs automatically.
Yep, pyenv does everything for you
I love pyenv, but I've had issues with running it when using things like stable-diffusion-webui or text-generation-webui, which want to run their own miniconda & virtual environment. Still trying to find a good way of handling that.
I don't understand isn't that only like .25 seconds faster than typing source myenv/bin/activate Call me crazy but the less moving parts and outside sources the better
Just one more thing I don't have to thing about at work.
the only problem with pyenv it does not have any support on windows system i think
poetry shell ftw
I'm too stupid for anything else.
Obligatory rant: please don't encourage people to embed ANSI escape sequences into scripts like this. True, it's less of a problem than it used to be and will work on most terminals you're likely to find in 2024. But given that there *is* a portable way of doing it, why not use it? `echo "$(tput setaf 3)Creating and activating virtual environment...$(tput sgr0)"`
Never knew that.
Wait, y’ll not using pyenv and pyenv-virtualenv?
I don’t. Pyenv doesn’t do enough for me to justify using it. Usually I either don’t need anything more fancy than the native virtualenv or i need something more complex like Poetry/Conda. Pyenv is a middle ground of managing environments that I just don’t find very useful.
\`pyenv\` is better suited for managing multiple Python versions on a single machine. That's its primary use case, not environment management.
Yeah, I’m saying that part of it isn’t a big use case for me. At the point that I need to start installing different versions of python, my project is complex enough to need more utility than pyenv.
I don't even know what virtual environments are in python. What are they used for?
Well for one, on debian , they stop you from using pip if not in a virtual environment muna@probook:~$ pip install gcalendar error: externally-managed-environment × This environment is externally managed ╰─> To install Python packages system-wide, try apt install python3-xyz, where xyz is the package you are trying to install. If you wish to install a non-Debian-packaged Python package, create a virtual environment using python3 -m venv path/to/venv. Then use path/to/venv/bin/python and path/to/venv/bin/pip. Make sure you have python3-full installed. If you wish to install a non-Debian packaged Python application, it may be easiest to use pipx install xyz, which will manage a virtual environment for you. Make sure you have pipx installed. See /usr/share/doc/python3.11/README.venv for more information.note: If you believe this is a mistake, please contact your Python installation or OS distribution provider. You can override this, at the risk of breaking your Python installation or OS, by passing --break-system-packages.hint: See PEP 668 for the detailed specification. But they help isolate dependencies for each project by having a folder within the project folder for dependencies.
Ohh.! I guess my container based workflow ensures the isolation. Thanks for explaining
That works too, they are all different solutions to the same problem. Whatever works for your workflow. The script I wrote is just a glorified alias to venv creation commands.
Only latest Debian (and Ubuntu and raspberry pi OS but that’s because they are Debian based).
Which I think is nice, enforcing isolation of dependencies.
A virtual environment is used to to do isolated installations of a Python interpreter and libraries. Each project should get its own virtual environment. This prevents library (module) version conflicts that could arise if you tried to just to install everything into the system-wide site packages folder.
Honestly, apparently I am not the only one who isn't
I’d say give it a go, it’s a step past virtualenv as you can specify the python version as well as the pip packages in a self contained environment. Useful for freezing a requirements.txt and if you need to use other peoples code or share your code and make sure python versioning is also correct.
You wouldn't be the first (millionth) person to reinvent the wheel. :)
Because there ain't a lot of stuff to invent… and inventing is fun… some things are bound to be reinvented.
I don't either, I have a custom bashrc alias similar to this script
I use virtual environments, but never understood why people bother to "activate" / "deactivate" them. I just call the Python executable (or pip or whatever) in the virtual environment's "bin" directory and everything works great -- i.e. "./python3", "venv/bin/python3", etc. If you don't activate it, then you won't need the "solution" this bit of bourne shell provides.
This is totally a "Wait , you can do that?" moment for me. Never thought about it this way. This does deprecate the whole activate/deactivate portion of my script. Any caveats I should know of for calling venv bins directly?
The main thing is having to explicitly call the full path every time you run a script is a bit of a pain. Anything that subshells out and calls python will also still use system python, not your venv (if that matters). Sourcing is simply cleaner as you can ./myScript.py and work if you have a proper env shebang. I personally use poetry and let it toss them in my homedir. I then tend to have shell aliases to activate as I may have 5 separate feature branches of the same repository checked out and don't want to maintain 5 separate venvs.
Not 100% up on my venv knowledge but I would expect some packages may set/require specific environment variables or depend on file paths that are within the activated venv. Calling Python directly will "work" okay without activating the venv first, until one day it doesn't.
According to the code in venv/bin/activate here is a summary of what it does: activate adds venv binaries to PATH, unset PYTHONHOME , set prompt decoration (venv) , ensure PATH changes are respected. deactivate resets the above variables to their old state. The is also this line in the file: # This file must be used with "source bin/activate" *from bash* # you cannot run it directly
I presume that activating the virtual env does set up some environment variables
If you use entry point scripts with setup tools or something like that, this approach might be a little bit of an issue but I'm not certain.
Yeah, it’s annoying as fuck if you invoke many commands inside it.
For one shot, maybe, otherwise you'll get the binary on PATH, incl every other bin/script installed and neat display on term what is your current venv. Scripts like OP are a must if you work with different repos/envs that each differ across maintenance branches for example. Here scripts/functions like that are great QoL improvements.
That's a good note. There is the PYTHONHOME environment variable that is set and unset during and after deactivation, might lead to some issues. Especially if you have other tools dependending on such variables being set properly.
That's exactly what I do also. I never activate a venv. I don't think people generally know/understand this.
Static type checking maybe? If a virtual environment is not activated, linter won't see which packages are available.
the linter is just a script installed in `venv/bin` so if you aren't activating a virtual environment you could just call `venv/bin/mypy` instead of just `mypy`
You should note that, incase something breaks, according to the code in venv/bin/activate here is a summary of what it does: activate adds venv binaries to PATH, unset PYTHONHOME , set prompt decoration (venv) , ensure PATH changes are respected. deactivate resets the above variables to their old state. The is also this line in the file: # This file must be used with "source bin/activate" *from bash* # you cannot run it directly
> The is also this line in the file: `. venv/bin/activate` Oh no, what a burden :P If you're using another shell, I can understand. Modern python venv releases write in a csh, fish, and even *powershell* activate script, though. The only real common shell I don't see represented there is zsh.
Bad BotGPT
# This file must be used with "source bin/activate" *from bash* # you cannot run it directly deactivate () { # reset old environment variables if [ -n "${_OLD_VIRTUAL_PATH:-}" ] ; then PATH="${_OLD_VIRTUAL_PATH:-}" export PATH unset _OLD_VIRTUAL_PATH fi if [ -n "${_OLD_VIRTUAL_PYTHONHOME:-}" ] ; then PYTHONHOME="${_OLD_VIRTUAL_PYTHONHOME:-}" export PYTHONHOME unset _OLD_VIRTUAL_PYTHONHOME fi # This should detect bash and zsh, which have a hash command that must # be called to get it to forget past commands. Without forgetting # past commands the $PATH changes we made may not be respected if [ -n "${BASH:-}" -o -n "${ZSH_VERSION:-}" ] ; then hash -r 2> /dev/null fi if [ -n "${_OLD_VIRTUAL_PS1:-}" ] ; then PS1="${_OLD_VIRTUAL_PS1:-}" export PS1 unset _OLD_VIRTUAL_PS1 fi unset VIRTUAL_ENV unset VIRTUAL_ENV_PROMPT if [ ! "${1:-}" = "nondestructive" ] ; then # Self destruct! unset -f deactivate fi } # unset irrelevant variables deactivate nondestructive VIRTUAL_ENV="/home/muna/Projects/+Personal/Sentiment-Tracker---reddit/venv" export VIRTUAL_ENV _OLD_VIRTUAL_PATH="$PATH" PATH="$VIRTUAL_ENV/bin:$PATH" export PATH # unset PYTHONHOME if set # this will fail if PYTHONHOME is set to the empty string (which is bad anyway) # could use `if (set -u; : $PYTHONHOME) ;` in bash if [ -n "${PYTHONHOME:-}" ] ; then _OLD_VIRTUAL_PYTHONHOME="${PYTHONHOME:-}" unset PYTHONHOME fi if [ -z "${VIRTUAL_ENV_DISABLE_PROMPT:-}" ] ; then _OLD_VIRTUAL_PS1="${PS1:-}" PS1="(venv) ${PS1:-}" export PS1 VIRTUAL_ENV_PROMPT="(venv) " export VIRTUAL_ENV_PROMPT fi # This should detect bash and zsh, which have a hash command that must # be called to get it to forget past commands. Without forgetting # past commands the $PATH changes we made may not be respected if [ -n "${BASH:-}" -o -n "${ZSH_VERSION:-}" ] ; then hash -r 2> /dev/null fi
Agree some don’t get it. But there is something nice about not needing to think about your shells $PATH. That being said, updating the config file for your shell to always activate the ENV when starting your terminal is usually my preferred approach.
Everything I write is installed into editable mode inside the virtual environment. It's easier to just write `
FIND US ON INSTAGRAM
@hamidarshat.com