this post was submitted on 23 Jun 2024
35 points (97.3% liked)

Python

6478 readers
1 users here now

Welcome to the Python community on the programming.dev Lemmy instance!

πŸ“… Events

PastNovember 2023

October 2023

July 2023

August 2023

September 2023

🐍 Python project:
πŸ’“ Python Community:
✨ Python Ecosystem:
🌌 Fediverse
Communities
Projects
Feeds

founded 2 years ago
MODERATORS
 

So, I have a python script I'd like to run from time to time from the CLI (on Linux) that resides inside a venv. What's the recommended/intended way to do this?
Write a wrapper shell script and put it inside a $PATH-accessible directory that activates the virtual environment, runs the python script and deactivates the venv again? This seems a bit convoluted, but I can't think of a better way.

top 22 comments
sorted by: hot top controversial new old
[–] [email protected] 26 points 6 months ago (3 children)

Use venv/bin/python app.py to run it.

[–] [email protected] 5 points 6 months ago (1 children)

That works nicely. Thanks πŸ‘

[–] [email protected] 2 points 6 months ago (1 children)

I use my own Zsh project (zpy) to manage venvs stored like ~/.local/share/venvs/HASH-OF-PROJECT-PATH/venv, so use zpy's vpy function to launch a script with its associated Python executable ad-hoc, or add a full path shebang to the script with zpy's vpyshebang function.

vpy and vpyshebang in the docs

If anyone else is a Zsh fan and has any questions, I'm more than happy to answer or demo.

[–] [email protected] 1 points 6 months ago (1 children)

@Andy The convention is to place the venv in a .venv/ sub folder. Follow the convention!

This is shell agnostic

Learn pyenv and minimize shell scripts (only lives within a Makefile).

Shell scripts within Python packages is depreciated

[–] [email protected] 2 points 6 months ago (1 children)

The convention

That's one convention. I don't like it, I prefer to keep my venvs elsewhere. One reason is that it makes it simpler to maintain multiple venvs for a single project, using a different Python version for each, if I ever want to. It shouldn't matter to anyone else, as it's my environment, not some aspect of the shared repo. If I ever needed it there for some reason, I could always ln -s $VIRTUAL_ENV .venv.

Learn pyenv

I have used pyenv. It's fine. These days I use mise instead, which I prefer. But neither of them dictate how I create and store venvs.

Shell scripts within Python packages is depreciated

I don't understand if what you're referencing relates to my comment.

[–] [email protected] 1 points 6 months ago (1 children)

The multiple venv for different Python versions sounds exactly like what tox does

Then setup a github action that does nightly builds. Which will catch issues caused by changes that only tested against one python version or on one platform

py313 is a good version to test against cuz there were many modules removed or depreciated or APIs changed

good luck. Hope some of my advice is helpful

[–] [email protected] 2 points 6 months ago (2 children)

Thanks, yes, I use nox and github actions for automated environments and testing in my own projects, and tox instead of nox when it's someone else's project. But for ad hoc, local and interactive multiple environments, I don't.

[–] [email protected] 1 points 1 month ago (1 children)

Are you using github actions locally? Feel silly making gh actions and workflows and only github runs them

[–] [email protected] 1 points 1 month ago (1 children)

No, I don't use GHA locally, but the actions are defined to run the same things that I do run locally (e.g. invoke nox). I try to keep the GHA-exclusive boilerplate to a minimum. Steps can be like:

- name: fetch code
  uses: actions/checkout@v4

- uses: actions/setup-python@v5
  with:
    allow-prereleases: true
    python-version: |
      3.13
      3.12
      3.11
      3.10
      3.9
      3.8
      3.7

- run: pipx install nox

- name: run ward tests in nox environment
  run: nox -s test test_without_toml combine_coverage --force-color
  env:
    PYTHONIOENCODING: utf-8

- name: upload coverage data
  uses: codecov/codecov-action@v4
  with:
    files: ./coverage.json
    token: ${{ secrets.CODECOV_TOKEN }}

Sometimes if I want a higher level interface to tasks that run nox or other things locally, I use taskipy to define them in my pyproject.toml, like:

[tool.taskipy.tasks]
fmt = "nox -s fmt"
lock = "nox -s lock"
test = "nox -s test test_without_toml typecheck -p 3.12"
docs = "nox -s render_readme render_api_docs"
[–] [email protected] 1 points 1 month ago (1 children)

Thanks for the introduction to taskipy. Think if i need macros, Makefile is the way to go. Supports running targets in parallel and i like performing a check to ensure the virtual environment is activated or the command won't run.

.ONESHELL:
.DEFAULT_GOAL := help
SHELL := /bin/bash
APP_NAME := logging_strict

#virtual environment. If 0 issue warning
#Not activated:0
#activated: 1
ifeq ($(VIRTUAL_ENV),)
$(warning virtualenv not activated)
is_venv =
else
is_venv = 1
VENV_BIN := $(VIRTUAL_ENV)/bin
VENV_BIN_PYTHON := python3
PY_X_Y := $(shell $(VENV_BIN_PYTHON) -c 'import platform; t_ver = platform.python_version_tuple(); print(".".join(t_ver[:2]));')
endif

.PHONY: mypy
mypy:					## Static type checker (in strict mode)
ifeq ($(is_venv),1)
	@$(VENV_BIN_PYTHON) -m mypy -p $(APP_NAME)
endif

make mypy without the virtualenv on will write a warning message why it's not working!

[–] [email protected] 1 points 1 month ago* (last edited 1 month ago)

Sure, but nox is the closer counterpart for in-venv-task definitions. List "sessions" with -l, pick specific sessions to run with -s.

import nox
from nox.sessions import Session

nox.options.reuse_existing_virtualenvs = True
APP_NAME = 'logging_strict'

@nox.session(python='3.12')
def mypy(session: Session):
    """Static type checker (in strict mode)"""
    session.install('-U', 'mypy', '.')
    session.run('mypy',  '-p', APP_NAME, *session.posargs)

Unfortunately it doesn't currently do any parallel runs, but if anyone wants to track/encourage/contribute in that regard, see nox#544.

[–] [email protected] 1 points 1 month ago

thanks for the head up on nox. Syntax seems like a tox meets pytest.

[–] [email protected] 2 points 6 months ago (1 children)

This. I've experimented by using pex before and one or two other means of executable python wrappers and they suck. Just do as lakeeffect says.

[–] [email protected] 1 points 6 months ago

Yep. This is the way.

[–] [email protected] 1 points 6 months ago (2 children)

I think the path to venv should be absolute right?

[–] [email protected] 1 points 6 months ago

Yeah, for the most part but really depends on what you’re trying to do specifically.

[–] [email protected] 1 points 6 months ago

Just activate the venv and then put it out of your mind. Can activate it with either a relative or absolute path. Doesn't matter which

[–] [email protected] 3 points 6 months ago

I use pipenv with pyenv together. This works pretty well, also in cron jobs. Just add pipenv run python script.py to the cron table.

[–] [email protected] 2 points 1 month ago

As someone's new comments just brought me back to this post, I'll point out that these days there's another good option: uv run.

[–] [email protected] 2 points 6 months ago

Just in case this comment didn't make it explicitly clear, you can just invoke the python binary inside your venv directly and it will automatically locate all the libraries that are installed in your virtual environment.

To show how this works, you can look at the sys.path variable to see which paths python will search for modules when you run import statements. Try running python3 -c 'import sys; print(sys.path)' using your system python, and you will only see system python library paths. Then, try running it again after replacing python3 with the full path to the python3 binary in your venv, and you will see an additional entry in the output with the lib directory in your venv, which shows that python will also look there for modules when an import statement is executed.

[–] [email protected] 2 points 6 months ago

You could package it and install with pipx

[–] santa -2 points 6 months ago

Does it need access to anything local? If not, you could run it as an AWS Lambda on a schedule.