3

I'm developing several different Python packages with my team. Say that we have ~/src/pkg1, ~/src/pkg2, and ~/src/pkg3. How do we add these to PYTHONPATH without each of us having to manage dot-files?

We could add, say, ~/src/site/sitecustomize.py, which is added once to PYTHONPATH, but is it "guaranteed" that there won't be a global sitecustomize.py.

virtualenv seems like the wrong solution, because we don't want to have to build/install the packages after each change.

3
  • I think you have tagged your question as virtualenv, have tried using it? Commented Jul 27, 2014 at 14:52
  • 2
    With virtualenv you don't build / install packages after each change, you just activate the environment (. bin/activate) prior to running the code (that's assuming you're using symlinked development eggs for your own packages). Commented Jul 27, 2014 at 15:01
  • If someone adds a new package, I'd like to avoid having to "virtualenv -e" in ~/src/pkg1, ~/src/pkg2, ~/src/pkg3, etc. for every member of the team. Commented Jul 27, 2014 at 15:28

4 Answers 4

2

You have a lot of options...

1) Why not dotfiles?

You could centralize the management of dotfiles with a centralized repository and optionally with version control. I use a Dropbox folder named dotfiles but many people use github or other services like that to manage dotfiles.

If you do that, you will guarantee every people on your development team to share some dotfiles. So you could define a dotfile say .python_proys which export the appropriate PATH and PYTHONPATH which by convention every developer should source in their environment.

Suppose pkg1 is only an script, pkg2 is an script and also a module and pk3 is only a module. Then, python_proys Example:

export PATH=$PATH:~/src/pkg1:~/src/pkg2
export PYTHONPATH=$PYTHONPATH:~/src/pkg2:~/src/pkg3

And then, every developer have to source this dotfile somewhere by convetion. Each one will do the way he like. One could source the dotfile manually before using the packages. Another one could source it in his .bashrc or .zshenv or whatever dotfile apply to him.

The idea is to have one centralized point of coordination and only one dotfile to maintain: the .python_proys dotfile.

2) Use symlinks

You could define a directory in your home, like ~/dist (for modules) and ~/bin (for scripts) and set symbolic links there to the specific pakages in ~/src/, and make every developer have this PATH and PYTHONPATH setting:

export PATH=$PATH:~/bin
export PYTHONPATH=$PYTHONPATH:~/dist

So, using the same example at Why not dotfiles?, where pkg1 is only an script, pkg2 is an script and also a module and pkg3 is only a module, then you could symlink like:

cd ~/bin
ln -s ../src/pkg1
ln -s ../src/pkg2
cd ~/dist
ln -s ../src/pkg2
ln -s ../src/pkg3

Those commands could be do automatically with an script. You could write a bootstrap script, or simply copy and paste the commands and save it in a shell script. In any way, maintain it and centralize it the same way i explain it before.

This way the .dotfiles will not change, only the script defining the symlinks.

Sign up to request clarification or add additional context in comments.

Comments

0

I suggest looking into creating a name.pth path configuration file as outlined in thesitemodule's documentation. These files can hold multiple paths that will be added tosys.pathand be easily edited since they're simply text files.

2 Comments

These are development packages in people's home directories, not in the site directory.
The paths put in a .pth file can reference directories anywhere.
0

First, you don't add a python module to PYTHONPATH, you just add the path component.

If you want all your team to be working on some python package, you can install the package as editable with the -e option in a virtual environment.

This way you can continue development and you don't have to mess with the PYTHONPATH. Keep in mind that the working directory is always included in the PYTHONPATH, so unless you have an external requirement; you don't need a virtual environment, just the source in your working directory.

Your workflow would be the following:

  1. Create virtual environment
  2. Create a .pth file, to modify your PYTHONPATH.
  3. Work as usual.

This would be my preferred option. If you have a standard layout across your projects, you can distribute a customized bootstrap script which will create the environment, and then adjust the PYTHONPATH automatically. Share this bootstrap script across the team, or add it as part of the source repository.

1 Comment

Each package has modules ~/src/pkg1/mod1/_init_.py, ~/src/pkg2/mod2/__init__.py, etc. The directories ~/src/pkg1, etc. therefore need to be in the path to see mod1, mod2, etc. With virtualenv -e, every coworker would have to run it for every project. I'm trying to avoid that.
-1

I assume that your other modules are at predictable path (relative to $0)

We can compute absolute path of $0

os.path.realpath(sys.argv[0])

then arrive at your module path and append it

sys.path.append(something)

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.