pipenv/pip install from git commit/revision id - pip

I would like to install a package from a git repository specifying a commit id using pipenv (I belive it should be very similar If I would use pip)
so far I tried:
pipenv install "git+ssh://git#bitbucket.org/<username>/<repository>.git/<commit_id>#egg=mypackage"
which is apending the following line to the Pipfile & provides no errors
<package-name> = {git = "ssh://git#bitbucket.org/<username>/<repository>.git/<commit_id>"}
If I import the package import mypackage it detects it but its dependencies are missing.
The setup.py of mypackage looks like;
import setuptools
with open("README.md", "r") as readme:
long_description = readme.read()
with open("./requirements.txt", "r") as fh:
requirements = fh.readlines()
setuptools.setup(
name='mypackage',
url='https://bitbucket.org/<username>/<repositroy>',
packages=setuptools.find_packages(),
install_requires=[req for req in requirements if req[0] not in ["#", "-"]],
)

Just figured it out by reading this that the revision id should be specified after a #
pipenv install "git+ssh://git#bitbucket.org/<username>/<repository>.git#<commit_id>#egg=<package_name>"

Related

Installing local package with "pip install -e ." does not update instalation

I am working on a small package locally, and want to install it in editable mode so that when I make changes to it I don't have to reinstall after every change. When I install using pip install . everything works as expected, however when I change it to pip install -e ., the package does not update the modules contained in utils.py on install, nor when modules are changed. Here is my init.py:
from .utils import *
and my setup.py:
from setuptools import setup, find_packages
with open("README.md", "r", encoding="utf-8") as fh:
long_description = fh.read()
setup(
name="mypackage",
version="0.1.4",
packages=find_packages(include=["mypackage"]),
include_package_data=True,
author="myname",
description="placeholder",
long_description=long_description,
url="github.here",
license="GNU GPLv3",
install_requires=[
"mypackage",
"pandas",
"matplotlib",
"numpy",
"ipykernel",
],
)
I have already tried to change packages=find_packages(), and install_requires=["pandas", "matplotlib", "numpy", "ipykernel",], which did not fix the issue.
edit: Here is the project file format-
mypackage
|---mypackage
|---|---__pycache__
|---|---__init__.py
|---|---utils.py
|---mypackage.egg-info
|---notebooks
|---|---test.ipynb
|---README.md
|---.gitattributes
|---setup.py

Unable to use locally built python package in development mode

I'm trying to spin up a super simple package for proof of concept and I can't see what i'm missing.
My aim is to be able to do the following:
python3 import mypackage
mypackage.add2(2)
>> 4
Github link
I created a public repo to reproduce the issue here
git clone https://github.com/OliverFarren/testPackage
Problem
I have a basic file structure as follows:
src/
mypackage/
__init__.py
mymodule.py
setup.cfg
setup.py
pyproject.toml
setup.cfg is pretty boiler plate from here
setup.py is just to allow pip install in editable mode:
import setuptools
setuptools.setup()
I ran the following commands at the top level directory in my Pycharm virtual env:
python3 -m pip install --upgrade build
python3 -m build
That created my dist and build directories and mypackage.egg-info file so now the directory looks like this:
testpackage
build/
bdist.linux-x86_64/
dist/
mypackage-0.1.0.tar.gz
mypackage-0.1.0-py3-none-any.whl
src/
mypackage/
mypackage.egg-info
__init__.py
mymodule.py
setup.cfg
setup.py
pyproject.toml
I've then tried install the package as follows:
sudo pip3 install -e .
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing wheel metadata ... done
Installing collected packages: mypackage
Running setup.py develop for mypackage
Successfully installed mypackage
Which I think should have installed it. Except when I try and import the package I get a ModuleNotFoundError
I'm wondering whether this is a permissions issue of some sort. When I try:
sudo pip3 list
pip3 list
I notice i'm getting different outputs, I can see my package present in the list and in my sys.path:
~/testpackage/src/mypackage'
I just don't understand what i'm missing here. Any advice would be appreciated!
Ok so I found the issue. Posting solution and leaving the github repo live - with fix, incase anyone else has this issue.
It turns out my setup.cfg wasn't boiler plate.
Here was my incorrect code:
[metadata]
# replace with your username:
name = mypackage
author = Oliver Farren
version = 0.1.0
description = Test Package
classifiers =
Programming Language :: Python :: 3
License :: OSI Approved :: MIT License
Operating System :: OS Independent
[options]
package_dir =
= src/mypackage
packages = find:
python_requires = >=3.6
[options.packages.find]
where = src/mypackage
src/mypackage should be src, it was looking inside the package for packages.
A key step in debugging this issue was checking the mypackage.egg.info files. The SOURCES.txt contained a list of all the files in the build package and I could clearly see that in the incorrect build, that src/mypackage/mymodules.py and src/mypackage/__init__.py were missing. So the package was correctly installed by pip, but being empty was making for a very confusing error message.

How to make own local package importable the same way as pip installed packages?

How can I save my own package to conda environment so it is importable from any location once environment is activated?
When we conda activate my_env and pip install package the package can be imported no matter what the location of file.py is. What can I do to have my own_local_package importable the same way once my_env is activated?
You can use pip locally to install packages, and use import mypackage the same way you do with any other module, the correct approach is to :
python -m pip install -e /path_to_package/mypackage/
python -m ensures you are using the pip package from the same python installation you are currently using.
-e makes it editable, i/e import mypackage will reload after you make some changes, instead of using the cached one.
mypackage must contain an __init__.py
file, and a basic setup.py (or pyproject.toml file for pipenv)
the package structure must be like this:
mypackage/
setup.py
mypackage/
__init__.py
minimal setup.py
from setuptools import find_packages, setup
setup(
name='mypackage', # Required
version='0.0.1', # Required
packages=find_packages(), # Required
)
for a more elaborate package:
the package structure must be like this:
mypackage/
setup.py
mypackage/
src/
__init__.py
__main__.py
additional python files
...
minimal setup.py
from setuptools import find_packages, setup
setup(
name='mypackage', # Required
version='0.0.1', # Required
packages=find_packages(where="/src"), # Required
)

Install dependencies of used namespaced packages

Let's say I have the following package structure:
package/
mynamespace-subpackage-a/
setup.py
mynamespace/
subpackage_a/
__init__.py
mynamespace-subpackage-b/
setup.py
mynamespace/
subpackage_b/
__init__.py
module_b.py
with setup.py in package a:
from setuptools import find_packages, setup
setup(
name='mynamespace-subpackage-a',
...
packages=find_packages(),
namespace_packages=['mynamespace'],
install_requires=['pandas']
)
and package b:
from setuptools import find_packages, setup
setup(
name='mynamespace-subpackage-b',
...
packages=find_packages(),
namespace_packages=['mynamespace'],
install_requires=[]
)
package b uses package a, but it does not have any references to the pandas library itself. So it is not listed in the install_requires, but should still be installed when pip install . is executed inside package b and package a should be packaged along with it.
What should be added in the second setup file to achieve and is this even possible? Or should pandas be in the requirement list of package b as well?
I would suspect something like:
install_requires = ['mynamespace.subpackage_a`]
From what I understood from the question, I believe it should be:
package/mynamespace-subpackage-b/setup.py:
#...
setup(
name='mynamespace-subpackage-b',
# ...
install_requires=[
'mynamespace-subpackage-a',
# ...
],
)
This obviously assumes that pip can found a when installing b, meaning a distribution of a should be published on some kind of index (such as PyPI for example). If it is not possible then maybe one of the following alternatives could help:
Place distributions of a and b (wheel or source distribution) in a local directory, and then use pip's --find-links option (doc): pip install --find-links=path/to/distributions mynamespace-subpackage-b
Use a direct reference file URL as seen in PEP 440: install_requires=['a # file:///path/to/a.whl']
Use a direct remote URL (VCS such as git would work) the URL could be to a private repository or on the local file system: install_requires=['mynamespace-subpackage-a # git+file:///path/to/mynamespace-subpackage-a#master'], this assumes setup.py is at the root of the repository.

Windows with Python 3.5: What to do since pip won't work with dependency_links?

I need to install a specific public package for one of my projects. I tried to do:
from setuptools import setup, find_packages
setup(
name = 'aaa',
install_requires = ['hyperas==0.3']
dependency_links = ['git+https://github.com/maxpumperla/hyperas.git#egg=hyperas-0.3']
)
But gave up because I couldn't get it to work and it was being deprecated. What is the correct way to tell pip to download a package from a particular url rather than from PyPi?
The url I need is https://github.com/maxpumperla/hyperas, rather than https://pypi.python.org/pypi/hyperas. Is this possible?

Resources