Installing local package with "pip install -e ." does not update instalation - pip

I am working on a small package locally, and want to install it in editable mode so that when I make changes to it I don't have to reinstall after every change. When I install using pip install . everything works as expected, however when I change it to pip install -e ., the package does not update the modules contained in utils.py on install, nor when modules are changed. Here is my init.py:
from .utils import *
and my setup.py:
from setuptools import setup, find_packages
with open("README.md", "r", encoding="utf-8") as fh:
long_description = fh.read()
setup(
name="mypackage",
version="0.1.4",
packages=find_packages(include=["mypackage"]),
include_package_data=True,
author="myname",
description="placeholder",
long_description=long_description,
url="github.here",
license="GNU GPLv3",
install_requires=[
"mypackage",
"pandas",
"matplotlib",
"numpy",
"ipykernel",
],
)
I have already tried to change packages=find_packages(), and install_requires=["pandas", "matplotlib", "numpy", "ipykernel",], which did not fix the issue.
edit: Here is the project file format-
mypackage
|---mypackage
|---|---__pycache__
|---|---__init__.py
|---|---utils.py
|---mypackage.egg-info
|---notebooks
|---|---test.ipynb
|---README.md
|---.gitattributes
|---setup.py

Related

Unable to use locally built python package in development mode

I'm trying to spin up a super simple package for proof of concept and I can't see what i'm missing.
My aim is to be able to do the following:
python3 import mypackage
mypackage.add2(2)
>> 4
Github link
I created a public repo to reproduce the issue here
git clone https://github.com/OliverFarren/testPackage
Problem
I have a basic file structure as follows:
src/
mypackage/
__init__.py
mymodule.py
setup.cfg
setup.py
pyproject.toml
setup.cfg is pretty boiler plate from here
setup.py is just to allow pip install in editable mode:
import setuptools
setuptools.setup()
I ran the following commands at the top level directory in my Pycharm virtual env:
python3 -m pip install --upgrade build
python3 -m build
That created my dist and build directories and mypackage.egg-info file so now the directory looks like this:
testpackage
build/
bdist.linux-x86_64/
dist/
mypackage-0.1.0.tar.gz
mypackage-0.1.0-py3-none-any.whl
src/
mypackage/
mypackage.egg-info
__init__.py
mymodule.py
setup.cfg
setup.py
pyproject.toml
I've then tried install the package as follows:
sudo pip3 install -e .
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing wheel metadata ... done
Installing collected packages: mypackage
Running setup.py develop for mypackage
Successfully installed mypackage
Which I think should have installed it. Except when I try and import the package I get a ModuleNotFoundError
I'm wondering whether this is a permissions issue of some sort. When I try:
sudo pip3 list
pip3 list
I notice i'm getting different outputs, I can see my package present in the list and in my sys.path:
~/testpackage/src/mypackage'
I just don't understand what i'm missing here. Any advice would be appreciated!
Ok so I found the issue. Posting solution and leaving the github repo live - with fix, incase anyone else has this issue.
It turns out my setup.cfg wasn't boiler plate.
Here was my incorrect code:
[metadata]
# replace with your username:
name = mypackage
author = Oliver Farren
version = 0.1.0
description = Test Package
classifiers =
Programming Language :: Python :: 3
License :: OSI Approved :: MIT License
Operating System :: OS Independent
[options]
package_dir =
= src/mypackage
packages = find:
python_requires = >=3.6
[options.packages.find]
where = src/mypackage
src/mypackage should be src, it was looking inside the package for packages.
A key step in debugging this issue was checking the mypackage.egg.info files. The SOURCES.txt contained a list of all the files in the build package and I could clearly see that in the incorrect build, that src/mypackage/mymodules.py and src/mypackage/__init__.py were missing. So the package was correctly installed by pip, but being empty was making for a very confusing error message.

pipenv/pip install from git commit/revision id

I would like to install a package from a git repository specifying a commit id using pipenv (I belive it should be very similar If I would use pip)
so far I tried:
pipenv install "git+ssh://git#bitbucket.org/<username>/<repository>.git/<commit_id>#egg=mypackage"
which is apending the following line to the Pipfile & provides no errors
<package-name> = {git = "ssh://git#bitbucket.org/<username>/<repository>.git/<commit_id>"}
If I import the package import mypackage it detects it but its dependencies are missing.
The setup.py of mypackage looks like;
import setuptools
with open("README.md", "r") as readme:
long_description = readme.read()
with open("./requirements.txt", "r") as fh:
requirements = fh.readlines()
setuptools.setup(
name='mypackage',
url='https://bitbucket.org/<username>/<repositroy>',
packages=setuptools.find_packages(),
install_requires=[req for req in requirements if req[0] not in ["#", "-"]],
)
Just figured it out by reading this that the revision id should be specified after a #
pipenv install "git+ssh://git#bitbucket.org/<username>/<repository>.git#<commit_id>#egg=<package_name>"

How to make own local package importable the same way as pip installed packages?

How can I save my own package to conda environment so it is importable from any location once environment is activated?
When we conda activate my_env and pip install package the package can be imported no matter what the location of file.py is. What can I do to have my own_local_package importable the same way once my_env is activated?
You can use pip locally to install packages, and use import mypackage the same way you do with any other module, the correct approach is to :
python -m pip install -e /path_to_package/mypackage/
python -m ensures you are using the pip package from the same python installation you are currently using.
-e makes it editable, i/e import mypackage will reload after you make some changes, instead of using the cached one.
mypackage must contain an __init__.py
file, and a basic setup.py (or pyproject.toml file for pipenv)
the package structure must be like this:
mypackage/
setup.py
mypackage/
__init__.py
minimal setup.py
from setuptools import find_packages, setup
setup(
name='mypackage', # Required
version='0.0.1', # Required
packages=find_packages(), # Required
)
for a more elaborate package:
the package structure must be like this:
mypackage/
setup.py
mypackage/
src/
__init__.py
__main__.py
additional python files
...
minimal setup.py
from setuptools import find_packages, setup
setup(
name='mypackage', # Required
version='0.0.1', # Required
packages=find_packages(where="/src"), # Required
)

ModuleNotFoundError for 'modin' even though it is installed by poetry

On import modin.pandas as modin_pd line I get ModuleNotFoundError: No module named 'modin'. I am using poetry & JupyterLab. If in the cell I type !poetry add modin, I get ValueError saying Package modin is already present.
So it cannot install modin because it is already installed but it cannot import it either. Any obvious solution that I am missing?
pip freeze command also shows modin to be installed. I also tried to install it via pip install but absolutely nothing let me to import this module in the end.
The problem may be this one KeyError: CPU
It can be solved by using pip install psutil

How to make cython a requirement for a pip install?

When creating a Python package and uploading it to pypi, it will automatically install the requirements that are put in the setup.py file under install_requires, e.g.
from distutils.core import setup
setup(
name = 'a_package',
packages = ['a_package'],
install_requires=['another_package']
)
When the package has a cython extension (and .pyx files instead of .c/.cpp files), the setup.py file will need to import cython to create an installable extension, e.g.
from distutils.core import setup
from distutils.extension import Extension
from Cython.Distutils import build_ext
setup(
name = 'a_package',
packages = ['a_package'],
install_requires=['another_package'],
cmdclass = {'build_ext': build_ext},
ext_modules = [Extension('the_extension', sources=['a_file.pyx'])]
)
But since Cython is imported before executing the setup part, when trying to install this package through pip from source (rather than from a wheel) downloaded from pypi, it will fail to install due to not being able to import cython, as it has not reached the part with the requirements yet.
I’m wondering what can be done to ensure that a pip install of this package from pypi will install cython before it tries to import it. Adding a requirements.txt with cython does not seem to add automatic-install requirements for files downloaded from pypi.
Now, I realize it’s possible to just pip install cython before pip install thispackage, but I’m wondering if there’s a better fix that would allow to install the package along with cython directly from pypi when it’s not possible to run an additional command (without resorting to uploading the .c. files and ajusting the setup.py file to use them instead of the .pyx).
What you're describing is a "build time dependency", and this is precisely the use case "PEP 518 -- Specifying Minimum Build System Requirements for Python Projects" was created for.
You can specify cython as a build-time dependency by adding a pyproject.toml file like:
[build-system]
requires = ["cython"]
Then when installing your package with a modern version of pip (or another PEP 518 compatible installer), cython will be installed into the build environment before your setup.py script is run.

Resources