Let's say I have the following package structure:
package/
mynamespace-subpackage-a/
setup.py
mynamespace/
subpackage_a/
__init__.py
mynamespace-subpackage-b/
setup.py
mynamespace/
subpackage_b/
__init__.py
module_b.py
with setup.py in package a:
from setuptools import find_packages, setup
setup(
name='mynamespace-subpackage-a',
...
packages=find_packages(),
namespace_packages=['mynamespace'],
install_requires=['pandas']
)
and package b:
from setuptools import find_packages, setup
setup(
name='mynamespace-subpackage-b',
...
packages=find_packages(),
namespace_packages=['mynamespace'],
install_requires=[]
)
package b uses package a, but it does not have any references to the pandas library itself. So it is not listed in the install_requires, but should still be installed when pip install . is executed inside package b and package a should be packaged along with it.
What should be added in the second setup file to achieve and is this even possible? Or should pandas be in the requirement list of package b as well?
I would suspect something like:
install_requires = ['mynamespace.subpackage_a`]
From what I understood from the question, I believe it should be:
package/mynamespace-subpackage-b/setup.py:
#...
setup(
name='mynamespace-subpackage-b',
# ...
install_requires=[
'mynamespace-subpackage-a',
# ...
],
)
This obviously assumes that pip can found a when installing b, meaning a distribution of a should be published on some kind of index (such as PyPI for example). If it is not possible then maybe one of the following alternatives could help:
Place distributions of a and b (wheel or source distribution) in a local directory, and then use pip's --find-links option (doc): pip install --find-links=path/to/distributions mynamespace-subpackage-b
Use a direct reference file URL as seen in PEP 440: install_requires=['a # file:///path/to/a.whl']
Use a direct remote URL (VCS such as git would work) the URL could be to a private repository or on the local file system: install_requires=['mynamespace-subpackage-a # git+file:///path/to/mynamespace-subpackage-a#master'], this assumes setup.py is at the root of the repository.
Related
I'm trying to spin up a super simple package for proof of concept and I can't see what i'm missing.
My aim is to be able to do the following:
python3 import mypackage
mypackage.add2(2)
>> 4
Github link
I created a public repo to reproduce the issue here
git clone https://github.com/OliverFarren/testPackage
Problem
I have a basic file structure as follows:
src/
mypackage/
__init__.py
mymodule.py
setup.cfg
setup.py
pyproject.toml
setup.cfg is pretty boiler plate from here
setup.py is just to allow pip install in editable mode:
import setuptools
setuptools.setup()
I ran the following commands at the top level directory in my Pycharm virtual env:
python3 -m pip install --upgrade build
python3 -m build
That created my dist and build directories and mypackage.egg-info file so now the directory looks like this:
testpackage
build/
bdist.linux-x86_64/
dist/
mypackage-0.1.0.tar.gz
mypackage-0.1.0-py3-none-any.whl
src/
mypackage/
mypackage.egg-info
__init__.py
mymodule.py
setup.cfg
setup.py
pyproject.toml
I've then tried install the package as follows:
sudo pip3 install -e .
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing wheel metadata ... done
Installing collected packages: mypackage
Running setup.py develop for mypackage
Successfully installed mypackage
Which I think should have installed it. Except when I try and import the package I get a ModuleNotFoundError
I'm wondering whether this is a permissions issue of some sort. When I try:
sudo pip3 list
pip3 list
I notice i'm getting different outputs, I can see my package present in the list and in my sys.path:
~/testpackage/src/mypackage'
I just don't understand what i'm missing here. Any advice would be appreciated!
Ok so I found the issue. Posting solution and leaving the github repo live - with fix, incase anyone else has this issue.
It turns out my setup.cfg wasn't boiler plate.
Here was my incorrect code:
[metadata]
# replace with your username:
name = mypackage
author = Oliver Farren
version = 0.1.0
description = Test Package
classifiers =
Programming Language :: Python :: 3
License :: OSI Approved :: MIT License
Operating System :: OS Independent
[options]
package_dir =
= src/mypackage
packages = find:
python_requires = >=3.6
[options.packages.find]
where = src/mypackage
src/mypackage should be src, it was looking inside the package for packages.
A key step in debugging this issue was checking the mypackage.egg.info files. The SOURCES.txt contained a list of all the files in the build package and I could clearly see that in the incorrect build, that src/mypackage/mymodules.py and src/mypackage/__init__.py were missing. So the package was correctly installed by pip, but being empty was making for a very confusing error message.
How can I save my own package to conda environment so it is importable from any location once environment is activated?
When we conda activate my_env and pip install package the package can be imported no matter what the location of file.py is. What can I do to have my own_local_package importable the same way once my_env is activated?
You can use pip locally to install packages, and use import mypackage the same way you do with any other module, the correct approach is to :
python -m pip install -e /path_to_package/mypackage/
python -m ensures you are using the pip package from the same python installation you are currently using.
-e makes it editable, i/e import mypackage will reload after you make some changes, instead of using the cached one.
mypackage must contain an __init__.py
file, and a basic setup.py (or pyproject.toml file for pipenv)
the package structure must be like this:
mypackage/
setup.py
mypackage/
__init__.py
minimal setup.py
from setuptools import find_packages, setup
setup(
name='mypackage', # Required
version='0.0.1', # Required
packages=find_packages(), # Required
)
for a more elaborate package:
the package structure must be like this:
mypackage/
setup.py
mypackage/
src/
__init__.py
__main__.py
additional python files
...
minimal setup.py
from setuptools import find_packages, setup
setup(
name='mypackage', # Required
version='0.0.1', # Required
packages=find_packages(where="/src"), # Required
)
I'm creating a module that has only 1 pypi dependency. This dependency has 2 packages on pypi. One that makes use of a system library and the other packages a binary distribution of that library. They look like:
theirmodule
theirmodule-binary
My module depends on theirmodule but I want users of my module to be able to decide if they want the lib version of the dependency or the binary version. I see in the docs about Extras. I could do:
setup(
name="MyModule",
...
extras_require={
"BIN": ["theirmodule-binary>=1.2"]
}
)
But then if the user does pip install mymodule[BIN] pip will install both theirmodule and theirmodule-binary. That would be a conflict since both have the same underlying import string eg:
import theirmodule
is used for both. How can this be handled without providing 2 separate pypi packages?
Maybe something like the following:
setup.py
import setuptools
setuptools.setup(
name='My-Project',
# ...
extras_require={
'Extra_Dependency_As_Binary': ['Dependency-Project-Binary>=1.2'],
'Extra_Dependency_As_Library': ['Dependency-Project-Library<=3.4'],
},
)
And then instruct the users of My-Project (maybe in the README file) to install by specifying either one of the extra explicitly. For example with pip it could be one or the other of:
path/to/pythonX.Y -m pip install 'My-Project[Extra_Dependency_As_Binary]'
path/to/pythonX.Y -m pip install 'My-Project[Extra_Dependency_As_Library]'
When creating a Python package and uploading it to pypi, it will automatically install the requirements that are put in the setup.py file under install_requires, e.g.
from distutils.core import setup
setup(
name = 'a_package',
packages = ['a_package'],
install_requires=['another_package']
)
When the package has a cython extension (and .pyx files instead of .c/.cpp files), the setup.py file will need to import cython to create an installable extension, e.g.
from distutils.core import setup
from distutils.extension import Extension
from Cython.Distutils import build_ext
setup(
name = 'a_package',
packages = ['a_package'],
install_requires=['another_package'],
cmdclass = {'build_ext': build_ext},
ext_modules = [Extension('the_extension', sources=['a_file.pyx'])]
)
But since Cython is imported before executing the setup part, when trying to install this package through pip from source (rather than from a wheel) downloaded from pypi, it will fail to install due to not being able to import cython, as it has not reached the part with the requirements yet.
I’m wondering what can be done to ensure that a pip install of this package from pypi will install cython before it tries to import it. Adding a requirements.txt with cython does not seem to add automatic-install requirements for files downloaded from pypi.
Now, I realize it’s possible to just pip install cython before pip install thispackage, but I’m wondering if there’s a better fix that would allow to install the package along with cython directly from pypi when it’s not possible to run an additional command (without resorting to uploading the .c. files and ajusting the setup.py file to use them instead of the .pyx).
What you're describing is a "build time dependency", and this is precisely the use case "PEP 518 -- Specifying Minimum Build System Requirements for Python Projects" was created for.
You can specify cython as a build-time dependency by adding a pyproject.toml file like:
[build-system]
requires = ["cython"]
Then when installing your package with a modern version of pip (or another PEP 518 compatible installer), cython will be installed into the build environment before your setup.py script is run.
I need to install a specific public package for one of my projects. I tried to do:
from setuptools import setup, find_packages
setup(
name = 'aaa',
install_requires = ['hyperas==0.3']
dependency_links = ['git+https://github.com/maxpumperla/hyperas.git#egg=hyperas-0.3']
)
But gave up because I couldn't get it to work and it was being deprecated. What is the correct way to tell pip to download a package from a particular url rather than from PyPi?
The url I need is https://github.com/maxpumperla/hyperas, rather than https://pypi.python.org/pypi/hyperas. Is this possible?