How to make cython a requirement for a pip install? - pip

When creating a Python package and uploading it to pypi, it will automatically install the requirements that are put in the setup.py file under install_requires, e.g.
from distutils.core import setup
setup(
name = 'a_package',
packages = ['a_package'],
install_requires=['another_package']
)
When the package has a cython extension (and .pyx files instead of .c/.cpp files), the setup.py file will need to import cython to create an installable extension, e.g.
from distutils.core import setup
from distutils.extension import Extension
from Cython.Distutils import build_ext
setup(
name = 'a_package',
packages = ['a_package'],
install_requires=['another_package'],
cmdclass = {'build_ext': build_ext},
ext_modules = [Extension('the_extension', sources=['a_file.pyx'])]
)
But since Cython is imported before executing the setup part, when trying to install this package through pip from source (rather than from a wheel) downloaded from pypi, it will fail to install due to not being able to import cython, as it has not reached the part with the requirements yet.
I’m wondering what can be done to ensure that a pip install of this package from pypi will install cython before it tries to import it. Adding a requirements.txt with cython does not seem to add automatic-install requirements for files downloaded from pypi.
Now, I realize it’s possible to just pip install cython before pip install thispackage, but I’m wondering if there’s a better fix that would allow to install the package along with cython directly from pypi when it’s not possible to run an additional command (without resorting to uploading the .c. files and ajusting the setup.py file to use them instead of the .pyx).

What you're describing is a "build time dependency", and this is precisely the use case "PEP 518 -- Specifying Minimum Build System Requirements for Python Projects" was created for.
You can specify cython as a build-time dependency by adding a pyproject.toml file like:
[build-system]
requires = ["cython"]
Then when installing your package with a modern version of pip (or another PEP 518 compatible installer), cython will be installed into the build environment before your setup.py script is run.

Related

Install dependencies of used namespaced packages

Let's say I have the following package structure:
package/
mynamespace-subpackage-a/
setup.py
mynamespace/
subpackage_a/
__init__.py
mynamespace-subpackage-b/
setup.py
mynamespace/
subpackage_b/
__init__.py
module_b.py
with setup.py in package a:
from setuptools import find_packages, setup
setup(
name='mynamespace-subpackage-a',
...
packages=find_packages(),
namespace_packages=['mynamespace'],
install_requires=['pandas']
)
and package b:
from setuptools import find_packages, setup
setup(
name='mynamespace-subpackage-b',
...
packages=find_packages(),
namespace_packages=['mynamespace'],
install_requires=[]
)
package b uses package a, but it does not have any references to the pandas library itself. So it is not listed in the install_requires, but should still be installed when pip install . is executed inside package b and package a should be packaged along with it.
What should be added in the second setup file to achieve and is this even possible? Or should pandas be in the requirement list of package b as well?
I would suspect something like:
install_requires = ['mynamespace.subpackage_a`]
From what I understood from the question, I believe it should be:
package/mynamespace-subpackage-b/setup.py:
#...
setup(
name='mynamespace-subpackage-b',
# ...
install_requires=[
'mynamespace-subpackage-a',
# ...
],
)
This obviously assumes that pip can found a when installing b, meaning a distribution of a should be published on some kind of index (such as PyPI for example). If it is not possible then maybe one of the following alternatives could help:
Place distributions of a and b (wheel or source distribution) in a local directory, and then use pip's --find-links option (doc): pip install --find-links=path/to/distributions mynamespace-subpackage-b
Use a direct reference file URL as seen in PEP 440: install_requires=['a # file:///path/to/a.whl']
Use a direct remote URL (VCS such as git would work) the URL could be to a private repository or on the local file system: install_requires=['mynamespace-subpackage-a # git+file:///path/to/mynamespace-subpackage-a#master'], this assumes setup.py is at the root of the repository.

Pip extras dependency substitution

I'm creating a module that has only 1 pypi dependency. This dependency has 2 packages on pypi. One that makes use of a system library and the other packages a binary distribution of that library. They look like:
theirmodule
theirmodule-binary
My module depends on theirmodule but I want users of my module to be able to decide if they want the lib version of the dependency or the binary version. I see in the docs about Extras. I could do:
setup(
name="MyModule",
...
extras_require={
"BIN": ["theirmodule-binary>=1.2"]
}
)
But then if the user does pip install mymodule[BIN] pip will install both theirmodule and theirmodule-binary. That would be a conflict since both have the same underlying import string eg:
import theirmodule
is used for both. How can this be handled without providing 2 separate pypi packages?
Maybe something like the following:
setup.py
import setuptools
setuptools.setup(
name='My-Project',
# ...
extras_require={
'Extra_Dependency_As_Binary': ['Dependency-Project-Binary>=1.2'],
'Extra_Dependency_As_Library': ['Dependency-Project-Library<=3.4'],
},
)
And then instruct the users of My-Project (maybe in the README file) to install by specifying either one of the extra explicitly. For example with pip it could be one or the other of:
path/to/pythonX.Y -m pip install 'My-Project[Extra_Dependency_As_Binary]'
path/to/pythonX.Y -m pip install 'My-Project[Extra_Dependency_As_Library]'

What happens with "pure" Python + Cython packages during installation built failure?

I just read the Cython Pure Python Mode documentation and I'm not sure if I understand it right. It sounds as if I could keep all my Python files as they are, add *.pxd files where I declare Cython types. In the setup.py, I still add
from setuptools import setup
from Cython.Build import cythonize
setup(
ext_modules = cythonize(
"A.py",
compiler_directives={'language_level' : "3"}
)
)
When I run python setup.py build_ext --inplace it actually builds the .so file.
What happens when I create the sdist / bdist, upload them to PyPI and a user does not have a matching platform? They will download the sdist, sure. I guess pip / setuptools will automatically try to compile the extension modules (A.py) and I guess if that works, it is fine. But what if cythonize fails? Will it still install the package and use the pure Python code?
I don't think so. I believe a failure in setup.py aborts installation completely.
You can try to declare an extension optional but there're reports that doesn't really work. Could be an issue with older setuptools.

Recompile Cython extension when setup.py changed with Pip editable

How do I make pip recompile Cython extensions when I have only changed setup.py when installing in editable mode. Currently it always skips the extensions. There are many questions related to this for distutils but I can't see any answers for pip.
To be clear, I have a setup.py like the following
from distutils.core import setup, Extension
from Cython.Distutils import build_ext
import numpy
ext_modules = []
# simulate_fast
ext_modules += [
Extension("adio.simulating.simulate_fast_c",
sources=["./adio/simulating/simulate_fast/simulate_fast_c.pyx",
"./adio/simulating/simulate_fast/c/simulate_fast.c",
"./adio/simulating/simulate_fast/c/matrix.c"],
include_dirs=[numpy.get_include()],
extra_compile_args=["-Ofast", "-ffast-math", "-march=native"],
language='c',
libraries=["gsl", "openblas"],
define_macros=[('FLOAT32', 1)]
)
]
setup(
name="adio",
packages=["adio"],
cmdclass={'build_ext': build_ext},
ext_modules=ext_modules
)
Now if I delete the ('FLOAT32', 1) in define_macros of course I would like the Cython extension to recompile.
However when I run
python3 -m pip install --editable -U . -v
I receive the following output as part of the output
running develop
running egg_info
writing adio.egg-info/PKG-INFO
writing dependency_links to adio.egg-info/dependency_links.txt
writing top-level names to adio.egg-info/top_level.txt
reading manifest file 'adio.egg-info/SOURCES.txt'
writing manifest file 'adio.egg-info/SOURCES.txt'
running build_ext
skipping './adio/simulating/simulate_fast/simulate_fast_c.c' Cython extension (up-to-date)
I have tried the -I and --force-reinstall flag with pip but it always skips the Cython extension. If I am not using editable mode then I can run
python3 -m pip install -U . -v
and this does recompile. How can I achieve the same thing when using the --editable flag.
Related
distutils ignores changes to setup.py when building an extension?

python package can be installed by pip but not conda

I need the sacred package for a new code base I downloaded. It requires sacred.
https://pypi.python.org/pypi/sacred
conda install sacred fails with
PackageNotFoundError: Package missing in current osx-64 channels:
- sacred
The instruction on the package site only explains how to install with pip. What do you do in this case?
That package is not available as a conda package at all. You can search for packages on anaconda.org: https://anaconda.org/search?q=sacred You can see the type of package in the 4th column. Other Python packages may be available as conda packages, for instance, NumPy: https://anaconda.org/search?q=numpy
As you can see, the conda package numpy is available from a number of different channels (the channel is the name before the slash). If you wanted to install a package from a different channel, you can add the option to the install/create command with the -c/--channel option, or you can add the channel to your configuration conda config --add channels channel-name.
If no conda package exists for a Python package, you can either install via pip (if available) or build your own conda package. This isn't usually too difficult to do for pure Python packages, especially if one can use skeleton to build a recipe from a package on PyPI.
It happens some issue to me before. If your system default Python environment is Conda, then you could download those files from https://pypi.python.org/pypi/sacred#downloads
and manually install by
pip install C:/Destop/some-file.whl

Resources