python package can be installed by pip but not conda - installation

I need the sacred package for a new code base I downloaded. It requires sacred.
https://pypi.python.org/pypi/sacred
conda install sacred fails with
PackageNotFoundError: Package missing in current osx-64 channels:
- sacred
The instruction on the package site only explains how to install with pip. What do you do in this case?

That package is not available as a conda package at all. You can search for packages on anaconda.org: https://anaconda.org/search?q=sacred You can see the type of package in the 4th column. Other Python packages may be available as conda packages, for instance, NumPy: https://anaconda.org/search?q=numpy
As you can see, the conda package numpy is available from a number of different channels (the channel is the name before the slash). If you wanted to install a package from a different channel, you can add the option to the install/create command with the -c/--channel option, or you can add the channel to your configuration conda config --add channels channel-name.
If no conda package exists for a Python package, you can either install via pip (if available) or build your own conda package. This isn't usually too difficult to do for pure Python packages, especially if one can use skeleton to build a recipe from a package on PyPI.

It happens some issue to me before. If your system default Python environment is Conda, then you could download those files from https://pypi.python.org/pypi/sacred#downloads
and manually install by
pip install C:/Destop/some-file.whl

Related

Pip install package not from pypi, whose name is the same as one in pypi ( + with dependencies from pypi )

I have a python package, hosted on azure (vsts), not on pypi, whose dependencies are python packages that live in pypi.
My package has the same name as a package that lives on pypi, I discovered.
Is there a way of installing my package with pip, specifying that my package must be searched on vsts before, while the dependencies can be grabbed from pypi?
If I use the --index-url option:
pip install <my-package> --index-url https://<my-package>:<PAT>#<url>/<proj>/_packaging/<my-package>/pypi/simple/
pip is able to locate my package, tries to install it, but it fails to install any dependency (because it searches for all of them in the same url, which is wrong because I am not hosting, say, my own version of numpy or other packages on vsts).
(This is the problem: pip install producing "Could not find a version that satisfies the requirement" )
If instead I use the --extra-index-url option:
pip install <my-package> --extra-index-url https://<my-package>:<PAT>#<url>/<proj>/_packaging/<my-package>/pypi/simple/
all the dependencies are found, but the problem is that this does not install my package, but the package with the same name that lives in pypi!
Even if --extra-index-url is there, it seems that pypi is given priority, and therefore my package that would be found at the url I specified is shadowed and doesn't get correctly found and installed.
Is there a way to, say, tell pip that it should give priority to my --extra-index-url? Or to give pip an --index-url which should only be valid for one package but not for its dependencies?
You need index URL pointing to VSTS and extra URL to PyPI:
pip install --index-url=https://<my-package>:<PAT>#<url>/<proj>/_packaging/<my-package>/pypi/simple/ --extra-index-url=https://pypi.org/simple/ <my-package>

Pip extras dependency substitution

I'm creating a module that has only 1 pypi dependency. This dependency has 2 packages on pypi. One that makes use of a system library and the other packages a binary distribution of that library. They look like:
theirmodule
theirmodule-binary
My module depends on theirmodule but I want users of my module to be able to decide if they want the lib version of the dependency or the binary version. I see in the docs about Extras. I could do:
setup(
name="MyModule",
...
extras_require={
"BIN": ["theirmodule-binary>=1.2"]
}
)
But then if the user does pip install mymodule[BIN] pip will install both theirmodule and theirmodule-binary. That would be a conflict since both have the same underlying import string eg:
import theirmodule
is used for both. How can this be handled without providing 2 separate pypi packages?
Maybe something like the following:
setup.py
import setuptools
setuptools.setup(
name='My-Project',
# ...
extras_require={
'Extra_Dependency_As_Binary': ['Dependency-Project-Binary>=1.2'],
'Extra_Dependency_As_Library': ['Dependency-Project-Library<=3.4'],
},
)
And then instruct the users of My-Project (maybe in the README file) to install by specifying either one of the extra explicitly. For example with pip it could be one or the other of:
path/to/pythonX.Y -m pip install 'My-Project[Extra_Dependency_As_Binary]'
path/to/pythonX.Y -m pip install 'My-Project[Extra_Dependency_As_Library]'

How to install deprecated/unsupported Python 3.4 on conda environment?

Since the deprecation of Python 3.4, conda has removed it from its package list. Is there a way, however, that I can install it?
I need it in order to use software written in this older version.
EDIT:
My question is different than the suggested duplicate one, because I am referring to deprecated and unsupported versions. I already know how to create a conda environment with a specific python version, but executing:
conda create --name py34env python=3.4
results in error (listed in the end), which is due to the lack of the package for Python 3.4 .
One can see the currently supported versions of Python by executing: conda search python and can confirm that Python 3.4 is not on the list.
This is the output of the error when trying to create a Python 3.4 conda enviroment:
$ conda create --name py34env python=3.4
Collecting package metadata (current_repodata.json): done
Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): done
Solving environment: failed
PackagesNotFoundError: The following packages are not available from current channels:
- python=3.4
Current channels:
- https://repo.anaconda.com/pkgs/main/linux-64
- https://repo.anaconda.com/pkgs/main/noarch
- https://repo.anaconda.com/pkgs/r/linux-64
- https://repo.anaconda.com/pkgs/r/noarch
To search for alternate channels that may provide the conda package you're
looking for, navigate to
https://anaconda.org
and use the search bar at the top of the page.
When Anaconda dropped it's free channel (technically, Conda 4.7+ just no longer looks there), this resulted in some older package versions that had never been ported to main no longer being accessible.
Option 1: Globally enable free channel searching
However, there is an option to restore access to the free channel, namely restore_free_channel.
# Not generally recommended
conda config --set restore_free_channel True
conda create -n py34 python=3.4
This isn't generally recommended (see blog post), but if you will be working in Python v3.4 frequently and will require other older compatible packages, it might be the best option.
Option 2: Temporarily include free channel
A more temporary solution is to include the free channel using the ad hoc --channel,-c argument. For example,
# slightly better
conda create -n py34 -c defaults -c free python=3.4
Note that I include defaults prior to free so that the latter will only be used if the package cannot be sourced from the former. This assumes the channel_priority setting is set to flexible (the default).
Option 3: Use Conda Forge
Alternatively, Conda Forge has Python v3.4.5, and that won't force you to change a global configuration option.
conda create -n py34 -c conda-forge python=3.4

How to make cython a requirement for a pip install?

When creating a Python package and uploading it to pypi, it will automatically install the requirements that are put in the setup.py file under install_requires, e.g.
from distutils.core import setup
setup(
name = 'a_package',
packages = ['a_package'],
install_requires=['another_package']
)
When the package has a cython extension (and .pyx files instead of .c/.cpp files), the setup.py file will need to import cython to create an installable extension, e.g.
from distutils.core import setup
from distutils.extension import Extension
from Cython.Distutils import build_ext
setup(
name = 'a_package',
packages = ['a_package'],
install_requires=['another_package'],
cmdclass = {'build_ext': build_ext},
ext_modules = [Extension('the_extension', sources=['a_file.pyx'])]
)
But since Cython is imported before executing the setup part, when trying to install this package through pip from source (rather than from a wheel) downloaded from pypi, it will fail to install due to not being able to import cython, as it has not reached the part with the requirements yet.
I’m wondering what can be done to ensure that a pip install of this package from pypi will install cython before it tries to import it. Adding a requirements.txt with cython does not seem to add automatic-install requirements for files downloaded from pypi.
Now, I realize it’s possible to just pip install cython before pip install thispackage, but I’m wondering if there’s a better fix that would allow to install the package along with cython directly from pypi when it’s not possible to run an additional command (without resorting to uploading the .c. files and ajusting the setup.py file to use them instead of the .pyx).
What you're describing is a "build time dependency", and this is precisely the use case "PEP 518 -- Specifying Minimum Build System Requirements for Python Projects" was created for.
You can specify cython as a build-time dependency by adding a pyproject.toml file like:
[build-system]
requires = ["cython"]
Then when installing your package with a modern version of pip (or another PEP 518 compatible installer), cython will be installed into the build environment before your setup.py script is run.

install "sub package" Jupyter Notebook

I am struggling to install packages, and "sub-packages" in Jupyter Notebook; I suspect I am missing some of the basic concepts around installing packages.
I understand that to install a package within the notebook I use
! pip install --user <package>
What I don't understand is how to install a "sub-package" (feel free to advise what the correct terminology is) such as below.
from nltk.tagger import *
Here is the original script that this comes from:
If i try :
!pip install nltk.tagger
I get the following error information / error
Collecting nltk.tagger
Could not find a version that satisfies the requirement nltk.tagger (from versions: ) No matching distribution found for nltk.tagger
So my first question is. How do I install this nltk.tagger subpackage? Also if tagger is a sub-package of NLTK, how come it isn't installed when I do a pip install NLTK?
Although the error mentions a version, searching online I can't find a reference even to the subpackage "tagger". Any advice or links explaining this would be appreciated.
if you pip inslall nltk, the subpackage nltk.tagger, and other dependencies will be installed too.
This is generally true for all packages.

Resources