Conda environment.yml Package Versions [duplicate] - anaconda

If I run conda info sphinx from the command line, the last entry as of August 24, 2017, is
sphinx 1.6.3 py36_0
-------------------
file name : sphinx-1.6.3-py36_0.tar.bz2
name : sphinx
version : 1.6.3
build string: py36_0
...
What is the meaning of the build string, which is mirrored above in the package version? Is this the minimum version of the python interpereter required by a package?

The first part of the build string (pyXX) of this package tells you the exact version of the Python interpreter that this package can be used for. Most likely, there are other packages for other versions of Python (py27, py35, etc.). The second part (after the underscore) tells you the build number of this package. The build number is typically incremented when there is a change in the build recipe, but no change in the version of the software being built. You can find more information in the description of the info/index.json fields.
Note, however, that the build string will be changing with conda build 3.0.
Package maintainers can customize their build strings using meta.yml (see Conda Build Documentation on Build section).

Related

Conda - search for package, specify Python version

I'm trying to search for a package, specifying the Python version. I have tried:
# Find Python 2.7 packages for 'numpy'
conda search "numpy=py27_0" --info
No match found for: numpy=py27_0. Search: *numpy*=py27_0
conda search "numpy==py27_0" --info
No match found for: numpy==py27_0. Search: *numpy*==py27_0
How can I specify the Python version, e.g. 2.7, 3.6?
There isn't a direct mechanism to constrain Python version outside of running the actual solver. However, the major channels use the Python version (say 3.8) to generate a string (say "py38") that is included in the build string. This can thus be used as proxy, by searching for constraints on the build string. For example, the following (equivalent) expressions should pick up all Python 2.7 builds of numpy in the configured channels:
## search all versions with 'py27' build string
conda search 'numpy=*=*py27*'
## alternative (MatchSpec) syntax
conda search 'numpy[build=*py27*]'
The first version must explicitly specify the version as unconstrained ("*"); the latter directly specifies the build constraint, with the unconstrained version implied.

What is after second = in conda dependencies yml. Example pandas=1.2.4=py38h1abd341_0

I am generating conda dependencies YAML and I don't entirely understand the information that is presented there. In pandas=1.2.4=py38h1abd341_0 I know that it is pandas 1.2.4 version but what is py38h1abd341_0?
That is the build string, documented here. The py38 indicates the package is built for Python 3.8. The 8 characters after that are a hexidecimal hash of the package dependencies, to differentiate variants that can be used to satisfy different dependencies (think glibc on Linux or the MSVCRT on Windows).
After the underscore is the build number, which is incremented when the package recipe changes but the package version does not change.
The new hash was introduced with Conda Build 3

Specify a chosen default version of conda package through conda-forge / conda-install

I'd like to distribute multiple versions of a package through conda. Specifically, I'd like to do something like this:
...
package-v1.2-dev
package-v1.2
package-v1.1-dev
package-v1.1
package-v1.0
The trick is that I'd like to have the "latest" or default package be the release versions that do not have -dev. As I understand it, conda install <package> without a version number will install the newest build. In my case, that will always be -dev. Is it possible to make the default a specific version number?
You can achieve this by specifying a custom "label" for your dev packages. Keep using the default main label for your release packages, but use a non-main label (e.g. dev) for the other packages.
First, a quick note about version numbers: conda package versions must not contain the - character, so v1.2-dev is not a valid version. For the following examples, I'll use v1.2.dev.
Here's how to upload your packages:
anaconda upload mypackage-v1.2.tar.bz2
anaconda upload --label dev mypackage-v1.2.dev.tar.bz2
(You can also manipulate the labels for existing packages via your account on the http://anaconda.org website.)
By default, your users will only download your main packages. Users who want the dev packages will have two choices:
They can specify the dev label on the command-line:
conda install -c mychannel/label/dev mypackage
OR
They can add your dev label to their .condarc config
# .condarc
channels:
- mychannel/label/dev # dev label
- mychannel # main label only
- conda-forge
- defaults
And then there's no need to specify the channel on the command-line:
conda install mypackage
PS -- Here's a side note about something you wrote above:
As I understand it, conda install <package> without a version number will install the newest build
Just to clarify, it doesn't install the "newest" in chronological sense, but rather the highest compatible version according to conda's VersionOrder logic. That logic is designed to be largely compatible with relevant Python conventions (e.g. PEP440 and others), but with some affordances for compatibility with other languages' conventions, too.
Please note: As far as conda (and PEP440) is concerned, 1.2.dev comes BEFORE 1.2. (Maybe you already knew that, but I don't consider it obvious.)
$ python
>>> from conda.models.version import VersionOrder
>>> VersionOrder('1.2.dev') < VersionOrder('1.2')
True

ModuleNotFoundError: No module named 'yaml'

I have used a YAML file and have imported PyYAML into my project.
The code works fine in PyCharm, however on creation of an egg and running the egg gives an error as module not found on command prompt.
You have not provided quite enough information for an exact answer, but, for missing python modules, simply run
py -m pip install PyYaml
or, in some cases
python pip install PyYaml
You may have imported it in your project (on PyCharm) but you have to make sure it is installed and imported outside of the IDE, and on your system, where the python interpreter runs it
I have not made an .egg for some time (you really should be consider using wheels for distributing packages), but IIRC an .egg should have a requires.txt file with an entry that specifies dependency on pyyaml.
You normally get that when setup() in your setup.py has an argument install_requires:
setup(
...
install_requires=['pyyaml<4']
...
)
(PyYAML 4.1 was retracted because there were problems with that version, but it might be in your local cache of PyPI as it was in my case, hence the <4, which restricts installation to the latest 3.x release)

import local package over global package

I'm working on a support library for a large Python project which heavily uses relative imports by appending various project directories to sys.path.
Using The Hitchhiker's Guide to Packaging as a template I attempted to create a package structure which will allow me to do a local install, but can easily be changed to a global install later if desired.
One of the dependencies of my package is the pyasn1 package for the encoding and decoding of ASN.1 annotated objects. I have to include the pyasn1 library separately as the version supported by the CentOS 6.3 default repositories is one major version back and has known bugs that will break my custom package.
The top-level of the library structure is as follows:
MyLibrary/
setup.py
setup.cfg
LICENSE.txt
README.txt
MyCustomPackage/
pyasn1-0.1.6/
In my setup configuration file I define the install directory for my library to be a local directory called .lib. This is desirable as it allows me to do absolute imports by running the command import site; site.addsitedir("MyLibrary/.lib") in the project's main application without requiring our engineers to pass command line arguments to the setup script.
setup.cfg
[install]
install-lib=.lib
setup.py
setup(
name='MyLibrary',
version='0.1a',
package_dir = {'pyasn1': 'pyasn1-0.1.6/pyasn1'},
packages=[
'MyCustomPackage',
'pyasn1',
'pyasn1.codec',
'pyasn1.compat','
pyasn1.codec.ber',
'pyasn1.codec.cer',
'pyasn1.codec.der',
'pyasn1.type'
],
license='',
long_description=open('README.txt').read(),
data_files = []
)
The problem I've run into with doing the installation this way is that when my package tries to import pyasn1 it imports the global version and ignores the locally installed version.
As a possible workaround I have tried installing the pyasn1 package under a different name than the global package (eg pyasn1_0_1_6) by doing package_dir = {'pyasn1_0_1_6':'pyasn1-0.1.6/pyasn1'}. However, this fails since the imports used internally to the pyasn1 package do not use the pyasn1_0_1_6 name.
Is there some way to either a) force Python to import a locally installed package over a globally installed one or b) force a package to install under a different name?
Use virtualenv to ensure that your application runs in a fully known configuration which is independent from the OS version of libraries.
EDIT: a quick (unix) solution is setting the PYTHONPATH environment variable, which works just like PATH for Python modules (module loaded from first path in which is found, so simply append you directory at the beginning of the PYTHONPATH). Anwyay, I strongly recommend you to proceed with virtualenv, since it was specifically engineered for handling situations like the one you are facing.
Rationale
The process is easily automatable if you write a setuptools script specifying dependencies with install_requires. For a complete example, refer to this one I wrote
Setup
Note that you can easily insert the steps below in a setup.sh shell script.
First create a virtualenv and enter it:
$ virtualenv $name
$ cd $name
Activate it:
$ source bin/activate
Now cd to your project directory and run the installer script:
$ cd $my_project_dir
$ python ./setup.py --prefix $path_to_virtualenv
Note the --prefix $path_to_virtualenv, which is used to tell the script to install in the virtualenv instead of system-wide. Call this after activating the virtualenv. Note that all the depencies are automatically downloaded and installed in the virtualenv.
Then you are done. When you want to leave the virtualenv, issue:
$ deactivate
On subsequent calls, you will only need to activate the virtualenv (step 2), maybe using a runawesomeproject.sh if you really want.
As noted on the virtualenv website, you should use virtualenv >= 1.9, as the previous versions did not download dependencies via HTTPS. If you consider plain HTTP to be sufficient, then any version should do.
You might also try relocatable virtualenvs: setup it and copy the folder to your host. Anyway, note that this feature is still experimental.

Resources