Poetry sees an indirect dependency (numpy for python) is outdated, but doesn't update it - python-poetry

Trying out poetry 1.1.11 and have pandas in my pyproject.toml in the tool.poetry.dependencies section. Pandas depends on numpy
pandas 1.3.3 Powerful data structures for data analysis, time series, and statistics
├── numpy >=1.17.3
├── python-dateutil >=2.7.3
│ └── six >=1.5
└── pytz >=2017.3
When I called poetry add pandas it correctly installed numpy 1.21.1. Numpy has bumped to 1.22.2 and poetry recognizes this
poetry show --outdated
numpy 1.21.1 1.21.2 NumPy is the fundamental package for array computing with Python.
But numpy isn't updated by poetry.
poetry update
Updating dependencies
Resolving dependencies... (0.3s)
No dependencies to install or update
Is this expected? How / when would numpy be updated?
EDIT2: per #finswimmer's request, here's the TOML and for a simpler case than in the first EDIT TOML. It's an empty project from poetry new. Then try poetry add numpy as below.
Just
[tool.poetry]
name = "delete_me4"
version = "0.1.0"
description = ""
authors = ["Your Name <you#example.com>"]
[tool.poetry.dependencies]
python = "^3.9"
[tool.poetry.dev-dependencies]
pytest = "^5.2"
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
❯ poetry add numpy
Using version ^1.21.2 for numpy
Updating dependencies
Resolving dependencies... (0.0s)
SolverProblemError
The current project's Python requirement (>=3.9,<4.0) is not compatible with some of the required packages Python requirement:
- numpy requires Python >=3.7,<3.11, so it will not be satisfied for Python >=3.11,<4.0
Because numpy (1.21.2) requires Python >=3.7,<3.11
and no versions of numpy match >1.21.2,<2.0.0, numpy is forbidden.
So, because delete-me4 depends on numpy (^1.21.2), version solving failed.
at ~/.local/share/pypoetry/venv/lib/python3.9/site-packages/poetry/puzzle/solver.py:241 in _solve
237│ packages = result.packages
238│ except OverrideNeeded as e:
239│ return self.solve_in_compatibility_mode(e.overrides, use_latest=use_latest)
240│ except SolveFailure as e:
→ 241│ raise SolverProblemError(e)
242│
243│ results = dict(
244│ depth_first_search(
245│ PackageNode(self._package, packages), aggregate_package_nodes
• Check your dependencies Python requirement: The Python requirement can be specified via the `python` or `markers` properties
For numpy, a possible solution would be to set the `python` property to ">=3.9,<3.11"
https://python-poetry.org/docs/dependency-specification/#python-restricted-dependencies,
https://python-poetry.org/docs/dependency-specification/#using-environment-markers

The suggestion in the help text above addresses the issue. poetry add numpy works in an empty project with this TOML change.
[tool.poetry.dependencies]
python = ">=3.9,<3.11"
I'm not sure what triggered it wanting Python >= 3.11 descoped, but hey. It solved the problem for me once I tried the repro.
Note that constraining Python further as follows allows for scipy to move from 1.6.1 to 1.7.1
[tool.poetry.dependencies]
python = ">=3.9,<3.10"

Related

Sklearn installation

I have installed sklearn library using
pip install sklearn
but while importing it it shows there is no any library called sklearn that is it gives import error
and afterwars i checked again installing using same command mentioned above but it says requirement already satisfied.
Why it showing like this?
What may be the solution for it? Here is the problem screenshot while importing
It's in CMD
C:\Users\scann>pip install sklearn
Requirement already satisfied: sklearn in c:\users\scann\appdata\local\programs\python\python310\lib\site-packages (0.0.post1)
C:\Users\scann>pip install -U sklearn
Requirement already satisfied: sklearn in c:\users\scann\appdata\local\programs\python\python310\lib\site-packages (0.0.post1)
I tried many methods for installation using github and using -U
But also i Didn't find any correct solution
As of today (2023-01-09), pip install sklearn is in a "brownout" period, and installing this way will eventually be removed.
The preferred installation method is:
pip install scikit-learn
The reason for deprecation is listed as:
sklearn package on PyPI exists to prevent malicious actors from using the sklearn package, since sklearn (the import name) and scikit-learn (the project name) are sometimes used interchangeably. scikit-learn is the actual package name and should be used with pip.
Further reading:
https://github.com/scikit-learn/sklearn-pypi-package
https://github.com/scikit-learn/scikit-learn/issues/24204

Version of a built `conda-forge` package is different between `pip list` and the `conda list` (it should be the same)

I recently added the package typepigeon to conda-forge. On conda-forge it is currently at version 1.0.9; however, when installing typepigeon via conda install, the output of pip list shows its version to be 0.0.0.post2.dev0+a27ab2a instead of 1.0.9.
conda list:
typepigeon 1.0.9 pyhd8ed1ab_0 conda-forge
pip list:
typepigeon 0.0.0.post2.dev0+a27ab2a
I think the issue arises from the way I am assigning the version (I am using dunamai to extract the Git tag as the version number). This version extraction is done within setup.py of typepigeon.
try:
__version__ = Version.from_any_vcs().serialize()
except RuntimeError as error:
warnings.warn(f'{error.__class__.__name__} - {error}')
__version__ = '0.0.0'
When conda-forge builds the feedstock, I think it might be looking at the Git tag of the feedstock repository instead of the version from PyPI (as it is locally executing setup.py).
How can I modify the Conda Forge recipe to force the PyPI version?
I've figured out a solution; it might not be the best possible way to do this, but it works for my workflow.
I injected the version into the setup.py by looking for an environment variable (that I called __version__):
if '__version__' in os.environ:
__version__ = os.environ['__version__']
else:
from dunamai import Version
try:
__version__ = Version.from_any_vcs().serialize()
except RuntimeError as error:
warnings.warn(f'{error.__class__.__name__} - {error}')
__version__ = '0.0.0'
Then, in the conda-forge recipe, I added an environment variable (__version__) to the build step:
build:
noarch: python
script: export __version__={{ version }} && {{ PYTHON }} -m pip install . -vv

how do I update xarray?

How can I update xarray? I tried:
>>> import xarray
>>> xarray.show_versions
<function show_versions at 0x7fcfaf2aa820>
But I cannot find any documentation how to read this, or how to update to a new version of xarray.
I was not the person to install it on the computer, so I do not know if it was through anaconda or something else. Is there a way to find this out?
xarray.show_versions is a function, which prints the versions of xarray and its dependencies.
To get just the version of xarray, you can check the __version__ property of the module.
Updating xarray is best done with pip or conda, depending on how you installed it in the first place.
import xarray as xr
print(xr.__version__)
# '0.18.2'
xr.show_versions()
INSTALLED VERSIONS
------------------
commit: None
python: 3.8.8 (default, Feb 19 2021, 18:07:06)
[GCC 8.3.0]
python-bits: 64
OS: Linux
OS-release: 5.11.0-27-generic
machine: x86_64
processor:
byteorder: little
LC_ALL: C.UTF-8
LANG: C.UTF-8
LOCALE: ('en_US', 'UTF-8')
libhdf5: 1.12.0
libnetcdf: 4.7.4
xarray: 0.18.2
pandas: 1.2.4
numpy: 1.20.3
scipy: 1.6.3
netCDF4: 1.5.6
pydap: None
h5netcdf: None
h5py: None
Nio: None
zarr: 2.8.3
cftime: 1.5.0
nc_time_axis: None
PseudoNetCDF: None
rasterio: 1.2.3
cfgrib: None
iris: None
bottleneck: 1.3.2
dask: 2021.05.0
distributed: 2021.05.0
matplotlib: 3.4.2
cartopy: None
seaborn: None
numbagg: None
pint: None
setuptools: 53.0.0
pip: 21.1.1
conda: None
pytest: None
IPython: 7.23.1
sphinx: None
To update xarray:
pip install --upgrade xarray
or
conda update xarray
To see if it was installed using conda or pip, run conda list xarray. If it was installed using pip, it should state pypi in the Channel column.
This is for those who want to do through GUI and who use software like pycharm, spyder, or other similar softwares.
SO, try finding 'python interpreter' in the settings. Most softwares shows the existing packages, current version,latest version(for example see the image of pycharm)
There is option to select the version that you want. for example there are times, when a module is in its beta phase and is not stable in usage. so, you can specify the latest stable version too. It is applicable for any module and not limited to xarray.

Python error using pyarrow - ArrowNotImplementedError: Support for codec 'snappy' not built

Using Python, Parquet, and Spark and running into ArrowNotImplementedError: Support for codec 'snappy' not built after upgrading to pyarrow=3.0.0. My previous version without this error was pyarrow=0.17. The error does not appear in pyarrow=1.0.1 and does appear in pyarrow=2.0.0. The idea is to write a pandas DataFrame as a Parquet Dataset (on Windows) using Snappy compression, and later to process the Parquet Dataset using Spark.
import numpy as np
import pandas as pd
import pyarrow as pa
import pyarrow.parquet as pq
df = pd.DataFrame({
'x': [0, 0, 0, 1, 1, 1],
'a': np.random.random(6),
'b': np.random.random(6)})
table = pa.Table.from_pandas(df, preserve_index=False)
pq.write_to_dataset(table, root_path=r'c:/data', partition_cols=['x'], flavor='spark')
Something is wrong with the conda install pyarrow method. I removed it with conda remove pyarrow and after that installed it with pip install pyarrow. This ended up working.
The pyarrow package you had installed did not come from conda-forge and it does not appear to match the package on PYPI. I did a bit more research and pypi_0 just means the package was installed via pip. It does not mean it actually came from PYPI.
I'm not really sure how this happened. You could maybe check your conda log (envs/YOUR-ENV/conda-meta/history) but, given that this was installed external from conda, I'm not sure there will be any meaningful information in there. Perhaps you tried to install Arrow after the version was bumped to 3 and before the wheels were uploaded and so your system fell back to building from source?
I had the exact same issue. Did fresh install of Anaconda 3.8. then did conda install -c conda-forge pyarrow from this link "https://anaconda.org/conda-forge/pyarrow". It chokes through this install but fails with frozen/flexible solve and conda keeps trying different variants until finally it installs. You can then import pyarrow. But then, when you try to open a parquet file, you get the 'snappy' codec error - the subject of this thread.
I then did conda remove pyarrow so I was back to a clean install. Then pip install pyarrow, and I could successfully load the parquet file.
I managed to get it to work by doing a pip install pyArrow from Conda prompt.
I'm not 100%, but it could be because since version 1.0.0 they slimmed down the default arrow build and snappy became an optional component, see
I think you would have to rebuild arrow using -DARROW_WITH_SNAPPY=ON, see. But this can be quite difficult and tedious to get to work.
Another option would be to disable snappy:
pq.write_to_dataset(table, root_path=r'c:/data', partition_cols=['x'], flavor='spark', compression="NONE")

How to make cython a requirement for a pip install?

When creating a Python package and uploading it to pypi, it will automatically install the requirements that are put in the setup.py file under install_requires, e.g.
from distutils.core import setup
setup(
name = 'a_package',
packages = ['a_package'],
install_requires=['another_package']
)
When the package has a cython extension (and .pyx files instead of .c/.cpp files), the setup.py file will need to import cython to create an installable extension, e.g.
from distutils.core import setup
from distutils.extension import Extension
from Cython.Distutils import build_ext
setup(
name = 'a_package',
packages = ['a_package'],
install_requires=['another_package'],
cmdclass = {'build_ext': build_ext},
ext_modules = [Extension('the_extension', sources=['a_file.pyx'])]
)
But since Cython is imported before executing the setup part, when trying to install this package through pip from source (rather than from a wheel) downloaded from pypi, it will fail to install due to not being able to import cython, as it has not reached the part with the requirements yet.
I’m wondering what can be done to ensure that a pip install of this package from pypi will install cython before it tries to import it. Adding a requirements.txt with cython does not seem to add automatic-install requirements for files downloaded from pypi.
Now, I realize it’s possible to just pip install cython before pip install thispackage, but I’m wondering if there’s a better fix that would allow to install the package along with cython directly from pypi when it’s not possible to run an additional command (without resorting to uploading the .c. files and ajusting the setup.py file to use them instead of the .pyx).
What you're describing is a "build time dependency", and this is precisely the use case "PEP 518 -- Specifying Minimum Build System Requirements for Python Projects" was created for.
You can specify cython as a build-time dependency by adding a pyproject.toml file like:
[build-system]
requires = ["cython"]
Then when installing your package with a modern version of pip (or another PEP 518 compatible installer), cython will be installed into the build environment before your setup.py script is run.

Resources