why poetry removes virtualenv? - python-poetry

my package doesn't requires virtualenv directly, some 3rd package does. However, when running test in tox, poetry install -E test -vvv alwasy fail due to:
poetry removes virtualenv first, which is created by tox
then it tries remove other parts and failed, due to virtualenv is removed, some packages cannot found.
the tox.ini:
[testenv]
skip_install = true
deps = poetry
commands =
poetry install -E test -vvv
the errors:
Project environment contains an empty path in sys_path, ignoring.
Installing dependencies from lock file
Finding the necessary packages for the current system
Package operations: 73 installs, 1 update, 16 removals, 68 skipped
• Removing virtualenv (20.16.3): Pending...
• Removing virtualenv (20.16.3): Removing...
• Removing virtualenv (20.16.3)
• Removing webencodings (0.5.1): Pending...
• Removing webencodings (0.5.1): Removing...
• Removing webencodings (0.5.1): Failed
Command '['/apps/backtest/.tox/py38/bin/python', '/apps/backtest/.tox/py38/lib/python3.8/site-packages/virtualenv/seed/wheels/embed/pip-22.2.2-py3-none-any.whl/pip', 'uninstall', 'webencodings', '-y']' returned non-zero exit status 2.
Command ['/apps/backtest/.tox/py38/bin/python', '/apps/backtest/.tox/py38/lib/python3.8/site-packages/virtualenv/seed/wheels/embed/pip-22.2.2-py3-none-any.whl/pip', 'uninstall', 'zipp', '-y'] errored with the following return code 2, and output:
/apps/backtest/.tox/py38/bin/python: can't open file '/apps/backtest/.tox/py38/lib/python3.8/site-packages/virtualenv/seed/wheels/embed/pip-22.2.2-py3-none-any.whl/pip': [Errno 2] No such file or directory
of course pip doesn't exist since it belongs virtualenv and has been removed.
the question is:
how to find which 3rd packages requires virtualenv?
how to disallow poetry to remove virtualenv (it does this for install it later) if I can't remove dependency to virtualenv?

You are mixing two different installation concepts and the second overrides the first.
deps = poetry
This installs poetry (and it's dependencies including virtualenv) into into the virtualenvironment created by tox. The deps section is a tox concept that installs packages required for testing other than the package installation itself.
Then the commands run.
poetry install -E test -vvv
The poetry command will detect it is inside a virtualenv and then install the dependencies into that virtualenv, but also cleaning up unnecessary packages for your package. Thus, poetry is overriding it's own dependencies. Causing the errors you're encountering.
Solution is documented here. Usecase 1 does the trick for me.
You would need to include the pyproject.toml into your answer as that would be necessary for me to identify any erroneous setup there.

Related

pip install is looking for required dependency within the package

I'm build up my package with setup.py, in which it has a install_requires=get_requirements('requirements.txt'), and in the requirements.txt, the first required dependency is bottle==0.12.18.
Then I ran
python setup.py sdist
twine upload --repository testpypi dist/*
After uploading succeeded, I installed with pip install -i https://test.pypi.org/simple/ my_package==version_num which gives me the error
Collecting bottle==0.12.18 (from mypakage==version_num)
Could not find a version that satisfies the requirement bottle==0.12.18 (from mypakage==version_num) (from versions: )
No matching distribution found for bottle==0.12.18 (from mypakage==version_num)
Looks like it's looking for the dependency within my package, which will definitely fail. This error suddenly started happening and I've never seen this before. Do you have any idea why it's happening and how can I make it look for dependency in a way like pip install -r requirements?

Installing specific versions of pyproj from github

I have been trying to install obspy and have been running into a lot of problems. I want to install obspy which has a dependency on pyproj. But apparently obspy only works with pyproj 1.9.5.1, which I tried installing using pip (pip3 install pyproj==1.9.5.1), but only got the errors like-
_proj.c:7488:13: error: ‘PyThreadState’ {aka ‘struct _ts’} has no member named ‘exc_traceback’; did you mean ‘curexc_traceback’?
Digging deeper I found that it might be a Cython problem, and installing pyproj directly from github might help, because it would apparently make Cython recompile all the necessary files. Something along the lines of -
pip3 install git+https://github.com/jswhit/pyproj.git
However this one gives the error -
ERROR: Minimum supported proj version is 6.2.0, installed version is 5.2.0.
I di try installing a higher version of libproj-dev (sudo apt install libproj-dev=6.2.0) however it shows that there is no candidate for 6.2.0. I tried downloading the deb file and installing from that using -
sudo apt-get install ~/Downloads/libproj-dev_6.2.0-1_amd64.deb
which just leads to the error -
The following packages have unmet dependencies:
libproj-dev : Depends: libproj15 (= 6.2.0-1) but it is not installable
E: Unable to correct problems, you have held broken packages.
But I think this is not the right way to install for me anyway, since I need a specific version. Hence I tried installing directly from the tarball of the release -
pip3 install https://github.com/pyproj4/pyproj/archive/v1.9.5.1rel.tar.gz
Which leads to the first error I had, evidently due to Cython.
With errors on everything I tried to do to fix this, I am not sure what even is relevant to my problem now.
Any help is appreciated, and if this site is not the correct place for this question, please help me migrate it to its proper destination.
I am on Ubuntu 18.10.
The problem is, that Cython-generated c-files don't work for Python-3.7 if generated with Cython versions up to 0.27.3 (at least): The setup.py of pyproj (at least in the version 1.9.5.1) doesn't regenerate the_proj.c, which is generated with Cython 0.23.2 and thus the installation cannot succeed.
You have the following options:
stay on Python3.6 where everything works out of the box.
regenerate _proj.c with a current Cython-version.
For the second option:
download and unzip your prefered version from https://github.com/pyproj4/pyproj/releases/tag/v1.9.5.1rel and switch to the created folder pyproj-1.9.5.1rel.
check, that the cython-version is >=0.27.3. via cython --version.
regenerate the _proj.c file via cython -3 _proj.pyx (_proj.pyx looks like Python3-code, but also language_level=2 (i.e. cython -2 _proj.pyx) will probably work.
install running pip install .
pyproj 1.9.5.1 was release at Jan 7, 2016. At that time, the latest version Python was 3.5. In my tests. pyproj 1.9.5.1 failed to be installed on Python 3.7.4, but succeeded on Python 3.5.7.
You need to create a environment with Python 3.5 by pyenv or conda.
References
pyproj 1.9.5.1 release
Python release history

python package can be installed by pip but not conda

I need the sacred package for a new code base I downloaded. It requires sacred.
https://pypi.python.org/pypi/sacred
conda install sacred fails with
PackageNotFoundError: Package missing in current osx-64 channels:
- sacred
The instruction on the package site only explains how to install with pip. What do you do in this case?
That package is not available as a conda package at all. You can search for packages on anaconda.org: https://anaconda.org/search?q=sacred You can see the type of package in the 4th column. Other Python packages may be available as conda packages, for instance, NumPy: https://anaconda.org/search?q=numpy
As you can see, the conda package numpy is available from a number of different channels (the channel is the name before the slash). If you wanted to install a package from a different channel, you can add the option to the install/create command with the -c/--channel option, or you can add the channel to your configuration conda config --add channels channel-name.
If no conda package exists for a Python package, you can either install via pip (if available) or build your own conda package. This isn't usually too difficult to do for pure Python packages, especially if one can use skeleton to build a recipe from a package on PyPI.
It happens some issue to me before. If your system default Python environment is Conda, then you could download those files from https://pypi.python.org/pypi/sacred#downloads
and manually install by
pip install C:/Destop/some-file.whl

Why can't pip find pysvn?

I'm working on a project which was written in Python 2, and I'm upgrading it to Python 3. So far, I've just been finding minor syntax errors which are easily fixable. What I've done is created a new project in Python 3, ensured that it worked, and copies chunks of code from the old project into the new one.
Right now, I'm having trouble with pysvn. Initially, I was getting this error:
ImportError: No module named 'pysvn'
At this point, I tried using pip install pysvn, which didn't work. I got the following:
pip install pysvn
Collecting pysvn
Could not find a version that satisfies the requirement pysvn (from versions:)
No matching distribution found for pysvn
So then after a bit of research, I went to the pysvn download site and tried:
>pip install --index-url http://pysvn.tigris.org/project_downloads.html pysvn, which gave me this error:
Collecting pysvn
The repository located at pysvn.tigris.org is not a trusted or secure host and is being ignored. If this repository is available via HTTPS it is recommended to use HTTPS instead, otherwise you may silence this warning and allow it anyways with '--trusted-host pysvn.tigris.org'.
and also the same error as when I tried >pip install pysvn.
My next step was to manually download the .exe file for the version I needed, and I was able to successfully install pysvn. I have checked the site-packages directory, and pysvn is indeed there, but pip still can't tell me anything about it:
>pip show pysvn
>
When I do this for another installed module, selenium for example, I get the following:
pip show selenium
Metadata-Version: 1.1
Name: selenium
Version: 2.49.2
Summary: Python bindings for Selenium
Home-page: https://github.com/SeleniumHQ/selenium/
Author: UNKNOWN
Author-email: UNKNOWN
License: UNKNOWN
Location: ...\lib\site-packages
Requires:
I was able to verify that the installation of pysvn was successful because my project now runs instead of giving me that ImportError.
So why can pip not give me information for another module in the same directory that was successfully installed?
As it turns out, because I didn't use pip install for pysvn, pip didn't know that pysvn existed. Because it wasn't available from PyPI (the Python Package Index), there was no way that pip could see it (because that's where pip goes first to find packages that it's attempting to install).
From the pip user guide:
pip supports installing from PyPI, version control, local projects, and directly from distribution files.
Since I had eventually downloaded pysvn from its own download site (which was not any of the above 4 options) and ran the .exe manually, pip simply doesn't know about it even though it's in the same directory as other packages installed by pip.
I suppose I could've also retrieved the distribution files and used pip with those, but my workaround did the trick.
My way on linux:
Get sources from here
tar -zxf pysvn-1.9.10.tar.gz
apt-get install subversion libsvn1 libsvn-dev make g++
cd pysvn-1.9.10/Source
python setup.py configure --pycxx-dir=/pysvn-1.9.10/Import/pycxx-7.1.3/
make
Here i've got errors below:
Compile: /pysvn-1.9.10/Import/pycxx-7.1.3/Src/cxxsupport.cxx into cxxsupport.o
/pysvn-1.9.10/Import/pycxx-7.1.3/Src/cxxsupport.cxx:42:10: fatal error: Src/Python3/cxxsupport.cxx: No such file or directory
#include "Src/Python3/cxxsupport.cxx"
Compile: /pysvn-1.9.10/Import/pycxx-7.1.3/Src/cxxextensions.c into cxxextensions.o
/pysvn-1.9.10/Import/pycxx-7.1.3/Src/cxxextensions.c:42:10: fatal error: Src/Python3/cxxextensions.c: No such file or directory
#include "Src/Python3/cxxextensions.c"
It is needed to edit that files:
vi /pysvn-1.9.10/Import/pycxx-7.1.3/Src/cxxsupport.cxx
change #include "Src/Python3/cxxsupport.cxx" to
#include "Python3/cxxsupport.cxx"
and same on second file. Than make again:
make clean && make
...
Compile: /code/pysvn-1.9.10/Import/pycxx-7.1.3/Src/cxxextensions.c into cxxextensions.o
Compile: /code/pysvn-1.9.10/Import/pycxx-7.1.3/Src/IndirectPythonInterface.cxx into IndirectPythonInterface.o
Compile: /code/pysvn-1.9.10/Import/pycxx-7.1.3/Src/cxx_exceptions.cxx into cxx_exceptions.o
Link pysvn/_pysvn_3_7.so
Then just copy it to the site-packages (change to yours directory):
mkdir /usr/local/lib/python3.7/site-packages/pysvn
cp /code/pysvn-1.9.10/Sources/pysvn/__init__.py /usr/local/lib/python3.7/site-packages/
cp /code/pysvn-1.9.10/Sources/pysvn/_pysvn*.so /usr/local/lib/python3.7/site-packages/

"Placeholder too short" error during anaconda installation of ncurses

I'm trying to install rpy2 with anaconda using:
conda install -c https://conda.anaconda.org/r rpy2
While conda is updating dependencies and linking packages, it stops with this error:
Linking packages ...
Error: ERROR: placeholder '/root/miniconda3/envs/_build_placehold_placehold_placehold_placehold_placehold_p' too short in: ncurses-5.9-4
Here's info for the installation.
Current conda install:
platform : linux-64
conda version : 3.18.2
conda-build version : 1.14.1
python version : 2.7.10.final.0
requests version : 2.8.0
Does anyone know what this error means and how to resolve it?
When Conda installs files, some of them have the build prefix in them. That's the placeholder you see. We have to change that before packages will work on your system. That's "relocatability." The prefix that you are trying to install to is longer than the prefix that the package was built with. We can replace longer strings with shorter strings in the replacement, but not vice versa.
We have increased the path length of the build prefix in Conda-Build 2.0.0, which is in beta right now. Once people begin using this, these problems should go away. However, it will only be truly effective by rebuilding all packages that have binary-embedded prefixes. This will take quite a while.
TLDR: try to install to a shorter folder path, if at all possible.

Resources