pip install from local directory doesnt add [build-system] dependencies - pip

In our project (Locust) we use setuptools_scm for versioning, so it is needed for all installations from local directory.
We used to have this specified in setup.py:
setup(
setup_requires=["setuptools_scm>=6.2"],
...
)
But we have upgraded to use setup.cfg and pyproject.toml
[build-system]
requires = ["setuptools_scm>=6.2", ...]
This works nicely in most cases, but it does not install setuptools_scm if someone does pip install -e . (it doesnt work for pip install . either but that is less important)
With no setuptools_scm installed the local version becomes 0.0.0:
~/git/locust pip install -e .
Looking in indexes: https://pypi.org/simple
...
Running setup.py develop for locust
Successfully installed locust-0.0.0
... and that makes me very sad.
What is the appropriate way make pip install setuptools_scm when installing from source?
I could of course add it as a regular dependency in setup.cfg, but that would make thousands of users download setuptools_scm even when it is not needed (when installing from PyPi)

Related

Does pip only use PyPI or does it use other domains to find packages?

I know that by default, pip uses PyPI to look for packages. I would like to know if there are other domains other than PypI that pip uses.
PIP Can install from
PyPI
VCS project URL
Local project directories
Local or remote source archives
to run from a local passage you can input pip install /opt/mypackage
Finally, run pip install --help to see all installation options
PIP can install from many different sources. You can find the whole list here
You can also setup your own Python package repository and configure pip to install from there.

How to install a package with Poetry that requires CLI args?

Here's an example that requires it:
poetry add pycurl
... gives:
ImportError: pycurl: libcurl link-time ssl backend (openssl) is different from compile-time ssl backend (none/other)
The fix is given in that post:
pip install --compile --install-option="--with-openssl" pycurl
How to package-manage my project now?
Must I use poetry for everything else, and manually pip install pycurl?
Or can I somehow fold it into my pyproject.toml?
Although this is old, I would like to share an alternative to re-install the package with the CLI options.
I do not know if running this with Poetry could cause any damage, but I would believe not.
First you'd have to activate your poetry environment with poetry shell.
After that, get the current version of pycurl that is installed using pip list, and copy that value.
Now, you can run pip install --compile --install-option="--with-openssl" --upgrade --force-reinstall pycurl==<version> to install it with the correct options.

How to avoid pip install package again while conda install was done before?

guys:
I use conda install tensorflow-gputo install tensorflow 2.0 , and
numpy=1.20.2 would be one of the package installed, and then I use python3 -m pip install SOMEPACKAGE ,this SOMEPACKAGE needs numpy to be installed as well , but pip seems does not check or realize the package numpy has already installed...
I would like to show everything I know so far :
1.I know the packages installed via conda install would go to anaconda3/envs/YOUR_ENV/lib/site-packages
2.I use python3 -m pip install -t anaconda3/envs/YOUR_ENV/lib/site-packages to force the package would be installed to the place where conda install would be.
However,pip still tries to dwonload *.whl file and install package again,I do not want this package installation process happen again ,while it did mention that I can use --upgrade to replace the existed package...
So I would like to know
How does pip and conda install check if the target package has already existed before they actually to through install process?
I think using python3 you are not using interpreter from your current conda environment so it gets installed elsewhere
python -m pip install (or simply pip install) from your activated environment should work and ignore dependencies installed by conda if they satisfy the requirements

How would you install python modules/packages so that my script can be run with pypy3 instead of Python3? ImportError: No module named

I have Python3.8 built from source on my Debian 10 Xfce desktop (binaries are not available in Debian repositories). That said, whenever I can, I run my python scripts with pypy3, which I do for the sake of performance.
Now, when I run the following code with pypy3 :
#!/usr/bin/env python3.8
import requests
from bs4 import BeautifulSoup
url = input("What is the address of the web page in question?")
response = requests.get(url)
soup = BeautifulSoup(response.text, 'html.parser')
print(soup.title.string)
I get from pypy3:
ImportError: No module named 'requests'
The same script is run by Python3.8 without any problems
I assume that I would have to install the module in a similar way that I did it for Python, that is: sudo pip3.8 install requests.
Based on my research of a similar problem described on Stackoveflow I tried:
pypy3 -m pip3.8 install requests
and got the following from my pypy3:
Error while finding module specification for 'pip3.8' (ImportError: No >module named 'pip3')
Then I also tried to run:
pypy3 -m pip install requests
And got the following:
No module named pip
My pip3.8 works fine for Python3.8, not for my pypy3, though.
How should I look for modules in pypy3. And how should I install them?
Is the problem with installing and importing modules one of the reasons reason for the low usage of pypy3?
Run this once to install pip itself: pypy3 -m ensurepip
The next version of PyPy will improve the error message to describe this command explicitly when you do pypy3 -m pip and pip is not installed yet.
pypy3
Enable snaps on Debian and install pypy3
Snaps are applications packaged with all their dependencies to run on all popular Linux distributions from a single build. They update automatically and roll back gracefully.
Snaps are discoverable and installable from the Snap Store, an app store with an audience of millions.
Enable snapd:
sudo apt update
sudo apt install snapd
Install pypy3:
sudo snap install pypy3 --classic
Normally, the pip and package are installed as follows
First of all, you need to install the pip
Install pip for Python 3
Follow the steps below to install Pip for Python 3 on Debian:
First, update the package list with:
sudo apt update
Next, install pip for Python 3 and all of its dependencies by typing:
sudo apt install python3-pip
Verify the installation by printing the pip version:
pip3 --version
The version number may be different, but it will look something like the one below:
pip 9.0.1 from /usr/lib/python3/dist-packages (python 3.5)
Pip Usage
With pip, we can install packages from PyPI, version control, local projects, and from distribution files but in most cases, you will install packages from PyPI.
we want to install a package named croniter, we can do that by issuing the following command:
pip install requests
To uninstall a package run:
pip uninstall requests

how to uninstall packages installed with pip3 and it's dependencies?

I was installing apache-airflow in my centOS 8. Only pip3 works in my environment. I did something with the environment variable which created two config files for airflow. I am not able to find another config file to delete it. So, I was trying to uninstall airflow. I used
pip3 uninstall apache-airflow
It removed the package but still, the other dependent files that were installed are there. I googled and found pip-autoremove but it doesn't work for pip3.
I am trying to find a way to clean install airflow again by removing all the old files, dependent packages. Is there a way to use autoremove in pip3 or are there any other alternatives for my issue?
Maybe if you make a new Virtual Environment and then install your package inside it.
python3 -m venv /path/to/new/virtual/environment
source <venv>/bin/activate.csh
pip3 install apache-airflow
pip3 freeze > dependencies.txt
Then make a pip freeze and now you can delete all installed packages (which are apache-airflow and its dependencies) in you working environment. So you can go to your working environment and just delete them:
pip3 uninstall -r <path>/dependencies.txt
Delete all the files under $AIRFLOW_HOME (default path: ~/airflow). Airflow will look for config file at $AIRFLOW_HOME/airflow.cfg. So reinstall airflow, set $AIRFLOW_HOME to the place where you want to have all your config files and DAGs as mentioned in https://airflow.apache.org/start.html.

Resources