I'm trying to spin up a super simple package for proof of concept and I can't see what i'm missing.
My aim is to be able to do the following:
python3 import mypackage
mypackage.add2(2)
>> 4
Github link
I created a public repo to reproduce the issue here
git clone https://github.com/OliverFarren/testPackage
Problem
I have a basic file structure as follows:
src/
mypackage/
__init__.py
mymodule.py
setup.cfg
setup.py
pyproject.toml
setup.cfg is pretty boiler plate from here
setup.py is just to allow pip install in editable mode:
import setuptools
setuptools.setup()
I ran the following commands at the top level directory in my Pycharm virtual env:
python3 -m pip install --upgrade build
python3 -m build
That created my dist and build directories and mypackage.egg-info file so now the directory looks like this:
testpackage
build/
bdist.linux-x86_64/
dist/
mypackage-0.1.0.tar.gz
mypackage-0.1.0-py3-none-any.whl
src/
mypackage/
mypackage.egg-info
__init__.py
mymodule.py
setup.cfg
setup.py
pyproject.toml
I've then tried install the package as follows:
sudo pip3 install -e .
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing wheel metadata ... done
Installing collected packages: mypackage
Running setup.py develop for mypackage
Successfully installed mypackage
Which I think should have installed it. Except when I try and import the package I get a ModuleNotFoundError
I'm wondering whether this is a permissions issue of some sort. When I try:
sudo pip3 list
pip3 list
I notice i'm getting different outputs, I can see my package present in the list and in my sys.path:
~/testpackage/src/mypackage'
I just don't understand what i'm missing here. Any advice would be appreciated!
Ok so I found the issue. Posting solution and leaving the github repo live - with fix, incase anyone else has this issue.
It turns out my setup.cfg wasn't boiler plate.
Here was my incorrect code:
[metadata]
# replace with your username:
name = mypackage
author = Oliver Farren
version = 0.1.0
description = Test Package
classifiers =
Programming Language :: Python :: 3
License :: OSI Approved :: MIT License
Operating System :: OS Independent
[options]
package_dir =
= src/mypackage
packages = find:
python_requires = >=3.6
[options.packages.find]
where = src/mypackage
src/mypackage should be src, it was looking inside the package for packages.
A key step in debugging this issue was checking the mypackage.egg.info files. The SOURCES.txt contained a list of all the files in the build package and I could clearly see that in the incorrect build, that src/mypackage/mymodules.py and src/mypackage/__init__.py were missing. So the package was correctly installed by pip, but being empty was making for a very confusing error message.
Related
How can I save my own package to conda environment so it is importable from any location once environment is activated?
When we conda activate my_env and pip install package the package can be imported no matter what the location of file.py is. What can I do to have my own_local_package importable the same way once my_env is activated?
You can use pip locally to install packages, and use import mypackage the same way you do with any other module, the correct approach is to :
python -m pip install -e /path_to_package/mypackage/
python -m ensures you are using the pip package from the same python installation you are currently using.
-e makes it editable, i/e import mypackage will reload after you make some changes, instead of using the cached one.
mypackage must contain an __init__.py
file, and a basic setup.py (or pyproject.toml file for pipenv)
the package structure must be like this:
mypackage/
setup.py
mypackage/
__init__.py
minimal setup.py
from setuptools import find_packages, setup
setup(
name='mypackage', # Required
version='0.0.1', # Required
packages=find_packages(), # Required
)
for a more elaborate package:
the package structure must be like this:
mypackage/
setup.py
mypackage/
src/
__init__.py
__main__.py
additional python files
...
minimal setup.py
from setuptools import find_packages, setup
setup(
name='mypackage', # Required
version='0.0.1', # Required
packages=find_packages(where="/src"), # Required
)
Let's say I have the following package structure:
package/
mynamespace-subpackage-a/
setup.py
mynamespace/
subpackage_a/
__init__.py
mynamespace-subpackage-b/
setup.py
mynamespace/
subpackage_b/
__init__.py
module_b.py
with setup.py in package a:
from setuptools import find_packages, setup
setup(
name='mynamespace-subpackage-a',
...
packages=find_packages(),
namespace_packages=['mynamespace'],
install_requires=['pandas']
)
and package b:
from setuptools import find_packages, setup
setup(
name='mynamespace-subpackage-b',
...
packages=find_packages(),
namespace_packages=['mynamespace'],
install_requires=[]
)
package b uses package a, but it does not have any references to the pandas library itself. So it is not listed in the install_requires, but should still be installed when pip install . is executed inside package b and package a should be packaged along with it.
What should be added in the second setup file to achieve and is this even possible? Or should pandas be in the requirement list of package b as well?
I would suspect something like:
install_requires = ['mynamespace.subpackage_a`]
From what I understood from the question, I believe it should be:
package/mynamespace-subpackage-b/setup.py:
#...
setup(
name='mynamespace-subpackage-b',
# ...
install_requires=[
'mynamespace-subpackage-a',
# ...
],
)
This obviously assumes that pip can found a when installing b, meaning a distribution of a should be published on some kind of index (such as PyPI for example). If it is not possible then maybe one of the following alternatives could help:
Place distributions of a and b (wheel or source distribution) in a local directory, and then use pip's --find-links option (doc): pip install --find-links=path/to/distributions mynamespace-subpackage-b
Use a direct reference file URL as seen in PEP 440: install_requires=['a # file:///path/to/a.whl']
Use a direct remote URL (VCS such as git would work) the URL could be to a private repository or on the local file system: install_requires=['mynamespace-subpackage-a # git+file:///path/to/mynamespace-subpackage-a#master'], this assumes setup.py is at the root of the repository.
When creating a Python package and uploading it to pypi, it will automatically install the requirements that are put in the setup.py file under install_requires, e.g.
from distutils.core import setup
setup(
name = 'a_package',
packages = ['a_package'],
install_requires=['another_package']
)
When the package has a cython extension (and .pyx files instead of .c/.cpp files), the setup.py file will need to import cython to create an installable extension, e.g.
from distutils.core import setup
from distutils.extension import Extension
from Cython.Distutils import build_ext
setup(
name = 'a_package',
packages = ['a_package'],
install_requires=['another_package'],
cmdclass = {'build_ext': build_ext},
ext_modules = [Extension('the_extension', sources=['a_file.pyx'])]
)
But since Cython is imported before executing the setup part, when trying to install this package through pip from source (rather than from a wheel) downloaded from pypi, it will fail to install due to not being able to import cython, as it has not reached the part with the requirements yet.
I’m wondering what can be done to ensure that a pip install of this package from pypi will install cython before it tries to import it. Adding a requirements.txt with cython does not seem to add automatic-install requirements for files downloaded from pypi.
Now, I realize it’s possible to just pip install cython before pip install thispackage, but I’m wondering if there’s a better fix that would allow to install the package along with cython directly from pypi when it’s not possible to run an additional command (without resorting to uploading the .c. files and ajusting the setup.py file to use them instead of the .pyx).
What you're describing is a "build time dependency", and this is precisely the use case "PEP 518 -- Specifying Minimum Build System Requirements for Python Projects" was created for.
You can specify cython as a build-time dependency by adding a pyproject.toml file like:
[build-system]
requires = ["cython"]
Then when installing your package with a modern version of pip (or another PEP 518 compatible installer), cython will be installed into the build environment before your setup.py script is run.
I am trying to use python-ldap with AWS Lambda. I downloaded the tarball from : https://pypi.python.org/pypi/python-ldap
and code to use lambda (lambda_function.py)
from ldap_dir.ldap_query.Lib import ldap
and uploaded the zip to Lambda.
where my directory structure is
ldap_dir -> ldap_query -> Lib -> ldap folder
ldap_dir -> lambda_function.py
Am I missing out something?
python-ldap is built on top of native OpenLDAP libraries. This article - even though unrelated to the python ldap module - describes how to bundle Python packages that have native dependencies.
The outline of this is the following:
Create an Amazon EC2 instance with Amazon Linux
Install compiler packages as well as the OpenLDAP developer package. yum install -y gcc openldap-devel
Create a virtual environment: virtualenv env
Activate the virtual environment: env/bin/activate
Upgrade pip (I am not sure this is necessary, but I got a warning without this): pip install --upgrade pip
Install python-ldap: pip install python-ldap
Create a handler Python script, for example, lambda.py with the following code:
import os
import subprocess
libdir = os.path.join(os.getcwd(), 'local', 'lib')
def handler(event, context):
command = 'LD_LIBRARY_PATH={} python ldap.py'.format(libdir)
subprocess.call(command, shell=True)
Implement your LDAP function, in this example ldap.py:
import ldap
print ldap.PORT
Create a zip package, let's say ldap.zip:
zip -9 ~/ldap.zip ldap.py
zip -9 ~/ldap.zip lambda.py
cd env/lib/python2.7/site-packages
zip -r9 ~/ldap.zip *
cd ../../../lib64/python2.7/site-packages
zip -r9 ~/ldap.zip *
Download the zip to your system (or put it into an S3 bucket). Now you can create your Lambda function using lambda.handler as the function name and use the zip file as the code.
I hope this helps.
one more step/check to the solution above:
still you might get No module named '_ldap', then check if the python version that you install on local/EC2 are the same as the Runtime on lambda
I'm working on a project which was written in Python 2, and I'm upgrading it to Python 3. So far, I've just been finding minor syntax errors which are easily fixable. What I've done is created a new project in Python 3, ensured that it worked, and copies chunks of code from the old project into the new one.
Right now, I'm having trouble with pysvn. Initially, I was getting this error:
ImportError: No module named 'pysvn'
At this point, I tried using pip install pysvn, which didn't work. I got the following:
pip install pysvn
Collecting pysvn
Could not find a version that satisfies the requirement pysvn (from versions:)
No matching distribution found for pysvn
So then after a bit of research, I went to the pysvn download site and tried:
>pip install --index-url http://pysvn.tigris.org/project_downloads.html pysvn, which gave me this error:
Collecting pysvn
The repository located at pysvn.tigris.org is not a trusted or secure host and is being ignored. If this repository is available via HTTPS it is recommended to use HTTPS instead, otherwise you may silence this warning and allow it anyways with '--trusted-host pysvn.tigris.org'.
and also the same error as when I tried >pip install pysvn.
My next step was to manually download the .exe file for the version I needed, and I was able to successfully install pysvn. I have checked the site-packages directory, and pysvn is indeed there, but pip still can't tell me anything about it:
>pip show pysvn
>
When I do this for another installed module, selenium for example, I get the following:
pip show selenium
Metadata-Version: 1.1
Name: selenium
Version: 2.49.2
Summary: Python bindings for Selenium
Home-page: https://github.com/SeleniumHQ/selenium/
Author: UNKNOWN
Author-email: UNKNOWN
License: UNKNOWN
Location: ...\lib\site-packages
Requires:
I was able to verify that the installation of pysvn was successful because my project now runs instead of giving me that ImportError.
So why can pip not give me information for another module in the same directory that was successfully installed?
As it turns out, because I didn't use pip install for pysvn, pip didn't know that pysvn existed. Because it wasn't available from PyPI (the Python Package Index), there was no way that pip could see it (because that's where pip goes first to find packages that it's attempting to install).
From the pip user guide:
pip supports installing from PyPI, version control, local projects, and directly from distribution files.
Since I had eventually downloaded pysvn from its own download site (which was not any of the above 4 options) and ran the .exe manually, pip simply doesn't know about it even though it's in the same directory as other packages installed by pip.
I suppose I could've also retrieved the distribution files and used pip with those, but my workaround did the trick.
My way on linux:
Get sources from here
tar -zxf pysvn-1.9.10.tar.gz
apt-get install subversion libsvn1 libsvn-dev make g++
cd pysvn-1.9.10/Source
python setup.py configure --pycxx-dir=/pysvn-1.9.10/Import/pycxx-7.1.3/
make
Here i've got errors below:
Compile: /pysvn-1.9.10/Import/pycxx-7.1.3/Src/cxxsupport.cxx into cxxsupport.o
/pysvn-1.9.10/Import/pycxx-7.1.3/Src/cxxsupport.cxx:42:10: fatal error: Src/Python3/cxxsupport.cxx: No such file or directory
#include "Src/Python3/cxxsupport.cxx"
Compile: /pysvn-1.9.10/Import/pycxx-7.1.3/Src/cxxextensions.c into cxxextensions.o
/pysvn-1.9.10/Import/pycxx-7.1.3/Src/cxxextensions.c:42:10: fatal error: Src/Python3/cxxextensions.c: No such file or directory
#include "Src/Python3/cxxextensions.c"
It is needed to edit that files:
vi /pysvn-1.9.10/Import/pycxx-7.1.3/Src/cxxsupport.cxx
change #include "Src/Python3/cxxsupport.cxx" to
#include "Python3/cxxsupport.cxx"
and same on second file. Than make again:
make clean && make
...
Compile: /code/pysvn-1.9.10/Import/pycxx-7.1.3/Src/cxxextensions.c into cxxextensions.o
Compile: /code/pysvn-1.9.10/Import/pycxx-7.1.3/Src/IndirectPythonInterface.cxx into IndirectPythonInterface.o
Compile: /code/pysvn-1.9.10/Import/pycxx-7.1.3/Src/cxx_exceptions.cxx into cxx_exceptions.o
Link pysvn/_pysvn_3_7.so
Then just copy it to the site-packages (change to yours directory):
mkdir /usr/local/lib/python3.7/site-packages/pysvn
cp /code/pysvn-1.9.10/Sources/pysvn/__init__.py /usr/local/lib/python3.7/site-packages/
cp /code/pysvn-1.9.10/Sources/pysvn/_pysvn*.so /usr/local/lib/python3.7/site-packages/