This is my .github/workflows/push.yaml file:
name: Push
on:
push:
branches:
- main
pull_request:
branches:
- main
jobs:
test:
runs-on: ubuntu-latest
steps:
- name: Checkout sources
uses: actions/checkout#v3
- name: Setup Python 3.10
uses: actions/setup-python#v4
with:
python-version: "3.10"
- name: Install dependencies
run: |
pip install poetry
poetry install
python -m pip install -e .
- name: Run tests
run: python -m unittest discover tests
- name: Check code style
run: pre-commit run --all-files
These are the dependencies declared in the pyproject.toml file:
[tool.poetry.dependencies]
python = "^3.10"
pyyaml = "^6.0"
protobuf = "^4.21.12"
gspread = "^5.7.2"
openpyxl = "^3.1.0"
[tool.poetry.group.dev.dependencies]
pre-commit = "^3.0.4"
[tool.poetry.group.test.dependencies]
pandas = "^1.5.3"
I get the following error in GitHub when I push some code to my repo:
======================================================================
ERROR: test_to_xlsx (unittest.loader._FailedTest)
----------------------------------------------------------------------
ImportError: Failed to import test module: test_generic
Traceback (most recent call last):
File "/opt/hostedtoolcache/Python/3.10.9/x64/lib/python3.10/unittest/loader.py", line 436, in _find_test_path
module = self._get_module_from_name(name)
File "/opt/hostedtoolcache/Python/3.10.9/x64/lib/python3.10/unittest/loader.py", line 377, in _get_module_from_name
__import__(name)
File "/home/runner/work/path/to/tests/test_generic.py", line 4, in <module>
import pandas as pd
ModuleNotFoundError: No module named 'pandas'
When I check the "Install dependencies" step log in GitHub, pandas is supposedly
being installed:
• Installing pandas (1.5.3)
However, when I add pip install pandas to my Install dependencies step, all tests are passed:
- name: Install dependencies
run: |
pip install poetry
poetry install
python -m pip install -e .
pip install pandas
----------------------------------------------------------------------
Ran 26 tests in 0.157s
OK
Obviously, I don't want to add pip install pandas like that. However, it shows that Poetry is not installing pandas correctly. What can I do in order to fix this annoying error?
PS: My poetry.lock file has the pandas package declared.
Related
I have added Machine Learning Model and preceding with CI pipeline in Github, however while executing workflow, I am getting error for Missing required root key on. Below is my yml file code.
ci_pipeline:
on:
push:
branches:
- main
steps:
- uses: actions/checkout#v1
with:
fetch-depth: 0
- name: Set up Python 3.9
uses: actions/setup-python#v1
with:
python-version: 3.9
- name: Install dependencies
run: |
python -m pip install --upgrade pip
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
- name: Format
run: |
black app.py
- name: Lint
run: |
pylint --disable=R,C app.py
- name: Test
run: |
python -m pytest -vv test.py
I have a docker image with poetry-core installed and inside the container I tried to run a pip install of project and it failed with the package not found error
r
oot#test-86978fffbb-6rx79:/home# /usr/local/bin/python3 -m pip show poetry-core
Name: poetry-core
Version: 1.0.8
Summary: Poetry PEP 517 Build Backend
Home-page: https://github.com/python-poetry/poetry-core
Author: Sébastien Eustace
Author-email: sebastien#eustace.io
License: MIT
Location: /usr/local/lib/python3.9/site-packages
Requires:
Required-by:
My pyproject.toml
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
When I run pip install with internet disabled, it fails not able to find the poetry-core
root#test-86978fffbb-6rx79:/app# /usr/local/bin/python3 -m pip install /app/
Processing /app
Installing build dependencies ... error
error: subprocess-exited-with-error
× pip subprocess to install build dependencies did not run successfully.
? exit code: 1
??> [2 lines of output]
ERROR: Could not find a version that satisfies the requirement poetry-core>=1.0.0 (from versions: none)
ERROR: No matching distribution found for poetry-core>=1.0.0
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error
× pip subprocess to install build dependencies did not run successfully.
? exit code: 1
??> See above for output.
I want to test my Python package not only when built locally, but also when it is installed via pip. For this, I basically want to do a pip3 install git+https://github.com/…, like this:
name: CI test
on: [push, pull_request]
jobs:
tests:
name: Pip test
runs-on: ubuntu-20.04
steps:
- name: Setup basic environment
run: |
sudo apt-get update
sudo apt-get install --no-install-recommends python3-dev python3-pip
- name: Install my package via pip
run: |
pip3 install git+$GITHUB_SERVER_URL/$GITHUB_REPOSITORY#$GITHUB_SHA
- name: Run tests
run: |
python3 -m pytest -s --pyargs mypkg
This works nicely for pushed commits, but the Pull Request actions fail claiming that the GITHUB_SHA is "not a tree".
How can I change this so that it works on both commits and pull requests?
I want to set Gitlab-CI for my python project with SonarQube. I have one problem with that.
I set SonarQube variables in settings gitlab-ci.
This is my gitlab-ci.yml file:
variables:
SONARQUBE_ARGUMENTS_NORMAL: -Dsonar.host.url=$SONAR_HOST_URL -Dsonar.login=$SONAR_LOGIN -Dsonar.password=$SONAR_PASSWORD --stacktrace
SONARQUBE_ARGUMENTS_PREVIEW: -Dsonar.host.url=$SONAR_HOST_URL -Dsonar.login=$SONAR_LOGIN -Dsonar.password=$SONAR_PASSWORD --stacktrace -Dsonar.analysis.mode=preview -Dsonar.gitlab.project_id=$CI_PROJECT_PATH -Dsonar.gitlab.commit_sha=$CI_COMMIT_SHA -Dsonar.gitlab.ref_name=$CI_COMMIT_REF_NAME
STAGE_ID: ${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}_${CI_JOB_NAME}_${CI_JOB_ID}
image: "python:3.7"
before_script:
- python --version
- python -c 'import struct;print( 8 * struct.calcsize("P"))'
- pip install --upgrade pip
- pip install --upgrade setuptools
- pip install pytest
- pip install -r requirements.txt
stages:
- Static Analysis
- Test
mypy:
stage: Static Analysis
script:
- python -m pip install mypy
- pwd
- ls -l
- python -m mypy --ignore-missing-imports *.py
flake8:
stage: Static Analysis
script:
- python -m pip install flake8
- flake8 --max-line-length=120 /*.py
pylint:
stage: Static Analysis
script:
- pip install pylint
- pylint -d C0301,C0114 *.py
test:
stage: Test
script:
- pwd
- ls -l
- export PYTHONPATH="$PYTHONPATH:."
- python -c "import sys;print(sys.path)"
- pytest sonarqube $SONARQUBE_ARGUMENTS_NORMAL
after_script:
- echo "End CI"
Static analisys (flake8, mypy, pylint) works fine but I have problem with test stage where I use Pytest.
In CI log I have that:
$ pytest sonarqube $SONARQUBE_ARGUMENTS_NORMAL
ERROR: usage: pytest [options] [file_or_dir] [file_or_dir] [...]
pytest: error: unrecognized arguments: -Dsonar.host.url=http://127.0.0.0:8000 -Dsonar.login=my_login -Dsonar.password=my_password --stacktrace
Can you give me any idea, what I doing wrong?
I doing that same in android project with gradle and that works fine.
./gradlew test sonarqube $SONARQUBE_ARGUMENTS_NORMAL
You need to install the sonar-scanner package on your runner and call directly to sonar-scanner binary with the needed properties on a sonar.properties file
Getting SSL Certificate error while deploying the following code on aws lambda using aws codestar build pipeline.
Looked at multiple community discussions, nothing worked out.
version: 0.2
phases:
install:
commands:
# Upgrade AWS CLI & PIP to the latest version
- pip install --upgrade awscli
- pip install --upgrade pip
# Define Directories
- export HOME_DIR=`pwd`
- export NLTK_DATA=$HOME_DIR/nltk_data
pre_build:
commands:
- cd $HOME_DIR
# Create VirtualEnv to package for lambda
- virtualenv venv
- . venv/bin/activate
# Install Supporting Libraries
- pip install -U scikit-learn
- pip install -U requests
# Install WordNet
- pip install -U nltk
- python -m nltk.downloader -d $NLTK_DATA punkt
# Output Requirements
- pip freeze > requirements.txt
# Discover and run unit tests in the 'tests' directory. For more information, see <https://docs.python.org/3/library/unittest.html#test-discovery>
- python -m unittest discover tests
build:
commands:
- cd $HOME_DIR
- mv $VIRTUAL_ENV/lib/python3.6/site-packages/* .
Only way that worked for me was download the modules and install them into my source folder in a nltk_data folder then create a lambda environment variable NLTK with value ./nltk_data