I want to add these jobs in server but have error(pm2) :
apps:
- name: laravel-queue-worker
script:php /var/www/html/backendcodes/artisan
exec_mode: fork
interpreter: php
instances: 1
args:
- queue:work
- --queue=sendSms,sendNotification,sendEmail
- --tries=3
- --sleep=3
when I run
pm2 start laravel-queue-worker.yml
it returns :
[PM2][ERROR] File laravel-queue-worker.yml malformated
{ [Error: Unable to parse.]
message: 'Unable to parse.',
parsedLine: 3,
snippet: 'script:php /var/www/html/raffle-backend-master/artisan' }
Try with this .yaml, you can use a lint online for check the syntax, like this.
The problem was here -> script:php/var/www/html/backendcodes/artisan
You need to insert a space after script:.
apps:
-
args:
- "queue:work"
- "--queue=sendSms,sendNotification,sendEmail"
- "--tries=3"
- "--sleep=3"
exec_mode: fork
instances: 1
interpreter: php
name: laravel-queue-worker
script: "php /var/www/html/backendcodes/artisan"
Related
When I run pre-commit run --all-files all goes well, when I try to commit, pylint throws an error with: Exit code: 32, followed by the list of usage options. The only files changed are .py files:
git status
On branch include-gitlab-arg
Your branch is up to date with 'origin/include-gitlab-arg'.
Changes to be committed:
(use "git restore --staged <file>..." to unstage)
renamed: code/project1/src/Main.py -> code/project1/src/GitLab/GitLab_runner_token_getter.py
renamed: code/project1/src/get_gitlab_runner_token.py -> code/project1/src/GitLab/get_gitlab_runner_token.py
modified: code/project1/src/__main__.py
modified: code/project1/src/control_website.py
deleted: code/project1/src/get_website_controller.py
modified: code/project1/src/helper.py
Error Output:
The git commit -m "some change." command yields the following pre-commit error:
pylint...................................................................Failed
- hook id: pylint
- exit code: 32
usage: pylint [options]
optional arguments:
-h, --help
show this
help
message and
exit
Commands:
Options which are actually commands. Options in this group are mutually exclusive.
--rcfile RCFILE
whereas pre-commit run --all-files passes.
And the .pre-commit-config.yaml contains:
# This file specifies which checks are performed by the pre-commit service.
# The pre-commit service prevents people from pushing code to git that is not
# up to standards. # The reason mirrors are used instead of the actual
# repositories for e.g. black and flake8, is because those repositories also
# need to contain a pre-commit hook file, which they often don't by default.
# So to resolve that, a mirror is created that includes such a file.
default_language_version:
python: python3.8. # or python3
repos:
# Test if the python code is formatted according to the Black standard.
- repo: https://github.com/Quantco/pre-commit-mirrors-black
rev: 22.6.0
hooks:
- id: black-conda
args:
- --safe
- --target-version=py36
# Test if the python code is formatted according to the flake8 standard.
- repo: https://github.com/Quantco/pre-commit-mirrors-flake8
rev: 5.0.4
hooks:
- id: flake8-conda
args: ["--ignore=E501,W503,W504,E722,E203"]
# Test if the import statements are sorted correctly.
- repo: https://github.com/PyCQA/isort
rev: 5.10.1
hooks:
- id: isort
args: ["--profile", "black", --line-length=79]
## Test if the variable typing is correct. (Variable typing is when you say:
## def is_larger(nr: int) -> bool: instead of def is_larger(nr). It makes
## it explicit what type of input and output a function has.
## - repo: https://github.com/python/mypy
# - repo: https://github.com/pre-commit/mirrors-mypy
#### - repo: https://github.com/a-t-0/mypy
# rev: v0.982
# hooks:
# - id: mypy
## Tests if there are spelling errors in the code.
# - repo: https://github.com/codespell-project/codespell
# rev: v2.2.1
# hooks:
# - id: codespell
# Performs static code analysis to check for programming errors.
- repo: local
hooks:
- id: pylint
name: pylint
entry: pylint
language: system
types: [python]
args:
[
"-rn", # Only display messages
"-sn", # Don't display the score
"--ignore-long-lines", # Ignores long lines.
]
# Runs additional tests that are created by the pre-commit software itself.
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.3.0
hooks:
# Check user did not add large files.
- id: check-added-large-files
# Check if `.py` files are written in valid Python syntax.
- id: check-ast
# Require literal syntax when initializing empty or zero Python builtin types.
- id: check-builtin-literals
# Checks if there are filenames that would conflict if case is changed.
- id: check-case-conflict
# Checks if the Python functions have docstrings.
- id: check-docstring-first
# Checks if any `.sh` files have a shebang like #!/bin/bash
- id: check-executables-have-shebangs
# Verifies json format of any `.json` files in repo.
- id: check-json
# Checks if there are any existing merge conflicts caused by the commit.
- id: check-merge-conflict
# Checks for symlinks which do not point to anything.
- id: check-symlinks
# Checks if xml files are formatted correctly.
- id: check-xml
# Checks if .yml files are valid.
- id: check-yaml
# Checks if debugger imports are performed.
- id: debug-statements
# Detects symlinks changed to regular files with content path symlink was pointing to.
- id: destroyed-symlinks
# Checks if you don't accidentally push a private key.
- id: detect-private-key
# Replaces double quoted strings with single quoted strings.
# This is not compatible with Python Black.
#- id: double-quote-string-fixer
# Makes sure files end in a newline and only a newline.
- id: end-of-file-fixer
# Removes UTF-8 byte order marker.
- id: fix-byte-order-marker
# Add <# -*- coding: utf-8 -*-> to the top of python files.
#- id: fix-encoding-pragma
# Checks if there are different line endings, like \n and crlf.
- id: mixed-line-ending
# Asserts `.py` files in folder `/test/` (by default:) end in `_test.py`.
- id: name-tests-test
# Override default to check if `.py` files in `/test/` START with `test_`.
args: ['--django']
# Ensures JSON files are properly formatted.
- id: pretty-format-json
args: ['--autofix']
# Sorts entries in requirements.txt and removes incorrect pkg-resources entries.
- id: requirements-txt-fixer
# Sorts simple YAML files which consist only of top-level keys.
- id: sort-simple-yaml
# Removes trailing whitespaces at end of lines of .. files.
- id: trailing-whitespace
- repo: https://github.com/PyCQA/autoflake
rev: v1.7.0
hooks:
- id: autoflake
args: ["--in-place", "--remove-unused-variables", "--remove-all-unused-imports", "--recursive"]
name: AutoFlake
description: "Format with AutoFlake"
stages: [commit]
- repo: https://github.com/PyCQA/bandit
rev: 1.7.4
hooks:
- id: bandit
name: Bandit
stages: [commit]
# Enforces formatting style in Markdown (.md) files.
- repo: https://github.com/executablebooks/mdformat
rev: 0.7.16
hooks:
- id: mdformat
additional_dependencies:
- mdformat-toc
- mdformat-gfm
- mdformat-black
- repo: https://github.com/MarcoGorelli/absolufy-imports
rev: v0.3.1
hooks:
- id: absolufy-imports
files: '^src/.+\.py$'
args: ['--never', '--application-directories', 'src']
- repo: https://github.com/myint/docformatter
rev: v1.5.0
hooks:
- id: docformatter
- repo: https://github.com/pre-commit/pygrep-hooks
rev: v1.9.0
hooks:
- id: python-use-type-annotations
- id: python-check-blanket-noqa
- id: python-check-blanket-type-ignore
# Updates the syntax of `.py` files to the specified python version.
# It is not compatible with: pre-commit hook: fix-encoding-pragma
- repo: https://github.com/asottile/pyupgrade
rev: v3.0.0
hooks:
- id: pyupgrade
args: [--py38-plus]
- repo: https://github.com/markdownlint/markdownlint
rev: v0.11.0
hooks:
- id: markdownlint
With pyproject.toml:
# This is used to configure the black, isort and mypy such that the packages don't conflict.
# This file is read by the pre-commit program.
[tool.black]
line-length = 79
include = '\.pyi?$'
exclude = '''
/(
\.git
| \.mypy_cache
| build
| dist
)/
'''
[tool.coverage.run]
# Due to a strange bug with xml output of coverage.py not writing the full-path
# of the sources, the full root directory is presented as a source alongside
# the main package. As a result any importable Python file/package needs to be
# included in the omit
source = [
"foo",
".",
]
# Excludes the following directories from the coverage report
omit = [
"tests/*",
"setup.py",
]
[tool.isort]
profile = "black"
[tool.mypy]
ignore_missing_imports = true
[tool.pylint.basic]
bad-names=[]
[tool.pylint.messages_control]
# Example: Disable error on needing a module-level docstring
disable=[
"import-error",
"invalid-name",
"fixme",
]
[tool.pytest.ini_options]
# Runs coverage.py through use of the pytest-cov plugin
# An xml report is generated and results are output to the terminal
# TODO: Disable this line to disable CLI coverage reports when running tests.
#addopts = "--cov --cov-report xml:cov.xml --cov-report term"
# Sets the minimum allowed pytest version
minversion = 5.0
# Sets the path where test files are located (Speeds up Test Discovery)
testpaths = ["tests"]
And setup.py:
"""This file is to allow this repository to be published as a pip module, such
that people can install it with: `pip install networkx-to-lava-nc`.
You can ignore it.
"""
import setuptools
with open("README.md", encoding="utf-8") as fh:
long_description = fh.read()
setuptools.setup(
name="networkx-to-lava-nc-snn",
version="0.0.1",
author="a-t-0",
author_email="author#example.com",
description="Converts networkx graphs representing spiking neural networks"
+ " (SNN)s of LIF neruons, into runnable Lava SNNs.",
long_description=long_description,
long_description_content_type="text/markdown",
url="https://github.com/a-t-0/networkx-to-lava-nc",
packages=setuptools.find_packages(),
classifiers=[
"Programming Language :: Python :: 3",
"License :: OSI Approved :: AGPL3",
"Operating System :: OS Independent",
],
)
Question
How can I resolve the pylint usage error to ensure the commit passes pre-commit?
The issue was caused by #"--ignore-long-lines", # Ignores long lines. in the .pre-commit-config.yaml. I assume it conflicts with the line length settings for black, and for the pyproject.toml, which are set to 79 respectively.
How does one go about updating a variable that is declared in github action workflow?
Consider the following:
name: Test Variable
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
env:
DAY_OF_WEEK: Monday
jobs:
job1:
name: Job1
runs-on: ubuntu-latest
env:
Greeting: Hello
steps:
- name: "Say Hello John it's Monday"
run: |
echo $Greeting=Holla
echo "$Greeting $First_Name. Today is $DAY_OF_WEEK!"
env:
First_Name: John
- name: "Eval"
run: echo $Greeting $First_Name
So here I'm attempting to update Greeting then eval it later but GH is throwing
Invalid workflow file.You have an error in your yaml syntax on line 21.
So, if I were to update Greeting First_Name and DAY_OF_WEEKhow would I go about doing that?
Update
Fixed yaml syntax but the variable is not updated. Output for Eval is
Run echo $Greeting $First_Name
echo $Greeting $First_Name
shell: /usr/bin/bash -e {0}
env:
DAY_OF_WEEK: Monday
Greeting: Hello
Hello
Assign a variable:
run echo "Greeting=HOLLA" >> $GITHUB_ENV
using the variable
run echo "$Greeting"
docs:
https://docs.github.com/en/actions/using-workflows/workflow-commands-for-github-actions#setting-an-environment-variable
(Also make sure yours yml-file's indentation is correct.)
When I commit changes to gitlab and deploying project on cpanel. but got this error in pipeline.
i am using laravel deployer in this project, i want to deploy my project on cpanel through gitlab.
The command "cd /home/sfd/public_html && (/usr/local/cpanel/3rdparty/lib/pa
th-bin/git clone --recursive https://gitlab.com/nas-project
.git /home/sfd/public_html/releases/1 2>&1)" failed.
Exit Code: 128 (Invalid exit argument)
Host Name: 51.75.174.102
================
Cloning into '/home/sfd/public_html/releases/1'...
fatal: could not read Username for 'https://gitlab.com': No such device or
address
here i'm posting my some code, (runner code, gitlab-ci.yml, deploy code)
runner code
[[runners]]
name = "DESKTOP-1OOOT34"
url = "https://gitlab.com/nas-project.git"
token = "mytoken"
executor = "shell"
shell = "powershell"
[runners.custom_build_dir]
[runners.cache]
[runners.cache.s3]
[runners.cache.gcs]
[runners.cache.azure]
Deploy.php code
<?php
return [
'default' => 'basic',
'strategies' => [
//
],
'hooks' => [
'ready' => [
'artisan:storage:link',
'artisan:view:clear',
'artisan:config:cache',
'artisan:migrate',
],
],
'options' => [
'application' => env('APP_NAME', 'Laravel'),
'repository' => 'https://gitlab.com/n1063/suntop/nas-project.git',
],
'hosts' => [
'mysiteIP' => [
'deploy_path' => '/home/sfd/public_html',
'user' => 'sfd',
'multiplexing' => true,
'sshOptions' => [
'StrictHostKeyChecking' => 'no',
// ...
],
],
],
'localhost' => [
//
],
'include' => [
//
],
'custom_deployer_file' => false,
];
gitlab-ci.yml
image: edbizarro/gitlab-ci-pipeline-php:7.4
stages:
- preparation
- deploy
composer:
stage: preparation
script:
- php -v
- composer install --prefer-dist --no-ansi --no-interaction --no-progress --no-scripts --no-suggest
- cp .env.example .env
- php artisan key:generate
artifacts:
paths:
- vendor/
- .env
expire_in: 1 days
when: always
cache:
paths:
- vendor/
yarn:
stage: preparation
script:
- yarn --version
- yarn install --pure-lockfile
artifacts:
paths:
- node_modules/
expire_in: 1 days
when: always
.init_ssh_live: &init_ssh_live |
mkdir -p ~/.ssh
echo -e "$SSH_PRIVATE_KEY" > ~/.ssh/id_rsa
chmod 600 ~/.ssh/id_rsa
[[ -f /.dockerenv ]] && echo -e "Host *\n\tStrictHostKeyChecking no\n\n" > ~/.ssh/config
deploy:
stage: deploy
script:
- *init_ssh_live
- php artisan deploy mySiteIP -s upload
environment:
name: live
url: mySiteIP
only:
- dev
If you want to deploy Laravel project on cPanel server through GitLab without any package then this might help you. Config detail is in description https://gitlab.com/-/snippets/2238596
I created 3 jenkins jobs linked to the same github project, i'm using wdio v5 and cucumber, i want to run each job on a different port this is why i'm trying to pass the port from the jenkins post-build task : execute shell
I tryed this -- --seleniumArgs.seleniumArgs= ['-port', '7777']
then this
-- --seleniumArgs.seleniumArgs= ["-port", "7777"]
then
-- --seleniumArgs.seleniumArgs= '-port: 7777'
but nothing works
i found a solution :
so this is the wdio.conf.js file :
var myArgs = process.argv.slice(2);
Port= myArgs[1]
exports.config = {
////////////////////////
services: ['selenium-standalone'],
seleniumArgs: {
seleniumArgs: ['-port', Port]
},
//////////////////////
}
myArg will receive an array with the arguments passed in the command line
and this is the command
npm test 7777 -- --port 7777
the 7777 is the argument number 2, thus the index 1 in the array,
the index 0 is : wdio.conf.js, which is in the "test" script in package.json
===> "test": "wdio wdio.conf.js"
It's my first time using azk in my development (Ruby 2.2.3 + Rails 4 project)
I want run the Rspec tests.
How I use Azkfile to create a specific system to test environment? (test gem's + test database + webkit dependencies)
Add the systems test andpostgres-test to your Azkfile.js as example below.
To run the provision you must use the start and then stop the system:
$ azk start -R test && azk stop test
$ azk shell -- bundle exec rspec spec
Or you can run the provision commands directly in the shell:
$ azk start postgres-test
$ azk shell test
bundle install --path /azk/bundler
bundle exec rake db:create
bundle exec rake db:migrate
$ azk shell -- bundle exec rspec spec
Example:
systems({
app: {
// ..
},
postgres: {
// ...
},
/* TEST */
test: {
extends: "app",
depends: ["postgres-test"],
command: "bundle exec rspec spec && exit 0",
provision: [
"bundle install --path /azk/bundler",
"bundle exec rake db:create",
"bundle exec rake db:migrate",
],
scalable: { default: 0, limit: 1 },
http: false,
wait: false,
envs: {
RAILS_ENV: "test",
RACK_ENV : "test",
BUNDLE_APP_CONFIG: "/azk/bundler",
HOST: "#{system.name}.#{azk.default_domain}",
},
},
"postgres-test": {
extends: "postgres",
scalable: { default: 0, limit: 1 },
envs: {
// set instances variables
POSTGRES_USER: "azk",
POSTGRES_PASS: "azk",
POSTGRES_DB : "#{manifest.dir}_test",
},
},
});