Running dusk after php artisan serve fails in CircleCI - laravel

I'm working on a CircleCi config file that runs unit tests and browser (dusk) tests. Tests fail in CircleCi every time on the command php artisan dusk. Php artisan serve also is 'canceled' if I add the attribute background: true.
Declaring the port php artisan serve --port=8000
Using backgroud: true and php artisan serve &
Tried running curl http://localhost:8000 (failed)
circleci config file
version: 2
jobs:
Test:
docker:
- image: circleci/php:7.2-fpm-node-browsers
- image: circleci/redis:5.0
- image: circleci/mysql:5.7
environment:
MYSQL_ROOT_HOST: '%'
MYSQL_ROOT_PASSWORD: ''
MYSQL_USER: homestead
MYSQL_PASSWORD: secret
MYSQL_DATABASE: homestead
MYSQL_ALLOW_EMPTY_PASSWORD: true
environment:
APP_ENV: testing
APP_URL: http://localhost:8000
APP_KEY: ###
DB_HOST: 127.0.0.1
DB_DATABASE: homestead
DB_USERNAME: homestead
DB_PASSWORD: secret
REDIS_HOST: 127.0.0.1
REDIS_PASSWORD: 'null'
PUSHER_APP_ID: ###
PUSHER_APP_KEY: ###
PUSHER_APP_SECRET: ###
PUSHER_APP_CLUSTER: ###
MIX_PUSHER_APP_KEY: ###
MIX_PUSHER_APP_CLUSTER: ###
working_directory: ~/workspace
steps:
- checkout
- run:
name: Prepare Environment
command: .circleci/kickstart.sh
- restore_cache:
keys:
- composer-v1-{{ checksum "composer.lock" }}
- composer-v1-
- run: composer install -n --prefer-dist
- save_cache:
key: composer-v1-{{ checksum "composer.lock" }}
paths:
- vendor
- restore_cache:
keys:
- node-v1-{{ checksum "package-lock.json" }}
- node-v1-
- run: npm install
- save_cache:
key: node-v1-{{ checksum "package-lock.json" }}
paths:
- node_modules
- run:
name: Build Artifacts
command: npm run dev
- run:
name: Waiting for MySQL to be ready
command: dockerize -wait tcp://127.0.0.1:3306 -timeout 120s
- run: php artisan config:clear
- run: php artisan config:cache
- run: php artisan migrate:refresh --seed --database=mysql --force
- run: php artisan dusk:install
- run:
name: Run PHP Unit Tests
command: ./vendor/bin/phpunit
- run:
name: Run server
command: php artisan serve --port=8000 &
# - run:
# name: Run server
# command: php artisan serve --port=8000
# background: true
- run:
name: Test connection to server
command: curl http://localhost:8000
- run:
name: Run E2E Tests
command: php artisan dusk
environment:
APP_URL: http://localhost:8000
- store_artifacts:
path: ./tests/Browser/console
destination: console
- store_artifacts:
path: ./tests/Browser/screenshots
destination: screenshots
workflows:
version: 2
Build and Test:
jobs:
- Test
curl failing
#!/bin/bash -eo pipefail
php artisan serve --port=8000 &
curl http://localhost:8000
curl: (7) Failed to connect to localhost port 8000: Connection refused
Exited with code 7
Dusk should run after the server is running

Related

PHP Fatal error on CI/CD run php artisan test

I use docker-compose with laravel and postgresql and all works fine in local system. The problem is in the CI/CD.
I have changed the CI/CD yml file over and over but I am stuck!
CI/CD
name: CI/CD
on:
pull_request:
branches: ['master']
push:
branches: ['master']
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: shivammathur/setup-php#v2
with:
php-version: '7.4'
- uses: actions/checkout#v2
- name: Run Containers
run: docker-compose -f docker-compose.yml -f docker-compose.dev.yml up -d
# - name: Run composer install
# run: cd companyname_app_dir && composer install
# - name: Run composer update
# run: cd companyname_app_dir&& composer update
# - name: Setup Project
# run: |
# cd companyname_app_dir
# composer update
# composer install
# php artisan config:clear
# php artisan cache:clear
- name: Run test
run: cd companyname_app_dir && php artisan test
env:
APP_KEY: base64:x06N/IsV5iJ+R6TKlr6sC6Mr4riGgl8Rg09XHHnRZQw=
APP_ENV: testing
DB_CONNECTION: companyname-postgres
DB_DATABASE: db_test
DB_USERNAME: root
DB_PASSWORD: 1234
deploy:
needs: test
runs-on: ubuntu-latest
steps:
- name: Set up QEMU
uses: docker/setup-qemu-action#v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action#v2
- name: Login to Docker Hub
uses: docker/login-action#v2
with:
username: secret
password: secret
- name: Build and push
uses: docker/build-push-action#v3
with:
push: true
file: ./companyname_app_dir/Dockerfile
tags: company_image:latest
build-args: |
"NODE_ENV=production"
There are line comments, I tried using these but I couldn't run a test successfully.
docker-compose
version: '3'
networks:
companyname_network:
driver: bridge
services:
nginx:
image: nginx:stable-alpine
container_name: companyname-nginx
volumes:
- ./nginx/default.conf:/etc/nginx/conf.d/default.conf:ro
restart: always
depends_on:
- companyname_app
networks:
- companyname_network
companyname_app:
restart: 'always'
image: 'companyname_laravel'
container_name: companyname-app
build:
context: .
dockerfile: ./Dockerfile
networks:
- companyname_network
depends_on:
- companyname_db
companyname_db:
image: 'companyname_multiple_db'
container_name: companyname-postgres
build:
context: .
dockerfile: ./DockerfileDB
restart: 'always'
volumes:
- local_pgdata:/docker-entrypoint-initdb.d
environment:
- POSTGRES_MULTIPLE_DATABASES=db,db_test
- POSTGRES_USER=root
- POSTGRES_PASSWORD=1234
ports:
- 15432:5432
networks:
- companyname_network
companyname_dbadmin:
image: adminer
container_name: companyname-dbadmin
restart: 'always'
depends_on:
- companyname_db
ports:
- 5051:8080
networks:
- companyname_network
volumes:
local_pgdata:
docker-compose.dev
version: '3'
services:
nginx:
ports:
- 9000:80
companyname_app:
build:
args:
- NODE_ENV=development
volumes:
- ./companyname_app_dir:/app
- /app/vendor
With this file, I get an error:
Run cd companyname_app_dir && php artisan test
PHP Warning: require(/home/runner/work/companyname_app/companyname_app /companyname_app_dir/vendor/autoload.php): failed to open stream: No such file or directory in /home/runner/work/companyname_app/companyname_app/companyname_app_dir/artisan on line 18
PHP Fatal error: require(): Failed opening required '/home/runner/work/companyname_app/companyname_app/companyname_app_dir/vendor/autoload.php' (include_path='.:/usr/share/php') in /home/runner/work/companyname_app/companyname_app/companyname_app_dir/artisan on line 18
Error: Process completed with exit code 255.
if I use:
- name: Run composer install
run: cd companyname_app_dir && composer install
- name: Run composer update
run: cd companyname_app_dir && composer update
In CI/CD yml and remove Run Containers part, composer install and update successfully, but php artisan test throws this error:
postgresql can not connect
You must use composer install, else you will have no vendor folder at all, so you have nothing to run. That is why you are getting an error if you don't run composer install
You should not run composer update, because you are updating packages to new versions, you never do that in production, you just run composer install --no-dev
You are mixing running docker with a command OUTSIDE the docker container.
Related to point 3., if you are using docker-compose, you cannot execute:
- name: Run test
run: cd companyname_app_dir && php artisan test
env:
APP_KEY: base64:x06N/IsV5iJ+R6TKlr6sC6Mr4riGgl8Rg09XHHnRZQw=
APP_ENV: testing
DB_CONNECTION: companyname-postgres
DB_DATABASE: db_test
DB_USERNAME: root
DB_PASSWORD: 1234
Because you are outside docker, so you should execute docker-compose exec companyname_app php artisan test, that will execute the tests INSIDE the docker container, where you correctly have everything setup.
So your code (if I am not missing anything) should be:
- name: Run test
run: docker-compose exec companyname_app php artisan test
env:
APP_KEY: base64:x06N/IsV5iJ+R6TKlr6sC6Mr4riGgl8Rg09XHHnRZQw=
APP_ENV: testing
DB_CONNECTION: companyname-postgres
DB_DATABASE: db_test
DB_USERNAME: root
DB_PASSWORD: 1234
But I am not certaing what will you get back from that execution, I have no idea if the test fails, if the CI/CD (I am assuming you are using GitHub Actions or Bitbucket Pipelines), will truly identify that it has failed or not.
What I usually do, is just install everything on the machine (CI/CD machine), instead of using a docker file or docker-compose yaml. But that is my preference (at least for PHP/Laravel)

Can someone look at my yaml file for code deployment using Bitbucket Pipelines?

This is my first attempt at setting up pipelines or even using any CI/CD tool. So, reading the documentation at Bitbucket, I added the bitbucket-pipelines.yml file in the root of my Laravel application for a build. Here is the file.
image: php:7.4-fpm
pipelines:
default:
- step:
name: Build and test
caches:
- composer
script:
- apt-get update && apt-get install -qy git curl libmcrypt-dev mariadb-client ghostscript
- yes | pecl install mcrypt-1.0.3
- docker-php-ext-install pdo_mysql bcmath exif
- curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --file name=composer
- composer install
- ln -f -s .env.pipelines .env
- php artisan migrate
- ./vendor/bin/phpunit
services:
- mysql
- redis
definitions:
services:
mysql:
image: mysql:5.7
environment:
MYSQL_DATABASE: "laravel-pipeline"
MYSQL_RANDOM_ROOT_PASSWORD: "yes"
MYSQL_USER: "homestead"
MYSQL_PASSWORD: "secret"
redis:
image: redis
The above works fine in building the application, running tests,etc. But when I add the below to deploy, using the scp pipe, I get a notice saying either I need to include an image or at times the notice says there is a bad indentation of a mapping entry.
- step:
name: Deploy to test
deployment: test
# trigger: manual # Uncomment to make this a manual deployment.
script:
- pipe: atlassian/scp-deploy:0.3.13
variables:
USER: '${remoteUser}'
SERVER: '${server}'
REMOTE_PATH: '${remote}'
LOCAL_PATH: '${BITBUCKET_CLONE_DIR}/*'
I don't really know yaml, and this is my first time working with a CI/CD tool so I am lost. Can someone guide me in what I am doing wrong?
Your indentation for name and deployment is not the same as for the script. Try putting it all on the same indentation like this.
- step:
name: Deploy to test
deployment: test
script:
- pipe: atlassian/scp-deploy:0.3.13
variables:
USER: '${remoteUser}'
SERVER: '${server}'
REMOTE_PATH: '${remote}'
LOCAL_PATH: '${BITBUCKET_CLONE_DIR}/*'

Bitbucket Pipeline Laravel with MySQL php_network_getaddresses

I use php7.2-fpm-stretch docker image and MySQL as an attached service. Following command runs successfully:
mysql -h 127.0.0.1 -u username -ppassword
However, when composer wants to run package:discover it ends up with the following error:
SQLSTATE[HY000] [2002] php_network_getaddresses: getaddrinfo failed: Name or service not known
The .env file has the following configuration for the database:
DB_HOST=127.0.0.1
DB_CONNECTION=127.0.0.1
DB_DATABASE=pipeline
DB_USERNAME=username
DB_PASSWORD=password
And my yml file is as below:
image: php:7.2-fpm-stretch
pipelines:
default:
- step:
caches:
- composer
script:
- apt-get update && apt-get install -qy git unzip mysql-client
#Some other non-related configuration
- composer install
- php artisan key:generate
services:
- mysql
definitions:
services:
mysql:
image: mysql:5.7
environment:
MYSQL_DATABASE: 'pipeline'
MYSQL_RANDOM_ROOT_PASSWORD: 'yes'
MYSQL_USER: 'username'
MYSQL_PASSWORD: 'password'

How to run laravel app using docker-compose?

Following is what I tried, is there something I doing wrong?
step1. create a simple laravel app at localhost.
composer create-project --prefer-dist laravel/laravel laravel-app 5.6
step2. create docker-compose.yml
version: '3'
services:
php:
image: php:7-fpm
ports:
- "3021:8000"
volumes:
- ./laravel-app:/app
composer:
image: composer:latest
volumes:
- ./laravel-app:/app
working_dir: /app
command: ["install","php artisan serve --host=0.0.0.0"]
depends_on:
- php
After that, I run docker-compose up --force-recreate -d and access 127.0.0.1:3021 at browser, but I get nothing.
Then I run docker-composer log, it shows me this error message:
Invalid argument php artisan serve --host=0.0.0.0. Use "composer require php artisan serve --host=0.0.0.0" instead to add packages to your composer.json.
How to fix this issue?
You are mixing commands. Composer does not "serve". Php has a build in dev server to "serve".
You can read more about it here: https://laravel.com/docs/4.2/quick
To actually get Laravel up and running please do the following:
1 - Run this in the laravel-app folder: composer install
2 - Create a Dockerfile with the following contents:
FROM php:7
RUN apt-get update -y && apt-get install -y libmcrypt-dev openssl
RUN docker-php-ext-install pdo mcrypt mbstring
RUN curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
WORKDIR /app
COPY . /app
CMD php artisan serve --host=0.0.0.0 --port=8000
EXPOSE 8000
3 - Build your docker image: docker build -t my-laravel-image .
4 - Finally replace the content of your docker-compose:
version: '3'
services:
web:
image: my-laravel-image
ports:
- 3021:8000
volumes:
- ./laravel-app:/app
A more complete tutorial can be found here (not mine): https://www.techiediaries.com/docker-compose-laravel/
EDIT:
in order to use the official compose image you could simply do this:
version: '3'
services:
composer:
image: composer:latest
working_dir: /app
entrypoint: php artisan serve --host=0.0.0.0
depends_on:
- php
volumes:
- ./laravel-app:/app
ports:
- "3021:8000"
Make sure ./laravel-app contains a laravel project. Otherwise this won't work!
in the main folder of your Laravel app, create a file named Dockerfile and insert this code:
FROM php:7
RUN apt-get update -y && apt-get install -y openssl zip unzip git
RUN curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
RUN docker-php-ext-install pdo pdo_mysql
WORKDIR /app
COPY . /app
RUN composer install
CMD php artisan serve --host=0.0.0.0 --port=8181
EXPOSE 8181
In the same main folder of Dockerfile, create a file named docker-compose.yml and insert this code:
version: '2'
services:
app:
build: .
ports:
- "8009:8000"
volumes:
- .:/app
env_file: .env
working_dir: /app
command: bash -c 'php artisan migrate && php artisan serve --host 0.0.0.0'
depends_on:
- db
links:
- db
db:
image: "mysql:5.7"
environment:
- MYSQL_ROOT_PASSWORD=yourpassword
- MYSQL_DATABASE=yourdbname
- MYSQL_USER=root
- MYSQL_PASSWORD=yourpassword
volumes:
- ./data/:/var/lib/mysql
ports:
- "3306:3306"
phpmyadmin:
depends_on:
- db
image: phpmyadmin/phpmyadmin
restart: always
ports:
- 8090:80
environment:
PMA_HOST: db
MYSQL_ROOT_PASSWORD: yourpassword
Open the terminal command line and go inside the laravel folder, and launch this commands:
docker.compose build
docker-compose up -d
if have need to create and migrate the db, or use other commands, launch the Laravel commands in this way:
docker-compose run app php artisan
The app will available at the address http://0.0.0.0:8009
Source: https://medium.com/#pierangelo1982/dockerize-an-existing-laravel-application-with-docker-compose-a45eb7956cbd

Unable to start MySQL service in docker during gitlab-ci

I have the following .gitlab-ci taken from the example of Laravel Dusk CI:
stages:
- build
- test
# Variables
variables:
MYSQL_ROOT_PASSWORD: root
MYSQL_USER: root
MYSQL_PASSWORD: secret
MYSQL_DATABASE: test
DB_HOST: mysql
DB_CONNECTION: mysql
build:
stage: build
services:
- mysql:5.7
image: chilio/laravel-dusk-ci:stable
script:
- composer install --prefer-dist --no-ansi --no-interaction --no-progress --no-scripts
# - npm install # if you need to install additional modules from your projects package.json
# - npm run dev # if you need to run dev scripts for example laravel mix
cache:
key: ${CI_COMMIT_REF_NAME}
paths:
# these are only examples, you should modify them according to your project,
# or remove cache routines entirely, if they are causing any problems on your next builds..
# below are 2 safe ones if you use composer install and npm install in your stage script
- vendor
- node_modules
# - /resources/assets/vendors # for example if you put your vendor node-libraries there
test:
stage: test
cache:
key: ${CI_COMMIT_REF_NAME}
paths:
- vendor
- node_modules
policy: pull
services:
- mysql:5.7
image: chilio/laravel-dusk-ci:stable
script:
- cp .env.example .env
# - cp phpunit.xml.ci phpunit.xml # if you are using custom config for your phpunit tests in CI
- configure-laravel
- start-nginx-ci-project
- ./vendor/phpunit/phpunit/phpunit -v --coverage-text --colors --stderr
# - phpunit -v --coverage-text --colors --stderr # if you want to use preinstalled phpunit
- php artisan dusk --colors --debug
artifacts:
paths:
- ./storage/logs # for debugging
- ./tests/Browser/screenshots
- ./tests/Browser/console
expire_in: 7 days
when: always
However, when the runner executes the job, I keep getting the following warning:
Using Docker executor with image chilio/laravel-dusk-ci:stable ...
Starting service mysql:5.7 ...
Pulling docker image mysql:5.7 ...
Using docker image sha256:66bc0f66b7af6ba3ea96582685d3afcd6dff93c2f8999da0ffadd67b280db548 for mysql:5.7 ...
Waiting for services to be up and running...
*** WARNING: Service runner-237f18d2-project-23-concurrent-0-mysql-0 probably didn't start properly.
Health check error:
ContainerStart: Error response from daemon: Cannot link to a non running container: /runner-237f18d2-project-23-concurrent-0-mysql-0 AS /runner-237f18d2-project-23-concurrent-0-mysql-0-wait-for-service/service
Service container logs:
2018-07-11T19:49:03.214991318Z
2018-07-11T19:49:03.215062485Z ERROR: mysqld failed while attempting to check config
2018-07-11T19:49:03.215067480Z command was: "mysqld --verbose --help"
2018-07-11T19:49:03.215070774Z
2018-07-11T19:49:03.215073778Z mysqld: error while loading shared libraries: libpthread.so.0: cannot stat shared object: Permission denied
I've tried to set the runner to privileged in the config.toml:
privileged = true
To solve the Question:
mysqld: error while loading shared libraries: libpthread.so.0: cannot stat shared object: Permission denied
Step1: update your software and kernel(maybe):
apt-get update && apt-get upgrade
Step2: install the docker dependency package:
(ubuntu/debian): apt-get install apt-transport-https ca-certificates curl gnupg2 software properties-common
(centos/redhat):yum-utils device-mapper-persistent-data lvm2
Step3: reboot your server & restart your docker-ce:
reboot
systemctl restart docker-ce

Resources