I would like to set up a testing scenario where I can separate the test-builds for PHPUnit and Laravel Dusk. The reason why is, that this I would like to run different .env & phpunit.xml files for each test-approach.
Set the environment for PHPUnit
Test PHPUnit
Clean the Travis Build
Set the environment for Laravel Dusk
Test Laravel Dusk
I've gone through the Travis documentation about jobs and the matrix but I can't find a proper approach which I can follow.
My .travis.yml file:
sudo: true
dist: trusty
language: php
php:
- 7.3
addons:
chrome: stable
apt:
sources:
- mysql-5.7-trusty
packages:
- mysql-server
- mysql-client
services:
- mysql
install:
- composer self-update
- travis_retry composer install --no-interaction --prefer-dist --no-suggest
before_script:
- rm composer.lock
- echo -e "[server]\nmax_allowed_packet=64M" | sudo tee -a /etc/mysql/conf.d/drupal.cnf
- sudo service mysql restart
- mysql -e 'CREATE DATABASE testing;'
- mysql -e 'CREATE DATABASE business_external;'
- mysql business_external < /home/travis/build/StanBarrows/business/database/data/business_external
- google-chrome-stable --headless --disable-gpu --remote-debugging-port=9222 http://localhost &
- cp .env.travis .env
- cp phpunit.travis.xml phpunit.xml
- php artisan key:generate
- php artisan storage:link
- php artisan serve &
script:
- vendor/bin/phpunit
- php artisan dusk
notifications:
email: false
Related
Can anyone point me in the right direction for the Laravel CI/CD Development in code commit AWS?
I have gone through a lot of tutorials but always fail to connect the Database to either elastic beanstalk or EC2,
Can someone recommend a good tutorial for this
This is my build Command.
version: 0.2
phases:
install:
runtime-versions:
php: 7.4
nodejs: 12.x
commands:
- apt-get update -y
- apt-get install -y libpq-dev libzip-dev
- apt-get install -y php-pgsql
- curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
pre_build:
commands:
- cp .env.beta .env
- composer install
- npm install
build:
commands:
- npm run production
- php artisan migrate --force
- php artisan db:seed
artifacts:
files:
- '**/*'
name: $(date +%Y-%m-%dT%H:%M:%S).zip
proxy:
upload-artifacts: yes
logs: yes
now i am getting ngnix 404 error,Even after adding " /public " in beanstalk
I am doing laravel application and installing Laravel jetstream in Docker containers. I have separate containers for composer and artisan. And when I try to install jetstream by command:
docker-compose run --rm artisan jetstream:install inertia
I get an error:
Starting mysql ... done
sh: exec: line 1: composer: not found
Unable to locate publishable resources.
Publishing complete.
Inertia scaffolding installed successfully.
Please execute the "npm install && npm run dev" command to build your assets.
The webpage still doesn't work with error message Class 'Inertia\Inertia' not found. I assume there is a problem with connection between composer and artisan containers, but how I can set up this connection?
Docker-compose.yml
composer:
image: composer:latest
container_name: composer
volumes:
- ./src:/var/www/html
working_dir: /var/www/html
depends_on:
- php
networks:
- laravel
artisan:
build:
context: .
dockerfile: Dockerfile
container_name: artisan
volumes:
- ./src:/var/www/html
depends_on:
- mysql
working_dir: /var/www/html
entrypoint: ['/var/www/html/artisan']
networks:
- laravel
Sadly I can't comment yet (< 50 reputation), but had a similar issue as you just now, an alternative to running it in different containers is to make a helper container and executing it all inside:
docker-compose.yml (note: ./code is your laravel root/folder)
version: '3'
services:
helper:
build: ./composer-artisan-helper
volumes:
- ./code:/app
Make a folder composer-artisan-helper and create a Dockerfile inside:
FROM php:7.4-fpm
# Install git
RUN apt-get update && apt-get install -y git
# Get latest Composer
COPY --from=composer:latest /usr/bin/composer /usr/bin/composer
# Keep running
ENTRYPOINT ["tail", "-f", "/dev/null"]
# Set work-directory
WORKDIR /app
Now run it and drop into that container:
docker-compose exec helper bash
make sure it now has all your laravel folder by doing a quick ls - if everything is fine execute
php artisan jetstream:install inertia
you should now be greeted eventually with Inertia scaffolding installed successfully..
Hope that helps! even though it's not split up into multiple containers (which isn't possible afaik anyway).
I was trying to install it with command:
docker-compose run --rm artisan jetstream:install inertia
But actually it works when I try to run it in second container:
docker-compose run --rm composer php artisan jetstream:install inertia
In my configuration artisan container doesn't have a connection to the composer, but the composer container has to the artisan obviously.
there is a right answer here! thank you Jsowa
But you can install related packages separately.
You need to install inertiajs package for laravel with the following command.
docker-compose run --rm composer require inertiajs/inertia-laravel
docker-compose run --rm artisan inertia:middleware
I am new at docker and docker-compose and I am developing a Laravel-project on docker and docker-compose with Laradock as following a tutorial(not sure whether It is a correct way or not to refer this situation though).
I want to install the composer in this environment to be able to use the composer command.
As a matter of fact, I wanted to do seeding to put data into DB that I made by php artisan make:migrate but this error appeared.
include(/var/www/laravel_practice/vendor/composer/../../database/seeds/AdminsTableSeeder.php): failed to open stream: No such file or directory
So I googled this script to find a solution that will solve the error then I found it.
It says, "Do composer dump-autoload and try seeding again", so I followed it then this error appeared.
bash: composer: command not found
Because I have not installed composer into docker-container.
My docker's condition is like this now.
・workspace
・mysql
・apache
・php-fpm
Since I have not installed the composer, I have to install it into docker-container to solve the problem, BUT I have no idea how to install it into docker-container.
So could anyone tell me how to install composer into docker-container?
Thank you.
here is the laradock/mysql/Dockerfile and laravelProject/docker-compose.yml.
ARG MYSQL_VERSION=5.7
FROM mysql:${MYSQL_VERSION}
LABEL maintainer="Mahmoud Zalt <mahmoud#zalt.me>"
#####################################
# Set Timezone
#####################################
ARG TZ=UTC
ENV TZ ${TZ}
RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone && chown -R mysql:root /var/lib/mysql/
COPY my.cnf /etc/mysql/conf.d/my.cnf
CMD ["mysqld"]
EXPOSE 3306
version: '2'
services:
db:
image: mysql:5.7
ports:
- "6603:3306"
environment:
- MYSQL_ALLOW_EMPTY_PASSWORD=true
- MYSQL_DATABASE=laravelProject
- LANG=C.UTF-8
volumes:
- db:/var/lib/mysql
command: mysqld --sql-mode=NO_ENGINE_SUBSTITUTION --character-set-server=utf8 --collation-server=utf8_unicode_ci
web:
image: arbiedev/php-nginx:7.1.8
ports:
- "8080:80"
volumes:
- ./www:/var/www
- ./nginx.conf:/etc/nginx/sites-enabled/default
volumes:
db:
You can build your own image and use it in your Docker compose file.
FROM php:7.2-alpine3.8
RUN apk update
RUN apk add bash
RUN apk add curl
# INSTALL COMPOSER
RUN curl -s https://getcomposer.org/installer | php
RUN alias composer='php composer.phar'
# INSTALL NGINX
RUN apk add nginx
I used the PHP alpine image as my base image because it's lightweight, so you might have to install other dependencies yourself. In your docker-compose file
web:
build: path/to/your/Dockerfile/directory
image: your-image-tag
ports:
- "8080:80"
volumes:
- ./www:/var/www
- ./nginx.conf:/etc/nginx/sites-enabled/default
You could do something like this:
FROM php:8.0.2-apache
RUN apt-get update && apt-get upgrade -y
RUN apt-get install -y mariadb-client libxml2-dev
RUN apt-get autoremove -y && apt-get autoclean
RUN docker-php-ext-install mysqli pdo pdo_mysql xml
COPY --from=composer /usr/bin/composer /usr/bin/composer
the argument COPY --from= should solve your problem.
FROM php:7.3-fpm-alpine
RUN docker-php-ext-install pdo pdo_mysql
RUN docker-php-ext-install mysqli && docker-php-ext-enable mysqli
RUN php -r "readfile('http://getcomposer.org/installer');" | php -- --install-dir=/usr/bin/ --filename=composer
RUN apk update
RUN apk upgrade
RUN apk add bash
RUN alias composer='php /usr/bin/composer'
I have just added the tests with Laravel Dusk.
Everything works if I test on my pc. I thus set up a travis.yml file :
language: php
sudo: required
dist: trusty
php:
- 7.1
- 7.2
addons:
chrome: stable
services:
- mysql
install:
- cp .env.travis .env
- mysql -e 'create database homestead_test;'
- travis_retry composer self-update
- travis_retry composer install --no-interaction
- php artisan key:generate
- php artisan migrate:fresh --seed
before_script:
- google-chrome-stable --headless --disable-gpu --remote-debugging-port=9222 http://localhost &
- php artisan serve &
script:
- php artisan code:analyse --level=7
- php artisan dusk
- vendor/bin/phpunit
notifications:
email: false
However, when I push on Github I obtains errors : show travis errors
I does not understand to make how so that my tests work on travis.
Would anybody know how to help me on this point? Best Regards, Quentin
Update :
The exact commit on github
The issue is SESSION_DRIVER=array in your .env.travis file, change it to SESSION_DRIVER=file.
The login tests aren't working because the sessions vanish after each request.
I have created a CircleCI config which will run my PHPUnit tests against my laravel application and that is working 100% however I am now trying to add a workflow to then SSH and deploy my app to an AWS EC2 server and I am getting the following errors:
Your config file has errors and may not run correctly:
2 schema violations found
required key [jobs] not found
required key [version] not found
However I cannot see an issue with my CircleCI config file, have I made a mistake somewhere?
version: 2
jobs:
build:
docker:
- image: circleci/php:7.1-browsers
working_directory: ~/laravel
steps:
- checkout
- run:
name: Download NodeJS v6
command: curl -sL https://deb.nodesource.com/setup_6.x | sudo -E bash -
- run:
name: Install SQLite and NodeJS 6
command: sudo apt-get install -y libsqlite3-dev nodejs
- run:
name: Setup Laravel testing environment variables for CircleCI test
command: cp .env.circleci .env
- run:
name: Update composer to latest version
command: composer self-update
- restore_cache:
keys:
- composer-v1-{{ checksum "composer.json" }}
- composer-v1-
- run: composer install -n --prefer-dist --ignore-platform-reqs
- save_cache:
key: composer-v1-{{ checksum "composer.json" }}
paths:
- vendor
- restore_cache:
key: dependency-cache-{{ checksum "package.json" }}
- run:
name: Install NodeJS Packages
command: npm install
- save_cache:
key: dependency-cache-{{ checksum "package.json" }}
paths:
- ./node_modules
- run:
name: Create SQLite Database
command: touch database/database.sqlite
- run:
name: Migrate Laravel Database
command: php artisan migrate --database=sqlite --force
- run:
name: Run NPM
command: npm run production
# Run Laravel Server for front-end tests
- run:
name: Run Laravel Server
command: php artisan serve
background: true
- run:
name: Run PHPUnit Tests
command: vendor/bin/phpunit
deploy:
machine:
enabled: true
steps:
- run:
name: Deploy Over SSH
command: |
ssh $SSH_USER#$SSH_HOST "cd /var/www/html"
workflows:
version: 2
build-and-deploy:
jobs:
- build
- deploy:
requires:
- build
filters:
branches:
only: master
Any help is appreciated, thank you!
CircleCI has documentation for AWS deployment. Look here https://circleci.com/docs/1.0/continuous-deployment-with-aws-codedeploy/
I think your problem is with SSH authorization for AWS. You can try it locally and make sure that your authorization is successfully, and then do the same thing with your AWS.