Bitbucket Pipeline Laravel with MySQL php_network_getaddresses - laravel

I use php7.2-fpm-stretch docker image and MySQL as an attached service. Following command runs successfully:
mysql -h 127.0.0.1 -u username -ppassword
However, when composer wants to run package:discover it ends up with the following error:
SQLSTATE[HY000] [2002] php_network_getaddresses: getaddrinfo failed: Name or service not known
The .env file has the following configuration for the database:
DB_HOST=127.0.0.1
DB_CONNECTION=127.0.0.1
DB_DATABASE=pipeline
DB_USERNAME=username
DB_PASSWORD=password
And my yml file is as below:
image: php:7.2-fpm-stretch
pipelines:
default:
- step:
caches:
- composer
script:
- apt-get update && apt-get install -qy git unzip mysql-client
#Some other non-related configuration
- composer install
- php artisan key:generate
services:
- mysql
definitions:
services:
mysql:
image: mysql:5.7
environment:
MYSQL_DATABASE: 'pipeline'
MYSQL_RANDOM_ROOT_PASSWORD: 'yes'
MYSQL_USER: 'username'
MYSQL_PASSWORD: 'password'

Related

PHP Fatal error on CI/CD run php artisan test

I use docker-compose with laravel and postgresql and all works fine in local system. The problem is in the CI/CD.
I have changed the CI/CD yml file over and over but I am stuck!
CI/CD
name: CI/CD
on:
pull_request:
branches: ['master']
push:
branches: ['master']
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: shivammathur/setup-php#v2
with:
php-version: '7.4'
- uses: actions/checkout#v2
- name: Run Containers
run: docker-compose -f docker-compose.yml -f docker-compose.dev.yml up -d
# - name: Run composer install
# run: cd companyname_app_dir && composer install
# - name: Run composer update
# run: cd companyname_app_dir&& composer update
# - name: Setup Project
# run: |
# cd companyname_app_dir
# composer update
# composer install
# php artisan config:clear
# php artisan cache:clear
- name: Run test
run: cd companyname_app_dir && php artisan test
env:
APP_KEY: base64:x06N/IsV5iJ+R6TKlr6sC6Mr4riGgl8Rg09XHHnRZQw=
APP_ENV: testing
DB_CONNECTION: companyname-postgres
DB_DATABASE: db_test
DB_USERNAME: root
DB_PASSWORD: 1234
deploy:
needs: test
runs-on: ubuntu-latest
steps:
- name: Set up QEMU
uses: docker/setup-qemu-action#v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action#v2
- name: Login to Docker Hub
uses: docker/login-action#v2
with:
username: secret
password: secret
- name: Build and push
uses: docker/build-push-action#v3
with:
push: true
file: ./companyname_app_dir/Dockerfile
tags: company_image:latest
build-args: |
"NODE_ENV=production"
There are line comments, I tried using these but I couldn't run a test successfully.
docker-compose
version: '3'
networks:
companyname_network:
driver: bridge
services:
nginx:
image: nginx:stable-alpine
container_name: companyname-nginx
volumes:
- ./nginx/default.conf:/etc/nginx/conf.d/default.conf:ro
restart: always
depends_on:
- companyname_app
networks:
- companyname_network
companyname_app:
restart: 'always'
image: 'companyname_laravel'
container_name: companyname-app
build:
context: .
dockerfile: ./Dockerfile
networks:
- companyname_network
depends_on:
- companyname_db
companyname_db:
image: 'companyname_multiple_db'
container_name: companyname-postgres
build:
context: .
dockerfile: ./DockerfileDB
restart: 'always'
volumes:
- local_pgdata:/docker-entrypoint-initdb.d
environment:
- POSTGRES_MULTIPLE_DATABASES=db,db_test
- POSTGRES_USER=root
- POSTGRES_PASSWORD=1234
ports:
- 15432:5432
networks:
- companyname_network
companyname_dbadmin:
image: adminer
container_name: companyname-dbadmin
restart: 'always'
depends_on:
- companyname_db
ports:
- 5051:8080
networks:
- companyname_network
volumes:
local_pgdata:
docker-compose.dev
version: '3'
services:
nginx:
ports:
- 9000:80
companyname_app:
build:
args:
- NODE_ENV=development
volumes:
- ./companyname_app_dir:/app
- /app/vendor
With this file, I get an error:
Run cd companyname_app_dir && php artisan test
PHP Warning: require(/home/runner/work/companyname_app/companyname_app /companyname_app_dir/vendor/autoload.php): failed to open stream: No such file or directory in /home/runner/work/companyname_app/companyname_app/companyname_app_dir/artisan on line 18
PHP Fatal error: require(): Failed opening required '/home/runner/work/companyname_app/companyname_app/companyname_app_dir/vendor/autoload.php' (include_path='.:/usr/share/php') in /home/runner/work/companyname_app/companyname_app/companyname_app_dir/artisan on line 18
Error: Process completed with exit code 255.
if I use:
- name: Run composer install
run: cd companyname_app_dir && composer install
- name: Run composer update
run: cd companyname_app_dir && composer update
In CI/CD yml and remove Run Containers part, composer install and update successfully, but php artisan test throws this error:
postgresql can not connect
You must use composer install, else you will have no vendor folder at all, so you have nothing to run. That is why you are getting an error if you don't run composer install
You should not run composer update, because you are updating packages to new versions, you never do that in production, you just run composer install --no-dev
You are mixing running docker with a command OUTSIDE the docker container.
Related to point 3., if you are using docker-compose, you cannot execute:
- name: Run test
run: cd companyname_app_dir && php artisan test
env:
APP_KEY: base64:x06N/IsV5iJ+R6TKlr6sC6Mr4riGgl8Rg09XHHnRZQw=
APP_ENV: testing
DB_CONNECTION: companyname-postgres
DB_DATABASE: db_test
DB_USERNAME: root
DB_PASSWORD: 1234
Because you are outside docker, so you should execute docker-compose exec companyname_app php artisan test, that will execute the tests INSIDE the docker container, where you correctly have everything setup.
So your code (if I am not missing anything) should be:
- name: Run test
run: docker-compose exec companyname_app php artisan test
env:
APP_KEY: base64:x06N/IsV5iJ+R6TKlr6sC6Mr4riGgl8Rg09XHHnRZQw=
APP_ENV: testing
DB_CONNECTION: companyname-postgres
DB_DATABASE: db_test
DB_USERNAME: root
DB_PASSWORD: 1234
But I am not certaing what will you get back from that execution, I have no idea if the test fails, if the CI/CD (I am assuming you are using GitHub Actions or Bitbucket Pipelines), will truly identify that it has failed or not.
What I usually do, is just install everything on the machine (CI/CD machine), instead of using a docker file or docker-compose yaml. But that is my preference (at least for PHP/Laravel)

Can someone look at my yaml file for code deployment using Bitbucket Pipelines?

This is my first attempt at setting up pipelines or even using any CI/CD tool. So, reading the documentation at Bitbucket, I added the bitbucket-pipelines.yml file in the root of my Laravel application for a build. Here is the file.
image: php:7.4-fpm
pipelines:
default:
- step:
name: Build and test
caches:
- composer
script:
- apt-get update && apt-get install -qy git curl libmcrypt-dev mariadb-client ghostscript
- yes | pecl install mcrypt-1.0.3
- docker-php-ext-install pdo_mysql bcmath exif
- curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --file name=composer
- composer install
- ln -f -s .env.pipelines .env
- php artisan migrate
- ./vendor/bin/phpunit
services:
- mysql
- redis
definitions:
services:
mysql:
image: mysql:5.7
environment:
MYSQL_DATABASE: "laravel-pipeline"
MYSQL_RANDOM_ROOT_PASSWORD: "yes"
MYSQL_USER: "homestead"
MYSQL_PASSWORD: "secret"
redis:
image: redis
The above works fine in building the application, running tests,etc. But when I add the below to deploy, using the scp pipe, I get a notice saying either I need to include an image or at times the notice says there is a bad indentation of a mapping entry.
- step:
name: Deploy to test
deployment: test
# trigger: manual # Uncomment to make this a manual deployment.
script:
- pipe: atlassian/scp-deploy:0.3.13
variables:
USER: '${remoteUser}'
SERVER: '${server}'
REMOTE_PATH: '${remote}'
LOCAL_PATH: '${BITBUCKET_CLONE_DIR}/*'
I don't really know yaml, and this is my first time working with a CI/CD tool so I am lost. Can someone guide me in what I am doing wrong?
Your indentation for name and deployment is not the same as for the script. Try putting it all on the same indentation like this.
- step:
name: Deploy to test
deployment: test
script:
- pipe: atlassian/scp-deploy:0.3.13
variables:
USER: '${remoteUser}'
SERVER: '${server}'
REMOTE_PATH: '${remote}'
LOCAL_PATH: '${BITBUCKET_CLONE_DIR}/*'

Laravel Dusk testing with Gitlab CI error, cant connect to port:9515: Connection refused

so i've been trying to setup Laravel with Gitlab, everything works fine now, however when the script tries to run my Browser tests, i get the following error
Curl error thrown for http POST to /session with params: {"capabilities":{"firstMatch":[{"browserName":"chrome","goog:chromeOptions":{"binary":"","args":["--disable-gpu","--headless","--window-size=1920,1080"]}}]},"desiredCapabilities":{"browserName":"chrome","platform":"ANY","chromeOptions":{"binary":"","args":["--disable-gpu","--headless","--window-size=1920,1080"]}}}
Failed to connect to localhost port 9515: Connection refused
at vendor/php-webdriver/webdriver/lib/Remote/HttpCommandExecutor.php:331
here is my .gitlab-ci.yml file
before_script:
- apt-get update
- apt-get install -qq git curl libmcrypt-dev libjpeg-dev libpng-dev libfreetype6-dev libbz2-dev
- apt-get install zlib1g-dev libzip-dev
- apt-get clean
- curl --silent --show-error https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
- docker-php-ext-install pdo_mysql zip
- cp .env.test .env
image: php:7.3
services:
- mysql:5.7
variables:
MYSQL_ROOT_PASSWORD: secret
MYSQL_DATABASE: homestead
MYSQL_USER: homestead
MYSQL_PASSWORD: secret
DB_HOST: mysql
DB_USERNAME: root
stages:
- test
browser_test:
stage: test
script:
- echo "Starting pest tests"
- composer install
- php artisan dusk:install
- php artisan dusk:chrome-driver
- php artisan key:generate
- php artisan migrate
- php artisan serve & vendor/bin/pest
and this is my .env.test file
APP_ENV=local
APP_DEBUG=true
APP_KEY=SomeRandomString
APP_URL=http://localhost
DB_CONNECTION=mysql
DB_HOST=mysql
DB_DATABASE=homestead
DB_USERNAME=homestead
DB_PASSWORD=secret
CACHE_DRIVER=file
SESSION_DRIVER=file
QUEUE_DRIVER=sync
from what i could find is that the chrome-driver is'nt running,
Thanks for your answer.
the problem is solved by adding --no-sandbox to the chrome-driver setup
in tests/DuskTestCase.php add this
protected function driver()
{
$options = (new ChromeOptions)->addArguments([
'--disable-gpu',
'--headless',
'--window-size=1920,1080',
'--no-sandbox', <---------------------
]);
return RemoteWebDriver::create(
'http://localhost:9515', DesiredCapabilities::chrome()->setCapability(
ChromeOptions::CAPABILITY, $options
)
);
}

Running dusk after php artisan serve fails in CircleCI

I'm working on a CircleCi config file that runs unit tests and browser (dusk) tests. Tests fail in CircleCi every time on the command php artisan dusk. Php artisan serve also is 'canceled' if I add the attribute background: true.
Declaring the port php artisan serve --port=8000
Using backgroud: true and php artisan serve &
Tried running curl http://localhost:8000 (failed)
circleci config file
version: 2
jobs:
Test:
docker:
- image: circleci/php:7.2-fpm-node-browsers
- image: circleci/redis:5.0
- image: circleci/mysql:5.7
environment:
MYSQL_ROOT_HOST: '%'
MYSQL_ROOT_PASSWORD: ''
MYSQL_USER: homestead
MYSQL_PASSWORD: secret
MYSQL_DATABASE: homestead
MYSQL_ALLOW_EMPTY_PASSWORD: true
environment:
APP_ENV: testing
APP_URL: http://localhost:8000
APP_KEY: ###
DB_HOST: 127.0.0.1
DB_DATABASE: homestead
DB_USERNAME: homestead
DB_PASSWORD: secret
REDIS_HOST: 127.0.0.1
REDIS_PASSWORD: 'null'
PUSHER_APP_ID: ###
PUSHER_APP_KEY: ###
PUSHER_APP_SECRET: ###
PUSHER_APP_CLUSTER: ###
MIX_PUSHER_APP_KEY: ###
MIX_PUSHER_APP_CLUSTER: ###
working_directory: ~/workspace
steps:
- checkout
- run:
name: Prepare Environment
command: .circleci/kickstart.sh
- restore_cache:
keys:
- composer-v1-{{ checksum "composer.lock" }}
- composer-v1-
- run: composer install -n --prefer-dist
- save_cache:
key: composer-v1-{{ checksum "composer.lock" }}
paths:
- vendor
- restore_cache:
keys:
- node-v1-{{ checksum "package-lock.json" }}
- node-v1-
- run: npm install
- save_cache:
key: node-v1-{{ checksum "package-lock.json" }}
paths:
- node_modules
- run:
name: Build Artifacts
command: npm run dev
- run:
name: Waiting for MySQL to be ready
command: dockerize -wait tcp://127.0.0.1:3306 -timeout 120s
- run: php artisan config:clear
- run: php artisan config:cache
- run: php artisan migrate:refresh --seed --database=mysql --force
- run: php artisan dusk:install
- run:
name: Run PHP Unit Tests
command: ./vendor/bin/phpunit
- run:
name: Run server
command: php artisan serve --port=8000 &
# - run:
# name: Run server
# command: php artisan serve --port=8000
# background: true
- run:
name: Test connection to server
command: curl http://localhost:8000
- run:
name: Run E2E Tests
command: php artisan dusk
environment:
APP_URL: http://localhost:8000
- store_artifacts:
path: ./tests/Browser/console
destination: console
- store_artifacts:
path: ./tests/Browser/screenshots
destination: screenshots
workflows:
version: 2
Build and Test:
jobs:
- Test
curl failing
#!/bin/bash -eo pipefail
php artisan serve --port=8000 &
curl http://localhost:8000
curl: (7) Failed to connect to localhost port 8000: Connection refused
Exited with code 7
Dusk should run after the server is running

How to run laravel app using docker-compose?

Following is what I tried, is there something I doing wrong?
step1. create a simple laravel app at localhost.
composer create-project --prefer-dist laravel/laravel laravel-app 5.6
step2. create docker-compose.yml
version: '3'
services:
php:
image: php:7-fpm
ports:
- "3021:8000"
volumes:
- ./laravel-app:/app
composer:
image: composer:latest
volumes:
- ./laravel-app:/app
working_dir: /app
command: ["install","php artisan serve --host=0.0.0.0"]
depends_on:
- php
After that, I run docker-compose up --force-recreate -d and access 127.0.0.1:3021 at browser, but I get nothing.
Then I run docker-composer log, it shows me this error message:
Invalid argument php artisan serve --host=0.0.0.0. Use "composer require php artisan serve --host=0.0.0.0" instead to add packages to your composer.json.
How to fix this issue?
You are mixing commands. Composer does not "serve". Php has a build in dev server to "serve".
You can read more about it here: https://laravel.com/docs/4.2/quick
To actually get Laravel up and running please do the following:
1 - Run this in the laravel-app folder: composer install
2 - Create a Dockerfile with the following contents:
FROM php:7
RUN apt-get update -y && apt-get install -y libmcrypt-dev openssl
RUN docker-php-ext-install pdo mcrypt mbstring
RUN curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
WORKDIR /app
COPY . /app
CMD php artisan serve --host=0.0.0.0 --port=8000
EXPOSE 8000
3 - Build your docker image: docker build -t my-laravel-image .
4 - Finally replace the content of your docker-compose:
version: '3'
services:
web:
image: my-laravel-image
ports:
- 3021:8000
volumes:
- ./laravel-app:/app
A more complete tutorial can be found here (not mine): https://www.techiediaries.com/docker-compose-laravel/
EDIT:
in order to use the official compose image you could simply do this:
version: '3'
services:
composer:
image: composer:latest
working_dir: /app
entrypoint: php artisan serve --host=0.0.0.0
depends_on:
- php
volumes:
- ./laravel-app:/app
ports:
- "3021:8000"
Make sure ./laravel-app contains a laravel project. Otherwise this won't work!
in the main folder of your Laravel app, create a file named Dockerfile and insert this code:
FROM php:7
RUN apt-get update -y && apt-get install -y openssl zip unzip git
RUN curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
RUN docker-php-ext-install pdo pdo_mysql
WORKDIR /app
COPY . /app
RUN composer install
CMD php artisan serve --host=0.0.0.0 --port=8181
EXPOSE 8181
In the same main folder of Dockerfile, create a file named docker-compose.yml and insert this code:
version: '2'
services:
app:
build: .
ports:
- "8009:8000"
volumes:
- .:/app
env_file: .env
working_dir: /app
command: bash -c 'php artisan migrate && php artisan serve --host 0.0.0.0'
depends_on:
- db
links:
- db
db:
image: "mysql:5.7"
environment:
- MYSQL_ROOT_PASSWORD=yourpassword
- MYSQL_DATABASE=yourdbname
- MYSQL_USER=root
- MYSQL_PASSWORD=yourpassword
volumes:
- ./data/:/var/lib/mysql
ports:
- "3306:3306"
phpmyadmin:
depends_on:
- db
image: phpmyadmin/phpmyadmin
restart: always
ports:
- 8090:80
environment:
PMA_HOST: db
MYSQL_ROOT_PASSWORD: yourpassword
Open the terminal command line and go inside the laravel folder, and launch this commands:
docker.compose build
docker-compose up -d
if have need to create and migrate the db, or use other commands, launch the Laravel commands in this way:
docker-compose run app php artisan
The app will available at the address http://0.0.0.0:8009
Source: https://medium.com/#pierangelo1982/dockerize-an-existing-laravel-application-with-docker-compose-a45eb7956cbd

Resources