The command '/bin/sh -c yum install yum-utils' returned a non-zero code: 1 - laravel

i am trying to setup laravel php setup using docker.
Is something wrong with the docker file or the network configurations !!!!
Using DockerFile:
FROM centos:7
# Install some must-haves
RUN yum -y install vim wget sendmail
RUN yum -y install libtool make automake autoconf nasm libpng-static
RUN yum -y install git
RUN git --version
# Install PHP 7.1 on CentOS
RUN rpm -Uvh https://dl.fedoraproject.org/pub/epel/epel-release-latest-7.noarch.rpm \
&& rpm -Uvh http://rpms.remirepo.net/enterprise/remi-release-7.rpm
RUN yum install yum-utils
RUN yum install epel-release
RUN yum-config-manager --enable remi-php73
RUN yum --enablerepo=remi-php73 -y install php php-bcmath php-cli php-common php-gd php-intl php-ldap php-mbstring \
php-mysqlnd php-pear php-soap php-xml php-xmlrpc php-zip php-fpm
RUN php -v
# Prepare PHP environment
COPY config/php/php-fpm.conf /etc/php-fpm.conf
COPY config/php/www.conf /etc/php-fpm.d/www.conf
COPY config/php/php.ini /usr/local/etc/php/php.ini
COPY config/php/xdebug.ini /usr/local/etc/php/conf.d/xdebug.ini
# Install Composer
RUN curl -sS https://getcomposer.org/installer | php
RUN mv composer.phar /usr/bin/composer
RUN composer --version
# Install Node.js
RUN curl -sL https://rpm.nodesource.com/setup_7.x | bash -
RUN yum -y install nodejs
RUN yum list installed nodejs
RUN node -v
# Final update and clean up
RUN yum -y update --skip-broken
RUN yum clean all
# Define work directory
WORKDIR /var/www/laravel-boilerplate
# Expose ports
EXPOSE 9000
CMD ["php-fpm", "-F", "-O"]
# CMD ["/bin/sh", "-l", "-c", "php-fpm"]
# CMD ["php-fpm", "-F"]
Command which i had run to setup the instances are,
docker-compose up -d
any idea what went wrong?
Adding docker compose file
version: '2'
services:
mysql:
image: mysql:latest
volumes:
- "./data/db:/var/lib/mysql"
ports:
- "3306:3306"
restart: always
environment:
- MYSQL_ROOT_PASSWORD=test
- MYSQL_DATABASE=laravel_boilerplate
- MYSQL_USER=root
- MYSQL_PASSWORD=secret
laravel-env:
build: ./dockerfiles
depends_on:
- mysql
volumes:
- ".:/var/www/laravel-boilerplate"
- "./dockerfiles/config/php/php-fpm.conf:/etc/php-fpm.conf"
- "./dockerfiles/config/php/www.conf:/etc/php-fpm.d/www.conf"
- "./dockerfiles/config/php/php.ini:/usr/local/etc/php/php.ini"
- "./dockerfiles/config/php/xdebug.ini:/usr/local/etc/php/conf.d/xdebug.ini"
nginx:
image: nginx:latest
depends_on:
- laravel-env
volumes:
- ".:/var/www/laravel-boilerplate"
- "./dockerfiles/config/nginx/default.conf:/etc/nginx/conf.d/default.conf"
ports:
- "80:80"
restart: always
Let me know if I missed anything!!!!
Is something has to get removed while building or is something had to deleted like cleanup, am pretty new setting up the code so. Your help is much appreciated.
Thanks folks.

Related

How to run laravel job queue in php-fpm image on docker

I have a container with a nginx, mailhog, redis and PHP image. All these images are on the same network.
I run Laravel on the PHP image.
I want to make use of the Job queue that laravel has, but I am struggling to run the queue in the PHP image.
I've looked at all the examples but it seems my lack of understanding of docker is causing me to not ask the right question
Below is my docker-compose.yml
version: '3'
networks:
devnet:
external: true
services:
# lightweight web-server:
nginx:
image: nginx:stable-alpine
container_name: lar-nginx
ports:
- 8080:80
- 4040:443
volumes:
- ./:/var/www
- ./run/nginx:/etc/nginx/conf.d
- ./local/certs:/etc/nginx/certs
depends_on:
- php
networks:
- devnet
# server-side scripting engine
php:
build:
context: .
dockerfile: Dockerfile
container_name: lar-php
volumes:
- ./:/var/www
ports:
- "9000:9000"
networks:
- devnet
# caching server:
redis:
image: redis:latest
container_name: lar-redis
ports:
- "6379:6379"
networks:
- devnet
# development email catch-all server & client:
mailhog:
image: mailhog/mailhog:latest
container_name: lar-mailhog
ports:
# imap port for send mail
- "1025:1025"
# www mailhog ui
- "8025:8025"
networks:
- devnet
Dockerfile
FROM php:7.4-fpm
RUN apt-get update
RUN apt-get -y install curl gnupg cron
# RUN curl -sL https://deb.nodesource.com/setup_12.x | bash -
# RUN apt-get -y install nodejs
# RUN npm install
# Install other required PHP extensions and unix utils:
RUN apt-get update && apt-get install -y libmcrypt-dev \
mariadb-client libmagickwand-dev libonig-dev \
libzip-dev libcurl4-openssl-dev redis-server \
zlib1g-dev wget git \
--no-install-recommends \
# && pecl install imagick
# && docker-php-ext-enable imagick
&& docker-php-ext-install pdo_mysql \
&& docker-php-ext-install mbstring \
&& docker-php-ext-install zip \
&& docker-php-ext-install xml \
&& docker-php-ext-install curl \
&& docker-php-ext-install gd \
&& docker-php-ext-install soap
# Configure PHP internal vars:
ENV PHP_MEMORY_LIMIT=256M
# Install Composer
RUN curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
# Install php apcu pecl package:
RUN pecl install apcu && docker-php-ext-enable apcu
# Install php redis pecl package:
RUN pecl install redis && docker-php-ext-enable redis
# Clear cache
RUN apt-get clean && rm -rf /var/lib/apt/lists/*
# Install extensions
RUN docker-php-ext-install pdo_mysql zip exif pcntl
# Permissions for Laravel
RUN chown -R www-data:www-data /var/www
RUN chmod -R 777 /var/www
COPY entrypoint.bash /usr/sbin
RUN chmod a+x /usr/sbin/entrypoint.bash
ENTRYPOINT /usr/sbin/entrypoint.bash
entrypoint.bash
#!/bin/bash
# turn on bash's job control
set -m
# Start the "main" PHP process and put it in the background
php-fpm &
# Start the helper crond process
crond
# now we bring the primary process back into the foreground
fg %1
In normal server(lamp) environment its pretty simple to work with cronjobs and queue but I dont know how to start up the queue.
php artisan queue:work in the php image returs There are no commands defined in the "queue:" namespace. Did you mean this? queue
Running it in tinker
\Queue::pushON('new', new App\Jobs\PublishingClass(array('foo'=>1,'foobar'=>783,'foobarfoo'=>33)));
show the job gets processed but I need to do it with a process running in the background
The most simple way is to call with the use of Tinker
It's Laravel command using for debugging, use it by running below command from from project root
php artisan tinker
To dispatch job on a specific queue from tinker
\Queue::pushON('rms', new App\Jobs\UpdateRMS());
first parameter - Queue name
second parameter - job name
Dispatch multiple jobs at once to a specific queue
\Queue::bulk([new App\Jobs\UpdateRMS(), new App\Jobs\UpdateRMS()], null, 'rms');
You can use this docker image, you don't need to configure the schedule, it's already implemented, with differents php expansions like redis, Rdkafka.
Follow this link:
https://hub.docker.com/r/jkaninda/laravel-php-fpm
https://github.com/jkaninda/laravel-php-fpm

How to avoid installing all centos packages everytime I run gitlab ci pipeline?

I'm running a gitlab ci pipeline with a Centos image.
The pipeline has a before script that runs a set of commands.
gitlab-ci.yaml
variables:
WORKSPACE_HOME: '$CI_PROJECT_DIR'
DELIVERY_HOME: delivery
PIP_CACHE_DIR: "$CI_PROJECT_DIR/.cache/pip"
default:
image: centos:latest
cache:
paths:
- .cache/pip
before_script:
- chmod u+x devops/scripts/*.sh
- devops/scripts/install-ci.sh
- python3 -m ensurepip --upgrade
- cp .env.docker.dist .env
- pip3 install --upgrade pip
- pip3 install -r requirements.txt
install-ci.yaml
sed -i 's/mirrorlist/#mirrorlist/g' /etc/yum.repos.d/CentOS-Linux-* &&\
sed -i 's|#baseurl=http://mirror.centos.org|baseurl=http://vault.centos.org|g' /etc/yum.repos.d/CentOS-Linux-*
yum -y update
yum -y install gcc gcc-c++ make
yum -y install python3.8
yum install python3-setuptools
yum -y groupinstall "Development Tools"
yum -y install python3-devel
yum -y install postgresql-server
yum -y install postgresql-devel
yum -y install postgresql-libs
yum -y install python3-pip
timedatectl set-timezone Europe/Paris
yum -y install sqlite-devel
The issue is that everytime I run the ci pipeline it takes time to install centos and all it's packages.
Is there a way to avoid this ? or cache this operation somewhere ?
You could create your own image in which all your dependencies are installed and use this in your job instead of installing the dependencies all over again. I would create a dedicated project on your gitlab instance, something like "centos-python-postgress" and within this project you create a Dockerfile in which you install everything you need. (You can either copy your install-ci.sh or RUN the commands directly within your dockerfile) :
FROM centos:latest
RUN sed -i 's/mirrorlist/#mirrorlist/g' /etc/yum.repos.d/CentOS-Linux-* && sed -i 's|#baseurl=http://mirror.centos.org|baseurl=http://vault.centos.org|g' /etc/yum.repos.d/CentOS-Linux-*
RUN yum -y update
RUN yum -y install gcc gcc-c++ make
...
You can now either build the Dockerfile on your machine and push it manually to the container registry in this project or you create a CI Pipeline that builds and pushes that image automatically:
stages:
- build
stage: build
image:
name: gcr.io/kaniko-project/executor:debug
entrypoint: [""]
script:
- mkdir -p /kaniko/.docker
- echo "{\"auths\":{\"${CI_REGISTRY}\":{\"auth\":\"$(printf "%s:%s" "${CI_REGISTRY_USER}" "${CI_REGISTRY_PASSWORD}" | base64 | tr -d '\n')\"}}}" > /kaniko/.docker/config.json
- >-
/kaniko/executor
--context "${CI_PROJECT_DIR}"
--dockerfile "${CI_PROJECT_DIR}/Dockerfile"
--destination "${CI_REGISTRY_IMAGE}:latest"
Now, Instead of using centos:latest in your origin project/job, you can use your own image:
variables:
WORKSPACE_HOME: '$CI_PROJECT_DIR'
DELIVERY_HOME: delivery
PIP_CACHE_DIR: "$CI_PROJECT_DIR/.cache/pip"
default:
image: registry.gitlab.com/snowfire/centos-python-postgress:latest
cache:
paths:
- .cache/pip
before_script:
- ...

Docker compose work on linux environment but not windows environment

I am using 2 environment for development one is a linux VM at home while another is the windows laptop at office. The dockerfile of angular environment work fine until a few days ago, it show the following error when I tried to start the docker container with docker compose on the laptop:
ng | /bin/sh: 1: sudo: not found
ng exited with code 127
However, the same issue does not occurs on my linux VM.
Dockerfile:
FROM node:12
RUN wget -q -O - https://dl-ssl.google.com/linux/linux_signing_key.pub | apt-key add -
RUN sh -c 'echo "deb [arch=amd64] http://dl.google.com/linux/chrome/deb/ stable main" >> /etc/apt/sources.list.d/google.list'
RUN apt-get update && apt-get install -yq google-chrome-stable
RUN mkdir /app
WORKDIR /app
ENV PATH /app/node_modules/.bin:$PATH
COPY package.json package-lock.json /app/
RUN npm install
#RUN npm install -g #angular/cli
COPY . /app
EXPOSE 4200
CMD ng serve --host 0.0.0.0
docker-compose.yaml:
version: "3"
services:
dj:
container_name: dj
build: Backend
command: python manage.py runserver 0.0.0.0:80
volumes:
- ./Backend:/code
ports:
- "80:80"
ng:
container_name: ng
build: Frontend/SPort
volumes:
- ./Frontend/SPort:/app
ports:
- "4200:4200"
I think you want to fix the sh script in your Dockerfile
add this:
RUN apt-get update && apt-get install -y dos2unix && dos2unix /path/to/the/script
hope that will help since the error comes from CRLF characters in windows

How do I install composer in container of docker?

I am new at docker and docker-compose and I am developing a Laravel-project on docker and docker-compose with Laradock as following a tutorial(not sure whether It is a correct way or not to refer this situation though).
I want to install the composer in this environment to be able to use the composer command.
As a matter of fact, I wanted to do seeding to put data into DB that I made by php artisan make:migrate but this error appeared.
include(/var/www/laravel_practice/vendor/composer/../../database/seeds/AdminsTableSeeder.php): failed to open stream: No such file or directory
So I googled this script to find a solution that will solve the error then I found it.
It says, "Do composer dump-autoload and try seeding again", so I followed it then this error appeared.
bash: composer: command not found
Because I have not installed composer into docker-container.
My docker's condition is like this now.
・workspace
・mysql
・apache
・php-fpm
Since I have not installed the composer, I have to install it into docker-container to solve the problem, BUT I have no idea how to install it into docker-container.
So could anyone tell me how to install composer into docker-container?
Thank you.
here is the laradock/mysql/Dockerfile and laravelProject/docker-compose.yml.
ARG MYSQL_VERSION=5.7
FROM mysql:${MYSQL_VERSION}
LABEL maintainer="Mahmoud Zalt <mahmoud#zalt.me>"
#####################################
# Set Timezone
#####################################
ARG TZ=UTC
ENV TZ ${TZ}
RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone && chown -R mysql:root /var/lib/mysql/
COPY my.cnf /etc/mysql/conf.d/my.cnf
CMD ["mysqld"]
EXPOSE 3306
version: '2'
services:
db:
image: mysql:5.7
ports:
- "6603:3306"
environment:
- MYSQL_ALLOW_EMPTY_PASSWORD=true
- MYSQL_DATABASE=laravelProject
- LANG=C.UTF-8
volumes:
- db:/var/lib/mysql
command: mysqld --sql-mode=NO_ENGINE_SUBSTITUTION --character-set-server=utf8 --collation-server=utf8_unicode_ci
web:
image: arbiedev/php-nginx:7.1.8
ports:
- "8080:80"
volumes:
- ./www:/var/www
- ./nginx.conf:/etc/nginx/sites-enabled/default
volumes:
db:
You can build your own image and use it in your Docker compose file.
FROM php:7.2-alpine3.8
RUN apk update
RUN apk add bash
RUN apk add curl
# INSTALL COMPOSER
RUN curl -s https://getcomposer.org/installer | php
RUN alias composer='php composer.phar'
# INSTALL NGINX
RUN apk add nginx
I used the PHP alpine image as my base image because it's lightweight, so you might have to install other dependencies yourself. In your docker-compose file
web:
build: path/to/your/Dockerfile/directory
image: your-image-tag
ports:
- "8080:80"
volumes:
- ./www:/var/www
- ./nginx.conf:/etc/nginx/sites-enabled/default
You could do something like this:
FROM php:8.0.2-apache
RUN apt-get update && apt-get upgrade -y
RUN apt-get install -y mariadb-client libxml2-dev
RUN apt-get autoremove -y && apt-get autoclean
RUN docker-php-ext-install mysqli pdo pdo_mysql xml
COPY --from=composer /usr/bin/composer /usr/bin/composer
the argument COPY --from= should solve your problem.
FROM php:7.3-fpm-alpine
RUN docker-php-ext-install pdo pdo_mysql
RUN docker-php-ext-install mysqli && docker-php-ext-enable mysqli
RUN php -r "readfile('http://getcomposer.org/installer');" | php -- --install-dir=/usr/bin/ --filename=composer
RUN apk update
RUN apk upgrade
RUN apk add bash
RUN alias composer='php /usr/bin/composer'

Docker build image for php apache

I am trying to create a docker image from ubuntu where I need ti install laravel. For that I am trying to run.
I have a docker file where I am using this code:
FROM ubuntu:latest
RUN apt-get update && apt-get upgrade -y\
&& apt-get install apache2\
&& apt-get install php libapache2-mod-php php-common php-mbstring php-xmlrpc php-soap php-gd php-xml php-mysql php-cli php-mcrypt php-zip\
&& curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
EXPOSE 80 443
When I am running docker-coompose up I am getting error:
ERROR: Service 'web' failed to build: The command '/bin/sh -c apt-get update && apt-get upgrade -y && apt-get install apache2 && apt-get install php libapache2-mod-php php-common php-mbstring php-xmlrpc php-soap php-gd php-xml php-mysql php-cli php-mcrypt php-zip && curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer' returned a non-zero code: 1
My docker-compose file is :
version: '3'
services:
db:
image: postgres
restart: always
environment:
POSTGRES_PASSWORD: .......
adminer:
image: adminer
restart: always
ports:
- 8080:8090
web:
build: .
working_dir: /var/www/html
volumes:
- .:/var/www/html
ports:
- "80:7000"
depends_on:
- db
What should I do if I need to create a docker image for my laravel application. I have a laravel application and I need to run it through docker.
I just used this tutorial: https://www.digitalocean.com/community/tutorials/how-to-set-up-laravel-nginx-and-mysql-with-docker-compose#step-1-%E2%80%94-downloading-laravel-and-installing-dependencies to accomplish exactly this.
If you just need the end result check out my github repo: https://github.com/RyanFletcher86/LaravelDocker
I've managed to distill it down to 2 commands to get everything up and running.

Resources