I am very new to containerizing approach of deploying applications. I am trying to deploy my Laravel app to Azure using Docker and ACI. I couldn't find any well explained articles or articles matching my requirements of deployment.
I am actually trying to setup a proper DevOps pipeline, with sequence being: I push my code to GitHub, Run GitHub Actions, Build Docker Image, Push to ACR and Pull in ACI.
I attempted to build the Laravel docker image in my local environment with Nginx and Supervisor in a single image and it works well. But now I want to use automated Let's Encrypt SSL in my Nginx server. If I rebuild the image every time requesting a new SSL certificate for my server with certbot that wouldn't be a right idea, right? So, what is the best way to do it?
Here is my current Dockerfile without SSL:
# Use the official PHP 8.1 image as the base image
FROM php:8.1-fpm
# Install necessary packages
RUN apt-get update && apt-get install -y git zip unzip supervisor libpng-dev libonig-dev libxml2-dev libzip-dev nginx
# Full update system
RUN apt-get upgrade -y
# Install PHP extensions
RUN docker-php-ext-install pdo_mysql gd mbstring exif pcntl bcmath zip
# Set the working directory to /var/www/html
WORKDIR /var/www/html
# Copy the Laravel application files to the container
COPY . .
# Copy the Nginx configuration file
COPY nginx.conf /etc/nginx/sites-available/default
# Install Composer and run it to install the application dependencies
RUN curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
RUN composer install --no-dev --no-interaction
# Copy the environment file
RUN cp .env.example .env
# Generate the application key
RUN php artisan key:generate
# Set the ownership and permissions of the application files
RUN chown -R www-data:www-data /var/www/html
# Copy the Supervisor configuration file
COPY supervisor.conf /etc/supervisor/conf.d/mysupervisor.conf
# Expose port 80 for the Nginx web server
EXPOSE 80
# Start Nginx and Supervisor
CMD ["/bin/sh", "-c" , "service nginx restart && /usr/bin/supervisord -c /etc/supervisor/supervisord.conf"]
Related
I'm having some issues with running the laravel command on my docker container.
I use the php docker image and use the copy command for getting composer from the composer image. After that I've added composer to my $PATH variable and run the composer global require laravel/installer command.
After building the docker compose file and running it I open up the command line for my php image. In here I try to run the "laravel" command but get the following error: /bin/sh: laravel:not found.
Looking in the $HOME/.config/composer/vendor folder I see laravel here, so I think the installation is correct.
I might be completely wrong here or have made a dumb rookie mistake, so any help is greatly appreciated
Below here is my dockerfile:
FROM php:8.0.14-apache
RUN docker-php-ext-install pdo pdo_mysql
#apache
RUN a2enmod rewrite
#composer
COPY --from=composer:latest /usr/bin/composer /usr/local/bin/composer
#add composer to path
ENV PATH="$PATH:$HOME/usr/local/bin/composer"
RUN export PATH="$PATH:$HOME/.composer/vendor/bin"
#update
RUN apt-get update
RUN apt-get -y install nano
#add nodejs
RUN apt-get -y install nodejs npm
COPY . /var/www/html/
RUN npm install
#install laravel
RUN composer global require laravel/installer
You copy composer to /usr/local/bin/composer, but you add $HOME/usr/local/bin/composer to the path.
Also, RUN export ... doesn't do anything, because each RUN statement is run in a separate shell. So when the RUN command is done, the shell exits and the exported variable is lost.
Try this
FROM php:8.0.14-apache
RUN docker-php-ext-install pdo pdo_mysql
#apache
RUN a2enmod rewrite
#composer
COPY --from=composer:latest /usr/bin/composer /usr/local/bin/composer
#add composer to path
ENV PATH="$PATH:/usr/local/bin/composer"
#update
RUN apt-get update
RUN apt-get -y install nano
#add nodejs
RUN apt-get -y install nodejs npm
COPY . /var/www/html/
RUN npm install
#install laravel
RUN composer global require laravel/installer
I've added the "changed path" from the command composer global about to my ENV path and added /vendor/bin. I'm not sure if its bad practise to add something from the root folder to the $PATH variable.
So the complete line looks like this:
ENV PATH="$PATH:/root/.config/composer/vendor/bin"
By adding this line i'm able to run the laravel command
I have been trying for the last few days to set up Laravael for Docker on my WSL2 enabled machine. After digging through various bloated stacks - I've tried to build my own stack for local development. My issue is I cannot both mount the uncompiled Laravel folder to the container, and also install dependencies via composer. Below are my current set of files. I cannot access the default application because autoload.php has not been created by composer. If I copy the files to the container via dockerfile and then proceed to run composer dependencies, I end up with a static application that will not reflect changes as I make them in VSCode.
For clarification, my goal is simply to be able to edit my Laravel application without needing to re-build the image everytime.
dockerfile
FROM php:7.4.14-apache
RUN apt-get update -y && apt-get install -y zip unzip
# Install Composer
RUN curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
# Install Laravel
# VOLUME ./laravel /var/www/html
# WORKDIR /var/www/html
# RUN composer install
docker-compose.yml
version: '3'
services:
web:
build:
context: .
dockerfile: dockerfile.httpd
ports:
- '80:80'
volumes:
- './laravel/public:/var/www/html'
db:
image: mariadb
restart: always
environment:
- MYSQL_ROOT_PASSWORD=my_secure_pwd
- MARIADB_USER=mydbuser
- MARIADB_DATABASE=laravel
- MARIADB_PASSWORD=mydbuserpwd
Im fairly new to docker and so im trying to learn more about it using a laravel project, im following this tutorial:
https://www.digitalocean.com/community/tutorials/how-to-set-up-laravel-nginx-and-mysql-with-docker-compose
Ive adjusted the Dockerfile a bit from what the tutorial has but even the tutorial file causes the same result.
FROM php:7.3-fpm
# Copy composer.lock and composer.json
COPY composer.lock composer.json /var/www/
# Install dependencies
RUN curl -sL https://deb.nodesource.com/setup_10.x | bash - && \
apt-get update && apt-get install -y mysql-client \
RUN npm install -g npm
# Clear cache
RUN apt-get clean && rm -rf /var/lib/apt/lists/*
# Install extensions
RUN docker-php-ext-install pdo pdo_mysql
# Install composer
RUN curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
# Add user for laravel application
RUN groupadd -g 1000 www
RUN useradd -u 1000 -ms /bin/bash -g www www
# Copy existing application directory contents
COPY . /var/www
# Copy existing application directory permissions
COPY --chown=www:www . /var/www
# Change current user to www
USER www
# Set working directory
WORKDIR /var/www
# Expose port 9000 and start php-fpm server
EXPOSE 9000
CMD ["php-fpm"]
But i keep getting the following error when i run docker-compose up -d:
E: Package 'mysql-client' has no installation candidate
ERROR: Service 'app' failed to build: The command '/bin/sh -c curl -sL https://deb.nodesource.com/setup_10.x | bash - && apt-get update && apt-get install -y mysql-client nodejs build-essential vim git curl' returned a non-zero code: 100
Am i missing something?
I expected this to work since i am running apt-get update before installing mysql-client.
Thanks.
php:7.3-fpm now use Debian 10 (Buster) as its base image and Buster ships with MariaDB, so just replace mysql-client with mariadb-client should fix it.
If you still want to use the mysql client, it's called default-mysql-client now.
php:7.2-apache triggers the error as well, but I resolve it using php:7.2.18-apache
it worked for me: sudo apt-get update && apt-get install -y git curl libmcrypt-dev default-mysql-client
or alternatively apt-cache search mysql-server
find out your servers then sudo apt-get install default-mysql-server default-mysql-server-core mariadb-server-10.6 mariadb-server-core-10.6
in my case it was the above codes
I am new in laravel and docker. I have a laravel repository and want to clone it and set up docker for it. How to setup docker file for it and what should i write in docker file?
Beside using laradock, a more step-by-step approach is described in "Laravel in Docker" and the example repository buddy-works/laravel-first-steps.
It uses as an example Dockerfile:
FROM php:7
RUN apt-get update -y && apt-get install -y openssl zip unzip git
RUN curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
RUN docker-php-ext-install pdo mbstring
WORKDIR /app
COPY . /app
RUN composer install
CMD php artisan serve --host=0.0.0.0 --port=8181
EXPOSE 8181
I want to get docker container's result from local, following is what I tried.
step1.
create php-composer image using dockerFile.
FROM php:7
RUN apt-get update
RUN apt-get install curl
RUN curl -sS https://getcomposer.org/installer -o composer-setup.php
RUN php composer-setup.php --install-dir=/usr/local/bin --filename=composer
RUN apt-get install -y git
step2.
create container and execute laravel app.
docker run -p 127.0.0.1:3000:8000 --name MyTest -dt php-composer to create a container
docker cp laravelApp/ d4bbb5d36312:/usr/
docker exec -it MyTest bash
cd usr/laravelApp
php artisan serve
After that, terminal shows success info:
Laravel development server started: <http://127.0.0.1:8000>
But when I access 127.0.0.1:3000 at browser, I get nothing.
Why is that?
there are some php extensions that are needed for laravel to work, so you need to install them too, this is the full dockerfile
FROM php:7
RUN apt-get update -y && apt-get install -y openssl zip unzip git
RUN curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
RUN docker-php-ext-install pdo mbstring
WORKDIR /app
COPY app /app # this copies all the app files to a folder called `app`
RUN composer install
CMD php artisan serve --host=0.0.0.0 --port=8000
EXPOSE 8000
then to run the container, run this command only:
docker run -p 3000:8000 --name MyTest
then go to http://localhost:3000
let me know if it didn't work