Deploying Laravel to Azure - Laravel 8 - laravel

I have managed to deploy my laravel application to Azure using Bitbucket. Since i am unable to run composer install on Azure Devops, I git commit the vendor folder to the bitbcuket for deployment.
I understand the best practice is to ignore the vendor folder and run composer install on the server to install all dependencies.
Is there a way i can run composer install on Azure Devops pipeline so i could gitignore the vendor folder during my git push ?

In order to install composer inside your project folder, you should provide read and write access. Following method will help you to install composer
sudo chmod -R 777 /var/www/html/PROJECT_FOLDER_NAME
sudo composer install

Related

Laravel project size is too large. How to deploy?

I was made easy blog on Laravel Framework and now it is finished. But I am new guy in this. So the question how to deploy my project? The Vendor and ClockWork folders has thousands of files. I think that it should not to upload to the server. Am I wrong?
The project will not work without vendor dependencies, but true - sending it all over SSH is quite long and archive will be huge - uploading them every time by yourself will be annoying. Usually, you do not upload vendor and node_modules folders to the server, but run:
composer install --optimize-autoloader --no-dev
npm install && npm run build && rm -R node_modules
Read more details in Laravel's deployment section
maybe try to compress the project to a zip file

Laravel/ui package doesn't work after pulled from github

so my friend and i are working on a project and we install laravel/ui package on his device to use Auth:routes, but after i pull from github it says "In order to use the Auth::routes() method, please install the laravel/ui package." even though i also pulled all file that he push and the gitignore file is empty. does this mean i have to install laravel/ui package in my device too? because i dont want to have the client to installed all those again when we finished and send the folder to our client. we used laravel 7.24 btw
You just need to regenerate the classes as :
composer dump-autoload
you should open the terminal on your project root directory and run :
composer install
on your system.because probably the vendor directory(where the packages are located) added to .gitignore
if vendor not added in .gitignore try to run:
composer dump-autoload

how to fix error laravel 5.2 "failed to open stream: No such file or directory" without composer

I've got this error in my laravel 5.2 project who hosted in debian linux
Warning: require_once(/home/u706561288/public_html/sap/vendor/composer/autoload_real.php): failed to open stream: No such file or directory in
/home/u706561288/public_html/sap/vendor/autoload.php on line 5
Fatal error: require_once(): Failed opening required '/home/u706561288/public_html/sap/vendor/composer/autoload_real.php' (include_path='.:/opt/alt/php70/usr/share/pear') in
/home/u706561288/public_html/sap/vendor/autoload.php on line 5
so many forum like stackoverflow tell me to using php artisan "composer update" but
unfortunately my hosting package not available to composer instalation
please tell me how to fix this problem
I suggest that you test these steps
On the Localhost, run these two commands (composer update and composer dump-autoload)
Re-upload the entire project on the server
Also, if the problem is not resolved, you can delete the ‍Vendor folder and the composer.lock file and run the composer install command and Re-upload the entire project again.
When the hosting service or some PC does not allow to install Composer and appears error like in my case, follow these steps:
Delete all laravel project in hosting service. I recommend to use smartftp for good speed file access and tracked action.
Back to our localhost project and run composer install --no-scripts command and then composer clearcache.
Reupload all laravel project.
Don't forget to configure .env file.
I hope this can help with the same problem in future.
I had the same error while back, what I did to solve it was. I delete all the vendor folder from the root project then install it back by executing composer install
delete the ‍Vendor folder and the composer.lock file and run the composer install command and Re-upload the entire project again
this helped me
If anyone still facing the issue after deleting the vendor directory and did composer install try to update your composer by composer selfupdate then do the same install process.

How can I run composer with ddev?

I need to run composer on my ddev project and don't have it on my Windows machine. For example, the project requires a composer install before startup. How can I use composer in this environment, especially on Windows?
Updated 2018-11-15 to show native ddev support (ddev composer command)
There are several ways to run composer for your project.
ddev v1.4.0 now has the ddev composer and ddev composer create commands. These run composer inside the container, so you're guaranteed to get composer behavior that matches the in-container hosting environment. (This matters most for Windows users.)
ddev composer require swiftmailer/swiftmailer
ddev composer update
ddev composer install
ddev composer create drupal-composer/drupal-project:8.x-dev --stability dev
Note that ddev composer create is not exactly the same as composer create-project so you don't have to understand complexities of the underlying filesystem. There are drupal and TYPO3 ddev composer create examples in the docs.
Nothing here prevents you from using any composer technique that you're comfortable with, but this is a great way to get predictable on-linux in-container composer builds. It should be hugely important for people using Windows OS, where composer is less available and has some unpredictable behavior.
Install on the host the old fashioned way: If composer is installed on your computer/host, just composer install. However, that only works on macOS and Linux, and only works if you have the right versions of php related components. It does not work well at all on Windows (NTFS) because the symlinks composer creates are not compatible with usage inside the (Linux) web container. (Composer is not hard to install on Windows: Use chocolatey and choco install -y composer. You'll want to enable the gd and curl extensions in c:\tools\php72\php.ini)
All the normal composer behavior has always been installed inside your web container, so you can use that whether or not you have composer on your host computer. For example: ddev exec composer install -d /var/www/html will do a composer install in the root of your repository, exactly the same as ddev composer install. You can also do ddev ssh and operate on the command line in the container.
Try this hooks approach to running composer install inside the container (on the mounted partition) every time your project starts:
hooks:
post-start:
- exec: composer install -d /var/www/html
For some older ideas on composer patterns (mostly obsoleted by ddev composer, See
How to: Use "composer create-project" and DDEV to start a new Drupal 8 site when Composer isn't installed on the host machine and
How to: Set up a D8/Composer site on Pantheon without CircleCI, custom upstreams
To expand on the accepted answer, DDEV now has a composer-specific hook.
hooks:
post-start:
- composer: install -d /var/www/html
The reason for using this instead of exec, I assume, is that there are also pre-composer and post-composer hooks, so maybe this also executes those hooks. I'm not sure of that or the actual difference, though.

How to cache package manager downloads for docker builds?

If I run composer install from my host, I hit my local composer cache:
- Installing deft/iso3166-utility (1.0.0)
Loading from cache
Yet when building a container having in its Dockerfile:
RUN composer install -n -o --no-dev
I download all the things, e.g.:
- Installing deft/iso3166-utility (1.0.0)
Downloading: 100%
It's expected, yet I like to avoid it. As even on a rebuilt, it would also download everything again.
I would like to have a universal cache for composer that I could also reshare for other docker projects.
I looked into this and found the approach to define a volume in the Dockerfile:
ENV COMPOSER_HOME=/var/composer
VOLUME /var/composer
I added that to my Dockerfile, and expected to only download the files once, and hit the cache afterwards.
Yet when I modify my composer, e.g. remove the -o flag, and rerun docker build ., I expected to hit the cache on build, yet I still download the vendors again.
How are volumes supposed to work to have a data cache inside a docker container?
Use the experimental feature : Docker buildkit (Supported Since docker 18.09, docker-compose 1.25.4)
In your dockerfile
# syntax=docker/dockerfile:experimental
FROM ....
# ......
RUN --mount=type=cache,target=/var/composer composer install -n -o --no-dev
Now before building, make sure the env var is exported:
export DOCKER_BUILDKIT=1
docker build ....
If you are using docker-compose, make sure to export also COMPOSE_DOCKER_CLI_BUILD :
export COMPOSE_DOCKER_CLI_BUILD=1 DOCKER_BUILDKIT=1
docker-compose build ...
If it does not work with docker-compose, make sure your docker-compose version is above 1.25.4
docker-compose version
I found two ways of dealing with this problem, yet none deal with composer volumes anymore.
Fasten composer download process: Use hirak/prestissimo
composer global require "hirak/prestissimo:^0.3"
💡 With Composer 2.0, the above step is no longer required for faster downloads. In fact, it won't install on Composer 2.0 environments.
Force docker to use a cached composer install. Docker uses a cache on a RUN if the added files didn't change. If you only do COPY . /your-php-app, docker build will refresh all the cashes and re-run composer install even if only one unrelated file in the source tree changed. In order to make docker build to run composer install only install on package changes, one has to add composer.json and composer.lock file before adding the source files. Since one also needs the source files anyway, one has to use different folders for composer install and rsync the content back to the then added folder; furthermore one then has to run the post-install scripts manually. It should look something like this (untested):
WORKDIR /tmp/
COPY composer.json composer.lock ./
RUN composer install -n -o --no-dev --no-scripts
WORKDIR /your-php-app/
COPY . /your-php-app/
RUN rsync -ah /tmp/* /your/php-app/
RUN composer run-script post-install-cmd
or combine the two =)
I would like to have a universal cache for composer that I could also reshare for other docker projects.
Using a shared volume for the Composer cache works great when working with containers. If you want to go broader than just containers, and use a shared cache for e.g. local development as well, I've developed a solution for that called Velocita - how it works.
Basically, you use one global Composer plugin for local projects and inside and build containers. This not only speeds up downloads tremendously, it also helps with 3rd party outage for example.
I would consider utilizing the $HOME/.composer/cache/files directory. This is where composer reads/write to when using composer install.
If you are able to mount it from your host to your container that would work. Also you could just tar it up after each time your run composer install and then drop that in before you run composer install the next time.
This is loosely how Travis CI recommends doing this.
Also, consider using the --prefer-dist flag with your composer install command.
Info on that can be found here: https://getcomposer.org/doc/03-cli.md#install
--prefer-dist: Reverse of --prefer-source, composer will install from dist if possible. This can speed up installs substantially on build servers and other use cases where you typically do not run updates of the vendors. It is also a way to circumvent problems with git if you do not have a proper setup.
Some references on utilizing the composer cache for you:
https://blog.wyrihaximus.net/2015/07/composer-cache-on-travis/
https://github.com/travis-ci/travis-ci/issues/4579

Resources