When I deploy a Laravel 9 project to production, Laravel replies:
Spatie\LaravelIgnition\Exceptions\ViewException: Vite manifest not found at: /var/www/.../public/build/manifest.json in file /var/www/.../vendor/laravel/framework/src/Illuminate/Foundation/Vite.php on line 139
It turns out the files in /public/build folder are not committed in the git repository, thus missing on production server.
Should I?
Install npm on production server and run npm run build to generate the manifest files, or
Include the manifest files (e.g. manifest.json) of /public/build folder into my repository and pull them in production server ...
You can add buildpacks (scripts that are run when your app is deployed. They are used to install dependencies for your app and configure your environment) on Heroku which will allow you to run npm. Well, easy does it on Heroku.
But if you happen to be on Fortrabbit, where you can't run npm or vite in ssh. The simplest way is to build your assets locally (npm run build or vite build) and push them to production.
Make sure you comment the public/build folder in .gitignore before pushing it to production. This might work for a lot (almost) of servers including Heroku without adding buildpacks.
Should this fail, make sure your APP_ENV is set to production APP_ENV=production or anything else except local as the documentation of vite states.
There are multiple projects inside our repo.
Eg.
MainProject/
SubProject_1/
SubProject_2/
After I installed cypress, "Cypress" folders were created for each project.
MainProject/
Cypress
SubProject_1/
Cypress
SubProject_2/
Cypress
Now in package.json file I've got;
Script
{
"cypress:open": "cypress open",
}
When I run npm run cypress:open, it opens up UI for root directory. Which is;
MainProject/
Cypress
If I want to open cypress for different folder as below, how should I try modify the script ?
SubProject_1/
Cypress
Please note that, I've got cypress v10.
When I run Cypress open --project ./SubProject_1/Cypress, it created a folder /SubProject_1/Cypress.
Thanks
I think you want to specify the config file for the particular project,
See Specifying an Alternative Config File
"open:sub1": "cypress open --config-file SubProject_1/cypress.config.js"
Each project config would specify the folders relative to that project.
I've recently started using lerna to manage a monorepo, and in development it works fine.
Lerna creates symlinks between my various packages, and so tools like 'tsc --watch' or nodemon work fine for detecting changes in the other packages.
But I've run into a problem with creating docker images in this environment.
Let's say we have a project with this structure:
root
packages
common → artifact is a private npm package, this depends on utilities, something-specific
utilities → artifact is a public npm package
something-specific -> artifact is a public npm package
frontend → artifact is a docker image, depends on common
backend → artifact is a docker image, depends on common and utilities
In this scenario, in development, everything is fine. I'm running some kind of live reload server and the symlinks work such that the dependencies are working.
Now let's say I want to create a docker image from backend.
I'll walk through some scenarios:
I ADD package.json in my Dockerfile, and then run npm install.
Doesn't work, as the common and utilities packages are not published.
I run my build command in backend, ADD /build and /node_modules in the docker file.
Doesn't work, as my built backend has require('common') and require('utilities') commands, these are in node_modules (symlinked), but Docker will just ignore these symlinked folders.
Workaround: using cp --dereference to 'unsymlink' the node modules works. See this AskUbuntu question.
Step 1, but before I build my docker image, I publish the npm packages.
This works ok, but for someone who is checking out the code base, and making a modification to common or utilities, it's not going to work, as they don't have privledges to publish the npm package.
I configure the build command of backend to not treat common or utilities as an external, and common to not treat something-specific as an external.
I think first build something-specific, and then common, and then utilities, and then backend.
This way, when the build is occuring, and using this technique with webpack, the bundle will include all of the code from something-specfic, common and utilities.
But this is cumbersome to manage.
It seems like quite a simple problem I'm trying to solve here. The code that is currently working on my machine, I want to pull out and put into a docker container.
Remember the key thing we want to achieve here, is for someone to be able to check out the code base, modify any of the packages, and then build a docker image, all from their development environment.
Is there an obvious lerna technique that I'm missing here, or otherwise a devops frame of reference I can use to think about solving this problem?
We got a similar issue and here is what we did: Put the Dockerfile in the root of the monorepo (where the lerna.json locates).
The reason: You really treat the whole repo as a single source of truth, and you want any modification to the whole repo to be reflected in the docker image, so it makes less sense to have separate Dockerfiles for individual packages.
Dockerfile
FROM node:12.13.0
SHELL ["/bin/bash", "-c"]
RUN mkdir -p /app
WORKDIR /app
# Install app dependencies
COPY package.json /app/package.json
COPY yarn.lock /app/yarn.lock
COPY packages/frontend/package.json /app/packages/frontend/package.json
COPY packages/backend/package.json /app/packages/backend/package.json
COPY lerna.json /app/lerna.json
RUN ["/bin/bash", "-c", "yarn install"]
# Bundle app source
COPY . /app
RUN ["/bin/bash", "-c", "yarn bootstrap"]
RUN ["/bin/bash", "-c", "yarn build"]
EXPOSE 3000
CMD [ "yarn", "start" ]
package.json
{
"private": true,
"workspaces": [
"packages/*"
],
"scripts": {
"bootstrap": "lerna clean --yes && lerna bootstrap",
"build": "lerna run build --stream",
"start": "cross-env NODE_ENV=production node dist/backend/main",
},
"devDependencies": {
"lerna": "^3.19.0",
"cross-env": "^6.0.3"
},
}
Late to the party but my approach is using webpack in conjunction with webpack-node-externals and generate-package-json-webpack-plugin, see npmjs.com/package/generate-package-json-webpack-plugin.
With node externals, we can bundle all the dependencies from our other workspaces (libs) into the app (this makes a private npm registry obsolete). With the generate package json plugin, a new package json is created containing all dependencies except our workspace dependencies. With this package json next to the bundle, we can do npm or yarn install in the dockerfile.
Using Vue's default scripts:
"scripts": {
"serve": "vue-cli-service serve",
"build": "vue-cli-service build"
},
I run "npm run build" it produces the production build in the "dist" directory however in the end it says:
Images and other types of assets were omitted.
I honestly don't understand what to do include them. I don't want to make a specific folder for images and upload so my web server can serve it. Vue should handle the files in its src/assets folder itself.
So far I have found a solution while googling which says to include:
NODE_ENV = PRODUCTION
But it doesn't work either.
Any clues how to get this fixed? I cannot launch a website without including its logo.
I believe that message just means that they are omitted from the file listing displayed in the console during the build.
Nope. They are omitted from build. My page is "broken" if I use npm run build and then serve it through DJango.
Doing npm run serve it will produce the right files in the django's static folders, and then, serving it from django (w npm server stopped) will work.
I'm trying to run a gulp watch when running my Java web-app with Gretty.
I'm using this plugin to run gulp from Gradle.
For the moment, I'm just able to run a single gulp build before the app run by doing this: appRun.dependsOn gulp_build
What I would like is that when I run the app, there is also gulp watch starting (task gulp_default in Gradle on my case), so that SCSS files are automatically compiled when I save them without having to restart the app.
I can't just do appRun.dependsOn gulp_build because gulp_default doesn't return anything, so the gradle task doesn't execute appRun.
Any idea how I can do this?
I found a way, but by using npm to start the app and not gradle.
I used the concurrently package. I start the app by doing npm start instead of gradle appRun, and I added this in my package.json:
"scripts": {
"start": "concurrently \"gradle appRun\" \"gulp watch\""
}