By default node_modules folder is under the root directory of my Laravel project and it's working fine. I can compile the scripts with npm run dev and there is no error.
However according to the project folder structure, I move node_modules folder into a child folder called backend.
Here, I also moved the files like webpack.mix.js, package.json into backend folder and run npm install again inside it. But I keep my resources and public folders as original and link them with relative path via backend folder.
The folder structure looks like this
Now, if I run npm run dev inside backend folder, it complains many errors like can't resolve '#babel/runtime/regenerator'.
But if I make a symbolic node_modules inside root folder which is linked to backend/node_modules, it works fine and I can compile the scripts without error.
My question is - How can I compile the scripts from child folder without making a symbolic like this?
Probably it doesn't know where node_modules folder is located.
As laravel-mix is based on webpack. I add the modules path inside webpack config as below to make all import knows where the node_modules folder is located.
mix.webpackConfig({
resolve: {
modules : [
path.resolve("./node_modules")
]
}
});
There is no more can't resolve error.
Related
I have a web app using a NuxtJS frontend being served from a Golang web micro-service.
I am using go:embed dist to ship the frontend assets into the web binary and running it all in a Docker container. For non-Gophers, go:embed is a directive that bundles files or directories into the Go binary. It's a compile error to use go:embed on a directory that doesn't exist.
Usually, you don't commit the dist directory to VCS, but if I don't, then all my CI builds fail because I cannot compile the web service since dist doesn't exist.
I tried adding a .gitignore inside the dist folder like this:
*
!.gitigore
I was hoping this would commit the empty dist folder to VCS, but still not commit any of the assets. This worked fine, until I ran the NuxtJS build and it deleted the .gitignore. I assume it deletes the whole directory and recreates it.
Does anybody know whether there is a configuration option to keep the dist folder around between builds, or at least to not delete the .gitignore within it?
I configured 2 workspaces in package.json e.g example and gatsby-theme but later I found I was actually developing a gatsby-starter so I removed the example workspace that depended on the latter workspace from package.json.
I wonder if I moved all files from gatsby-theme/ to the project root directory and overwrote the package.json and other files with gatsby-theme's, does it become a project that could be managed with both npm and yarn?
Here is my problem, I constructed a dockerfile launching yarn install from a folder where a package.json and yarn.lock are present (they have been taken from the project I have to setup yarn dependencies for, this project is inside a deconnected server).
Then, I run bash into container image and uploaded the created folder node_modules, and put it into the deconnected server, where project is present, at root folder project.
But then, when I launched yarn start from root folder, it says that it cannot find rescripts despite of the fact that folder #rescripts is present into node_modules.
I tried NODE_PATH=./node_modules yarn start without any success.
Thanks a lot for your help.
Regards
I think i get what i want with :
https://yarnpkg.com/blog/2016/11/24/offline-mirror/
I create a cache folder with all tar.gz dependencies downloaded.
But if i remove node_modules and yarn.lock, and run yarn install --offline --cache-folder npm-packages-offline-cache/, I got error saying it could not find proper dependance in cache folder. It's like it cannot recognize any tar.gz inside. Any help will be welcome.
Regars
I've been trying to convert and deploy one of our node.js apps into a lambda function and have been having some problems with the node_modules dependencies - saying that it can't find certain modules. I started by creating a package.json, npm install the dependencies locally then copy the node modules folder up to lambda.
For instance I have a project that requires sequelize and convict and have been getting errors saying that it cannot find the moment module as a sub-dependency. I see that moment is included in the root of my node_modules but it was not included in the sub folder under the sequelize module.
However, this project runs fine locally. What is the difference in lambda and what's the best practice for deploying a somewhat long list of node modules with it - just a copy of the node_modules folder? On some of the other simpler projects I have, the small amount of node_modules can be copied up with no problems.
{
"errorMessage": "Cannot find module 'moment'",
"errorType": "Error",
"stackTrace": [
"Function.Module._resolveFilename (module.js:338:15)",
"Function.Module._load (module.js:280:25)",
"Module.require (module.js:364:17)",
"require (module.js:380:17)",
"VERSION (/var/task/node_modules/sequelize/node_modules/moment-timezone/moment-timezone.js:14:28)",
"Object. (/var/task/node_modules/sequelize/node_modules/moment-timezone/moment-timezone.js:18:2)",
"Module._compile (module.js:456:26)",
"Object.Module._extensions..js (module.js:474:10)",
"Module.load (module.js:356:32)",
"Function.Module._load (module.js:312:12)"
]
}
I resolved this by uploading all from a zip file which contains all the data I need for my lambda function.
you can just create your project in your local machine and make all the changes that you need then the file you are going to zip should have this same structure and also see that there is an option to load your code from a zip file.
This sounds to me like an issue caused by different versions of npm. Are you running the same version of nodejs locally as is used by Lambda (ie. v0.10.36)?
Depending on the version of npm you're using to install the modules locally, the node_modules directory's contents are laid out slightly differently (mainly in order to de-duplicate things), and that may be why your dependencies can't find their dependencies in Lambda.
After a bit of digging, it sounds like a clean install (ie. rm your node_modules directory and run npm install) might clean things up for you. The reason why is that it seems that npm doesn't install sub-dependencies if they're already present at the top level (ie. you installed moment before sequelize, etc).
I would like to take all the contents of my 'files' folder, that's in my 'src' folder (js, css, image files etc), and add them to a folder in the out directory called, say, v001.
I could then version all my static assets on my Amazon S3 storage and only invalidate my html files when there are new changes (and archive the previous version of assets to save some space). I'd obviously add some management to the html document templates to pick the correct version of the assets.
Here's my src directory structure:
-src
-files
-fonts
-img
-scripts
-styles
-documents
index.html.eco
-layouts
default.html.eco
And I want my out folder to look like this
-out
-v001
-fonts
-img
-scripts
-styles
index.html
I'm not sure how to gather the contents of my src 'files' folder and put them in the out directory in a folder called v001. Could anyone help please?
I would integrate grunt or gulp for such solution.
You could either use docpad plugins for grunt or gulp, or tools themselves.
For example there is a grunt plugin for copying files and folders: https://github.com/gruntjs/grunt-contrib-copy
I'm curious how would you do the invalidation
=====
Update. So what I just did and recommend to you.
Install gulp in your docpad project: https://github.com/gulpjs/gulp/
Install gulp docpad plugin: https://github.com/terminalpixel/docpad-plugin-gulp
then install proper gulp plugin that will help you to copy files to a proper new folder.
As you can see from gulp docpad configuration, you can add this copy/paste action to the whole docpad build process - super awesome