How to install yarn workspace packages without symlink? - yarnpkg

I have a yarn workspaces project which looks something like this:
node_modules
packages
shared
test.js
package.json
client
test.js
package.json
server
test.js
package.json
package.json
server.Dockerfile
As you can see, I have a server.Dockerfile, which builds an image of the server that I can push up to different hosting providers such as Heroku or AWS.
I copy packages and package.json into this container:
COPY packages packages
COPY package.json .
And I then install only the dependencies for the server package:
RUN cd packages/server && yarn install
All the dependencies are now in the node_modules folder, and the next thing I think of doing is to delete the packages folder to remove any unnecessary code from the docker image (e.g. the client code):
RUN rm -rf packages
The problem with this is that all the yarn workspace packages inside the node_modules folder are simply symlinks to the packages folder... so I cannot delete that folder.
How do I get yarn install to make a copy of the yarn workspace packages instead of creating symlinks?
Or, is there another way to remove all of the unused code (e.g. the client code) so that my docker image isn't bloated?

You can use yarn-workspace-isolator to extract the package with its local dependencies to avoid publishing them to npm if you don't want to.
isolate-workspace -w my-package -o ~/dist/my-package
Now, as the doc saying:
You can simply run yarn install inside of ~/dist/my-package and yarn will
install all dependencies as if you had not used workspaces at all
without having to publish any workspace dependency.

Running yarn install in workspaces does the same thing inside any package or the root directory. It installs the modules for every package and symlinks them etc.
If you want to build a docker image for just the server you should only copy that package into the container and install that as an independent package.
If the server has a dependency on the shared lib, you could publish it to npm so it can fetch it too.

Related

Azure pipelines working directory pointing two different places

Hi Everyone I have a question related to the pipeline.workspace variable.
In the example below i have set pipeline.workspace as the working directory and as paths for a cli command.
Npm install creates folders under /home/vsts/work/node_modules/
while the next command when i use pipeline.workspace it points to ./home/vsts/work/1/
Am I doing something wrong? or is something up?
- task: Bash#3
displayName: 'Publish Sentry'
inputs:
targetType: 'inline'
script: |
npm install #sentry/cli
.$(Pipeline.Workspace)/node_modules/.bin/sentry-cli releases --org --project new "$(Build.BuildNumber)" --finalize
workingDirectory: "$(Pipeline.Workspace)"
The node_modules folder is pre-generate that may contains some global packages. It is not generated by the npm install command in the script. This situation exists on Microsoft-hosted Ubuntu agents and Microsoft-hosted macOS agents.
When executing the npm install command to install packages locally, there are few points you need to pay attention to :
If no node_modules folder is existing in current working directory, and also no node_modules folder is existing in any parent directory of current working directory, the npm install command will generate the node_modules folder in current working directory and install the packages into this node_modules folder.
If no node_modules folder is existing in current working directory, but the node_modules folder is existing in parent directory, the npm install command will install the packages into the existing node_modules folder in the closest parent directory.
For example, there are the following paths:
/root/dir1/node_modules
/root/dir1/dir2/node_modules
/root/dir1/dir2/dir3
When executing the npm install command in the directory "/root/dir1/dir2/dir3", the packages will be installed into "/root/dir1/dir2/node_modules".
If the node_modules folder is existing in current working directory, regardless of whether the node_modules folder is existing in parent directory or not, the npm install command will install the packages into node_modules folder in current working directory.
For example, there are the following paths:
/root/dir1/node_modules
/root/dir1/dir2/node_modules
/root/dir1/dir2/dir3/node_modules
When executing the npm install command in the directory "/root/dir1/dir2/dir3", the packages will be installed into "/root/dir1/dir2/dir3/node_modules".

yarn berry run how to run installed packages

I see with yarn berry I get the plug'n'play feature instead of node_modules/
I couldn't find anything to suggest it supports running from installed packages.
For example with npm a workflow might be to run the installed version of webpack:
$ npm install --save-dev webpack
$ node node_modules/webpack/bin/webpack ...
A globally installed webpack might not be the same version. Worse yet, during Docker deployment, I get what's installed locally, the only node and npm are available globally. I thought I can do a preinstall script that does npm install -g yarn; yarn set version berry but then I'm not sure how to do webpack, jest, babel, etc, and the thought that I should have to install them all globally during the same preinstall hackaround seems like several steps backwards.
Is there some way to run from locally-installed packages that I'm missing?
I saw this possibly related question - Yarn Berry - Run a Node Script Directly
But the answer there seems a bit off the point - I'm not running any js, I'm trying to type in a package.json script, i.e. something that can run from the shell.
Why not just use yarn run <bin> (or simply yarn <bin>)? If you are in a repository set to use yarn berry, that will run any package bin file.
yarn node <file> will run any .js file with Plug n' Play set up. No need to install those dependencies globally, except for maybe yarn classic.
I was trying to do yarn some-bin and kept getting:
Couldn't find a script named "some-bin".
I eventually figured out it was because the package that provides some-bin is installed inside a workspace and not at the root of my project. So instead I had to run:
yarn workspace my-workspace some-bin
And that worked.

Yarn install has been replaced with `add`

On my Windows system I can run yarn install with no issue in my project. But during my Azure build which is running on Ubuntu-16.04 I get the following message:
error: install has been replaced with add to add new dependencies. Run "yarn add yarn build" instead.
Doing a yarn add gives this message:
error: Running this command will add the dependency to the workspace root rather than the workspace itself, which might not be what you want - if you really meant it, make it explicit by running this command again with the -W flag (or --ignore-workspace-root-check).
In my project I have multiple applications all with their own package.json file. If I'm reading the message correctly the yarn add will add all the dependencies to the root file and not in the directories where the package.json files are located.
So how do install the packages per directory/package.json file using yarn add?
Initially I added: yarn add --cwd apps/<foldername>/<foldername> to the build script. You can do this for multiple folders to initiate different builds. But just running yarn from the root also resolved all the different builds.

Yarn install trigger all scripts in my package.json, it is normal?

The documentation doesn't mention this specifity:
yarn install is used to install all dependencies for a project. This
is most commonly used when you have just checked out code for a
project, or when another developer on the project has added a new
dependency that you need to pick up.
Yarn install : Install all the dependencies listed within package.json in the local node_modules folder.

Gulp didn't install on Laravel Homestead VM (from Windows) and no node_modules folder exists in the directory with package.json?

I got Laravel Homestead up and running, except when I issue this ssh command:
gulp
I get this error:
Local gulp not found in ~/projects/laravel
Try running: npm install gulp
That's when I noticed there was no node_modules folder at all in this directory. Weird. Is this an issue where the paths were too long for Windows when I did a vagrant up? Since the host machine for this VM is Windows and I'm sharing folders with my VM (actually I'm not, but rather I'm using phpStorm's sync so that pages will load faster on the VM), when I do a npm install am I still going to encounter the problem? Hmm...I guess Taylor Otwell is using a Mac for development. Anybody have a solution to this?
You should first install node in your local machine.
Then, navigate to your project folder and delete node_modules directory.
Run on your local machine inside project directory:
npm install gulp --save-dev
Solved it!
Yes it appears to be that some of these paths from these package installs are too long for Windows. This means you can only install the gulp package after your VM is up. Here's what I did: (I also include a step for use with phpStorm already (since it will keep server pages loading faster on the VM)
SSH into the VM and create a folder called node_modules in your project's directory (the directory where package.json is).
In phpStorm, go to File > Settings > Build, Execution, Deployment > Deployment and click on the Excluded Paths tab. Click the Add deployment path button and add the folder node_modules from step 1.
SSH back into the VM and in your projects folder (should be directory folder with package.json) and run this command: npm install. This will install all packages listed in your package.json file locally to the directory you are in, and will put the necessary files into the folder node_modules.
Now run the gulp command: gulp
UPDATE:
If you really prefer file-sharing through a mounted folder, then I've created a Gist that will guide someone through all the challenges I faced and had to resolve, including how to successfully run npm install in the guest environment:
Laravel Homestead for Windows (includes fixes)

Resources