Workspace-global bin-scripts in Yarn 2 - yarnpkg

I have a yarn 2 workspaces project with two workspaces:
|-foo
\-bar
Now, in the root package.json, I pull in common dev-depenencies:
"devDependencies": {
"#rollup/plugin-commonjs": "^14.0.0",
"#rollup/plugin-node-resolve": "^8.4.0",
"#rollup/plugin-replace": "^2.3.3",
"#rollup/plugin-typescript": "^5.0.2",
"#types/jest": "^26.0.13",
"nollup": "^0.13.10",
"rimraf": "^3.0.2",
"rollup": "^2.23.1",
"ts-jest": "^26.3.0",
"tslib": "^2.0.1",
"typescript": "^4.0.2"
}
How can I easily (without too much boilerplate) now reference rollup, etc. from scripts in the package.json of foo and bar?
Example: foo/package.json
"build": "rollup ...",
Writing "../node_modules/.bin/rollup" sucks.
Note, I don't want to install rollup etc globally.

To run executable installed in root workspace, you can say:
"build": "run -T rollup",
-T is for --top-level and is not currently documented anywhere. I was informed about its existence on Yarn discord server by one of the maintainers.
https://github.com/yarnpkg/berry/blob/bef203949d9acbd42da9b47b2a2dfe3615764eaf/packages/plugin-essentials/sources/commands/run.ts#L47-L49

So, I found a not-too-bad solution. In the workspace root, I add some executable files for the commands I want to use in my scripts, e.g.:
tsc:
#!/bin/bash
$(dirname "${BASH_SOURCE[0]}")/node_modules/.bin/tsc -b -f "$#"
rollup:
#!/bin/bash
$(dirname "${BASH_SOURCE[0]}")/node_modules/.bin/rollup "$#"
These basically "forward" the call to the actual binaries and may add some common parameters.
In my foo/bar package.json, I can now reference those scripts:
"dev:compile": "../tsc",
"build": "../tsc && ../rollup -c rollup.config.js",

Related

Suddenly, NPM script variables no longer work

I use package.json variables like this in NPM scripts:
// package.json
{
"version": "0.12.1",
"scripts": {
"get-version": "echo %npm_package_version%"
}
}
npm run get-version currently echoes %npm_package_version% instead of 0.12.1. In the past, the scripts worked without any problems. Suddenly only the variable name comes back. With multiple repositories. I run Windows 10 2004 and NodeJS v15.4.0.
Was there a change for NPM scripts in Node.js 15? Is it a bug or a feature?
UPDATE: Failure to expand environment variables on Windows appears to be a recent high-priority known bug in the npm CLI.
Because this is npm#7 specific, until a fix is released, you can downgrade to npm#6.
ORIGINAL ANSWER:
The easiest solution for the specific case in this question is to use node.
"get-version": "node -p process.env.npm_package_version"
This will work on every platform that Node.js supports.
If you need a more general solution and don't want to rewrite a bunch of scripts to use node, you can try cross-var as mentioned by #RobC in the comments.
As for the source of the problem, perhaps you are running under the Windows bash shell, in which case you can use this:
"get-version": "echo $npm_package_version"
That won't work for non-bash Windows environments though.
I found simple hack which is working perfect in my case,
Specifically in your use case
// package.json
{
"version": "0.12.1",
"scripts": {
"get-version": "node -e \"console.log(process.env.npm_package_version)\""
}
}
Usage
npm run get-version
However you want to pass arguments.
// package.json
{
"scripts": {
"get-argument": "node -e \"console.log('your argument:', process.argv[1] )\"",
}
}
Test example
npm run get-argument hello_world
Default values are a great way to handle undefined values. We use a predefined value instead. Inside our NPM script we can achieve that by using the following syntax;
{
"version": "0.12.1",
"scripts": {
"get-version": "echo ${npm_package_version:0.99}"
}
}
And of course, running npm from a bash prompt might help. I guess running from a Cmd/Powershell "could work" but I would be careful about that.
FYI - A related change in Version 7 if you are using the Package config variables:
The variable name changed from npm_package_config_customFooVar in V6 to npm_config_customFooVar in V7
Delineate these appropriate (as below) to the environment (Windows bash linux etc) being used. or Use lib like cross-var.
Package.json
{
"config": {
"customFooVar": "bar",
"env": "development"
},
"scripts": {
"get-var": "echo using env1 $npm_config_customFooVar OR env2 %npm_config_customFooVar%"
"build": "npm config set myAppName:env"
"postbuild": "cross-var ng build --configuration=$npm_config_env && cross-var node myOtherBuildSript.js $npm_config_env"
}
}
e.g. npm-cli call (note space after --) as this is passed to the script. Not to npm itself.
npm run build -- production
pass args from package.json to cli
echo %npm_package_version%
This solution allowed me to use the npm_package_version variable in both Windows and Unix:
Install run-script-os as a dev dependency. Then in your package.json the variable can be used:
"scripts": {
...
"postversion": "yarn postversion-wrapper",
"postversion-wrapper": "run-script-os",
"postversion-wrapper:windows": "echo %npm_package_version%",
"postversion-wrapper:nix": "echo $npm_package_version"
}

NG-ANTD nz-tslint-rules migration not working

I was trying to update our production project which holds 500+ modules and we certainly need this tool to make it work because manually doing so would be a nightmare. I've been the whole afternoon trying to make it work even copying and pasting your import example and haven't manage to do so.
Our imports are like following in the whole project:
import {
NzTableModule,
NzCheckboxModule,
NzInputModule,
NzFormModule,
NzSelectModule,
NzDrawerModule,
NzDividerModule,
NzToolTipModule,
NzDatePickerModule,
} from 'ng-zorro-antd';
I'm using the following config:
{
"rulesDirectory": [
"nz-tslint-rules"
],
"rules": {
"nz-secondary-entry-imports": true
}
}
package.json:
"ng-zorro-antd": "^9.3.0",
"typescript": "~3.8.3",
"nz-tslint-rules": "^0.901.2",
"#angular/core": "~9.1.12",
I'm executing the following command from the app root:
"tslint --project ."
I've managed to be sure about the script execution with a console log in the nzSecondaryEntryImportsRule.js file
Also I've notice that:
tsutils.isImportDeclaration(node)
Always returns false therefore it continues to the next iteration in the for loop
I'd appreciate any help on this.
I guess because the global version is too low.
here are three solutions:
upgrade your global tslint
npm install tslint -g
add the command to the scripts in package.json, and then use npm run lint:fix
{
"scripts": {
"lint:fix": "tslint --project tsconfig.json --fix"
}
}
run from node_modules/.bin/tslint
node_modules/.bin/tslint --project tsconfig.json --fix

#cypress/code-coverage Can't resolve '../self-coverage-helper' #cypress

I try to run in index.js of Cypress:
require('#cypress/code-coverage/task')
Getting this error:
Module not found: Error: Can't resolve '../self-coverage-helper' in 'C:\repo\patientstrength_codecover\node_modules\#cypress\code-coverage\node_modules\nyc'
Totally lost here. My package.json:
"nyc": "^15.1.0",
"cypress": "^5.0.0",
"cypress-istanbul": "^1.3.0",
"cypress-localstorage-commands": "^1.2.2",
"cypress-multi-reporters": "^1.2.4",
"#cypress/code-coverage": "^3.8.1",
"#babel/core": "^7.11.4",
"#babel/plugin-syntax-dynamic-import": "^7.8.3",
"#babel/plugin-syntax-jsx": "^7.10.4",
"#babel/preset-env": "^7.11.0",
"#babel/preset-react": "^7.10.4",
The issue was that we are using a "root" package.json with basic scripts like gulp, jest and well - also we tried to run Cypress from that root. And beside the coverage, it worked fine.
So we have:
/git/root/package.json
/git/root/solution1/package.json
/git/root/solution2/package.json
We solved the issue by simply install Cypress and all dependencies first (!) in the:
/git/root/solution1/package.json and /git/root/solution2/package.json solutions.
NOT in the /git/root/package.json.
The /git/root/package.json now only contains a script invoking the 2 Cypress installations. And later we merge the results. Sure some redundancy
The invoke script is as following:
"test:client1": "cd client1 && cd ClientApp && npm run coverage"
So very simple approach. What we couldn't solve is the redundancy regarding Cypress configuration and commands. That can be optimized.

Attaching a debugger to a parcel built app

I have my project setup as follows, within my package.json I have the follow:
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"dev": "parcel ./public/index.html --open",
"build": "parcel build ./public/index.html NODE_ENV=production --no-source-maps --public-url ./public --out-dir ./dist",
"lint": "eslint --ext .js,.vue src --fix"
},
I can start my project by running: npm run:dev which starts as follows:
To debug the "dev" script, make sure the $NODE_DEBUG_OPTION string is specified as the first argument for the node command you'd like to debug.
For example:
"scripts": {
"start": "node $NODE_DEBUG_OPTION server.js"
}
> impcentral#1.0.0 dev /Users/william/imp/src/impCentral
> parcel ./public/index.html --open
Server running at http://localhost:63188 - configured port 1234 could not be used.
As you can tell it does not stop at my break points within WebStorm. I've tried passing in the $NODE_DEBUG_OPTION within the package.json but to no avail.
Any ideas folks, open to trying this in Visual Studio Code too.
You don't need running your NPM configuration in debugger unless you need debugging parcel itself. As your application, served by parcel, is run in browser, you have to use JavaScript Debug run configuration to debug it.
start your app by running npm run dev (either in WebStorm or in terminal)
create a JavaScript Debug run configuration with your server URL (http://localhost:1234, or http://localhost:63188 in your case):
select this configuration and press Debug

Cant get webpack hotreload with create-react-app and docker windows

We are going to develop a react pwa with dockersetup and to publish the app on gitlab pages during deployment to master branch.
I work on a windows device and cant get the feature of hotreloading in dev-mode. Whenever i make some change, the code isnt recompiling. I have to docker-compose up --build every single time for every change.
Is there any possible way to get hotreloading working on a windows/docker/create-react-app setup?
Following the package.json:
{
"name": "Appname",
"version": "0.1.0",
"private": true,
"dependencies": {
"react": "^16.7.0",
"react-dom": "^16.7.0",
"react-scripts": "2.1.1"
},
"scripts": {
"start": "react-scripts start",
"build": "react-scripts build",
"buildandserver": "react-scripts build && serve -s build",
"test": "react-scripts test",
"eject": "react-scripts eject"
},
"eslintConfig": {
"extends": "react-app"
},
"browserslist": [
">0.2%",
"not dead",
"not ie <= 11",
"not op_mini all"
]
}
Now the Dockerfile for Dev-Setup:
FROM node:9.6.1
# set working directory
RUN mkdir /usr/src/app
WORKDIR /usr/src/app
# add `/usr/src/app/node_modules/.bin` to $PATH
ENV PATH /usr/src/app/node_modules/.bin:$PATH
# install and cache app dependencies
COPY package.json /usr/src/app/package.json
RUN npm install
RUN npm install react-scripts#1.1.1 -g
# start app
CMD ["npm", "start"]
And at least the docker-compose for dev-setup:
version: '3.5'
services:
App-Name:
container_name: App-Name
build:
context: .
dockerfile: devsetup/Dockerfile
volumes:
- './:/usr/src/app'
- '/usr/src/app/node_modules'
ports:
- '3000:3000'
environment:
- NODE_ENV=development
Im running docker for windows on the device.
I hope anyone can help me out of here...Thanks!
The problem is mainly caused by the fact you're on Windows.
Why?
Because Docker on Windows does not work well with volumes. To be more precise - it does not notify the container about the volume change. It does expose up to date files in container but the Linux inside the container "doesn't know" about the fact that file has been changed which is required to trigger webpack recompilation.
There is few solutions:
Switch to Linux for development (I know it may be not possible but if you are working with docker a lot and you can move - do that. Linux containers on Linux work much faster, no issues with volumes etc)
If you can't you can either use legacy polling in weback which is already mentioned in comments
You can use e.g. https://github.com/merofeev/docker-windows-volume-watcher which is Python based tool which watch your local files and container files inside the volumes and notify the container about the change...
I found 3 working a bit better than 2 but both will have some performance sacrifice.
I hope it helps. If you have any questions just comment and I will try to edit to explain better.

Resources