Script name collision in terminal - macos

I just installed
npm install -g angular-cli
And attempted to initialize a new project
ng new foo
Unfortunately I already have nailgun installed which also uses the ng keyword.
How can I set which has precedence? Better yet, is there an easy way to rename one of them?

The first one found in the path is the one that will be called, so you could re-order your path as a first option. When you needed nailgun, you could swap the order for the nailgun and global npm modules directories in your path.
You could rename ng in the .bin directory in your global install location, but this could have side effects if other angular tools expect to be able to call ng.

Related

How to uninstall everything from $GOPATH/bin?

The go clean -i command that is ran inside some project deletes an executable file of that particular project that was previously installed by go install command. How to delete everything installed by the go install commands that were ran from several different projects? Is there some single go command that can do that?
TL;DR
Delete the binary like any other file.
The "install" term means place (something) in a new position ready for use.
Therefore, Go builds a single-file binary and places it in another directory ($GOPATH/bin). It is useful when you add the Go binary directory into the environment variable to call the program.
There's no auxiliary flag such as go clean -bincache to remove all binaries installed by Go 1.16.4.
However, at the current version of GoLang (1.16.4), the right way to remove (or "uninstall" as you said) any installed binary is solely to delete it, like any other file despite you feel it sounds awkward.

Do NPM packages have to be installed globally to access their functionality via the command line?

I am having trouble understanding how the -g flag works in NPM. Specifically I'm struggling to understand how it relates to command-line functionality exposed by NPM modules.
I assumed that the difference between installing a package locally and globally was simply that a local package would not be available outside of the particular project. And of course that a globally installed package would be available in any project. I'm from a Rails background so this for me would be similar to installing a gem into a particular RVM versus installing it into the global RVM. It would simply affect which places it was available.
However there seems to be more significance than just scope in NPM. For packages that have command-line functionality, like wait-on, the package (as far as I can tell) is not available on the command line unless it's installed globally.
Local install doesn't make the command-line functionality available:
$ npm install wait-on
$ wait-on
=> -bash: /usr/local/bin/wait-on: No such file or directory
Global install does expose the command-line functionality
$ npm install wait-on -g
$ wait-on
=> Usage: wait-on {OPTIONS} resource [...resource]
Description:
wait-on is a command line utility which will wait for files, ports,
sockets, and http(s) resources to become available (or not available
using reverse flag). Exits with success code (0) when all resources
are ready. Non-zero exit code if interrupted or timed out.
Options may also be specified in a config file (js or json). For
example --config configFile.js would result in configFile.js being
required and the resulting object will be merged with any
Can you expose the command-line functionality using a local install?
Is it possible to install locally but also get the command line functionality? This would be very helpful for my CI setup as it's far easier to cache local modules than global modules so where possible I'd prefer to install locally.
If you are using npm 5.2.0 or later, the npx command is included by default. It will allow you to run from the local node modules: npx wait-on
For reference: https://www.npmjs.com/package/npx
I think you can access locally installed modules from the command line only if you add them to your "scripts" section of your package.json. So to use the locally installed version of wait-on, you can add an entry in "scripts" section of package.json like so "wait-on": "wait-on". Then to run it, you would have to do npm run wait-on. You can also do "wo": "wait-on" and then do npm run wo basically meaning what comes after the run is the script entry. In node_modules, there is a .bin folder and inside of this folder is all the executables that you can access this way.
Installing locally makes the package available to the current project (where it stores all of the node modules in node_modules). This is usually only good for using a module like so var module = require('module'); or importing a module.
It will not be available as a command that the shell can resolve until you install it globally npm install -g module where npm will install it in a place where your path variable will resolve this command.
You can find a pretty decent explanation here.
It is also useful to put commands in the scripts block in package.json as it automatically resolve local commands. That means you could have a script that depended on a package without having an undocumented dependency on the same.
If you need to run it locally on cmd, you have to go inside the node_modules and run from the path.

Package nodejs application with global packages

We have a project which have to be packaged as a zip so we can distribute it to our cliens. With the normal node_modules directory i have no problems. I just put the directory and the node.exe together in my project folder and can start our project on every other computer without installing node or running any npm command.
But now i have a dependecy on phantomjs which needs to be installed as a global package npm install -g phantomjs.
How do i pack modules like this into our project? I first thought of copying phantomjs into the local node_modules directory and set the path variable NODE_PATH to this directory. It doesn't find phantomjs.
Development and client platforms are both windows.
Well, generally it is fine to install global dependencies with the --save flag and call their bins like ./node_modules/phantomjs/bin/phantomjs /*now executes*/ (just as an illustrative example).
However, with with Phantom it's not that simple, since it's downloading binaries and/or even compiling. You would have three options:
ssh into target and just npm install -g phantomjs before or define it in a manifest e.g. Dockerfile just like that, if you are using containers.
Compile it from source as advised here.
If you are using the CLI, then just the --save approach.
So I hardly advise just making a Docker image out of it and ship it as tarball. You can't zip the platform dependent Phantom installation, unfortunately.
Also lots of dependencies like karma-runner-phantomjs look for the path of the global dependencies to resolve it for their use.

npm global install does not add packages to PATH on Windows 8.1

When I run npm install -g <package> it installs the packages in my user/AppData/Roaming/npm/npm_modules/ folder. This sub-folder is not in my PATH so if I try to run the package without explicitly calling the entire path the call fails with a '<package>' is not recognized as an internal or external command, operable program or batch file.
What can I do to fix this?
Thanks
I'm using win8.1 and I found that the nodejs installer didn't add the path to global node modules to the system PATH. Just add %AppData%\npm; to the user variable(since %AppData% dir is depending on user) PATH to fix it.
You will need to log out then log back in for the change to your PATH variable to take effect.
SET PATH=%AppData%\npm;%PATH%
You have to run this line SET PATH=pathtonodejs;%PATH% (where pathtonodejs is where you installed nodejs) once the installation for nodejs is complete and it should work.
It turned the problem was a change in behavior of the module I was using.
I'd been following old tutorials for using Express.js. The old tutorials assumed Express would be in my path after installing it globally, but as of Express v4.0 there's a separate Express module you have to install in order to get it in your path

nodeclipse doesn't recognize any module

Any require statement that refers to any global module failed.
The module is installed globally (-g) and regular node in command line run just fine.
redis is failing, mongodb is failing and so on.
I didn't find any configuration options for that.
express working just fine, but not the other modules.
after sudo npm install -g redis for example, nodeclipse can't find it.
the node command line, works fine.
run both with regular user.
Node includes modules differently then maybe one would expect. From the node docs:
If the module identifier passed to require() is not a native module, and does not begin with '/', '../', or './', then node starts at the parent directory of the current module, and adds /node_modules, and attempts to load the module from that location.
If it is not found there, then it moves to the parent directory, and so on, until the root of the tree is reached.
So yes npm does install to a global directory when invoked with the -g option, however that directory is not read from by node unless the current module is also located in that same directory or a subdirectory, which would work so long as that other module is also installed by npm -g.
However, this scheme does not work if the start .js is in some different directory.
So, I think for you to get this to work, add to the NODE_PATH environment variable where npm installed the modules (e.g. NODE_PATH=/usr/local/lib/node_modules). This can be done navigating from Run As ... -> Run Configurations -> Environment Tab
Have you just 'npm install' those module?
Give more information in your questions, as it is impossible to tell what is different in your environment.

Resources