Any require statement that refers to any global module failed.
The module is installed globally (-g) and regular node in command line run just fine.
redis is failing, mongodb is failing and so on.
I didn't find any configuration options for that.
express working just fine, but not the other modules.
after sudo npm install -g redis for example, nodeclipse can't find it.
the node command line, works fine.
run both with regular user.
Node includes modules differently then maybe one would expect. From the node docs:
If the module identifier passed to require() is not a native module, and does not begin with '/', '../', or './', then node starts at the parent directory of the current module, and adds /node_modules, and attempts to load the module from that location.
If it is not found there, then it moves to the parent directory, and so on, until the root of the tree is reached.
So yes npm does install to a global directory when invoked with the -g option, however that directory is not read from by node unless the current module is also located in that same directory or a subdirectory, which would work so long as that other module is also installed by npm -g.
However, this scheme does not work if the start .js is in some different directory.
So, I think for you to get this to work, add to the NODE_PATH environment variable where npm installed the modules (e.g. NODE_PATH=/usr/local/lib/node_modules). This can be done navigating from Run As ... -> Run Configurations -> Environment Tab
Have you just 'npm install' those module?
Give more information in your questions, as it is impossible to tell what is different in your environment.
Related
I am having trouble understanding how the -g flag works in NPM. Specifically I'm struggling to understand how it relates to command-line functionality exposed by NPM modules.
I assumed that the difference between installing a package locally and globally was simply that a local package would not be available outside of the particular project. And of course that a globally installed package would be available in any project. I'm from a Rails background so this for me would be similar to installing a gem into a particular RVM versus installing it into the global RVM. It would simply affect which places it was available.
However there seems to be more significance than just scope in NPM. For packages that have command-line functionality, like wait-on, the package (as far as I can tell) is not available on the command line unless it's installed globally.
Local install doesn't make the command-line functionality available:
$ npm install wait-on
$ wait-on
=> -bash: /usr/local/bin/wait-on: No such file or directory
Global install does expose the command-line functionality
$ npm install wait-on -g
$ wait-on
=> Usage: wait-on {OPTIONS} resource [...resource]
Description:
wait-on is a command line utility which will wait for files, ports,
sockets, and http(s) resources to become available (or not available
using reverse flag). Exits with success code (0) when all resources
are ready. Non-zero exit code if interrupted or timed out.
Options may also be specified in a config file (js or json). For
example --config configFile.js would result in configFile.js being
required and the resulting object will be merged with any
Can you expose the command-line functionality using a local install?
Is it possible to install locally but also get the command line functionality? This would be very helpful for my CI setup as it's far easier to cache local modules than global modules so where possible I'd prefer to install locally.
If you are using npm 5.2.0 or later, the npx command is included by default. It will allow you to run from the local node modules: npx wait-on
For reference: https://www.npmjs.com/package/npx
I think you can access locally installed modules from the command line only if you add them to your "scripts" section of your package.json. So to use the locally installed version of wait-on, you can add an entry in "scripts" section of package.json like so "wait-on": "wait-on". Then to run it, you would have to do npm run wait-on. You can also do "wo": "wait-on" and then do npm run wo basically meaning what comes after the run is the script entry. In node_modules, there is a .bin folder and inside of this folder is all the executables that you can access this way.
Installing locally makes the package available to the current project (where it stores all of the node modules in node_modules). This is usually only good for using a module like so var module = require('module'); or importing a module.
It will not be available as a command that the shell can resolve until you install it globally npm install -g module where npm will install it in a place where your path variable will resolve this command.
You can find a pretty decent explanation here.
It is also useful to put commands in the scripts block in package.json as it automatically resolve local commands. That means you could have a script that depended on a package without having an undocumented dependency on the same.
If you need to run it locally on cmd, you have to go inside the node_modules and run from the path.
I am using a template, Vue cli3 application and it stopped working and I don't recall why. The error I am receiving is when I try to start the application I get this error.
yarn run serve
yarn run v1.16.0
error An unexpected error occurred: "The \"path\" argument must be of type string. Received type object".
info If you think this is a bug, please open a bug report with the information provided in "C:\\node\\TradeTriggers\\yarn-error.log".
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
I can delete the node_modules and package.json.lock
I cannot do anything with yarn. No yarn install, yarn run serve, nothing and npm doesn't seem to want to run the application. I'm sorry there are a lot of tools to know in the JS world!
The machine is a windows10 machine and I cannot find yarn in my env variables, so the issue may lie there. I even tried installing the Yarn MSI but my version is still the one I installed through npm a while ago, still nothing.
I had problem that was caused by yarn looking at first the global settings file, ie global registry. It might be similar. On Windows, first check your yarn config path:
yarn config bin
Windows shows the path. Then in that folder, check whether you have a "rc" file, ie that is yarn configuration.
Try move this file out of the folder, for test. Have it as a copy somewhere else, where you can restore it if this does not help.
Then, once file is out, run your yarn commands again like you used to.
Sideline: on Linux, I had to remove a leftover buggy .yarnrc file in
/usr/local/share/.yarnrc
to get similar things working again. It was not a Vue app, but similar kind of error.
This error can occur when you are using Yarn Workspaces and have incompatible directories in the packages/ directory.
In my scenario, I used git subtrees to pull another repo into my project's packages/ directory. This directory had a package.json file, but it did not have compatible values for the fields required by Yarn Workspaces.
Moving the problematic package out of packages/ should fix this issue.
I just installed
npm install -g angular-cli
And attempted to initialize a new project
ng new foo
Unfortunately I already have nailgun installed which also uses the ng keyword.
How can I set which has precedence? Better yet, is there an easy way to rename one of them?
The first one found in the path is the one that will be called, so you could re-order your path as a first option. When you needed nailgun, you could swap the order for the nailgun and global npm modules directories in your path.
You could rename ng in the .bin directory in your global install location, but this could have side effects if other angular tools expect to be able to call ng.
We have a project which have to be packaged as a zip so we can distribute it to our cliens. With the normal node_modules directory i have no problems. I just put the directory and the node.exe together in my project folder and can start our project on every other computer without installing node or running any npm command.
But now i have a dependecy on phantomjs which needs to be installed as a global package npm install -g phantomjs.
How do i pack modules like this into our project? I first thought of copying phantomjs into the local node_modules directory and set the path variable NODE_PATH to this directory. It doesn't find phantomjs.
Development and client platforms are both windows.
Well, generally it is fine to install global dependencies with the --save flag and call their bins like ./node_modules/phantomjs/bin/phantomjs /*now executes*/ (just as an illustrative example).
However, with with Phantom it's not that simple, since it's downloading binaries and/or even compiling. You would have three options:
ssh into target and just npm install -g phantomjs before or define it in a manifest e.g. Dockerfile just like that, if you are using containers.
Compile it from source as advised here.
If you are using the CLI, then just the --save approach.
So I hardly advise just making a Docker image out of it and ship it as tarball. You can't zip the platform dependent Phantom installation, unfortunately.
Also lots of dependencies like karma-runner-phantomjs look for the path of the global dependencies to resolve it for their use.
I'm developing something with node.js and socket.io, but I'm doing my local dev on Windows for my own convenience. Installation instructions for socket.io say just do npm install socket.io. This is fine for my linux environment, and I'm guessing node will just find it in modules. But on Windows I don't know what to do. I got version 0.6 working fine somehow, managing to find the files I need.
Now, it looks like I need two sets of files, one for the server side and one for the client. There's also two repos on github, socket.io and socket.io-client. So I'm trying to just download all the files I need from there. The issue is that the server one refers to the client one, but the socket.io-client files aren't in the server repo. If I put the server files in, and reference them in my node server, it crashes on startup saying Cannot find module 'socket.io-client'.
tl;dr If I'm just copying files into my project directory, rather than doing an npm install, what is the proper file structure to get socket.io version 0.7 running?
Had the same issue here and I'm not using NPM either. But nothing to do with Windows: I'm on Ubuntu with the same prob.
You also need to have the socket.io-client module available in your node_modules path or wherever you keep the server-side socket.io module.
For solving similar issues I created a runner script that simply set the NODE_PATH env variable as needed and then execute my script. I also put my own modules (or the modules I don't want to install via npm) in the node_modules subdirectory of my project. A better explanation is here http://www.bennadel.com/blog/2169-Where-Does-Node-js-And-Require-Look-For-Modules-.htm
#!/bin/sh
export NODE_ENV=development
if [ "${NODE_PATH}" = "" ]; then
export NODE_PATH=$(npm -g root 2>/dev/null)
fi
node ${1}