I was installing apache sqoop for hadoop v1. in the installation it says script /bin/addtowar.sh shoudl be in sqoop bin dir but i dont find it.
used web url
https://sqoop.apache.org/docs/1.99.1/Installation.html
Thanks!
Hi I think your missing this command: mv sqoop-(version)-bin-hadoop(hadoop version).tar.gz /usr/lib/sqoop
Please replace that command with this command:: mv sqoop-(version)-bin-hadoop(hadoop version) /usr/lib/sqoop
I think Your problem will solve.
addToWar.sh is no longer used in the install.
I ran into the same thing, and documented the solution here:
http://brianoneill.blogspot.com/2014/10/sqoop-1993-w-hadoop-2-installation.html
Related
I am getting started with hadoop. I installed java. Set Java_Home to 1.8 and installed hadoop.2.7.6 and I cd'ed into the hadoop installation directory to run bin/hadoop. How ever I donot see any output. I have also tried one of the examples using the command
bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.6.jar
Initially
Appreciate your help.
It looks like there is some issue with version 2.7.6. I installed version 2.7.7 and it started to work.
I'm trying to setup a cronjob for a regularly scheduled import of json data into a mongo database. To conduct the import, I have the following command in the Python script that the cronjob runs:
os.system("mongoimport --jsonArray --db %s --collection %s --file .../data.txt" %(db_name,collection_name))
However, the log file of the cronjob keeps displaying the following error:
sh: mongoimport: command not found
I think I need to call mongoimport with the full file path in the code, but I'm not sure where mongodb/mongod/mongoimport is installed on my system. whereis mongoimport, whereis mongodb, whereis mongod all return nothing.
I installed mongodb with Homebrew. Packages installed with Homebrew are located in /Library/Caches/Homebrew. However, in my system that folder only has a mongodb-2.6.4_1 tar file. Do I have to unpack this tar file to access mongoimport?
Thanks for your help.
As of June 2020,
I installed mongodb latest version using brew as per the documentation , and I faced the same issue command not found: mongoimport .
I had to to install mongodb-database-tools
brew install mongodb/brew/mongodb-database-tools
Then I could use mongoimport
Just adding this solution, incase it helps someone
Got the same issue, but I installed mongodb via Mac Port. Unfortunately, from version 3 of mongodb, these mongodb tools are maintained as a separate project, so I updated Mac port to latest version then installed mongo tools separately.
sudo port install mongo-tools
Hope this helps someone that installing mongodb by mac port.
If you installed MongoDB correctly you need to create a ~/.bash_profile and assign /usr/local/mongodb/bin to the $PATH environment variable
After that you should be able to access the mongoimport command
If you used brew for installation, mongod is in /usr/local/bin/ directory. Other utilities (mongoimport, mongoexport etc.) are in the same path. All you need to do is open another terminal.
Visit https://www.mongodb.com/download-center/community and you can download a tarball for MacOS, which contains all the tools including mongoimport.
Untar, add to you PATH and voilĂ !
Try using ./mongoimport or sudo ./mongoimport
After following all of these examples, I was able to use it that way from bash.
I am trying to install hadoop in my ubuntu OS. I followed each and every step exactly from this link Hadoop Install Tutorial and everything was going as expected until i tried to run
$ start-dfs.sh and $ hadoop jar ./share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar pi 2 5 command. These commands doesn't work as expected.I tried R&D and somehow came to know that i was using older hadoop version Hadoop 1.0.2 despite of me getting latest 2.2.0 version.
As i could not solve this, i tried to uninstall hadoop completely, Now when i try doing it, it says
$ sudo dpkg -r hadoop
dpkg: dependency problems prevent removal of hadoop:
hadoop-native depends on hadoop (= 1.0.2-0ubuntu1~hadoop1).
dpkg: error processing hadoop (--remove):
dependency problems - not removing
Errors were encountered while processing:
hadoop
Appreciate any help !
I dont know whether its a proper way to remove hadoop or not, but i have removed it using below method.
I first manually deleted the /usr/local/hadoop folder from all the users(If any).If you are not able to remove it due to lack of permissions, then make sure about the permissions of the folder. Make the permission of the folder to "Sudo" and on "Creating and deleting files" so that every user can delete from their instances.
Then from Terminal $ rm -r hadoop does the job going to the /usr/local path.
After this, i checked $ hadoop version again in terminal ..and boom it again showed its existence. Then i did below step.
2.Goto terminal sudo apt-get purge hadoop or sudo apt-get remove hadoop...then it worked
Does somebody knows how to install a Hadoop patch like things in https://issues.apache.org/jira/browse/HDFS-385 ?
I don't know what and how to override the original codes so that the pluggable interface can work.
Could anybody give me a hint?
thanks
This is a general, not a Hadoop related problem...
You can apply .patch files using the patch command on your terminal. You can find help to that command with patch --help and man patch
I set up my Hadoop clusters with Hadoop 2.0.2. Then, today I tried to test 1.0.0. So I downloaded the deb file from the Hadoop website and installed it: It did mess up everything.
Now, when I type "which -a hadoop" I get 2 results
one pointing to my old Hadoop installation folder
and the other one pointing to /usr/bin/hadoop.
So the question is: how to get rid off of Hadoop 1.0.0 completely?
Try using dpkg -r hadoop; this should remove the Hadoop package from the system, but leave the config files intact. If you want to lose the config files as well, try dpkg -P hadoop instead.
> $HADOOP_HOME
> /home/shiv/hadoop
> sudo rm -r /home/shiv/hadoop
And Hadoop is uninstalled!
I struggled through this for longer than a while and then decided to share it here:
The trick is to basically delete all the symlinks pointing back to locations where HDP components reside since that is what causes 80% of the problem. Here is a step by step tutorial for that:
http://www.yourtechchick.com/hadoop/how-to-completely-remove-and-uninstall-hdp-components-hadoop-uninstall-on-linux-system/
Hope that helps!