Hadoop: start-dfs.sh throwing syntax errors - hadoop

I am trying to start hadoop by running ./start-dfs.sh, but i am getting some syntax errors. Could anybody please help?
Gurupads-MacBook-Air:sbin guru$ sudo ./start-dfs.sh
Starting namenodes on [localhost]
/Users/guru/homebrew/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-functions.sh: line 398: syntax error near unexpected token `<'
/Users/guru/homebrew/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-functions.sh: line 398: ` done < <(for text in "${input[#]}"; do'
/Users/guru/homebrew/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 70: hadoop_deprecate_envvar: command not found
/Users/guru/homebrew/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 87: hadoop_bootstrap: command not found
/Users/guru/homebrew/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 104: hadoop_parse_args: command not found
/Users/guru/homebrew/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 105: shift: : numeric argument required
/Users/guru/homebrew/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 244: hadoop_need_reexec: command not found
/Users/guru/homebrew/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 252: hadoop_verify_user_perm: command not found
/Users/guru/homebrew/Cellar/hadoop/3.1.1/libexec/bin/hdfs: line 213: hadoop_validate_classname: command not found
/Users/guru/homebrew/Cellar/hadoop/3.1.1/libexec/bin/hdfs: line 214: hadoop_exit_with_usage: command not found
/Users/guru/homebrew/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 263: hadoop_add_client_opts: command not found
/Users/guru/homebrew/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 270: hadoop_subcommand_opts: command not found
/Users/guru/homebrew/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 273: hadoop_generic_java_subcmd_handler: command not found
Starting datanodes
ERROR: Attempting to operate on hdfs datanode as root
ERROR: but there is no HDFS_DATANODE_USER defined. Aborting operation.
Starting secondary namenodes [Gurupads-MacBook-Air.local]
ERROR: Attempting to operate on hdfs secondarynamenode as root
ERROR: but there is no HDFS_SECONDARYNAMENODE_USER defined. Aborting operation.
2018-09-18 21:51:24,380 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

The error states that, you are running hdfs datanode as root user and there is no HDFS_DATANODE_USER is defined.
Solution:
Edit /Users/guru/homebrew/Cellar/hadoop/3.1.1/etc/hadoop/hadoop-env.sh file by adding below lines at the end of file:
export HDFS_NAMENODE_USER=root
export HDFS_DATANODE_USER=root
export HDFS_SECONDARYNAMENODE_USER=root
If you are using Yarn then you can add below lines in same file:
export YARN_RESOURCEMANAGER_USER=root
export YARN_NODEMANAGER_USER=root
Restart hadoop using start-dfs.sh script.

Related

List directories in localhost using lftp cmd

I'm trying to copy a file to a folder in localhost using lftp command but I'm having troubles executing commands.
When I try to list folders in a directory using : !dir;
I get the following error : stderr: 1. bash.exe: warning: could not find /tmp, please create! 2. sh: -c: line 0: syntax error near unexpected token ;;' 3. sh: -c: line 0: dir;;

Anyone know how to fix hadoop-functions.sh "syntax error near unexpected token `<'"?

I've configured Hadoop 3.1.1 on my MacPro running OSX 10.14.2, and I'm getting the following error when I run start-all.sh
$ sudo /usr/local/Cellar/hadoop/3.1.1/sbin/start-all.sh
Starting namenodes on [localhost]
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-functions.sh: line 398: syntax error near unexpected token `<'
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-functions.sh: line 398: ` done < <(for text in "${input[#]}"; do'
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 70: hadoop_deprecate_envvar: command not found
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 87: hadoop_bootstrap: command not found
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 104: hadoop_parse_args: command not found
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 105: shift: : numeric argument required
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 244: hadoop_need_reexec: command not found
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 252: hadoop_verify_user_perm: command not found
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/hdfs: line 213: hadoop_validate_classname: command not found
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/hdfs: line 214: hadoop_exit_with_usage: command not found
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 263: hadoop_add_client_opts: command not found
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 270: hadoop_subcommand_opts: command not found
/usr/local/Cellar/hadoop/3.1.1/libexec/bin/../libexec/hadoop-config.sh: line 273: hadoop_generic_java_subcmd_handler: command not found
Same issues starting the datanodes, secondary namenodes, resourcemanager, and nodemanagers.
I have found a similar bug reference online: https://issues.apache.org/jira/browse/HDFS-12571.
Update
After some debugging, the root cause is bash "< <(command)" syntax not being accepted for some reason. The bash versions on the system (/bin/bash and /usr/local/bin/bash from Homebrew) both work properly.
Maybe you should modify HDFS_NAMENODE_USER、HDFS_DATANODE_USER and so on in hadoop-env.sh to current user instead of root! Then before run the sudo ./start-all.sh command, you may need to recreate hdfs namenode with hdfs namenode -format.

Canopy installation error in ubuntu

While trying to install canopy I got following error
ubuntu#hari-desktop:~/Downloads$ sh canopy-2.1.3.rh6-x86_64-cp27.sh
canopy-2.1.3.rh6-x86_64-cp27.sh: 32: canopy-2.1.3.rh6-x86_64-cp27.sh: 0: not found
canopy-2.1.3.rh6-x86_64-cp27.sh: 46: canopy-2.1.3.rh6-x86_64-cp27.sh: 0: not found
canopy-2.1.3.rh6-x86_64-cp27.sh: 154: canopy-2.1.3.rh6-x86_64-cp27.sh: Syntax error: word unexpected (expecting ")")
Solution: run using bash: bash canopy-2.1.3.rh6-x86_64-cp27.sh

Hadoop Services not Running

Recently installed the VM at the below link and was attempting to run the command sudo hdfs dfs -mkdir /user/vagrant as the HDFS user. However, Hadoop is throwing the below error message at me.
https://atlas.hashicorp.com/puppetlabs/boxes/centos-6.6-64-nocm
-bash-4.1$ hdfs dfs -mkdir /user/vagrant
: command not founddoop-env.sh: line 15:
: command not founddoop-env.sh: line 16:
: command not founddoop-env.sh: line 21:
: command not founddoop-env.sh: line 26:
: command not founddoop-env.sh: line 29:
: command not founddoop-env.sh: line 32:
: command not founddoop-env.sh: line 35:
: command not founddoop-env.sh: line 38:
: command not founddoop-env.sh: line 46:
: command not founddoop-env.sh: line 49:
: command not founddoop-env.sh: line 52:
: command not founddoop-env.sh: line 55:
: command not founddoop-env.sh: line 58:
: command not founddoop-env.sh: line 61:
: command not founddoop-env.sh: line 66:
: command not founddoop-env.sh: line 69:
: command not founddoop-env.sh: line 72:
: command not founddoop-env.sh: line 75:
: command not founddoop-env.sh: line 77:
: command not founddoop-env.sh: line 78:
mkdir: Call From bigtop1.vagrant/127.0.0.1 to bigtop1.vagrant:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
I believe this means that none of the Hadoop services are running (even running hadoop -fs returns the command not found errors), but am unsure of how to resolve this issue. As well, I am not even seeing a /hadoop folder under /usr or /usr/local.
I was hoping to resolve this issue by using vagrant destroy -> vagrant up, but to no avail.
Noticed line endings were windows format. Destroyed VM, correctly configured git for unix line endings git config --global core.autocrlf false, and remade. Works perfectly.

Hadoop HBase scripts on Linux Mint gives strange errors

I have installed Hadoop and Pig on my Mint (Ubuntu-like) virtual machine. I keep getting strange error messages when running scripts. In fact, when I run hadoop commands I also get errors but at least it works, but with HBase it just fails.
For example, running sh hadoop -rmr /home/myoutput I get:
hadoop: 102: [: fs: unexpected operator
Deleted hdfs://localhost/home/myoutput
When I run start-hbase it starts fine.
When I run sh hbase shell I get:
hbase: 163: hbase: [[: not found
hbase: 163: hbase: [[: not found
hbase: 197: hbase: Syntax error: "(" unexpected
These lines in the hbase script are:
163: if [[ $f = *sources.jar ]]
197: function append_path() {
What am I missing?
Mint is not just ubuntu-like, it is actually built on ubuntu, so you should be able to find the answer pretty easy for that.
Also, my suggestion to you is to tag this question with ubuntu, not mint.

Resources