hortonworks sandbox connection refuse - hadoop

i just start to learn hadoop with hortonworks sandbox
i have HDP 2.3 on a virtualBox and in the setting i have a Bridged Adapter network and a NAT,
when i start the machine everything is ok i can do some hadoop command i can connect the Ambari at 127.0.0.1:8080
but when i run the script in /etc/lib/hue/tools/start_scripts/gen_hosts.sh
to generate hosts with a different ip address every thing going wrong and i can't execute a simple hadoop command like hadoop fs -ls /user
i get this error
ls: Call From sandbox.hortonworks.com/10.24.244.85 to sandbox.hortonworks.com:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
as i said i just started to learn hadoop and i am not a network expert so i will preciate any help
thank you.

I found that you have to restart the services ( HDFS,MapReduce,Hbase,..) from
Ambari when you generate a host.
Hope this will help someone.

turn on name NameNode / HDFS it will work

Related

ConnectionRefused when trying to connect to hadoop using cloudera

When I try to do a command that connects with hadoop I get an exception
For example:
command: hdfs dfs -ls /user/cloudera
result:ls: Call From quickstart.cloudera/10.0.2.15 to quickstart.cloudera:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
In addition, when I used such commands before, I did not have that problem.
Thanks
You need to go to Cloudera manager in the browser. Usually this is bookmarked on Firefox as "Cloudera manager". Login "cloudera/clousera". Then start "HDFS" service. The "hadoop fs" commands should start working now.

Hadoop and httpfs Installation difficulties

I am trying to install hadoop in my VM, and I can visit using :
hdfs dfs -ls hdfs://localhost:9000/
But when I tried to visit from another VM using:
hdfs dfs -ls hdfs://hadoop-vm:9000/
I received a 'connection refused error'.
In the browser I can visit:
http://hadoop-vm:50090 etc
Can anyone tell me how to enable a visit from another vm using hdfs?
Another question is I can not install Hadoop httpfs, and I can not find out any info how to download it at all. Can anyone help me?
-sounds like a firewall or something is blocking access to port 9000. Try to telnet to it both in the VM and remotely.
-otherwise, look at the http://wiki.apache.org/hadoop/ConnectionRefused hints
-don't worry about httpfs.

Pig ReadTimeOut Exception

I've installed hortonworks sandbox on Virtual Box. (6092MB of Ram)
I'm following this tutorial.
When I try to execute one simple script
Using arguments: -useHCatalog
Execute on Tez.
I got this error:
java.net.SocketTimeoutException: Read timed out
What can I do?
It sounds like HiveServer is not running. Can you open Ambari (browser - port 8080) and verify that it is running? Ambari can let you restart Hive if it is not.

Hadoop on a single node vagrant VM - Connection refused when starting start-all.sh

I have created a vagrant virtual machine and installed hadoop on that.
Only a single server cluster.
But when I try to start my hadoop on the machine it gives the following error:
mkdir: Call From master/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
And idea? The machine is named as master. The server is an Ubuntu.
Thanks!
This because hdfs nodes are not running go to,
cd HADOOP_HOME/sbin
./start-all.sh
Will start all processes.

java.net.ConnectException: Connection refused error when running Hive

I'm trying work through a hive tutorial in which I enter the following:
load data local inpath '/usr/local/Cellar/hive/0.11.0/libexec/examples/files/kv1.txt' overwrite into table pokes;
Thits results in the following error:
FAILED: RuntimeException java.net.ConnectException: Call to localhost/127.0.0.1:9000 failed on connection exception: java.net.ConnectException: Connection refused
I see that there are some replies on SA having to do with configuring my ip address and local host, but I'm not familiar with the concepts in the answers. I'd appreciate anything you can tell me about the fundamentals of what causes this kind of answer and how to fix it. Thanks!
This is because hive is not able to contact your namenode
Check if your hadoop services has started properly.
Run the command jps to see what all services are running.
The reason why you get this error is that Hive needs hadoop as its base. So, you need to start Hadoop first.
Here are some steps.
Step1: download hadoop and unzip it
Step2: cd #your_hadoop_path
Step3: ./bin/hadoop namenode -format
Step4: ./sbin/start-all.sh
And then, go back to #your_hive_path and start hive again
Easy way i found to edit the /etc/hosts file. default it looks like
127.0.0.1 localhost
127.0.1.1 user_user_name
just edit and make 127.0.1.1 to 127.0.0.1 thats it , restart your shell and restart your cluster by start-all.sh
same question when set up hive.
solved by change my /etc/hostname
formerly it is my user_machine_name
after I changed it to localhost, then it went well
I guess it is because hadoop may want to resolve your hostname using this /etc/hostname file, but it directed it to your user_machine_name while the hadoop service is running on localhost
I was able to resolve the issue by executing the below command:
start-all.sh
This would ensure that the Hive service has started.
Then starting the Hive was straight forward.
I had a similar problem with a connection timeout:
WARN DFSClient: Failed to connect to /10.165.0.27:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection timed out: no further information
DFSClient was resolving nodes by internal IP. Here's the solution for this:
.config("spark.hadoop.dfs.client.use.datanode.hostname", "true")

Resources