Is there any other alternative procedure of installing either from source as described in below link?
https://cwiki.apache.org/confluence/display/AMBARI/Installation+Guide+for+Ambari+2.5.2
or
installing HDP version of Ambari.
Do I have any other alternative source to easy install Ambari eg., like from binary files?
You can build and run Apache Ambari on OSX from source (via the doc link you provided).
There are no binary files that are pre-built and offered by Ambari for OSX.
Even if you get Ambari up and running on OSX you will NOT be able to provision an HDP stack on OSX. This is because the services defined by the HDP stack only support linux or windows. Yum, a package manager utility, is required to install the services (ie. Yarn, Spark, etc.) on linux. Yum is not available on osx.
You would have to write a custom stack for it to work on osx. That would be a good deal of work.
That being said there is nothing stopping you from installing services via source or pre-packaged bins from the various apache repos on OSX. This will allow you to run Hadoop processes locally on OSX but without the niceties provided by Ambari for provisioning, managing, and monitoring.
Related
I need to install Apache Storm as a single node cluster under Windows 10.
The latest instruction I found is:
http://ptgoetz.github.io/blog/2013/12/18/running-apache-storm-on-windows/
In the latest archive, Storm does not contain any .cmd files so it looks like WinOS is not supported at all.
Is there a way to install 2+ version of Apache Storm under Windows OS?
Windows is supported, but you need to either use the Powershell script (storm.ps1), or just call the Storm Python script directly (storm.py).
Regarding how to install it, the instructions for Linux and Windows are the same (see this). The only major difference should be whether you use Windows or Linux style paths in the storm.yaml file, and what tool you use to run Storm under supervision (e.g. systemd or Windows services).
I am new to HDP installation using Ambari. I want to install Hadoop 2.9.0 using Ambari web installation. My Ambari version is 2.7.0.0 and I am using HDP 3.0 which has Hadoop 3.1.0. But I need to install Hadoop 2.9.0. Can someone please let me know if this can be done? And how can this be achieved?
I have not started the cluster installation yet and I'm done with Ambari installation.
Ambari uses pre-defined software stacks.
HDP does not offer any stack with Hadoop 2.9.0
You would therefore need to manually install that version of Hadoop yourself, although you can still manage the servers (but not the Hadoop configuration) using Ambari
In any case, there's little benefit to installing a lower version of the software, plus you won't get Hortonworks support if you do that
I want to install the Orange3-spark addon,
I checked the requirements and it was (Apache Spark, Pandas, and orange3),
Problem Description
I installed it according to the order below, added the spark addon (GUI approach), no error messages, but there were no widgets shown in the Spark ML section.. The Spark ML section is empty,
Installation Steps
Installed Apache Spark 2.1.1 with Hadoop 2.7 on a Windows 10 machine.
Scala 2.11.8 (comes with Spark)
Checked Spark using (spark-shell) in command prompt
Installed Anaconda 4.4.0 Python 3.6 version
Verified that Pandas is installed within Anaconda
Installed Orange version 3.4.4
Installed Spark Addon (GUI way from orange)
Can you please instruct me on what to do ?
I have never used Python before, I know the job of most of the above-mentioned components, however, this is the first time that I install any of the above. So please bear with me and be clear with your comments ;))
I had a similar problem with the network add on in Orange3 and overcame it by doing a terminal install with root access via sudo instead of the canvas gui install. Try a command line install running as administrator via runas.
I have an installed CDH cluster and used hadoop version, but it returns only with Hadoop version. Is there any way to get maybe all installed components version number on a graphical interface? Which command can get for example Spark version number?
Open CM (hostname:portnumber) -> Hosts tab -> Host Inspector to find what version of
CM and CDH is installed across all hosts in the cluster, as well as installed cdh components list with version details
Spark version can checked in using
spark-submit --version
Spark was developed separately from Hadoop-hdfs and Hadoop-mapreduce as a standalone tool which can be be used along with Hadoop, as such most of its interfaces are are different from hadoop.
i am having problems while trying to install hadoop on windows32 bit
i have followed the yahoo ttorial on installing hadoop
http://developer.yahoo.com/hadoop/tutorial/
and have successfully downloaded the VMplayer and Hadoop virtual machine image,
but i am not able to configure the eclipse plug-in
using the jar i am able to write the basic map-reduce program in eclipse,Now after writing howw to execute them on hadoop as with hadoop vMimage i can not find the bin directory
Hortonworks has released windows version of hadoop by modifying shell scripts wrappers by windows powershell scripts. You can directly install hadoop on your windows machine.
Use the following link for downloading the windows version :
http://s3.amazonaws.com/public-repo-1.hortonworks.com/HDP-Win/1.1/Beta/HDP-WIN-1.1.0_BETA-hadoop.zip