What are the steps to install HDP 2.0 through Ambari? I have tried with the steps described in the Hortonworks documentation. But, the installation is not successful.
Ambari supports RHEL (centos) 6 and 7 ; SLES 11sp3 ; Ubuntu 12 and Ubuntu 14 ; Debian 7.
Related
I am installing Centos for learning Hadoop. But want to know while installing Centos which Base Environment I need to select while installing Centos
https://www.golinuxhub.com/2014/11/step-by-step-centos-7-64-bit.html
Have you installed? If not try downloading centos image directly and spin up a VM.
If you are using vmbox - try below
https://www.osboxes.org/centos/
ian new to centos 7
and iam configuring hadoop 2.7.1 cluster so i need to install openjdk as a requirement
so i installed one by the command
yum install java-1.7.0-openjdk
and java version command out put is
java version "1.7.0_131"
OpenJDK Runtime Environment (rhel-2.6.9.0.el7_3-x86_64 u131-b00)
OpenJDK 64-Bit Server VM (build 24.131-b00, mixed mode)
but my problem is that i want to use jps command and it is found in
java-1.7.0-openjdk-1.7.0.101-2.6.6.1.el7_2.x86_64
so i iwant to install this rpm and i used the command
cd /usr/lib/jvm
rpm -ivh --nodeps ftp://mirror.switch.ch/pool/4/mirror/scientificlinux/7.0/x86_64/updates/security/java-1.7.0-openjdk-1.7.0.101-2.6.6.1.el7_2.x86_64.rpm
but becuase a newer version of jdk is installed i wasn't able to install this rpm
with the error
package java-1.7.0-openjdk-1:1.7.0.131-2.6.9.0.el7_3.x86_64 (which is newer than java-1.7.0-openjdk-1:1.7.0.101-2.6.6.1.el7_2.x86_64) is already installed
i don't know if iam using the right way to make jps command works
what should i do to include jps command
and is it right to install an old release of openjdk i mean 101 when newer one already exists i mean 131
java-1.7.0-openjdk contains only the JRE. jps is part of the openjdk development package. Refer here.
Try
yum install java-1.7.0-openjdk-devel
I am trying to install MarkLogic-RHEL6-8.0-5.x86_64.rpm on CENTos7 - and getting this error:
[root#localhost marklogic]# rpm -i MarkLogic-RHEL6-8.0-5.x86_64.rpm
error: Failed dependencies:
libsasl2.so.2()(64bit) is needed by MarkLogic-8.0-5.x86_64
libc.so.6(GLIBC_2.11) is needed by MarkLogic-8.0-5.x86_64
Could not find any way to resolve this using yum or any other way.
OS version is:
[root#localhost marklogic]# cat /etc/*elease
CentOS Linux release 7.2.1511 (Core)
NAME="CentOS Linux"
VERSION="7 (Core)"
ID="centos"
ID_LIKE="rhel fedora"
VERSION_ID="7"
PRETTY_NAME="CentOS Linux 7 (Core)"
ANSI_COLOR="0;31"
CPE_NAME="cpe:/o:centos:centos:7"
HOME_URL="centos.org/";
BUG_REPORT_URL="bugs.centos.org/";
CENTOS_MANTISBT_PROJECT="CentOS-7"
CENTOS_MANTISBT_PROJECT_VERSION="7"
REDHAT_SUPPORT_PRODUCT="centos"
REDHAT_SUPPORT_PRODUCT_VERSION="7"
CentOS Linux release 7.2.1511 (Core)
CentOS Linux release 7.2.1511 (Core)
Thanks in advance - help would be appreciated.
You used the installer for Red Hat 6 / CentOS 6. Try the one for Red Hat Enterprise Linux, Version 7: http://developer.marklogic.com/products
HTH!
RHEL 7 uses the newer libsasl2.so.3. However, MarkLogic requires libsasl2.so.2. Unfortunately there is no symlink to libsasl2.so.2 by default.
For MarkLogic 8 on RHEL 7x and CentOS 7x, you need to manually create a symbolic link in /usr/lib64
/usr/lib64/libsasl2.so.2 --> /usr/lib64/[your sasl version - mine is libsasl2.so.3.0.0]
I'm new to Hadoop and want find the way how to install Spark 1.5.1 on the existing Hadoop cluster. 4 nodes, Ubuntu 14.04. Hadoop 2.3.2. Ambari Version 2.1.2.1. Followed tutorial, but there are spark version for the Ubuntu 12, and I cannot install it on our system. So after step 1 I stucked. sudo apt-get install spark_2_3_2_1_12-master -y
Got an error:
Reading package lists... Done
Building dependency tree
Reading state information... Done
E: Unable to locate package spark_2_3_2_1_12-master
Can anyone provide us with some guidline, how to install 1.5?
Currently we have Spark 1.4 installed, up, and running, but due to requirement of functionality need the 1.5!
Ubuntu 14.04 Trusty Tahr is not officially supported by HDP. If you look at the repos available for stack updates, HDP stack public repos, they only have ones up for Centos, Red Hat, and Oracle Linux. Did you try using Spark's Simple Build Tool to build spark-1.5 source against your Hadoop install ? You would need to set SPARK_HADOOP_HOME=your hadoop location. See this for step by step with Ubuntu 14.04 and an earlier version of Spark. I don't see why the same steps would fail with Spark 1.5.
I need to upgrade Hadoop from CDH4 to CDH5. I have 5 nodes.
Can I upgrade using Cloudera Manager using parcels?
What is the easiest way to upgrade Hadoop? Can someone provide me steps?
Thanks,
Raj Baba
In Cloudera hadoop distribution, Cloudera Manager(CM) and CDH are different components. For upgrading CDH4 to CDH5, you would need to upgrade your Cloudera Manager version to CM5 first. You cannot use parcels for upgrading your Cloudera Manager version as this is the base component. Depends on your Linux distribution, tools (yum/apt) can be used for upgrading your CM.
If you are using CentOS or RHEL, you may use yum for upgrading your CM, update your /etc/yum.repos.d/cloudera-manager.repo file as follows :
[cloudera-manager]
# Packages for Cloudera Manager, Version 5, on RedHat or CentOS 6 x86_64
name=Cloudera Manager
baseurl=http://archive-primary.cloudera.com/cm5/redhat/6/x86_64/cm/5.2.0/
gpgkey = http://archive.cloudera.com/cm5/redhat/6/x86_64/cm/RPM-GPG-KEY-cloudera
gpgcheck = 0
After updating this file, you may use the command yum upgrade cloudera-manager-* for upgrading.
Once CM is CM5, for upgrading CDH4 to CDH5 parcel is the best option. Following cloudera documentation can be used for upgrading
http://www.cloudera.com/content/cloudera/en/documentation/core/latest/topics/cm_mc_upgrade_tocdh5_using_parcels.html