Failed dependencies when install pxf service - hawq

When I rpm pxf service in hawq, I got some errors:
error: Failed dependencies:
hadoop >= 2.6.0 is needed by pxf-service-0:3.0.0-root.noarch
hadoop-hdfs >= 2.6.0 is needed by pxf-service-0:3.0.0-root.noarch
What's your advice here ?

Please make sure the PXF rpm OS architecture version matches. For example if the PXF rpm is built for RHEL6 and you are installing on RHEL7 then you may see some dependency issues

Could you please make sure the version of hadoop you are running in the cluster .I guess you might be running a lower version of hadoop .You have to run atleast 2.6 version of hadoop to run the current version of pxf .

The wiki here use the rpm bigtop(hadoop).
https://cwiki.apache.org/confluence/display/HAWQ/Build+Package+and+Install+with+RPM
It means if I install with rpm(HAWQ 2.2.0), the other ways (using binary hadoop without rpm installs like tar) are not support.
If I install hadoop use tar, I must build HAWQ from source code for now.
Please refer to:
https://issues.apache.org/jira/browse/HAWQ-1568

Related

Maven version error while trying to install ambari on centos

I am trying to install ambari 2.7.5 on centos 7. I am following the instructions given on this link. When I run the build command for maven I get the following error:
I have installed maven 3.6.3 . And I could not find maven 3.1.0 in apache downloads. So how to resolve this error?
#VK for Ambari-web you have to edit some files and set your versions. If you are having hard time with npm/node version make sure they match the posted versions I gave. In centos7 I am using the epel repository. Sounds like you already got current maven so good to go there.
Also this post is pretty much a duplicate of:
Building Ambari 2.7.5 on CentOS 7 from source, Worked 2 weeks ago, now fails
Its also discussed here:
Ambari 2.7.5 installation failure on CentOS 7
And a jira with another bower file sample:
https://issues.apache.org/jira/browse/AMBARI-25519

Hadoop installation status

I'm running debian. I'm new to Hadoop. Sometime back, I was trying to install Hadoop. I'm not sure if I've successfully installed. When I enter command at terminal
hadoop version
I see the output:
Hadoop 2.7.1
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r 15ecc87ccf4a0228f35af08fc56de536e6ce657a
Compiled by jenkins on 2015-06-29T06:04Z
Compiled with protoc 2.5.0
From source with checksum fc0a1a23fc1868e4d5ee7fa2b28a58a
This command was run using /home/xxxxxxx/java/hadoop-2.7.1/share/hadoop/common/hadoop-common-2.7.1.jar
Is hadoop installed properly? If not, what other tests I've to do? If yes, is there some simple "get-started" tutorial/exercise you're aware of, that can help me get started with?
Thank you!

Install Spark 1.5 in existing Hortonworks HDP Cluster

I'm new to Hadoop and want find the way how to install Spark 1.5.1 on the existing Hadoop cluster. 4 nodes, Ubuntu 14.04. Hadoop 2.3.2. Ambari Version 2.1.2.1. Followed tutorial, but there are spark version for the Ubuntu 12, and I cannot install it on our system. So after step 1 I stucked. sudo apt-get install spark_2_3_2_1_12-master -y
Got an error:
Reading package lists... Done
Building dependency tree
Reading state information... Done
E: Unable to locate package spark_2_3_2_1_12-master
Can anyone provide us with some guidline, how to install 1.5?
Currently we have Spark 1.4 installed, up, and running, but due to requirement of functionality need the 1.5!
Ubuntu 14.04 Trusty Tahr is not officially supported by HDP. If you look at the repos available for stack updates, HDP stack public repos, they only have ones up for Centos, Red Hat, and Oracle Linux. Did you try using Spark's Simple Build Tool to build spark-1.5 source against your Hadoop install ? You would need to set SPARK_HADOOP_HOME=your hadoop location. See this for step by step with Ubuntu 14.04 and an earlier version of Spark. I don't see why the same steps would fail with Spark 1.5.

how to determine the cloudera minor release in the one click install debian package ? (i.e., 5.1 ? 5.2 ?)

I've managed to get cloudera single node hadoop cluster up and running from this package: http://archive.cloudera.com/cdh5/one-click-install/precise/amd64/cdh5-repository_1.0_all.deb
my colleagues asked me what minor release of cloudera this is installing.. and i am stumped as to which. Is there some info, readme, config or license file that gives this information for cloudera hadoop distros once they are installed ?
Or maybe someone just knows which minor release the above URL will install (if you could provide that info along with a link to a documentation source that would be fantastic.)
thanks in advance
-chris
The one-click install repo currently points to the latest Cloudera version, which is 5.3.0 as of earlier this week.
To check the version you installed, just list the package name. There should be some version number like '5.2.x' appended to the package name. An example command:
dpkg -l | grep 'cloudera'

Ambari install script location(s)

I'm setting up a HDP 2.1 cluster with Apache Ambari. All servers run SLES 11 SP3. The setup fails if I select to install Ganglia because of some dependencies:
Installing package apache2?mod_php* ('/usr/bin/zypper --quiet install --auto-agree-with-licenses --no-confirm apache2?mod_php*')
Problem: apache2-mod_php53-5.3.17-0.27.1.x86_64 conflicts with apache2-mod_php5 provided by apache2-mod_php5-5.2.14-0.7.30.50.1.x86_64
Solution 1: Following actions will be done:
do not install apache2-mod_php5-5.2.14-0.7.30.50.1.x86_64
deinstallation of php5-5.2.14-0.7.30.50.1.x86_64
deinstallation of php5-xmlwriter-5.2.14-0.7.30.50.1.x86_64
[... more PHP 5.2.x packages ...]
Solution 2: do not install apache2-mod_php53-5.3.17-0.27.1.x86_64
Apparently the Regex picks the 5.3 version, a 5.2 version would be available though.
So my question is: Where is the install script stored, that Ambari is running here? I would like to replace the regex with the correct version of the package.
Information about what packages are to be installed is stored in
/var/lib/ambari-server/resources/stacks/HDP/2.0.6/services/GANGLIA/metainfo.xml
Change the value and restart the Ambari Server for the changes to take effect.

Resources