The software I used as following:
And I meet the error as following:
The system I used is Centos 7 and the JDK I used is 1.7.0_07. Anyone can help me to solve the problem?
You might be missing some of the libraries required to build hadoop. Do you have all this installed?
yum -y install wget gcc gcc-c++ autoconf automake libtool zlib-devel cmake openssl openssl-devel snappy snappy-devel bzip2 bzip2-devel protobuf protobuf-devel
If libraries are not the case, you can find this project useful. This is a bash script which builds hadoop from scratch. May be there you'll find a step you've missed:
https://github.com/hadoopfromscratch/hadoopfromscratch
Related
I'm trying to compile Boost with the zlib package built into Ubuntu, but it cannot find it and thus gives me a zlib : no upon running ./b2.
I definitely have it installed, as dpkg -L zlib1g-dev and dpkg -L zlib1g give me their locations.
I've tried manually passing in the directories as command-line options as suggested here, and I've also tried creating a user-config.jam file as suggested here, but nothing has worked.
Any ideas as to what might be the problem?
For unclear reasons, the solution was just to get rid of everything in the Boost directory and recompile + reinstall Boost. After that it was able to find zlib (and bzip2) without any problems.
Hello i just uploaded the same question yesterday, but my explanation was poor.
Apache Hadoop Common FAILURE while installing Hadoop
< it's a my previous post.
i'm want to install Hadoop in a raspberry pi base on the raspbian OS.
http://data.andyburgin.co.uk/post/157450047463/running-hue-on-a-raspberry-pi-hadoop-cluster
< i refer to this site.
**Compile Hadoop
We now need to compile the Hadoop binaries, download and unpack the Hadoop 2.6.4 source, tweak pom.xml so it bypasses the documentation generation as this fails on the Pi, apply the HADOOP-9320 patch and build the binary.
cd
apt-get install oracle-java8-jdk
wget http://apache.mirror.anlx.net/hadoop/common/hadoop-2.6.4/hadoop-2.6.4-src.tar.gz
tar -zxvf hadoop-2.6.4-src.tar.gz
vi hadoop-2.6.4-src/pom.xml
Disable the problem step in by adding the following to <properties>…</properties>
<additionalparam>-Xdoclint:none</additionalparam>
Next apply the HADOOP-9320 patch
cd hadoop-2.6.4-src/hadoop-common-project/hadoop-common/src
wget https://issues.apache.org/jira/secure/attachment/12570212/HADOOP-9320.patch
patch < HADOOP-9320.patch
cd ~/hadoop-2.6.4-src/
Next install a whole bunch of build tools and libraries:
apt-get install maven build-essential autoconf automake libtool cmake zlib1g-dev pkg-config libssl-dev libfuse-dev libsnappy-dev libsnappy-java libbz2-dev**
<< i've finished this line.
but the error occurred at the next stage > **sudo mvn package -Pdist,native -DskipTests -Dtar**
how do i fix this error? please help. thank you :)
When I'm trying to build Apache Thrift source in cygwin, I'm getting error saying "Couldn't find libtoolize!". How can I install libtoolize in cygwin?
You will need GNU M4 1.4.6 or later to install LibTool (which includes libtoolize).
Good news is that you can easily do it if you run the Cygwin installer (no worries, it will keep your Cygwin installation and add new packages if you select them). So all you need to do is to click on the following buttons:
GNU M4 installation
LibTool installation
'libtoolize' is a part of libtool. You can dowload latest version of libtool from http://ftp.gnu.org/gnu/libtool/, extract it, then run ./configure and make install from cygwin terminal.
Use this:
python -mpip get-install libtoolize
I am new to Unixlike operating systems.
After installing Hadoop as per the instructions below,
http://wiki.apache.org/hadoop/Running_Hadoop_On_OS_X_10.5_64-bit_(Single-Node_Cluster)
I am trying to build the examples as given in the same URL using
ant examples
This gives me an exception as below
compile-mapred-classes:
Trying to override old definition of task jsp-compile
[javac] /Users/hadoop/hadoop-1.2.1/build.xml:549: warning: 'includeantruntime' was not set, defaulting to build.sysclasspath=last; set to false for repeatable builds
create-native-configure:
BUILD FAILED
/Users/hadoop/hadoop-1.2.1/build.xml:634: Execute failed: java.io.IOException: Cannot run program "autoreconf" (in directory "/Users/hadoop/hadoop-1.2.1/src/native"): error=2, No such file or directory
What exactly is needed for my Mac to get past this?
As the error says, you need to install autoreconf. The easiest way to do it is through Homebrew (brew install autoconf, after you install the Homebrew itself).
Incidentally, hadoop can be installed through Homebrew as well.
I had the same issue, and was able to resolve it by installing automake, autoconf and libtool:
brew install automake autoconf libtool
Running a couple of brew unlink {formula} && brew link {formula} on automake, autoconf and libtool did the trick for me
Can I install gcc++ on CentOS 6.x without `yum install gcc-c++ ....' ?
Is there any .tar or .rpm package available for download?
Yum will install rpm from it's repository.
So I don't understand why you want to avoid yum, it will solve dependencies and install them as well.
However, here is official RPM repository mirror (one of many):
http://centos.arminco.com/5/os/i386/CentOS/
Here is list of all mirrors : http://www.centos.org/modules/tinycontent/index.php?id=30
You will need at least 3 RPMs:
gcc-4.4.6-3.el6.i686.rpm
gcc-c++-4.4.6-3.el6.i686.rpm
libgcc-4.4.6-3.el6.i686.rpm
For compilation of C/C++ you will also need libstdc++, glibc, etc
When you run
yum install gcc
Everything is done
As you did not specified architecture I assume i386, but URL is very similar for x86_64:
http://centos.arminco.com/6/os/x86_64/Packages/
If you want to install it as a local user (or as a superuser)
GNU GSRC provides an easy way to do so
Link: http://www.gnu.org/software/gsrc/
After installation via bzr, simply do these:
./bootstrap
./configure --prefix=~/local
make -C gnu/gcc
(or make -C gnu/gcc MAKE_ARGS_PARALLEL="-jN" to speed up for a N-core system)
make -C gnu/gcc install