I have a trouble running nimbus apache storm 2.0.0 on windows - apache-storm

In Apache storm 2.0.0 when I run nimbus, it stops after a while. The error message is:
java.lang.Error: java.lang.UnsatisfiedLinkError: C:\Users\AppData\Local\Temp\librocksdbjni4098681609019942941.dll: A dynamic link library (DLL) initialization routine failed.
In Storm 1.2.3 I did not have this trouble.

You most likely don't have the MSVC library installed that Rocksdb needs on Windows. Please see https://github.com/facebook/rocksdb/wiki/RocksJava-Basics#maven. You probably need to go to https://www.microsoft.com/en-us/download/details.aspx?id=48145 and install that.

you should run Apache Storm in by administrator user In windows 10. So, I entered to system as administrator and the problem was solved.

Related

Is there a way to set up Apache Storm latest version under windows 10?

I need to install Apache Storm as a single node cluster under Windows 10.
The latest instruction I found is:
http://ptgoetz.github.io/blog/2013/12/18/running-apache-storm-on-windows/
In the latest archive, Storm does not contain any .cmd files so it looks like WinOS is not supported at all.
Is there a way to install 2+ version of Apache Storm under Windows OS?
Windows is supported, but you need to either use the Powershell script (storm.ps1), or just call the Storm Python script directly (storm.py).
Regarding how to install it, the instructions for Linux and Windows are the same (see this). The only major difference should be whether you use Windows or Linux style paths in the storm.yaml file, and what tool you use to run Storm under supervision (e.g. systemd or Windows services).

How to Install Apache Ambari on MacOS?

Is there any other alternative procedure of installing either from source as described in below link?
https://cwiki.apache.org/confluence/display/AMBARI/Installation+Guide+for+Ambari+2.5.2
or
installing HDP version of Ambari.
Do I have any other alternative source to easy install Ambari eg., like from binary files?
You can build and run Apache Ambari on OSX from source (via the doc link you provided).
There are no binary files that are pre-built and offered by Ambari for OSX.
Even if you get Ambari up and running on OSX you will NOT be able to provision an HDP stack on OSX. This is because the services defined by the HDP stack only support linux or windows. Yum, a package manager utility, is required to install the services (ie. Yarn, Spark, etc.) on linux. Yum is not available on osx.
You would have to write a custom stack for it to work on osx. That would be a good deal of work.
That being said there is nothing stopping you from installing services via source or pre-packaged bins from the various apache repos on OSX. This will allow you to run Hadoop processes locally on OSX but without the niceties provided by Ambari for provisioning, managing, and monitoring.

How to reinstall crashed ambari

I installed Ambari and Hortonworks on Centos 7.
I had many problems with the Kerberos configuration, so I tried to reinstall Ambari.
I followed this link, but I always get errors in the Install and test step.
So what is the best way to reinstall ambari server?
Errors during "Install&Test" step are usually caused by conflicts with already installed HDP packages.
Did you try to follow this link? http://web.archive.org/web/20170816163504/http://www.yourtechchick.com/hadoop/how-to-completely-remove-and-uninstall-hdp-components-hadoop-uninstall-on-linux-system/

Orange3-spark addon installed but no widgets inside

I want to install the Orange3-spark addon,
I checked the requirements and it was (Apache Spark, Pandas, and orange3),
Problem Description
I installed it according to the order below, added the spark addon (GUI approach), no error messages, but there were no widgets shown in the Spark ML section.. The Spark ML section is empty,
Installation Steps
Installed Apache Spark 2.1.1 with Hadoop 2.7 on a Windows 10 machine.
Scala 2.11.8 (comes with Spark)
Checked Spark using (spark-shell) in command prompt
Installed Anaconda 4.4.0 Python 3.6 version
Verified that Pandas is installed within Anaconda
Installed Orange version 3.4.4
Installed Spark Addon (GUI way from orange)
Can you please instruct me on what to do ?
I have never used Python before, I know the job of most of the above-mentioned components, however, this is the first time that I install any of the above. So please bear with me and be clear with your comments ;))
I had a similar problem with the network add on in Orange3 and overcame it by doing a terminal install with root access via sudo instead of the canvas gui install. Try a command line install running as administrator via runas.

Eclipse setup for Hadoop development

I am using hadoop 1.0.3 version .
I tried configuring hadoop Eclipse Indigo.
But It failed to start dfs and showed error failed to login.
plz suggest me what is the problem in map/reduce?
edit : I am using windows 7 so firstly install cygwin than hadoop1.0.4 and started service on
http://127.0.0.1:50030/ and http://localhost:50070/ successfully.
Check if you install hadoop properly. Try to install hadoop plugin, as you can see in install plugin. You can see a tutorial for this in haddop video tutorial. Also, take a look to eclipse setup for hadoop development.
Eclipse Indigo is version 3.7. You can read "Any eclipse before 3.6 is compatible to the eclipse plugin. (Doesn't work with Eclipse 3.7)" http://www.orzota.com/eclipse-setup-for-hadoop-development/

Resources