Multi-Node Hadoop in kubernetes - hadoop

I already intalled minikube the single node Kubernetes cluster, I just want a help of how to deploy a multi-node hadoop cluster inside this kubernetes node, I need a starting point please!?

For clarification, do you want hadoop to leverage k8s components to run jobs or do you just want it to run as a k8s pod?
Unfortunately I could not find an example of hadoop built as a Kubernetes scheduler. You can probably still run it similar to the spark example.
Update: Spark now ships with better integration for Kubernetes. Information can be found here here

Related

How can I setup Rstudio, sparklyR on an auto scale cluster managed by slurm?

I have an aws HPC auto scale cluster managed by slurm, I can submit jobs using sbatch, however I want to use spraklyr on this cluster so that slurm increases the cluster size based on the workload of the sparklyr code in the R script. Is this possible?
Hi Amir is there a reason you are using slurm here? Sparklyr has better integration with Apache Spark and it would be advisable to run it over a spark cluster. You can follow this Blog to know the steps to setup this up with Amazon EMR which is a Service to run Spark cluster on AWS - https://aws.amazon.com/blogs/big-data/running-sparklyr-rstudios-r-interface-to-spark-on-amazon-emr/

Dask on Hadoop Kubernetes

I've installed Hadoop via a helm chart on my microk8s kubernetes cluster.
I would like to know how to create a dask cluster on my different machines on this hadoop cluster. I tried following the the tutorials on the Dask websites, but I keep getting errors because it is looking for the local yarn/hadoop. How do I point to the hadoop on kubernetes so I can create the cluster?
If you want to launch Dask on Yarn we recommend using https://yarn.dask.org
However, if you are using Kubernetes already you might consider https://kubernetes.dask.org, which is more commonly used today.

How to provision a Hadoop ecosystem cluster with OpenShift?

We are searching a viable way for provisioning a Hadoop ecosystem cluster with OpenShift (based on Docker). We look to build up a cluster using the services of the Hadoop ecosystem, i.e. HDFS, YARN, Spark, Hive, HBase, ZooKeeper etc.
My team has been using Hortonworks HDP for on-premise hardware but will now switch into a OpenShift-based infrastructure. Hortonworks Cloudbreak seems not to be suitable for OpenShift-based infrastructures. I have found this article that describes the integration of YARN into OpenShift but it seems like there are no further information available.
What is the easiest way to provision a Hadoop ecosystem cluster on OpenShift? Manually adding all the services feels error-prone and hard to administer. I have stumbled upon the Docker images of these separate services, but it is not comparable to the automated provisioning you get with a platform like Hortonworks HDP. Any guidance is appreciated.
If you install Openstack within Openshift, Sahara allows provisioning of Openstack Hadoop clusters
Alternatively, Cloudbreak is Hortonwork's tool for provisioning container based cloud deployments
Both provides Ambari, allowing you the same interface for cluster administration as HDP.
FWIW, I personally don't find the reason for putting Hadoop in containers. Your datanodes are locked to specific disks. There's no improvement in running several smaller ResourceManagers on a single host. Plus, for YARN, you'd be running containers within containers. And for the namenode, you must have a replicated Fsimage + Editlog because the container could be placed on any system

Is it possible to start multi physical node hadoop clustster using docker?

I've seen searching for a way to start docker on multiple physical machines and connect them to a hadoop cluster, so far I only found ways to start a cluster locally on 1 machine. Is there a way to do this?
You can very well provision a multinode hadoop cluster with docker.
Please look at some posts below which will give you some insights on doing it:
http://blog.sequenceiq.com/blog/2014/06/19/multinode-hadoop-cluster-on-docker/
Run a hadoop cluster on docker containers

Is there docker orchestration for Hadoop cluster

I was looking at Rancher(an orchestration engine for docker). I think there isn't build in support of hadoop setup.
Take a look at the latest version of Rancher, it has a catalog function that includes Hadoop deployment out of the box.
this is in the latest 0.49 release of Rancher for sure.
One source of information would be "Docker Releases Orchestration Tool Kit", which mentions docker machine, docker swarm, and more importantly, built on top of the swarm API, mesosphere.
Mesosphere’s technology is the only way for an organization to run a Docker Swarm workload in a highly elastic way on the same cluster as other types of workloads.
For example, you can run Cassandra, Kafka, Storm, Hadoop and Docker Swarm workloads alongside each other on a single Mesosphere cluster, all sharing the same resources.

Resources