I have a cluster server which consists of 100 nodes with CentOS installed currently. For some reasons (however bad it is :<), I need to install Windows 8.1 over them.
I am going to install Windows 8.1 on a node and set up the softwares needed.
Can I make a ghost image from it and distribute over the other nodes to install it? Is there any one who have a similar experience?
thanks :D
Related
I'm trying to create my own hadoop clister. My all data nodes have installed ubuntu 18 and Name node is having ubuntu 14.
Is it mandatory that Name node and Data nodes should have same version of OS .. ?
It is recommended to have the same major version at least to avoid kernel vulnerabilities. If you come across these low level issues, they are very difficult to debug.
As #piyush-p said, it's not recommended but as long as you are running the same Java version across all the hosts you should be okay. You probably won't want to
do this if you are using a commercial distribution of Hadoop (HDP, Cloudera) as their
respective setup tools (Ambari, Cloudera Manager) will probably disallow this.
See HDP Support for mix of OS Releases within a cluster for more details.
I am reading a book about Hadoop now. The books says you need to download and install VMware Workstation Player (Windows 7 version)
https://my.vmware.com/en/web/vmware/free#desktop_end_user_computing/vmware_workstation_player/14_0
Then, apparently, I need to download and install CentOS6, from here.
https://sourceforge.net/projects/centos-6-vmware/
Once the VM is running, you need to go to File > Open, and run the centos program. The problem I am having is that I can't install the VM; I'm getting an error message that reads 'This host supports Intel VT-x, but Intel VT-x is disabled'. So, I Googled this, and I don't have anything that is listed there (no Processor submenu, no Chipset, Advanced CPU Configuration, and no Northbridge). There must install Hadoop, I hope. What is the easiest way to get this up and running on a Windows 7 machine? I just want to follow some of the steps from the book. Thanks.
I need to do a installation of windows updates (OS and Microsoft Security Essentials) on multiple clients using Cent-OS Server. I'm not very familiar with Linux systems and I cant find an appropriate tutorial On the internet.
Give OPSI a try, this is an OpenSource Deployment Solution which works on CentOS:
http://www.opsi.org/
This is an integrated system to deploy full installation as well as simple updates or rollouts.
yum installs RedHat/CentOS/Fedora RPM packages on RedHat/CentOS/Fedora systems. It doesn't have anything to do with Windows. It doesn't understand exe files or anything like that.
I'm not even sure where to begin to understand what the question you are actually trying to ask is... unless your question is really just as confused as it sounds and you are failing to understand the difference between package managed linux systems and Windows systems.
Hi I want to learn Hadoop.I have basic idea on how hadoop works with MapReduce framework.
Now i want to practice on my local PC so i want to know how to install hadoop on single Node.
I installed VM Workstation 10 and i tried to install any Linux flavour Operating system to install Hadoop , but iam not able to load Ubuntu into VM ware Workstation ,iam getting error as Exiting intel ...,Operating Not found message.
Can any one please provide me steps on how to start with Hadoop installation.
Should i go for any Distributions(Cloudera,Hortonworks,MapR).If that is simple then tell me how to install those distributions.(I tried even with Cloudera importing vmware file into VMWare workstation it did not worked for me)
You can use the VM given by Udacity for its course on Hadoop. I found it really easy to set up.
I have Cent OS on two nodes and Ubuntu on other two nodes can i install Cloudera 4.5 or later on the servers.
i have searched on internet but could not find any relevant information.
How can i install Cloudera on these 4 servers?
No we can not install Hadoop on a cluster having heterogeneous OS.
This is one of the limitation of Hadoop.