Create Zookeeper Node using zkCli.cmd directly from DOS command line - cmd

I am trying to use the zkCli.cmd client utility to create a node on a remote zookeeper server from DOS command line directly (i.e. without going into the client utility itself). Is this possible?
I have tried the following:
D:\apps\zookeeper-3.4.6\bin>zkCli.cmd -server 192.168.1.3:2181 create /test-node test-data
But it doesn't create any nodes on that zookeeper server.
My final aim is to be create nodes on remote zookeeper server via a .bat file directly for example.
Thanks,
PM.

I was also having same issue faced on linux, than I found that you can create a znode by provide its data value :
create -s /ha/masternode "highavailability"

Related

How to keep logstash service running when I logout from remote server

I configure logstash service following the instructions in the link https://www.elastic.co/guide/en/logstash/current/running-logstash-windows.html (logstash as a service using nssm) but I noted that the service does actually not running when I am disconnected from the remote server I installed it.
Is there a way to fix this problem?
thanks,
g
The same thing happens also running logstash manually (I mean , running the appropriate bat file in command prompt).

Is there an in built processor in Apache NiFi which can create a Password enabled SSH connection?

I have an HDInsight Cluster setup on Azure Cloud. Also have installed Apache NiFi on a separate VM. Please Note I have SCP & SSH access enabled from VM to my cluster. I am trying to setup some processors as per my requirement, first one in the list is an "ExecuteProcess" processor. What I am trying to achieve through that is to establish an SSH connection with my HDInsight Cluster and once that's successful pass that result (connection established = 'Y') through a FlowFile to my second processor which is a "GetFile" processor that will basically read a JSON file from a particular path in that HDInsight cluster.
I have added "ExecuteProcess" processor and in the Configure option -> Properties section, have set the below:
Command : ssh sshdepuser#demodepdata-ssh.azurehdinsight.net
command arguments: sshdepuser#demodepdata-ssh.azurehdinsight.net
Batch Duration : No Value Set
Redirect Error System : True
Working Directory : No Value Set
Argument Delimiter : No Value Set
Command : ssh sshdepuser#demodepdata-ssh.azurehdinsight.net
command arguments: sshdepuser#demodepdata-ssh.azurehdinsight.net
Batch Duration : No Value Set
Redirect Error System : True
Working Directory : No Value Set
Argument Delimiter : No Value Set
Please Note sshdepuser#demodepdata-ssh.azurehdinsight.net is the server hostname for my HDInsight Cluster to which I am trying to establish connectivity from my VM (Server DNS Name : dep-hadoop.eastus.cloudapp.azure.com)
I am trying to setup some processors as per my requirement, first one in the list is an "ExecuteProcess" processor. What I am trying to achieve through that is to establish an SSH connection with my HDInsight Cluster and once that's successful pass that result (connection established = 'Y') through a FlowFile to my second processor which is a "GetFile" processor that will basically read a JSON file from a particular path in that HDInsight cluster.
I am afraid that this doesn't work this way, you are not going to be able to pass an ssh connection as a flow file, nut you can try a workaround: in the execute processor, instead of make only an ssh connection, copy also the file to a local folder, then you can use the GetFile processor.

Run post processing commands on remote server from informatica cloud

I am running a job on informatica cloud. It picks up a file from a server (remote) and dumps the data into salesforce. I want to run post processing commands from informatica cloud on the source file which is present in the remote server after the informatica job finishes. Is it possible?
Files need to present in the Agent installed machine.
Post processing command file cannot be present in remote location.

Bash script to write Log Processing output to remote DB2

I have a bash script running on a server to capture and process the Logs.
The script is invoked by an utility for processing the Logs. After processing the Logs the
state of the Logs should be stored in db2 table located in some other server.
say, if i have shell script on 110.88.99.10 and db2 on 110.88.99.11..
I need save the processed result to db2.. Any suggestions?
It seems that you need to install the DB2 client: IBM data server access. Once you have installed, you configure the remote instance and database (catalog TCPIP Node and catalog database), and then you can integrate db2 commands (db2 insert ... - log result) in your script.

Accessing Riak node from a remote machine (riak-admin backup)

While trying to run a riak-admin backup riak#ec2-xxx.compute-1.amazonaws.com riak /home/user/backup.dat all on a remote machine (ec2 instance) I encounter the following error message
{"init terminating in do_boot",{{nocatch,{could_not_reach_node,'riak#ec2-xxx.compute-1.amazonaws.com'}},[{riak_kv_backup,ensure_connected,1,[{file,"src/riak_kv_backup.erl"},{line,171}]},{riak_kv_backup,backup,3,[{file,"src/riak_kv_backup.erl"},{line,40}]},{erl_eval,do_apply,6,[{file,"erl_eval.erl"},{line,572}]},{init,start_it,1,[]},{init,start_em,1,[]}]}}
I assume there's a connection / permission error since the same backup command will work if run locally on the instance (with a local node ip of course), I should note the server (Node.js) can remotely connect to that ip so the port is open and accessible 8098). Any advice on how to make the backup operational remotely?
It would appear that the riak-admin backup command doesn't work remotely - and certainly it's not something I've ever tried to do. I'd recommend setting up a periodic backup (via cron or similar) and then use rsync to get your backup file down to local.
Alternatively, you could try the following hacky untested idea for a single script.
#!/bin/bash
ssh ec2-xxx.compute-1.amazonaws.com "riak-admin backup riak#ip-local-ec2 /home/user/backup.dat all"
rsync -avP ec2-xxx.compute-1.amazonaws.com:/home/user/backup.dat .

Resources