I am running a job on informatica cloud. It picks up a file from a server (remote) and dumps the data into salesforce. I want to run post processing commands from informatica cloud on the source file which is present in the remote server after the informatica job finishes. Is it possible?
Files need to present in the Agent installed machine.
Post processing command file cannot be present in remote location.
Related
I configure logstash service following the instructions in the link https://www.elastic.co/guide/en/logstash/current/running-logstash-windows.html (logstash as a service using nssm) but I noted that the service does actually not running when I am disconnected from the remote server I installed it.
Is there a way to fix this problem?
thanks,
g
The same thing happens also running logstash manually (I mean , running the appropriate bat file in command prompt).
I want to automate the process of transferring files between servers that is done using WinSCP. Can we automate this process through jmeter using OS process sampler. Please help me with the command to transfer files and how to connect to servers using jmeter and transfer files.
Use Jmeter Plugin jmeter-ssh-sampler for this:
And then can use SCP command to transfer file from one server to another server.
To copy a file from a local to a remote system run the following command:
Commad : scp file.txt remote_username#10.10.0.2:/remote/directory
Below is the link to the plugin:
jmeter-ssh-sampler link
The docs for the Workload Scheduler for Node.js says:
"Important: Before running a database step, download and install the
JDBC database client driver on the agent where you want to run the
step. Specify the client jar file path in the JDBC jar class path."
How can I download and install the necessary JAR files to the agent? I see from this question that they should be installed at /home/wauser/utils, but I cannot figure out how to access the agent to install.
I tried an FTP step to move the file to the agent, but it was also unsuccessful.
From your description I assume your are trying to run SQL steps on the xx_CLOUD agent.
Where is the database running and which type of database is it? is it on Bluemix or somewhere else?
Currently the best way to run SQL steps is to use a workload scheduler agent installed on a VM or a docker image, so that you can easily install the jdbc jar files.
I am working on Postgresql 9.2 streaming replication and I have finished setting up on the master and on the standby I want to set up the parameters in recovery.conf file.
But I can not get the file so I have created a new file 'named recovery.conf' and copied all the contents of recovery.conf.sample file and edited the parameters.
I saved it and when I start the postgresql service, it gives error
"service on local computer started and stopped....."
But when I remove recovery.conf file the service starts.
I need help.
I have a bash script running on a server to capture and process the Logs.
The script is invoked by an utility for processing the Logs. After processing the Logs the
state of the Logs should be stored in db2 table located in some other server.
say, if i have shell script on 110.88.99.10 and db2 on 110.88.99.11..
I need save the processed result to db2.. Any suggestions?
It seems that you need to install the DB2 client: IBM data server access. Once you have installed, you configure the remote instance and database (catalog TCPIP Node and catalog database), and then you can integrate db2 commands (db2 insert ... - log result) in your script.