Spring XD Rest api job launch with jobParameters responding with 'jobParameters' is not recognized as an internal or external command - spring-xd

I have Spring XD job already deployed which expects 2 jobParameters (absoluteFilePath and fileName). Actually this job is triggered by JMS stream whose output provides those 2 jobParameters in JSON format and that works fine. I want to launch the job with REST API like:
curl -X POST http://localhost:9393/jobs/executions?jobname=loadData&jobParameters=%7B%22absoluteFilePath%22%3A%22C%3A%2FUB%2Fdev%2FBM.txt%22%2C%22fileName%22%3A%22BM.txt%22%7D
Error I'm getting:
'jobParameters' is not recognized as an internal or external command, operable program or batch file.
Wondering if there is anything wrong with CURL command or if jobParameters is not supported?
I'm able to launch a job without jobParameters with the following CURL command, but as the job expects parameters it fails.
curl -X POST http://localhost:9393/jobs/executions?jobname=loadData

Have you tried launching via the XD shell? It sends the jobName and jobParameters in the request body...
public void launchJob(String name, String jobParameters) {
String uriTemplate = resources.get("jobs/executions").toString();
MultiValueMap<String, Object> values = new LinkedMultiValueMap<String, Object>();
values.add("jobParameters", jobParameters);
values.add("jobname", name);
restTemplate.postForObject(uriTemplate, values, Object.class);
}
This looks like an OS shell message; try adding '...' around the URL.
Google makes me think this is on Windows (not sure where you're getting curl from); for Windows, you'll probably need "...".
(The problem is the &).

Related

Rundeck Webhook to pass an argument to a shell script

I have a shell script which I would like to pass two arguments. The script accepts a hostname($1) and a directory name($2) as arguments
just_pull.sh HOSTNAME CONFIG_DIR
I have created a simple Rundeck job to run this script when the Webhook is called. I have gone through the documentation and it does not provide a way to do this as I am new to rundeck and its web API's. Passing the JSON value to the URL also throws a null value. I believe I am sending or receiving the data in an improper manner.
How can I define the arguments section in job and how to correctly add the {option.hostname} in Webhook arguments section.
Thanks #MegaDrive68k for the answer.
To elaborate the answer to my question,
I had to modify the job also. The 'Argument' section has to be filled with, ${option.hostname} ${option.conf}
Additionally the webhook should have -hostname ${data.field1} -conf ${data.field2} as 'Options'.
To call the Webhook, run the following command,
curl -H "Content-Type: application/json" -X POST -d '{"field1" : "localhost", "field2" : "conf"}' http://rundeckurl.com/api/38/webhook/aZmoByl0Hmasked8RkxBT8Oda#webhookname
The above command will pass the arguments to my script in question as,
just_pull.sh localhost conf
In this way. Basically you need to define the argument on the "Options" field (Webhook definition).

Getting/using output of CMD window of Jmeter

I,m running a Java file from BeanShell Sampler in jmeter, I'm getting the output successful in cmd windows of jmeter. The output comprises of series of logger files,I need to extract only a specified string from the cmd window and use it for another sample
Given you run your program using i.e. ProcessBuilder you should be able to access its output via Process.getInputStream() method
Process process = new ProcessBuilder('c:\\apps\\jmeter\\bin\\jmeter.bat', '-v').start()
String output = org.apache.commons.io.IOUtils.toString(process.getInputStream(),'UTF-8')
log.info('My program output is:')
log.info(output)
Also I would recommend considering switching to JSR223 Sampler and Groovy language as this way it will be much faster and easier:
def output = "jmeter.bat -v".execute().text
log.info('My program output is:')
log.info(output)
Demo:
This java bean shell Command made the console out by j meter that is std out to be written in a file
System.setOut(new PrintStream(new BufferedOutputStream(new FileOutputStream("D:\\dir1\\dir2\\abc.out")),true));
Make sure your path to file should have double backward slash

Java 8 Acces Denied while running process with arguments

In my program i need to run exe file in process. I'm doing it with ProcessBuilder. When i'm putting to code only directory and exe name, process is running normally, but i want to put arguments. When i'm trying it i'm getting exception with Acces Denied message.
It's my code:
Process process = new ProcessBuilder("C:\\Directory", "file.exe", argument1).start();
What is wrong with it?
My earlier code, that worked but without arguments was:
String folder = "C:\\Directory";
String exe = "File.exe";
ProcessBuilder pb = new ProcessBuilder();
pb.command(folder + exe);
pb.start();
With this code i was able to see started process in ProcessManager.
Your code is trying to execute C:\\Directory which is not allowed.
The full path of the executable must be in the first argument to the constructor, so:
Process process = new ProcessBuilder("C:\\Directory\\file.exe", argument1).start();
This is assuming C:\Directory\file.exe is the program you are trying to run.
Update: In your original code you have:
String folder = "C:\\Directory";
String exe = "File.exe";
so 'folder + exe' is C:\DirectoryFile.exe so you the equivalent code is:
Process process = new ProcessBuilder("C:\\DirectoryFile.exe", argument1).start();

Groovy script can't execute() external process

Main question: Would groovy's execute() method allow me to run a command that takes a file as an argument, any maybe run the command in background mode?
Here is my issue. I was able to use groovy's execute() for simple commands like ls for example. Suppose now I want to start a process like Kafka from a groovy script (end result is to replace bash files with groovy scripts). So I start with these lines:
def kafkaHome = "Users/mememe/kafka_2.11-0.9.0.1"
def zkStart = "$kafkaHome/bin/zookeeper-server-start.sh"
def zkPropsFile = "$kafkaHome/config/zookeeper.properties"
Now, executing the command below form my mac terminal:
/Users/mememe/kafka_2.11-0.9.0.1/bin/zookeeper-server-start.sh /Users/mememe/kafka_2.11-0.9.0.1/config/zookeeper.properties
starts up the the process just fine. And, executing this statement:
println "$zkStart $zkPropsFile"
prints the above command line as is. However, executing this command from within the groovy script:
println "$zkStart $zkPropsFile".execute().text
simply hangs! And trying this:
println "$zkStart $zkPropsFile &".execute().text
where I make it a background process goes further, but starts complaining about the input file and throws this exception:
java.lang.NumberFormatException: For input string: "/Users/mememe/kafka_2.11-0.9.0.1/config/zookeeper.properties"
Trying this gives the same exception as above:
def proc = ["$zkStart", "$zkPropsFile", "&"].execute()
println proc.text
What am I missing please? Thank you.
Yes, try using the consumeProcessOutpusStream() method:
def os = new File("/some/path/toyour/file.log").newOutputStream()
"$zkStart $zkPropsFile".execute().consumeProcessOutputStream(os)
You can find the the method in the Groovy docs for the Process class:
http://docs.groovy-lang.org/docs/groovy-1.7.2/html/groovy-jdk/java/lang/Process.html
Which states:
Gets the output and error streams from a process and reads them to keep the process from blocking due to a full output buffer. The stream data is thrown away but blocking due to a full output buffer is avoided. Use this method if you don't care about the standard or error output and just want the process to run silently - use carefully however, because since the stream data is thrown away, it might be difficult to track down when something goes wrong. For this, two Threads are started, so this method will return immediately.

Hadoop command line -D options not working

I am trying to pass a variable (not property) using -D command line option in hadoop like -Dmapred.mapper.mystring=somexyz. I am able to set a conf property in Driver program and read it back in mapper.
So I can use this to pass my string as additional parameter and set it in Driver. But I want to see if -D option can be used to do the same
My command is:
$HADOOP_HOME/bin/hadoop jar /home/hduser/Hadoop_learning_path/toolgrep.jar /home/hduser/hadoopData/inputdir/ /home/hduser/hadoopData/grepoutput -Dmapred.mapper.mystring=somexyz
Driver program
String s_ptrn=conf.get("mapred.mapper.regex");
System.out.println("debug: in Tool Class mapred.mapper.regex "+s_ptrn + "\n");
Gives NULL
BUT this works
conf.set("DUMMYVAL","100000000000000000000000000000000000000"); in driver is read properly in mapper by get method.
My question is if all of Internet is saying i can use -D option then why cant i? is it that this cannot be used for any argument and only for properties? whihc we can read by putitng in file that i should read in driver program then use it?
Something like
Configuration conf = new Configuration();
conf.addResource("~/conf.xml");
in driver program and this is the only way.
As Thomas wrote, you are missing the space. You are also passing variable mapred.mapper.mystring in your CLI, but in the code you are trying to get mapred.mapper.regex. If you want to use -D parameter, you should be using Tool interface. More about it is here - Hadoop: Implementing the Tool interface for MapReduce driver.
Or you can parse your CLI arguments like this:
#Override
public int run(String[] args) throws Exception {
Configuration conf = this.getConf();
String[] otherArgs = new GenericOptionsParser(conf,args).getRemainingArgs();
while (i<otherArgs.length) {
if (otherArgs[i].equals("-x")) {
//Save your CLI argument
yourVariable = otherArgs[++i];
}
//then save yourVariable into conf for using in map phase
Than your command can be like this:
$HADOOP_HOME/bin/hadoop jar /home/hduser/Hadoop_learning_path/toolgrep.jar /home/hduser/hadoopData/inputdir/ /home/hduser/hadoopData/grepoutput -x yourVariable
Hope it helps
To use -D option with hadoop jar command correctly, given below syntax should be used:
hadoop jar {hadoop-jar-file-path} {job-main-class} -D {generic options} {input-directory} {output-directory}
Hence -D option should be placed after job main class name i.e at third position. Because when we issue hadoop jar command then, hadoop scripts invokes RunJar class main(). This main () parses first argument to set Job Jar file in classpath and uses second argument to invoke job class main().
Once Job class main () is called then control is transferred to GenericOptionsParser which first parses generic command line arguments (if any) and sets them in Job's configuration object and then calls Job class' run () with remaining arguments (i.e input and output path)

Resources