How to get the intermediate debug logs of cwl-runner while running a workflow? - common-workflow-language

How do I get the logs of all the steps in a cwl-runner, that I see on the terminal. I want to be able to store these logs somewhere, so that I can use the outputs of intermediate steps.
Is there a flag that I can use while running cwl-runner? I tried different flags like tmp-prefix, --output-dir and etc, but somehow they are not what I want.

Related

Configure Apache Solr logging to show warnings and slow queries via global config file

I start Solr in the foreground like so C:\solr-8.10.1\bin\solr start -p 8983 -m 1536m -f -v
It shows a command window and it logs a massive amount of DEBUG info, which I don't need.
I want to reduce the amount of logging here, and I found this: https://solr.apache.org/guide/8_5/configuring-logging.html
This seems exactly like what I need for my scenario:
I have many cores, each with their own solrconfig.xml:
C:\solr-8.10.1\server\solr\core1
C:\solr-8.10.1\server\solr\core2
C:\solr-8.10.1\server\solr\core3
C:\solr-8.10.1\server\solr\coreX
I don't want to have to make the logging changes to each core separately but 1 global setting that applies to all
I don't use Solr API, I want to be able to change settings via config files
I want ERRORS to be logged, and also any slow queries
After reading the tutorial then I decided I need to:
start Solr using solr start -p 8983 -m 1536m -f -q
Need to add an element <slowQueryThresholdMillis>1000</slowQueryThresholdMillis>
However, it's that last part where I have questions. I see a reference made to so called configsets, but I have no idea if that's the place where I need to configure my global settings.
I inspected the sample files, e.g. \solr-8.10.1\server\solr\configsets\sample_techproducts_configs\conf\solrconfig.xml
But I can't figure out if that's the right config file or how it would even apply to all other cores without any reference to the other cores.
I've had a look at these already, but they seem to want to handle things via code, whereas I'm looking for a file configuration:
configure Logger via global config file
Use of readConfiguration method in logging activities

Gitlab CI Display times in job log

Is there a way to display times in Gitlab CI in the job log?
As shown in the image below, for some commands (I guess automated setup commands) there are times shown on the right.
Is there a way to show these times on custom script commands?
I know that I can use e.g. bashs date command. But this does not show the time difference from the initialization time plus does not look that good. Also the time-displays are present already somewhere, so I wonder if they can be used directly.

Go CD failure message available in environment

There is a list of env variables available for GoCD at:
https://docs.gocd.org/current/faq/environment_variables.html
However I'm looking for something like: GO_BUILD_ERROR or similar.
I would like to have the failure reason or message when a build fails to pass this to an external script or message.
There seems to be nothing in the documentation.
GoCD doesn't have any such variables. The reason I feel is mostly because GoCD is very generic in terms of what commands constitute a build for a material. You might want to parse the logs manually to figure that out.
Also in the context of GoCD environment variables are used as input to the stages and not as output from them. If you're planning to build a plugin / wrapper for the commands that run consider storing them as properties in the jobs that way they can also be queried upon later if required.

Is there a way in Laravel to send an output to both console and log?

I am writing a command which has a lot of service information that I need to see during the command is running.
I am outputing this info simply running echo "some text", and that way I can see what happens when running this command. When the same command is run with scheduler I have to log all this info. So I have to duplicate all the same messages with: Log::info("some text").
If I want to avoid duplication I can create a helper class that can have all this, that I then include in all the service classes that are related to this command and use this helper class to avoid code duplication, but I still feel that this is not ideal solution. Is there maybe a built in way in Laravel on how to sent to console output and Log at the same time?
You could add: ->appendOutputTo('path')); when running your task that execute your command, to store the output messages in your log file. Although, I'm not sure if this will log all console I/O (it will be good to know in case you test it).
Check this.

How to run spark-jobs outside the bin folder of spark-2.1.1-bin-hadoop2.7

I have an existing spark-job, the functionality of this spark-job is to connect kafka-server get the data and then storing the data into cassandra tables, now this spark-job is running on server inside spark-2.1.1-bin-hadoop2.7/bin but whenever I am trying to run this spark-job from other location, Its not running, this spark-job contains some JavaRDD related code.
Is there any chance, I can run this spark-job from outside also by adding any dependency in pom or something else?
whenever I am trying to run this spark-job from other location, Its not running
spark-job is a custom launcher script for a Spark application, perhaps with some additional command-line options and packages. Open it, review the content and fix the issue.
If it's too hard to figure out what spark-job does and there's no one nearby to help you out, it's likely time to throw it away and replace with the good ol' spark-submit.
Why don't you use it in the first place?!
Read up on spark-submit in Submitting Applications.

Resources