I have a running elasticsearch in one server and I need to execute some bash script that has been located in a different machine/server. My question is whether I can use exec command plugin in logstash to command logstash execute that creates bash script in another server ?
Related
I'm trying to create a cronjob in my VM server that run a shell script that upload the apache log to a bigquery table and the script have this assertments:
#!/bin/bash
bq load --project_id=myproject --field_delimiter=" " mylogs.mylogsfont /var/log/apache2/access.log /var/log/apache2/schema_access.txt
rm /var/log/apache2/access.log
/etc/init.dapache2 restart
but the bq load command do not run.
Is there a way to run this command in a shell script?
I've looked thru the web and I found the boto files and gcsfuse but both talks about jobs with the storage and have no metntions to bigquery jobs
I use NiFi on Windows 10 machine with installed Linux subsystem (Ubuntu).
My task is to execute bash scripts and commands using NiFi. I tried to use ExecuteProcess and ExecuteStreamCommand with selected commands like simply 'bash' or 'bash ls' for test purposes, but all I got was:
ExecuteProcess[id=4f530725-0171-1000-d1b1-7df587eada7e] /bin/ls:
/bin/ls: cannot execute binary file
If I try to pass basic Windows commands everything is OK.
Is there a way to run bash commands in my case?
I'm not a Windows user but according to the docs, in order to get to the Linux-style commands you have to run Bash.exe, so I'm guessing you'll need to specify -c as an argument followed by the Linux bash command you want to run (as a string), something like:
bash -c "ls"
I need to run a bash script continuously for indefinite time inside a docker container in Azure via Azure Container Instance service (ACI). My bash script has a while loop that keeps it running and Azure container has OnFailure Property to restart container if fails.
I see that after running Container for about 2 days, Container status is Running. However, the bash script that was running in foreground and sending logs in azure container console seems to be died and no longer sending logs to console. I also see it's not doing what it supposed to do.
How can I reliably keep this bash script running for indefinite time in Azure container?
The bash script which has internal while loop runs as below:
Commands
bash
my-while-loop-script.sh
To solve this issue, I replaced while loop inside my-while-loop-script.sh with a crond to execute a python application as a cron job. below is the line that executes a cron inside my-while-loop-script.sh. this line will execute my-cron.cron contents show below:
./busybox crond -f
To achieve that, I used busybox 1.30.1 tools. To install busybox in your docker:
ADD busybox-1.30.1/ /busybox
WORKDIR /busybox
RUN make defconfig
RUN make
And, you also need to add cron settings to crontabs dir.
RUN mkdir -p /var/spool/cron/crontabs/
# Copy cron settings
ADD my-cron.cron /var/spool/cron/crontabs/root
Sample my-cron.cron looks like just a normal cron file:
* * * * * python my-app.py
I want to execute the Cassandra "nodetool refresh" command outside the container.
Do we have a command to execute via shell script?
my shell script nodetool_refresh.sh is in host.
I want to mention that I don't want docker exec command .
I am just wondering is that possible to run one script (e.g. shell script, python script, etc.) in different environments?
For example, I want to run my script from Linux shell to docker container shell (which the container is created by the script)? In other words, keep the script executing the rest of commands on container (after into the container).
run.sh (#shell script)
sudo docker exec -it some_containers bash #this command will lead me to docker container environment
apt-get install curl # I want to also execute this command inside the docker container after I enter the docker container environment
# this is just one script
Your question is not very clear, but it sounds like this is a job requiring two scripts - the first script runs in your "Linux shell", and needs to cause the second script to be placed into the container (perhaps by way of the dockerfile), at which point you can have the first script use docker exec.
Please see the answers on this question for more information.