How to run google CLI commands in a linux shell script? - shell

I'm trying to create a cronjob in my VM server that run a shell script that upload the apache log to a bigquery table and the script have this assertments:
#!/bin/bash
bq load --project_id=myproject --field_delimiter=" " mylogs.mylogsfont /var/log/apache2/access.log /var/log/apache2/schema_access.txt
rm /var/log/apache2/access.log
/etc/init.dapache2 restart
but the bq load command do not run.
Is there a way to run this command in a shell script?
I've looked thru the web and I found the boto files and gcsfuse but both talks about jobs with the storage and have no metntions to bigquery jobs

Related

How to load results of bash command into gcs using Airflow?

I try to save json that is a result of bash command in the bucket of gcs. Executing the bash command in my local terminal everything works properly and it loads data into gcs. Unfortunately the same bash command doesn't work via Airflow. Airflow marks the task as successfully done but in gcs I can see empty file. I suspect that this happens because of the out of airflow memory but I am not sure. If so someone can explain me how and where the results are stored in airflow ? I see in the bash operator documentation that airflow creates a temporary directory which is cleaned after the execution. Does it mean that the results of bash command also are cleaned afterwards ? Is there any way to save the results in gcs ?
This is my dag:
get_data = BashOperator(
task_id='get_data',
bash_command='curl -X GET -H 'XXX: xxx' some_url | gsutil cp -L manifest.txt - gs://bucket/folder1/filename.json; rm manifest.txt',
dag=dag
)

logstash with bash script execution in different server

I have a running elasticsearch in one server and I need to execute some bash script that has been located in a different machine/server. My question is whether I can use exec command plugin in logstash to command logstash execute that creates bash script in another server ?

cassandra nodetool refresh from host

I want to execute the Cassandra "nodetool refresh" command outside the container.
Do we have a command to execute via shell script?
my shell script nodetool_refresh.sh is in host.
I want to mention that I don't want docker exec command .

run .sh script via Jenkins to execute aws command error

my problem is that i try to execute shell script to copy created files from msbuild to AWS s3 via Jenkins.
Then i add new build step "Execute Shell" and set to execute shell script by command: sh publishS3.sh nothing happens and files doesn't apper in s3 bucket.
my Jenkins use Local Windows Server.
Then i try to execute the shell script by typing sh publishS3.sh in Jenkins local directory all ok , files was copyed secessfully to s3 bucket , but if i try to do it from jenkins nothing was happen. My publishS3.sh script is:
#!/bin/bash
aws s3 cp Com.VistaDraft.Common.dll s3://download.vistadraft.com/MVP
i was tryed to to check witch output i receive after execute by adding at the end command > output.txt but Jenkins generate an empty file. If i try to do the same locally i was receive an message that i secessfully copyed files to s3. i Set the shell script path of jenkins C:\Program Files\Git\git-bash.exe and using git-bash.exe locally too. Maybe whom know where is a problem ? Please suggest.
You could try to add -ex in the first line of the script to allow you to see what it's doing and ease the debugging:
#!/bin/bash -ex
# rest of script
Make sure the aws tool is in the PATH of the environment where Jenkins runs your script. It might help if you specify full path to the command.
You could put which aws in your script to see what's going on.

mongodb backup script

I need to run mongodump on my database everyday.
How do I automate this reasonably? Every day I want a new folder created with the timestamp and the dump data inside.
Thanks.
Look at
https://github.com/micahwedemeyer/automongobackup
Otherwise use standard tools like cron or shell scripting for wrapping the mongodump call.
I have a super quick handy script. Sometimes I create a cron job for one of my database.
ssh root#hostname "mongodump --db myDatabaseName --out /tmp/mongo-backup ; zip -r /tmp/mongo-backup$(date "+%Y.%m.%d").zip /tmp/mongo-backup ; rm -rf /tmp/mongo-backup" ;
scp root#hostname:/tmp/mongo-backup$(date "+%Y.%m.%d").zip ./
The above script does two things.
Runs the mongodump script and builds a ZIP file like: mongo-backup2017.03.02.zip
Downloads that file via SCP to your local machine.
You can use the cron scheduler to run a mongodump shell script every day. Or you can even use iCal by creating an event, editing it, and selecting Run Script.

Resources