How to email time-out notice from Google Cloud Storage? - installation

I have a gsutil script that that periodically backs up data to Google Could Storage.
The gsutil backup script runs on my local box.
I would like to run a script (or service) on Google Could Storage, that emails a warning to the administrator when no backup has been made in 24 hours.
I am new to cloud services. Please point me in the right direction.
Where would such a script be located? Is there a similar example script?
Thank you.

There's no built-in feature that accomplishes this. However, you could accomplish something like this with another monitor program.
For example, I might edit my backup script such that after successfully completing a backup, it writes the current time to a "last_successful_backup.txt" file. Then, I'd put a cronjob wherever I keep my monitors and alerting systems that would check the "last_successful_backup.txt" file every few hours and set off an alarm if the time it contains is older than 24 hours.

What about to spin up Google VM and send emails from the instance? Using, say, SendGrid, Mailgun, or Mailjet

Related

How can I run a .sh script on Google Cloud Shell on schedule?

I have a .sh script in Google Cloud Shell that automates my instance shutdown, backup, restart sequence.
How can I run a .sh script on Schedule (i.e. daily) in a simplest possible way?
I am not a professional and I've read all documentation about cron jobs, Cloud Scheduler, Cloud Tasks... but none of the examples in the documentation appear to detail a simple task that I need, and I do not have enough knowledge yet to understand these multiple services in details.... I just need a simple direction pointer to understand how to connect my Google Cloud Shell .sh script with any form of scheduler, as in:
Run a .sh script that I have in my virtual 5gb Cloud Sell Storage on schedule (daily at specific time), instead of manually opening Google Cloud Console and using a terminal to run the same script with "bash" command?
I just need to know what I need to learn/do to make this happen.
Thank you for your input.
That's not going to be possible. The Cloud Shell will turn off shortly after you close the tab. For this you'll need to use an actual VM. You can run one for free using the e2 micro instance.
https://cloud.google.com/free/docs/gcp-free-tier/#compute
Once you got this setup you can learn crontab to run your script on a schedule.

Executing gcloud commands in bash

I've spent 3 days beating my head against this before coming here in desperation.
So long story short I thought I'd fire up a simple PHP site to allow moderators of a gaming group I'm in the ability to start GCP servers on demand. I'm no developer so I'm looking at this from a Systems perspective to find the simplest solution to do the job.
I fired up an Ubuntu 18.04 machine on GCP and set it up with the Google SDK, authorised it for access to the project and was able to simply run gcloud commands which worked fine. Had some issues with the PHP file calling the shell script to run the same commands but with some testing I can see it's now calling the shell script no worries (it broadcasts wall "test") to console everytime I click the button on the PHP page.
However what does not happen is the execution of the gcloud command. If I manually run this shell script it starts up the instance no worries and broadcasts wall, if I click the button it broadcasts but that's it. I've set the files to have execution rights and I've even added the user nginx runs as to have sudo rights, putting sudo sh in front of the command in the PHP file also made no difference. Please find the bash script below:
#!/bin/bash
/usr/lib/google-cloud-sdk/bin/gcloud compute instances start arma3s1-prod --zone=australia-southeast1-b
wall "test"
Any help would be greatly appreciated, this coupled with an automated shut down would allow our gaming group to save money by only running the servers people want to play on.
Any more detail you want about the underlying system please let me know.
So I asked a PHP dev at work about this and in two seconds flat she pointed out the issue and now I feel stupid. In /etc/passwd the www-data user had /usr/sbin/nologin and after I fixed that running the script gcloud wanted permissions to write a log file to /var/www. Fixed those and it works fine. I'm not terribly worried about the page or even server being hacked and destroyed, I can recreate them pretty easily.
Thanks for the help though! Sometimes I think I just need to take a step back and get a set fresh of eyes on the problem.
When you launch a command while logged in, you have your account access rights to the Google cloud API but the PHP account doesn't have those.
Even if you add the www-data user to root, that won't fix the problem, maybe create some security issues but nothing more.
If you really want to do this you should create a service account and giving the json to the env variable, GOOGLE_APPLICATION_CREDENTIALS, which only have the rights on the compute instance inside your project this way your PHP should have enough rights to do what you are asking him.
Note that the issue with this method is that if you are hacked there is a change the instance hosting your PHP could be deleted too.
You could also try to make a call to prepared cloud function which will create the instance, this way, even if your instance is deleted the cloud function would still be there.

How can I do a daily database backup for my VPS?

I currently have a VPS and I have the fear of something happening to it, either by me or by the hosting company, and so I need to have a daily backup sent to servers unrelated to that of the hosting company.
Essentially I need my server to automatically export my database into an SQL file and then send it to a third party server, idk, such as google or whatever, on a daily basis or even a few times every day so if something happens to the server the sql files will be accessible regardless.
How can I achieve that?
We are not suppose to write you a solution, only help you with coding errors etc.
Here's what you can do:
Create a shell script on the remote server that you want to save the database,this can be a mac or a linux box, we need cron an a shell.
Create a cron job to run dayly.
ShellScript Example. [dbBackup.sh]
#!/bin/bash
today =`date '+%Y-%m-%d'`;
ssh root#remoteServer.com mysqldump -u root --password=SomeDiffPassword databaseName > /home/user/DailyDatabaseBackups/database_$today.sql
Cron Example
* * * * * /home/user/dbBackup.sh

how to schedule the shell script using Google Cloud Shell?

I have a .sh file that is stored in GCS. I am trying to schedule the .sh file through google cloud shell.
I can run the same file using gsutil cat gs://miptestauto/baby.sh | sh command but not able to schedule it.
Following is my code for scheduling the file:
16 17 * * * gsutil cat gs://miptestauto/baby.sh | sh
It displays the message as "auto saving..done" but the scheduled job is not get displayed when I use crontab -l
# contents of .sh file
bin/bash
bq load --source_format=CSV babynames.baby_destination13 gs://testauto/yob2010.txt name:string,gender:string,count:integer
Please can anyone tell me how schedule it using google cloud shell.
I am not using compute engine/app engine. Just wanted to schedule it using the cloud shell.
thank you in advance :)
As per the documentation, Cloud Shell is intended for interactive use only. The Cloud Shell instances are provisioned on a per-user, per-session basis and sessions are terminated after an hour of inactivity.
In order to schedule a daily cron job, the instance needs to be up and running all time but this doesn’t happen with Cloud Shell and I believe your jobs are not running because of this.
When you start Cloud Shell, it provisions a f1-micro instance which is the same machine type you can get for free if you are eligible for “Always Free”. Therefore you can create a f1-micro instance, configure the cron job on it and leave it running so it can execute the daily job.
You can check free usage limits at https://cloud.google.com/compute/pricing#freeusage
You can also use the Cloud Scheduler product https://cloud.google.com/scheduler which is a serverless managed Cron like scheduler.
To schedule a script you first have to create a project if you don’t have one. I assume you already have a project so if that’s the case just create the instance that you want for scheduling this script.
To create the new instance:
At the Google Cloud Platform Console click on Products & Services which is the icon with the four bars at the top left hand corner.
On the menu go to the Compute section and hover on Compute Engine and then click on VM Instances.
Go to the menu bar above the instance section and there you will see a Create Instance button. Click it and fill in the configuration values that you want your new instance to have. The values that you select will determine your VM instance features. You can choose, among other values, the name, zone and machine type for your new instance.
In the Machine type section click the drop-down menu tab to select an “f1-micro instance”.
In the Identity and API access section, give access scope to the Storage API so that you can read and write to your bucket in case you need to do so; the default access scope only allows you to read. Also enable BigQuery API.
Once you have the instance created and access to the bucket, just create your cron job inside your new instance: In the user account under which the cron job will execute, run crontab -e and edit this file to run the cron job that will execute your baby.sh script.The following documentation link should help you with this.
Please note, if you want to view output from your script you may need to redirect it to your current terminal.

In rsyslog, how can I trigger an hourly email aggregating entries in one of our logs?

One of our apps has been configured to log certain errors to a log on a remote server using rsyslog. I've been asked to provide an hourly email that lists the errors logged within the last hour. I've looked at ommail, but it doesn't seem to do exactly this. Any suggestions on how best to do this?
I would go low tech on this:
put error messages in a separate file like
*.error /var/log/error.log
then rotate it hourly via logrotate
From logrotate, you can run a script in the prerotate or postrotate part, where you can take the contents of the file and send them via Email.
ommail is more for sending logs matching a certain filter, so it would be hacky to make it send such "digests".

Resources