How can I do a daily database backup for my VPS? - vps

I currently have a VPS and I have the fear of something happening to it, either by me or by the hosting company, and so I need to have a daily backup sent to servers unrelated to that of the hosting company.
Essentially I need my server to automatically export my database into an SQL file and then send it to a third party server, idk, such as google or whatever, on a daily basis or even a few times every day so if something happens to the server the sql files will be accessible regardless.
How can I achieve that?

We are not suppose to write you a solution, only help you with coding errors etc.
Here's what you can do:
Create a shell script on the remote server that you want to save the database,this can be a mac or a linux box, we need cron an a shell.
Create a cron job to run dayly.
ShellScript Example. [dbBackup.sh]
#!/bin/bash
today =`date '+%Y-%m-%d'`;
ssh root#remoteServer.com mysqldump -u root --password=SomeDiffPassword databaseName > /home/user/DailyDatabaseBackups/database_$today.sql
Cron Example
* * * * * /home/user/dbBackup.sh

Related

Transfer from smb to next cloud

I have already smb network share set up. Every user has its own home folder that is shared. Now I want to switch to nextcloud as smb is quite slow when using vpn. Probably there is a way to fix it but as know nextcloud is faster and I'm not a network expert its just to big of a time waste. Now I want to keep my old smb structure and have the files shared from smb and from next cloud. But next cloud is not aware in case files are added from smb. How can tell next cloud to "scan" for new files? I'm guessing there is some command that I can run to check if new files are added.
To enable this, you need to perform a full file scan with Nextcloud. As the folders get bigger, the file scans take more and more time, which is why an update trigger for newly added files is not worth it. The only other option is to run a cronjob once or twice a day at times when the cloud is least likely to be used.
To configure the cronjob for www-data the following changes have to be made:
type "sudo crontab -u www-data -e" in your terminal.
append the following line to trigger a file scan every day at 2am "0 2 * * * php "path_to_nextcloud"/occ files:scan --all".
Now all files in the data directory of Nextcloud will be scanned every day at 2am.
If you are not familiar with setting up a cronjob, use this site https://crontab.guru/

how to schedule the shell script using Google Cloud Shell?

I have a .sh file that is stored in GCS. I am trying to schedule the .sh file through google cloud shell.
I can run the same file using gsutil cat gs://miptestauto/baby.sh | sh command but not able to schedule it.
Following is my code for scheduling the file:
16 17 * * * gsutil cat gs://miptestauto/baby.sh | sh
It displays the message as "auto saving..done" but the scheduled job is not get displayed when I use crontab -l
# contents of .sh file
bin/bash
bq load --source_format=CSV babynames.baby_destination13 gs://testauto/yob2010.txt name:string,gender:string,count:integer
Please can anyone tell me how schedule it using google cloud shell.
I am not using compute engine/app engine. Just wanted to schedule it using the cloud shell.
thank you in advance :)
As per the documentation, Cloud Shell is intended for interactive use only. The Cloud Shell instances are provisioned on a per-user, per-session basis and sessions are terminated after an hour of inactivity.
In order to schedule a daily cron job, the instance needs to be up and running all time but this doesn’t happen with Cloud Shell and I believe your jobs are not running because of this.
When you start Cloud Shell, it provisions a f1-micro instance which is the same machine type you can get for free if you are eligible for “Always Free”. Therefore you can create a f1-micro instance, configure the cron job on it and leave it running so it can execute the daily job.
You can check free usage limits at https://cloud.google.com/compute/pricing#freeusage
You can also use the Cloud Scheduler product https://cloud.google.com/scheduler which is a serverless managed Cron like scheduler.
To schedule a script you first have to create a project if you don’t have one. I assume you already have a project so if that’s the case just create the instance that you want for scheduling this script.
To create the new instance:
At the Google Cloud Platform Console click on Products & Services which is the icon with the four bars at the top left hand corner.
On the menu go to the Compute section and hover on Compute Engine and then click on VM Instances.
Go to the menu bar above the instance section and there you will see a Create Instance button. Click it and fill in the configuration values that you want your new instance to have. The values that you select will determine your VM instance features. You can choose, among other values, the name, zone and machine type for your new instance.
In the Machine type section click the drop-down menu tab to select an “f1-micro instance”.
In the Identity and API access section, give access scope to the Storage API so that you can read and write to your bucket in case you need to do so; the default access scope only allows you to read. Also enable BigQuery API.
Once you have the instance created and access to the bucket, just create your cron job inside your new instance: In the user account under which the cron job will execute, run crontab -e and edit this file to run the cron job that will execute your baby.sh script.The following documentation link should help you with this.
Please note, if you want to view output from your script you may need to redirect it to your current terminal.

How to email time-out notice from Google Cloud Storage?

I have a gsutil script that that periodically backs up data to Google Could Storage.
The gsutil backup script runs on my local box.
I would like to run a script (or service) on Google Could Storage, that emails a warning to the administrator when no backup has been made in 24 hours.
I am new to cloud services. Please point me in the right direction.
Where would such a script be located? Is there a similar example script?
Thank you.
There's no built-in feature that accomplishes this. However, you could accomplish something like this with another monitor program.
For example, I might edit my backup script such that after successfully completing a backup, it writes the current time to a "last_successful_backup.txt" file. Then, I'd put a cronjob wherever I keep my monitors and alerting systems that would check the "last_successful_backup.txt" file every few hours and set off an alarm if the time it contains is older than 24 hours.
What about to spin up Google VM and send emails from the instance? Using, say, SendGrid, Mailgun, or Mailjet

Running background services ruby

I have to run couple of scripts which crawl some 1000s web pages and save some information for every 10 minutes.
I am using dreamhost shared hosting for my PHP site hosting.
What would be the appropriate way to configure these services in cron so that it executes 24X7.
Please let me know which host i can use for the same.
If you can ssh into your server, you would need to run "crontab -e" to edit your cron jobs and then add a line like this:
*/10 * * * * path/to/ruby path/to/your/script.rb

need script to export file to remote server

AIX 5.3
DB 10.2.0.4
My requirement is to create a script that does export a big table from source to remote destination server and on remote destination database server to create another script that import that export table.
In export script I need to add to purge exported data and it schedule on weekly basis.
Thanks,
There's a lot to this question. I would recommend you break it down and investigate how to accomplish each piece. For example:
Copy files between computers: see scp
Execute tasks on remote server: see ssh
Schedule on weekly basis: see cron
Your question may be better served on ServerFault, as well.

Resources