Transfer from smb to next cloud - smb

I have already smb network share set up. Every user has its own home folder that is shared. Now I want to switch to nextcloud as smb is quite slow when using vpn. Probably there is a way to fix it but as know nextcloud is faster and I'm not a network expert its just to big of a time waste. Now I want to keep my old smb structure and have the files shared from smb and from next cloud. But next cloud is not aware in case files are added from smb. How can tell next cloud to "scan" for new files? I'm guessing there is some command that I can run to check if new files are added.

To enable this, you need to perform a full file scan with Nextcloud. As the folders get bigger, the file scans take more and more time, which is why an update trigger for newly added files is not worth it. The only other option is to run a cronjob once or twice a day at times when the cloud is least likely to be used.
To configure the cronjob for www-data the following changes have to be made:
type "sudo crontab -u www-data -e" in your terminal.
append the following line to trigger a file scan every day at 2am "0 2 * * * php "path_to_nextcloud"/occ files:scan --all".
Now all files in the data directory of Nextcloud will be scanned every day at 2am.
If you are not familiar with setting up a cronjob, use this site https://crontab.guru/

Related

How to email time-out notice from Google Cloud Storage?

I have a gsutil script that that periodically backs up data to Google Could Storage.
The gsutil backup script runs on my local box.
I would like to run a script (or service) on Google Could Storage, that emails a warning to the administrator when no backup has been made in 24 hours.
I am new to cloud services. Please point me in the right direction.
Where would such a script be located? Is there a similar example script?
Thank you.
There's no built-in feature that accomplishes this. However, you could accomplish something like this with another monitor program.
For example, I might edit my backup script such that after successfully completing a backup, it writes the current time to a "last_successful_backup.txt" file. Then, I'd put a cronjob wherever I keep my monitors and alerting systems that would check the "last_successful_backup.txt" file every few hours and set off an alarm if the time it contains is older than 24 hours.
What about to spin up Google VM and send emails from the instance? Using, say, SendGrid, Mailgun, or Mailjet

Golang file and folder replication / mirroring across multiple servers

Consider this scenario. In a load-balanced environment, I have 3 separate instances of a CMS running on 3 different physical servers. These 3 separate running instances of the application is sharing the same database.
On each server, the CMS has a /media folder where all media subfolders and files reside. My question is how I'd implement/code a file replication service/functionality in Golang, so when a subfolder or file is added/changed/deleted on one of the servers, it'll get copied/replicated/deleted on all other servers?
What packages would I need to look in to, or perhaps you have a small code snippet to help me get started? That would be awesome.
Edit:
This question has been marked as "duplicate", but it is not. It is however an alternative to setting up a shared network file system. I'm thinking that keeping a copy of the same file on all servers, synchronizing and keeping them updated might be better than sharing them.
You probably shouldn't do this. Use a distributed file system, object storage (ala S3 or GCS) or a syncing program like btsync or syncthing.
If you still want to do this yourself, it will be challenging. You are basically building a distributed database and they are difficult to get right.
At first blush you could checkout something like etcd or raft, but unfortunately etcd doesn't work well with large files.
You could, on upload, also copy the file to every other server using ssh. But then what happens when a server goes down? Or what happens when two people update the same file at the same time?
Maybe you could design it such that every file gets a unique id (perhaps based on the hash of its contents so you can safely dedupe) and those files can never be updated or deleted, only added. That would solve the simultaneous update problem, but you'd still have the downtime problem.
One approach would be for each server to maintain an append-only version log when a file is added:
VERSION | FILE HASH
1 | abcd123
2 | efgh456
3 | ijkl789
With that you can pull every file from a server and a single number would be sufficient to know when a file is added. (For example if you think Server A is on version 5, and you get informed it is now on version 7, you know you need to sync 2 files)
You could do this with a database table:
ID | LOCAL_SERVER_ID | REMOTE_SERVER_ID | VERSION | FILE HASH
Which you could periodically poll and do your syncing via ssh or http between machines. If a server was down you could just retry until it works.
Or if you didn't want to have a centralized database for this you could use a library like memberlist. The local meta data for each node could be its version.
Either way there will be some amount of delay between a file was uploaded to a single server, and when it's available on all of them. Handling that well is hard, which is why you probably shouldn't do this.

Syncing a file from a client to a server

I'm trying to keep a file updated real time with the server. Its more like a real time syncing which has a very small delay. Is there any application that lets me do this? Or would you suggest me using a local host as a server?
I dont know how you are connected to your server - but i assume this will be something like SCP / SFTP / FTP and i dont know your OS. WinSCP will do excatly this what you need, you can set it to watch your Filesystem (to a specified folder) and it will update the server files as soon as your file on your drive changes.
It also supports command line features so that you can use it within your own applications.

copying large files over

I have a dedicated server where I host a large website. We need to do an upgrade on the website and I want to create a development copy on a testurl (on a different cpanel account) but same server.
The files are around 1GB in total size and 70,000 in number.
I have tried WS FTP pro but it has only copied 10% in around 20 hours.
What's the easiest and quickest method to create a replica on my development URL?
I am a newbie so please give detailed instructions.
Thanks
I would think the easiest method would be this:
Create the new account in WHM
Login via SSH
Navigate to your existing account folder
Copy the files to the new account folder
This should be pretty easy for you, as long as you know how to access your server via SSH. It's pretty simple:
Login via SSH
Type su and enter your root password (this is only necessary if you SSH into your server using an account other than root - a good practice, in my opinion)
Find and navigate to your source account. I'm assuming you're probably setup to have your web accounts in the /home folder, so try typing something like cd /home/source_folder
Once you're in the correct source directory, type cp -R * /home/destination_folder
That's pretty much it. The -R option recursively copies all the files from your source to your destination, and if you're copying a HUGE number of files, you might consider adding --verbose after the -R option so you can see it working. I apologize in advance if I've gone a little more granular than needed.

Encrypted FTP Storage

I guess this is kind of a programming question, because I'm going to write a program if this doesn't exist.
So I found a very cheap web-host (I don't really care about the actual web hosting). They will give me a domain name and ftp server with a ton of storage space. Anyway, I want to backup a few hundred gigs of data (mostly family photos and scans of important documents). I also want to backup any future family photos / documents. I don't care if everything on my local NAS dies in a fire, I just want to have the photos and important documents backed up off-site.
So I want some program that lets me select folders locally and schedules them to be backed up to the ftp server. I'm a bit of a security nut, so i'd like the files to be encrypted locally before being transferred up onto the server.
I know I can do this with truecrypt volumes, but I don't want to transfer an entire encrypted volume blob up to the server ever time I change a file in it. So I could do multiple true crypt volumes but that will be a pain to manage
Also this must be mac/linux compatible although I'll primarily be on linux.
I basically need rsync + truecrypt + cron + sftp all rolled into a cryptographically secure program.
I've been searching for days with no luck. Any ideas?
mozyBackup does this - it doesn't use FTP, it has a custom uploader.
ps. Remember a typical home ADSL connection only does about 1Gb/day upstream
Linux option.
Out of the box option probably duplicity ( for example see http://www.howtoforge.com/creating-encrypted-ftp-backups-with-duplicity-and-ftplicity-on-debian-lenny )
Otherwise if these are basically rarely changed archive copies of files - I would roll my own gnupg (or dpad) individual file encryption, a file changed script, and ftp or rsync.

Resources