Cloud Backup from Network Drive? - cloud-storage

My company keeps a document repository on a network drive. I need access to a small subset of directories on that remote drive (maybe 30G or so). But accessing the drive via VPN when I'm working remotely is unreliable.
I'm willing to buy some cloud storage but I'm not sure how to configure it to include just some of the subdirectories on the network drive. (I know that active backups / syncing will only happen when my laptop is on the intranet and "sees" the network drive, and this is fine.)
Is there a way to set up a local directory structure on my laptop, perhaps using symbolic links, that mirrors just the remote drive subdirs I need? Then I could aim a cloud storage service there to that local directory. Would this work?

I have been in your situation and I found a solution that certainly helps me accomplish my needs. Backup my files to the cloud and also access/share/recover files anytime. I've tried a couple of them like Carbonite & Mozy Pro (they were okay but not great) but Infrascale did pass the test. You should try contacting them - or even try Intronis.

Related

how can i create automatic file backup to save to cloud storage?

I have 4 workstations (windows) that I need to backup and save to gloud storage, I would like it to be automatically. it is possible?
You can imagine to set up, on each workstation a planned task that performs a gcloud rsync regularly in dedicated folder on Google Cloud Storage (and that get the correct folder from the local workstation)

Backup strategy ubuntu laravel

I am searching for a backup strategy for my web application files.
I am hosting my (laravel) application at an ubuntu (18.04) server in the cloud and currently have around 80GB of storage that needs to be backed up (this grows fast). The biggest files are around ~30mb, the rest of it are small jpg/txt/pdf files.
I want to make at least 2 times a day a full backup of the storage directory and store it as a zip file on a local server. I have 2 reasons for this: independence from cloud providers, and for archiving.
My first backup strategy was to zip all the contents of the storage folder en rsync the zip, this goes well until a couple of gigabytes then the server is completely stuck on cpu usage.
My second approach is with rsync, but this i can't track when a file is deleted / added.
I am looking for a good backup strategy that preferable generate zips before or after backup and stores them so we can browse and examine back in time.
Strange enough i could not find anything that suits me, i hope anyone can help me out.
I agree with #RobertFridzema that the whole server becomes unresponsive when using ZIP functionality from spatie package.
Had the same situation with a customer project. My suggestion is to keep the source code files within version control. Just backup the dynamic/changing files with rsync (incremental works best and fast) and create a separate database backup strategy. For example with MySQL/Mariadb: mysqldump, encrypt the resulting file and move it to an external storage as well.
If ZIP creation still is a problem, I would maybe use a storage which is already set up with raid functionality or if that is not possible, I would definitly not use the ZIP functionality on the live server. rsync incremental to another server and do the backup strategy there.
Spatie has a package for Laravel backups that can be scheduled in the laravel job scheduler. It will create zips with the entire project including storage dirs
https://github.com/spatie/laravel-backup

pyRevit Profile keeps changing

First I'm not the user using this but am implementing it for a couple of users.
We use VDi machines with all users profiles on the server. I have managed to clone the Git Repo and leave a copy on the server which I use Robo copy to copy to the users.
This has worked great but we are facing an issue when they want to change some settings we get an error. The Setting do work great if in the config file it is pointing to the UNC path (\domian.local\share\users\username) but if it points to the drive lette of the share (t:\users\username) or c drive (c:\users\username) we get an error.
I'll look for the errors and upload it.
Cheers
Isaac

How to save/backup amazon instance local

I would like to lower the cost paying to Amazon.
There are stopped instances that I want to backup and save on my local server, on-prem.
After creating an image from the instance, is there any way I can copy AMI to my local server and remove it from Amazon. So in a day, I will need back, it can transfer back from my local server to Amazon to use it again?
The instance first created on Amazon.I rather a way to save instance on-premise as a file and not as a virtual server.
The main issue is: How can I transfer and save the image of an instance, that created on Amazon, as a file to the local server and how I can return it back to be in Amazon, in case I need to build the instance again.
Is there any way to do it?
Thanks a lot!
You can use some backup software (duplicati, cloudberry, or anything else):
Install backup software to your EC2
Make an image backup to S3 cloud storage
Install backup software to your physical machine
Restore image from S3 cloud storage to physical machine or your local storage to keep this backup locally.
And the last, but not least thing:
Good luck!)))
You would need to use the VM Import/Export Tool for that. Read the docs to make sure you know how to upload again.
As to the cost, I am not sure how Amazon configures the cost, that is something you have to check out from your account. Once you create the image it is on your account. Even after you download it, not sure when AWS charges you or not.
You can create an image file from your current drive but it will be quite expensive:
create another instance
attach your volume there as the second drive
use something like dd if=/dev/xvd0 of=drive.img ... to copy volume to a file
rsync / ftp / etc the file to your local drive.
You will be billed for the second instance and for the transfer. When you want to restore the machine - you'll be billed too.
Have you checked free tier? You have a year of free access to AWS for small instances and volumes.
You need a tool to get what you want. Take eg Cloudberry and create an image and store it at Amazon and then restore and things are done. This is the best option for you. No other ways.

media folder on another server using symlink? (Magento)

is it possible to have my magento store on one server and have the store media folder on another server using a symlink?
Would I run into any problems if I was to do this?
I don't have any experience with symlinks and not exactly sure how they work.
You have to mount second server directory on your first server and than you can created symlink for this mounted point. ssfs is the tool I use for mounting.
But I will not recommend doing this because performance wise it will be slower, but in other hand if you can setup any domain/subdomain on that second server then you can provide your remote media url in Magento's system configuration, just like cdn server configuration. And it will enhance performance.

Resources