Creating continuously backups with duplicity, uploading them later - macos

I would like to use duplicity as a second and primarily as a remote backup for my macbook air. I would like to setup the backup as a regularly cronjob. I am traveling a lot so i can not ensure a fast or even an internet connection to my remote backup space at all.
Has anyone an idea how to to create regularly backups and upload them only, if an internet connection is detected, with duplicity?

Try duplicity along with Dropbox. So you copy your data to the local Dropbox directory.
When you have internet connection the dropbox client will sync your backup

I did a cronjobbed duplicity setup with a target being a local copy of iCloud, once there is connectivity the delta is being uploaded automagically.

Related

How to keep the data of two Postgresql installations in sync

I work with a desktop computer an a laptop. I develop on both machines but only one at a time. When I start using my laptop I sync all its data with the data directory of the server: text, applications, etc. When I'm back I sync my laptop back to the server. This works fine.
I recently started to use Postgres (I'm a noob to databases at all) and I expect that things are not that simple in that region. How can I keep the data of two Postgres installations in sync given the way I am working (sync server with laptop, use laptop, sync laptop with server)? Is that as simple as copying the modified data over the existing data or should another scheme be adopted?
I am not sure whether it matters but on the desktop I use win7-64 and on the laptop win8-64. I am trying to get acquainted with linux on virtual machines so a solution that is platform independent is appreciated :-)
Thanks in advance!
Here are some possibilies:
Filesystem level synchronization
PostgreSQL stores all data in a data directory. The data directory is selected during installation. In my opinion the simplest solution is to stop PostgreSQL service on both computers and synchronize the data directories. Versions of PostgreSQL should be the same. You should copy the data directory from one computer to another. Never try to synchronize in both directions!.
Create a backup of a source database and restore it on target server
(Advanced)Use proper replication solutions.

Syncing folders from local computer to remote computer?

I have installed Bitnami wamp stack on my local computer as well as the remote computer.
I do all the development locally, What I would want is to sync the public htdocs folder of the local system with the remote system.
FTP is slow and has to be done manually. What i want is a single sync command, similar to github push and pull. I don't want to use github either.
Basically, i am looking for a solution that can sync modified files only, instead, of uploading complete folders again and again using FTP.

Windows Server Backup: New Disk & Keeping Backup History

I have been doing backups for a couple of years on Win Server 2008R2 using Windows Server Backup. The 'drive' is attached via iSCSI and has been working fine. Well, now I have a new SAN devise and I want to backup to this using iSCSI instead of the old location--everything is working fine as far as mounting the drives--but specifically, is there any way I can copy over the backup history to the new drive? I need to completely remove the old drive from the system and only use the new drive. Currently I can do this but I don't know how to do this without losing years of incremental backup history.
If I add both drives to the backup schedule, will this copy the history from the first drive over to the second? thanks!

p4v getting files as writable

Perforce is downloading files to the external hard-drive connected to my MacBookPro as writable ("777"). It's as if the "allwrite" option is set in my workspace, but it's not.
I thought Perforce was supposed to mark the files read-only until I check them out. Is there a setting somewhere I missed?
Rev. P4V/MACOSX104U/2009.2/236331
MacBookPro OSX 10.5.8
Is your external hard-drive formated as hfs+? If it's FAT32, it will be 777 anyway.
Have you checked if Windows thinks the files are read only after syncing with the Mac client?
Perforce does not like it when you access the same disk location from two different workspaces, nor the same workspace from two different hosts. This is because the server tracks the state of the files on the client; you're begging for your local store to lose synchronization with the depot.
What are you really trying to accomplish here?
I would recommend that you forget about FAT32; put your Windows workspace on an NTFS volume and your Mac workspace on an HFS+ volume. Submit & sync to share the data. Storage is cheap.

The ideal background filesystem backup

I am thinking about a script/program that can run in background, and attempt to backup or synchronize a given filesystem path to a mirror location (probably located on an external/separate storage device).
This should apply to Windows but it could as well be used under Linux.
Differential/incremental backups are a bonus.
Windows System State backups are a bonus too.
Keeping the origin free of meta-data is essential. (unlike version control)
Searching by file or activity date could be interesting (like version control)
Backup repositories should be easy to browse and take little space.
Deleted files should be available for recovery for a period of time.
Windows Backup is tedious and bloated and limited.
Tar-gzipping is not accessible.
User interaction during backup should be nonexistent.
Amanda is the ultimate full-featured open-source backup solution, and there's a (relatively) new Zmanda Windows Client.
Duplicity is free and creates encrypted, incremental, compressed offsite backups. It's a linux app, but you could run it in cygwin or a small virtual machine.
I've written a perl script that runs it via a cronjob to backup several very big directories over DSL and it works great.
Check out AJCBackup. Does an excellent job at a good price.
Acronis True Image is great. It's not free but the Home edition is pretty cheap for what it does and it works reliably. Does image- and file- based backups, scheduling, instant backup of chosen folders accessible from explorer context menu, incremental/differential backups, can mount the backup files as Windows volumes and browse them, copy files out etc. It has saved my ass a few times already.

Resources