Restore Déjá Dup back up (on windows) - windows

I have a back up of user folders from an Ubuntu 15.04 System.
The back up was done via the system settings control, which as i understand uses Déjà Dup as the standard back up program.
The back up is now on a flash drive, it's files look like this:
duplicity-full.vol01.difftar.gz
...
duplicity-full.vol87.difftar.gz
as well as:
duplicity-full.manifest
duplicity-full.sigtar.gz
duplicity-full-signatures.sigtar.gz
duplicity-inc.33Z.to.914Z.manifest
duplicity-inc.33Z.to.914Z.vol1.difftar.gz
duplicity-new-signatures.33Z.to.914Z.sigtar.gz
If possible I'd like to restore these back up files on windows, or on a live Ubuntu system and then copy them onto windows.

(from personal experience)
Requirement: the partition/drive (c) containing the backup, 2 extra drives (a + b)
1) Create a Drive (a) that boots Ubuntu, e.g. a live-usb drive.
(see: http://www.ubuntu.com/download/desktop/create-a-usb-stick-on-windows)
2) Boot the Ubuntu system, next format the second extra drive (b) to hold an ext3 file system.
3) Use the duplicity command that ships with Ubuntu to restore the backup (from c) to the ext3 drive (b)
(duplicity restore file://path-to-folder path-to-folder-to-restore-on-on-b
http://duplicity.nongnu.org/duplicity.1.html)
Remarks:
-The files shown above in the questions indicate more than one clean back up, but a history as well, i deleted those files by hand bevor running the duplicity command.
-I saved the back up files on an ntfs system, but was not able to restore onto a ntfs system, this is the reason for the ext3 drive. This most likely was problem to a specific file and may not be mandatory.

Related

How to recover all data if OS changes from windows to raspbian?

I've installed Raspbian OS in my PC ,so all my data in windows 10 drives is may be deleted . Do I recover that ? (in a very critical situation)
I assume that you didn't wipe the disk. I also assume that you stopped using your disk immediately to avoid overwriting your files.
Teskdisk will not work for you. But scalpel will. I will explain two ways to use it, depending your sources and your tools.
First, I assume that you have another laptop/desktop with a linux OS and a SATA to USB converter. If these apply to you, for faster results, do the following.
Extract the disk from your PC
Connect it via USB to another PC
Install the package scalpel
With your favourite editor (vi/vim/nano, etc) open the file /etc/scalpel/scalpel.conf and uncomment the preferred lines
Run the following command ( scalpel /dev/sdX# -o output ) which will scan your disk and find any selected file type that is not overwritten but it may find some corrupted files.
Secondly, I assume that you have a USB disk with a linux iso. This way is cheaper but slower. In that case, do the following.
Boot from the USB to the linux OS
Install the package scalpel
With your favourite editor (vi/vim/nano, etc) open the file /etc/scalpel/scalpel.conf and uncomment the preferred lines
Run the following command ( scalpel /dev/sdX# -o output ) which will scan your disk and find any selected file type that is not overwritten but it may find some corrupted files.
The folder named output will be in the path from which you ran the command scalpel. In the folder named output will be a txt file named audit.txt and folders with the selected types you searched for and inside them the recovered files, many of which will be corrupted.
Wish you luck!

Unable to get Unison to sync current folder onto remote folder

I am trying to automate backup of my project folders through a script. I found Unison, which seemed to match my needs, and have writtenm a script that can reliably mount and unmount a samba folder (Mac backing up to a Windows box). The only problem I am having is getting my unison command to reliably backup my (currently test) directory. It says:
Looking for changes
Reconciling changes
Nothing to do: replicas have not changed since last sync.
Which doesn't make sense to me.
Here is my command:
unison ~/Documents/test ~/hm_mnt/test/ -fat -auto -force ~/Documents/test -noupdate ~/Documents/test -nodeletion ~/Documents/test
I want to simply maintain a copy of the current state of my test folder in this case. this worked once and has not duplicated since.
My initial run (with slightly different command run fine) and I have been refining it to the above state since trying to get manual updates to a test file to move over. These have not propagated, and even when I cleaned out the remote folder to make sure there was no differences detected.
Thanks

Get actual moint point of network volume in osx cli

In an automated system, i copy files to a mounted network volume with a sh
In basic i do "cp file.pdf /Volumes/NetworkShare/".
This works well until the remote system is down.
So before copying i can do a ping to detect if it's online.
But... when i get online OSX often remounts on a different path "/Volumes/NetworkShare-1/".
The old path "/Volumes/NetworkShare/" stil exists altough it's useless.
So, how can i find the actual mount point of this share in OSX cli?
I found out that diskutil does something like this for local disks, not for network volumes. Is there an equivalent for diskutil for network volumes?
The mount command (just on its own) will list all mounted filesystems. As for why OS X is creating that extra directory, that is pretty odd. Did you manually mount the filesystem, by any chance? If you created the “NetworkShare” directory yourself, OS X’s auto mounter might do what you’re suggesting.

backup collabnet subversion edge to another hard disk

I've installed Collabnet Subversion Edge, and would like to make sure I have it backed up properly. I would like NOT to use the CloudBackup service offered.
I've went to the administration interface for collabnet (localhost:3343) and went to Repositories > Backup Schedule. There, one can choose between 3 different 'Type of Job':
Cloud Services Backup
Full Dump Backup
Hotcopy Backup
Neither lets you choose where to copy the backup. I've tried looking up how this works, but documentation seems to be lacking a lot.
What is the best way to backup such a repository? Shall I just keep a copy of the entire collabnet folder (c:\csvn)?
The Subversion Edge admin UI lets you specify the folder for backups. It defaults to a folder inside the normal data folder, but you can specify a different value. So, for example, if you have a D:\ drive that you want the backups to go on you can just specify that folder in the settings and the backups will go to that folder.
It does need to be a physically accessible hard drive though.
See the Backup Directory configuration item in this screenshot:
https://ctf.open.collab.net/sf/projects/svnedge/screenshots/screens/config/config.png
You can use Windows Server Backup to backup Subversion repositories. It allows you to shedule backups
to a network share, dedicated backup volume, writeable media. For example, wbadmin command-line tool allows you to safely backup your repositories. This simple command performs one-time copy backup of C:\foo\bar to X:\ volume:
wbadmin start backup –backupTarget:x: -include:c\foo\bar -vsscopy
(To install Windows Server Backup, run ocsetup WindowsServerBackup in elevated command-prompt).
You can setup backup in different ways:
wbadmin command-line tool,
PowerShell cmdlets, good for automation and customization of backup actions,
Windows Server Backup wizard (control panel, actually) MMC snap-in.
It's not required to stop server's service when you run the backup because FSFS repository backend is always in consistent state.
Here are general tips about recovering Subversion repository from a backup:
Recover repository backup to an empty directory to make sure that restored repository files won't mix with files of the broken one. After repository if recovered, you can delete broken repository and then replace it with the recovered one.
Stop-start cycle your Subversion server after recovering repository from a backup.
If your clients get errors after repository recover, run svnadmin recover against it. The command finishes instantly and makes repository accessible again.
If you have access to the repository directories then you should be able to use hotcopy directly and specify where the backups go.
It's enough to take a periodical backup of just csvn/data directory where all your repositories and configuration files are stored.
Visit this link for backup (and upgrade) options. The contents in the link is added below. Hope it helps.
Manual Upgrade/Reinstallation Steps
Subversion Edge includes an integrated mechanism for installing updates. This is the preferred way to do an upgrade as it handles whatever steps are needed to perform the upgrade and can be done remotely from your web browser. However, there are scenarios where you might want or need to do an upgrade manually, for example your Subversion Edge server might not be able to access the Internet to pull down the updates or maybe one or more critical installation files have become corrupted and you need to reinstall using the same version. Here are the steps for performing a manual upgrade or reinstallation:
Windows
If your existing Subversion Edge installation was installed using the installer from Subversion Edge 2.0.0 or later, then all you need to do to upgrade is download the latest installer and run it. This will uninstall the current version and install the new version (which is how the Windows Installer (.msi) process works for upgrades).
If you are not sure what version you installed with, you can always safely use this approach:
Stop the existing services and uninstall the current version from the
Windows Control Panel. This will leave behind your C:\csvn folder and
any files in it that have been modified since the original install.
Delete everything in the C:\csvn folder EXCEPT the data folder. So
you should be left with just the C:\csvn\data folder.
Install the new version. The installer will pick up the existing data folder and when the services start it will basically just be an upgrade to the new
version.
WARNING: Take note of this reported bug and backup the svn_access_file first:
artf7081 - Using Windows installer for updates can overwrite the svn_access_file
Linux/Solaris
To upgrade a Linux/Solaris installation, this is the safest way to do it:
Stop the servers $ bin/csvn stop $ bin/csvn-httpd stop
Rename the csvn folder $ mv csvn csvn-old
Untar the new release as a non-root user
Move the data folder back into the new release
$ mv csvn-old/data csvn
Important! Copy "dist" configuration files to data folder
$ cp -f csvn/dist/*.dist csvn/data/conf
Start the servers
$ bin/csvn start
$ bin/csvn-httpd start

How can I move MySQL data directory on Mac OS 10.5? (and related questions)

I've managed to mess up my MySQL database (on Mac OS X 10.5) and need help recovering!
I tried to add an index to a fairly large table (190 million records) and in the course of this, I ran out of disk space. Subsequently realized that the partition with the data directory is too small and so I need to move it.
Initially I thought that I would just copy the data directory to another location, then bung a symlink in place of the original data directory.
BUT it refuses to move!
sudo cp -r /usr/local/mysql/data .
cp: ./data: Permission denied
(I have stopped the mysqld process before attempting this move)
Help!
This isn't a mysql question, but rather an OS question.
I would guess that you either don't have permission to write to the current directory, or there's already a directory there named 'data' that you don't have permission on, etc.
In my experience, MySQL doesn't like running out of disk space at all. Make sure the last records are OK after you bring the engine back up.
Also, don't use the symlink - change the mysql config. In Unix, this would be the 'datadir' setting in /etc/my.cnf.

Resources