What is the best way to back up an Xcode project - xcode

How is it recommended I back up my Xcode projects? Is there a way to backup the whole thing to the web to be pulled from in case the files are somehow deleted from my computer?

Not sure which is the "best", it must surely depend on your budget, development environment, resources, time, etc etc. Here some ideas, I list them in the order of preference, starting with the FREE.
"Git" in effect creates a backup, even if strictly speaking its version control tool.
You can create and save it on your iCloud Drive, Google Drive, Sky Drive, etc etc, of course you need to manually duplicate it on a regular basis.
You can for sure get commercial tools to do this, do a google/wikipedia search, read some reviews.
"Time machine" perhaps overkill, although maybe you can tweek it to focus on your projects.
"rdist" is UNIX utility that you could setup in a "crontab" to do regular copies of files that changed, although a risky strategy.
Just use Finder to manually duplicate the entire folder, you just need to remember to do it.

Related

Why is project and file saving/management so awkward in programming as compared to other digital media?

TLDR: What is the reason for the complex file management systems in place, such as Github repositories, when working in Visual Studio?
This has been bothering me for a while. I've finished a diploma course in Digital Media, and have started another course in programming. One thing that stuck out immediately after coming from 3D art is how incredibly awkward and obtuse basic file management is when working with Visual Studio. Presumably the same issues arise with other development environments, as if they didn't I can't imagine why anybody would ever use VS.
For example, let's say I want to work on a project in 3ds Max. It's stored on a shared network drive, so I don't want to risk two people accessing it at the same time and saving over each others work. I simply grab the folder or file that I want to use, copy and paste it with a new name, and then I'm good to go.
Saving things with a new name is easy, just save as, rename it. I can work from network drives, local drives, portable drives. The file can come from anywhere and be saved anywhere. Everything is fast, painless, and clear.
If I was to try and do the same thing in VS, for starters, it wouldn't let me build the program while saved to the network, so I'd have to copy it over to a local drive. Presumably this is to prevent the "multiple people accessing, saving over each other" issues that are easily avoided by just renaming the thing.
If I wanted to iteration save, that is, to frequently save the project with a version number name to allow easy rollbacks and troubleshooting, I'm not even sure how I'd do it. Renaming projects/solutions has proven so hard to do that I've had to delete projects and make them again with a new name, rather than try and figure out how to it properly.
There are all sorts of complex file management systems that VS seems to require for any large project work, all of which would be completely unnecessary if you could just copy, paste, rename and save-as with any real ease.
I'm obviously rather new to this, and I'm certain that there is an important reason why it's so awkward to manage files, I just don't know what that reason is. I feel like I'd have a far better understanding of how all these file management systems actually work if I knew why they existed in the first place. At the moment, just trying to be able to work from a network drive is taking up hours when it would be a non-issue in every other digital media field I've worked with.

Possible to selective sync dropbox or other cloud storage from multi-platform command line?

Going to be working with a medium sized remote group on a large (but independent) project that will be generating many GB to TB of data.
To keep users from having to store 500GB of data on their personal machines, and to keep everyone in sync, we need a command-line/python utility to control selective syncing of dependencies on multiple operating systems: or at least osx and linux.
So example, someone who needs to work on the folder:
startrek/startrekiii
May require the folders:
startrek/nimoy/common
startrek/nimoy/[user]
startrek/shatner/common
startrek/shatner/[user]
but not:
startrek/startrekii, startrek/nimoy/[some_other_user], etc
From their command line (or a UI) they would run:
sync startrekiii
And they'd also receive startrek/nimoy/common, etc
likewise we'll have an unsync command that, as long as those dependent folders are not in use by another sync, will be unsynced and removed from the user's HD.
Of cloud sync/storage solutions, dropbox seems to offer the most granular control over this, allowing you to sync specific folders and subfolders - however from everything I can find this granular control is strictly limited to their UI.
We're completely open to alternative solutions if you have them, we just need something as easily deployable as possible and don't have the budget for Aspera or something to that effect.
Two other important notes:
Because of one very central part of our pipeline which pulls files
from those dependent folders (over which we have limited API
control), the paths need to be consistent on their respective
platform. So ~/Dropbox/startrek/nimoy can never be ~/Dropbox/startrek/startrekiii/nimoy
Many of the people using this will be artists and otherwise non-technical people, the extent of who's experience using csh or bash is for simple things like changing directories and moving files around.
Has anyone found a way to hack into Dropbox's selective sync, and/or know of a better alternative?

Is it possible to do version control on my computer system files themselves?

Here's my problem. I have OSX Lion and I do Web development, BUT I have no real comprehension of what I'm doing when I'm using brew, pear, and the terminal in general. I am working on leveling up, but I still have to work in the meantime. That's why I very often mess up my system files (just tried to install PHPUnit, didn't work, so I deleted other pear directories, still didn't work, and now I end up with a mess).
It would feel better and relieve a lot of stress to know I can revert back my changes when I mess up. So my question is, can I set up a version control like git on all my computer files themselves, so that before any big change, I can save the state of my computer? Or is there any other way to get that same result?
I think creating different users for my mac is not enough, cause I want to build up my system, and add things to it, so it doesn't really help. And I'm not sure, but Time Machine is made just to get some files, not to revert my system to some previous state, or can it do it?
Help would be greatly greatly appreciated, thanks!
Seems to me you need to use a VM.
Take snapshots and work without worries. If you mess up you just revert to your last known good snapshot
You can do this - you can version control anything... but I wouldn't recommend it (at least not with GIT/SVN/etc - perhaps there's some software designed for this purpose that I'm unaware of).
You'll be tracking version changes for tons of files, temporary, setting files, binaries, etc. Files would be changing all the time and you'd need to stay on top of commits and so forth. Instead I'd recommended just copying folders (backup), making changes, verifying your changes work, then deleting the backups.
It's very easy to overuse version control.
Having an external drive with time machine and allowing it to sync often will allow you to revert certain parts (or all) of the file system to a certain date.
Since you're under OS X, I'd suggest Time Machine - it is more adapted to what you want to do than a source control versioning. TM is pretty decent at backuping, but there are other solutions if this one doesn't fit your needs.
EDIT: as commented by #dstarh, brew isolates everything it installs and uses symbolic links when needed. So use it whenever you can, it leaves your system cleans. There's instructions on how to uninstall a software, and in the worst of the cases, you could look at the source of your software's formula and find out what to delete.
Long story short : yes you could, but there's way easier and painless ways to do this.

Is it possible to explore SVN repo as an ordinary folder in Windows (for examle, mount as remote drive)?

So, I need to make a file storage for our team. Also I have SVN server. Opportunity to do rollbacks and control on who created or deleted file is very neccessary and important for our project.
Any ideas? Maybe without SVN. I can connect using WebDAV but only in read-only mode (because there is no LOCKS support in it).
You can set up the SVN server to allow exactly that.
Read the chapter in the SVN book about WebDAV and Autoversioning
So, what you want is the ability to roll back changes, and limit who can make the changes, but without the bother of checking in and out files?
Maybe Subversion isn't for you. I've done similar sharing with Dropbox and there's now BoxNet that's suppose to be like Dropbox on Steroids. Dropbox (and I assume box.net too) has some features that are very nice:
You can setup folder sharing between particular teams. That way, you can say who can and cannot access these files.
Dropbox automatically saves each and every version of a file, so you can always go back to previous versions -- even if that file has been deleted.
Files are stored locally. All a user has to know is to save a particular file in a particular folder, and everyone has access to it. I've successfully used Dropbox to collaborate with managers that make the Pointed Hair boss in Dilbert look like a high tech genius.
There's also Skydrive and Google Drive, but I don't find them as universal as Dropbox or as easy to use. It's possible to use Dropbox without ever going to the Dropbox website. To the non-geek, it appears to be magic as files I've written and edited appear on their drive. It took me a few weeks to train one person that he didn't have to email me his document when he made changes because I already had it.
Dropbox gives you 2 Gb of space for free which doesn't sound like a lot. However, my first hard drive was a whopping 20Mb which was twice the size of the standard 10Mb drive at that time. If you're not storing a lot of multimedia presentations or doing a lot of Photoshop, 2Gb might be more than enough for your project.
I know Windows 7 and later has some sort of versioning system built into it. I know this because anytime someone mentions that Mac OS X has time machine, some Wingeek pipes in stating that Windows has the same thing, but only better!. Unfortunately, Windows is not my forte, so I don't know too much about this specific feature. I believe the default is once per day, but it can be changed. This might be the perfect solution if everyone is on Windows.
Subversion can do autoversioning as Stefan stated. Considering his position in the Subversion community (especially his work on TortoiseSVN), he knows his stuff. Unfortunately I don't know too much about it since I've never used or seen this feature implemented. It's probably due to the fact that I work mainly with developers who know what a version control system is, and therefore have no need for something that does the versioning for them.
Also don't forget to check if you can use your corporate Sharepoint which does something very much what you want. I am not too impressed with Sharepoint, but if the facility is there, and your company can give you the support, it is something you probably want to look into.

Can anyone recommend a good backup "system" for a developer? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
I'm known around the office as "the backup guy". As a developer, I often jump back and forth between projects, and as a result I don't always remember exactly what changes were present in each when I return to them. I usually have to compare my local changes versus those in our source control system, and then I'll eventually remember it all. Thing is, I don't always have the luxury of doing this. Sometimes I have to build something for a client quickly, and so I make a backup of the working directory, and that way I can get the latest files from source control, and build the DLL quickly - all while knowing that the other (in-progress) changes are safe.
The problem is that I've now accumulated a bunch of backup folders in each project directory, which makes it harder to find the specific change I was looking for. while my practices have evolved to the point that I always take the time to give each backup folder an informative name, I'm starting to think I'd be better off writing my own tool.
For example: If I select a few folders in windows explorer, I'd like to have my own context menu item that triggers my own backup application. This application would prompt me for a backup name, and description. It would then move the selected folders to a specific, centralized backup directory - where it would also generate a 'readme.txt' file, outlining the backup details. Also, the backups would also be organized by date/time. I feel this would refine my backup procedure, and facilitate future lookups.
But yet, I can't help but wonder if such tools already exist. Surely, someone must be as obsessive as me when it comes to backups.
Do you know of any tools that could help me improve my backups?
I'm aware of this post, but isn't exactly aligned with what I want. I'd prefer to keep the backups on the same machine - I'll handle moving them over to other machines myself.
Update
To clarify: If I'm working on Task A, and suddenly I need build something for a client (Task B), I have to backup what I have so far for Task A, and get the latest from source control into the working directory. I then start and finish Task B, and then restore Task A. This is an ideal, neat scenario. But sometimes, I only get back to Task A a week down the line, or further - because I get hit with Task C, Task D, etc - all of which affect the same project. Now, if these changes are scheduled to be checked in, then I would probably benefit from checking them in as I progress (but to be honest, we usually wait until it is complete before we check it in, at this company - that means less checkins of unfinished code). So I'm not sure if each of my backups should equal a branch - because I'm sometimes excessive with my backups.
I think what you want is a distributed version control system, such as git.
First, your existing source control system can probably already support this, in the form of branches. Instead of just copying the working directory, commit it as a separate branch, where you can keep that client's version of the application.
However, as skiphoppy said, a distributed source control system would be much better suited for this. I quite like Bazaar, but git is very popular too (although I don't know how good its Windows support is, since it is primarily a *nix tool developed for the Linux kernel)
Subversion using TortoiseSVN will provide you with this functionality. The concepts are different (revisions, not "backup names") . The readme.txt that you make mention of is summarized in the Subversion log. Any comment that you provide can be used to guide others looking at the revision. Check out the Wikipedia page on Subversion as well as the homepage to download it and TortoiseSVN.
CloneZilla, backs up your entire hard drive partition, its free and reliable. I use it in place of Acronis Echo Server, and it restores my entire system in 8 minutes.
As skiphoppy says, a DSVN can really help. Git offers the ability to shelve the stuff you're working on now so that your working copy is clean yet you can pull your current working set off the shelf when you're done. That seems like what you really want.
If you're using Perforce, there's a couple of tar-based utilities that do this, too, but I haven't yet used them.
How about changing the way you work, sounds like one day things will go tits up if you carry on as is. Fair enough on the need to build a dll mid way through a change and having to back up your work in progress, but once release is done then re-integrate your changes with the release version immediately. I'd never allow myself to have multiple back ups of the same app, but hey, that's just me.
I use Hybrid Backup www.hybridbackup.com.au - based in Australia they were the only real people i could speak to that could handle exacly what i wanted - i dont have dll problems i have over 1000 files that all change everyday and everytime anyone inmy office does anything - i have well over 250gb of live data i need backed up everynight with every single change i have ever done - ever - basically i can be fairly lazy and copy files all over the place and copy directories to make sure everythings is backed up again but knowing that everyday everysingle thing i change (including my directory backups) are backed up and i can remmeber a file i know i had and see my backups exacly as they were 5 months ago - that was it - but the big thing is it syncs to 2 different places - brisbane and sydney - so i know everything safe - they even sent me a external backup vault/server to store everything on. cost a bit but business is data where im from and im sure most other people.
anyway just trying to point out you should have a awesome backup system so you dont worry about those things to start with.
I think it's a pretty reasonable practice to check in every night. Sometimes I check in 3 or 4 times a day, sometimes 20 (every time my code is working, actually).
If your code is always checked in, you should easily be able to just sync to a different branch without backing anything up.
If you can't check in your changes by the end of the day, a very reasonable answer is to discard them. You are most likely in some hole that you will have trouble digging yourself out of, and the next day you will replicate the work in an hour and do it MUCH BETTER than you did the first time. Also, if you go that long with broken code, how do you test?
Finally, if your code REALLY can't be checked into the build every day (it actually does happen in some situations, regardless of what I said in the previous paragraph), branch.
No more backups.
I use:
ZenOK Online Backup for my documents and small files (photos, videos and large files)
Love it.

Resources