Why is project and file saving/management so awkward in programming as compared to other digital media? - visual-studio

TLDR: What is the reason for the complex file management systems in place, such as Github repositories, when working in Visual Studio?
This has been bothering me for a while. I've finished a diploma course in Digital Media, and have started another course in programming. One thing that stuck out immediately after coming from 3D art is how incredibly awkward and obtuse basic file management is when working with Visual Studio. Presumably the same issues arise with other development environments, as if they didn't I can't imagine why anybody would ever use VS.
For example, let's say I want to work on a project in 3ds Max. It's stored on a shared network drive, so I don't want to risk two people accessing it at the same time and saving over each others work. I simply grab the folder or file that I want to use, copy and paste it with a new name, and then I'm good to go.
Saving things with a new name is easy, just save as, rename it. I can work from network drives, local drives, portable drives. The file can come from anywhere and be saved anywhere. Everything is fast, painless, and clear.
If I was to try and do the same thing in VS, for starters, it wouldn't let me build the program while saved to the network, so I'd have to copy it over to a local drive. Presumably this is to prevent the "multiple people accessing, saving over each other" issues that are easily avoided by just renaming the thing.
If I wanted to iteration save, that is, to frequently save the project with a version number name to allow easy rollbacks and troubleshooting, I'm not even sure how I'd do it. Renaming projects/solutions has proven so hard to do that I've had to delete projects and make them again with a new name, rather than try and figure out how to it properly.
There are all sorts of complex file management systems that VS seems to require for any large project work, all of which would be completely unnecessary if you could just copy, paste, rename and save-as with any real ease.
I'm obviously rather new to this, and I'm certain that there is an important reason why it's so awkward to manage files, I just don't know what that reason is. I feel like I'd have a far better understanding of how all these file management systems actually work if I knew why they existed in the first place. At the moment, just trying to be able to work from a network drive is taking up hours when it would be a non-issue in every other digital media field I've worked with.

Related

Unity Scene Files not transferring properly through TFS between Windows and OSX

This is a strange circumstance that my boss and I just got into this morning as we were trying to import my scene from the Team Foundation Server to his machine. I created a Unity Scene file and made my scene over the course of the past month or so and when I was finished I uploaded everything to TFS so he could pull it down and use it (we are quite far apart from each other so we can't just usb drive everything over to see what the problem might be). When he pulled the scene file down (and all the supporting scripts) one of the scripts that was in the scene was changed. It started out being a script called Smart_HUD4 but when he went in to the inspector the script was now called Smart_HUD2 and was an entirely different script than what I had written, not to mention I don't even have a Smart_HUD2 script on my machine. The same also applied to another script called laser (now called Laser1 and again, not something I wrote nor is it on my machine).
Has anyone else come up against a problem like this? Found any solutions? It's strange because I went ahead a re-downloaded the files I uploaded and everything was exactly how it was supposed to be, proper names and scripts in their proper places. We think the issue is because we are moving from Windows to OSX and the differences in operating systems might be leaving behind some residual code or something that is causing things to be switched around?
TFS never asked me to merge any files so if it is the case that TFS might be auto-merging files, none of the files its merging even share the same name, its just picking ones whose names are close and doing so (the scene file's name was completely unique so it has nothing to merge that with).
OK, few things to check:
Have you checked that the files do not exist in the TFS with similar naming? regardless of TFS requesting name changes?
do you have any other projects effected?
Have you checked (I'm guessing you have) that the unity versions are matching exactly?
Now, something to consider, The unity system does vary between Windows and MacOS, mainly in the fact that Unity Makes use windows bases features that are not present on a Mac and this can cause file issues.
When moving between systems, Unity packages the scene data on windows in a global and local format. So, sometimes folder structure can change.
From experience, the most likely issue is more to do with file locations than OS discrepancies, as it might have been a clash between your project and other similar.
Remote file management is odd at times when doing colab stuff. Have you checked with other colleagues made sure they don't have anything similar uploading?
But, scripts changing names and data is not something that I would expect. So my guess is that the issue lies with the upload on the TFS and not the MacOS vs Windows move.
Hope this helps,
Glenn

What is the best way to back up an Xcode project

How is it recommended I back up my Xcode projects? Is there a way to backup the whole thing to the web to be pulled from in case the files are somehow deleted from my computer?
Not sure which is the "best", it must surely depend on your budget, development environment, resources, time, etc etc. Here some ideas, I list them in the order of preference, starting with the FREE.
"Git" in effect creates a backup, even if strictly speaking its version control tool.
You can create and save it on your iCloud Drive, Google Drive, Sky Drive, etc etc, of course you need to manually duplicate it on a regular basis.
You can for sure get commercial tools to do this, do a google/wikipedia search, read some reviews.
"Time machine" perhaps overkill, although maybe you can tweek it to focus on your projects.
"rdist" is UNIX utility that you could setup in a "crontab" to do regular copies of files that changed, although a risky strategy.
Just use Finder to manually duplicate the entire folder, you just need to remember to do it.

Is it possible to explore SVN repo as an ordinary folder in Windows (for examle, mount as remote drive)?

So, I need to make a file storage for our team. Also I have SVN server. Opportunity to do rollbacks and control on who created or deleted file is very neccessary and important for our project.
Any ideas? Maybe without SVN. I can connect using WebDAV but only in read-only mode (because there is no LOCKS support in it).
You can set up the SVN server to allow exactly that.
Read the chapter in the SVN book about WebDAV and Autoversioning
So, what you want is the ability to roll back changes, and limit who can make the changes, but without the bother of checking in and out files?
Maybe Subversion isn't for you. I've done similar sharing with Dropbox and there's now BoxNet that's suppose to be like Dropbox on Steroids. Dropbox (and I assume box.net too) has some features that are very nice:
You can setup folder sharing between particular teams. That way, you can say who can and cannot access these files.
Dropbox automatically saves each and every version of a file, so you can always go back to previous versions -- even if that file has been deleted.
Files are stored locally. All a user has to know is to save a particular file in a particular folder, and everyone has access to it. I've successfully used Dropbox to collaborate with managers that make the Pointed Hair boss in Dilbert look like a high tech genius.
There's also Skydrive and Google Drive, but I don't find them as universal as Dropbox or as easy to use. It's possible to use Dropbox without ever going to the Dropbox website. To the non-geek, it appears to be magic as files I've written and edited appear on their drive. It took me a few weeks to train one person that he didn't have to email me his document when he made changes because I already had it.
Dropbox gives you 2 Gb of space for free which doesn't sound like a lot. However, my first hard drive was a whopping 20Mb which was twice the size of the standard 10Mb drive at that time. If you're not storing a lot of multimedia presentations or doing a lot of Photoshop, 2Gb might be more than enough for your project.
I know Windows 7 and later has some sort of versioning system built into it. I know this because anytime someone mentions that Mac OS X has time machine, some Wingeek pipes in stating that Windows has the same thing, but only better!. Unfortunately, Windows is not my forte, so I don't know too much about this specific feature. I believe the default is once per day, but it can be changed. This might be the perfect solution if everyone is on Windows.
Subversion can do autoversioning as Stefan stated. Considering his position in the Subversion community (especially his work on TortoiseSVN), he knows his stuff. Unfortunately I don't know too much about it since I've never used or seen this feature implemented. It's probably due to the fact that I work mainly with developers who know what a version control system is, and therefore have no need for something that does the versioning for them.
Also don't forget to check if you can use your corporate Sharepoint which does something very much what you want. I am not too impressed with Sharepoint, but if the facility is there, and your company can give you the support, it is something you probably want to look into.

Graceful File Reading without Locking

Whiteboard Overview
The images below are 1000 x 750 px, ~130 kB JPEGs hosted on ImageShack.
Internal
Global
Additional Information
I should mention that each user (of the client boxes) will be working straight off the /Foo share. Due to the nature of the business, users will never need to see or work on each other's documents concurrently, so conflicts of this nature will never be a problem. Access needs to be as simple as possible for them, which probably means mapping a drive to their respective /Foo/username sub-directory.
Additionally, no one but my applications (in-house and the ones on the server) will be using the FTP directory directly.
Possible Implementations
Unfortunately, it doesn't look like I can use off the shelf tools such as WinSCP because some other logic needs to be intimately tied into the process.
I figure there are two simple ways for me to accomplishing the above on the in-house side.
Method one (slow):
Walk the /Foo directory tree every N minutes.
Diff with previous tree using a combination of timestamps (can be faked by file copying tools, but not relevant in this case) and check-summation.
Merge changes with off-site FTP server.
Method two:
Register for directory change notifications (e.g., using ReadDirectoryChangesW from the WinAPI, or FileSystemWatcher if using .NET).
Log changes.
Merge changes with off-site FTP server every N minutes.
I'll probably end up using something like the second method due to performance considerations.
Problem
Since this synchronization must take place during business hours, the first problem that arises is during the off-site upload stage.
While I'm transferring a file off-site, I effectively need to prevent the users from writing to the file (e.g., use CreateFile with FILE_SHARE_READ or something) while I'm reading from it. The internet upstream speeds at their office are nowhere near symmetrical to the file sizes they'll be working with, so it's quite possible that they'll come back to the file and attempt to modify it while I'm still reading from it.
Possible Solution
The easiest solution to the above problem would be to create a copy of the file(s) in question elsewhere on the file-system and transfer those "snapshots" without disturbance.
The files (some will be binary) that these guys will be working with are relatively small, probably ≤20 MB, so copying (and therefore temporarily locking) them will be almost instant. The chances of them attempting to write to the file in the same instant that I'm copying it should be close to nil.
This solution seems kind of ugly, though, and I'm pretty sure there's a better way to handle this type of problem.
One thing that comes to mind is something like a file system filter that takes care of the replication and synchronization at the IRP level, kind of like what some A/Vs do. This is overkill for my project, however.
Questions
This is the first time that I've had to deal with this type of problem, so perhaps I'm thinking too much into it.
I'm interested in clean solutions that don't require going overboard with the complexity of their implementations. Perhaps I've missed something in the WinAPI that handles this problem gracefully?
I haven't decided what I'll be writing this in, but I'm comfortable with: C, C++, C#, D, and Perl.
After the discussions in the comments my proposal would be like so:
Create a partition on your data server, about 5GB for safety.
Create a Windows Service Project in C# that would monitor your data driver / location.
When a file has been modified then create a local copy of the file, containing the same directory structure and place on the new partition.
Create another service that would do the following:
Monitor Bandwidth Usages
Monitor file creations on the temporary partition.
Transfer several files at a time (Use Threading) to your FTP Server, abiding by the bandwidth usages at the current time, decreasing / increasing the worker threads depending on network traffic.
Remove the files from the partition that have successfully transferred.
So basically you have your drives:
C: Windows Installation
D: Share Storage
X: Temporary Partition
Then you would have following services:
LocalMirrorService - Watches D: and copies to X: with the dir structure
TransferClientService - Moves files from X: to ftp server, removes from X:
Also use multi threads to move multiples and monitors bandwidth.
I would bet that this is the idea that you had in mind but this seems like a reasonable approach as long as your really good with your application development and your able create a solid system that would handle most issues.
When a user edits a document in Microsoft Word for instance, the file will change on the share and it may be copied to X: even though the user is still working on it, within windows there would be an API see if the file handle is still opened by the user, if this is the case then you can just create a hook to watch when the user actually closes the document so that all there edits are complete, then you can migrate to drive X:.
this being said that if the user is working on the document and there PC crashes for some reason, the document / files handle may not get released until the document is opened at a later date, thus causing issues.
For anyone in a similar situation (I'm assuming the person who asked the question implemented a solution long ago), I would suggest an implementation of rsync.
rsync.net's Windows Backup Agent does what is described in method 1, and can be run as a service as well (see "Advanced Usage"). Though I'm not entirely sure if it has built-in bandwidth limiting...
Another (probably better) solution that does have bandwidth limiting is Duplicati. It also properly backs up currently-open or locked files. Uses SharpRSync, a managed rsync implementation, for its backend. Open source too, which is always a plus!

Supporting both This-User-Only and Local-Machine settings

I have an application that has to support modifying some registry data depending on the kind of 'installation' that is desired. At present, I have no problems hard-coding to either get elevation and do the changes to the entire local machine, but it is far from nice as ideally, I would also like to support per-user installations. I could hardcode that, but then I lose the local-machine stuff. To be precise, the changes in question involve file association changes, COM stuff etc.
How can I properly support both usage scenarios? Currently I use a set of ON/OFF checkboxes for the variety of associations.
Should I change this meaning on, for example, a MachineInstall file existing in my apps directory, and if not assume User install?
Is it an expected/valid/whatever usecase to say that someone might want to do some things for the entire machine, and some things only for the user? (E.g. mixing of the two.)
Or should I change the entire UI, move away from checkboxes and move to some sort of combobox going 'None/User/Local'? Then again, I think this might have some sort of breakage once you involve multiple users and combinations.
To give an indication, I personally expect the application in question to have its uses for everyone on a computer and as such lean towards the Local-Machine as a 'default', if that makes any sort of difference.
I am likely overthinking the matters quite a bit, so any and all input is very much appreciated. :)
P.S.
Now, someone is probably going to say 'do not do all that stuff from your app, do it from the installer instead'. And they probably have a point, but the point is to allow easy changing of these settings from within the application. To top it off, I am not using .MSI install packages because they make working with 32/64-bit specific executables a disaster requiring merge modules, spawning other MSI's depending on the situation, and so forth (I forgot the details last time I dug into it and forgot about the matter). I don't have that knowledge, nor the time to learn all the intricacies of MSI installations, so it is out for as far I am concerned. To boot, my application is perfectly capable of functioning without any of those registry entries being present, and that is by design. In a way, one might compare it to be like Process Explorer from Sysinternals, which does not require an installer, but can be unzipped and take over the task manager etc without a problem if a user wants, or simply run stand-alone.

Resources