We've been using VSS 6.0 since time began, but yesterday I nabbed VSS2005 off of our MSDN subscription, it wouldn't let me install it off the ISO through Daemon Tools (not sure why, but I submitted error report to MS...). I noticed it had a program files directory right on the ISO, so I just copied the folder onto my hard drive. Well, I opened up the client and behold, a glamorous version of VSS 6.0 connected to the exact same DB.
Anyone know if I'm going to destroy everything by using it?
We moved from VSS6 to VSS2005 just over a year ago. The database structure is identical. The only caveat we found was if some people still used VSS6 on a database where others were using VSS2005. VSS2005 treats Unicode text files as text files, whereas VSS6 does not. Which means that when VSS2005 adds a Unicode text file, VSS6 sees it as binary (this affects csproj files among others).
Other than that, VSS2005 supports proper HTTP access to the database (provided server extensions are installed), improved LAN performance (again, with server extensions), and better file system dialogs (the nasty old ones are gone). However, the new file add dialog shows ALL files, not just the ones that aren't included.
Also, VSS2005 allows the provision of custom editors and differencing tools by file extension, which is very useful. For example, some of our XML files are encrypted, so we run a decryption tool before the difference tool by using this system, which has increased the efficiency of our review processes substantially.
There are also other tweaks here and there, mostly good but occasionally annoying.
Finally, nothing has been destroyed. In fact, there appears to have been less additional corruption in the database since the transition - but I wouldn't put this down to the new VSS as it wasn't a comprehensive test.
I'm pretty sure, that there is no more danger of destroying anything than when using VSS 6.0.
It's quite a long time ago since I last used VSS, but we also updated from version 6 to version 2005. As far as I remember, there were only some cosmetic changes in the client (VSS explorer), but the format of the database and also the available feature were exactly the same than in VSS 6.
You should be fine.
Since VSS just uses a file share for everything, and there's nothing that is really server based, you're fine. Not much has changed in the format of the database, mostly client side stuff.
Related
This is a strange circumstance that my boss and I just got into this morning as we were trying to import my scene from the Team Foundation Server to his machine. I created a Unity Scene file and made my scene over the course of the past month or so and when I was finished I uploaded everything to TFS so he could pull it down and use it (we are quite far apart from each other so we can't just usb drive everything over to see what the problem might be). When he pulled the scene file down (and all the supporting scripts) one of the scripts that was in the scene was changed. It started out being a script called Smart_HUD4 but when he went in to the inspector the script was now called Smart_HUD2 and was an entirely different script than what I had written, not to mention I don't even have a Smart_HUD2 script on my machine. The same also applied to another script called laser (now called Laser1 and again, not something I wrote nor is it on my machine).
Has anyone else come up against a problem like this? Found any solutions? It's strange because I went ahead a re-downloaded the files I uploaded and everything was exactly how it was supposed to be, proper names and scripts in their proper places. We think the issue is because we are moving from Windows to OSX and the differences in operating systems might be leaving behind some residual code or something that is causing things to be switched around?
TFS never asked me to merge any files so if it is the case that TFS might be auto-merging files, none of the files its merging even share the same name, its just picking ones whose names are close and doing so (the scene file's name was completely unique so it has nothing to merge that with).
OK, few things to check:
Have you checked that the files do not exist in the TFS with similar naming? regardless of TFS requesting name changes?
do you have any other projects effected?
Have you checked (I'm guessing you have) that the unity versions are matching exactly?
Now, something to consider, The unity system does vary between Windows and MacOS, mainly in the fact that Unity Makes use windows bases features that are not present on a Mac and this can cause file issues.
When moving between systems, Unity packages the scene data on windows in a global and local format. So, sometimes folder structure can change.
From experience, the most likely issue is more to do with file locations than OS discrepancies, as it might have been a clash between your project and other similar.
Remote file management is odd at times when doing colab stuff. Have you checked with other colleagues made sure they don't have anything similar uploading?
But, scripts changing names and data is not something that I would expect. So my guess is that the issue lies with the upload on the TFS and not the MacOS vs Windows move.
Hope this helps,
Glenn
So, I need to make a file storage for our team. Also I have SVN server. Opportunity to do rollbacks and control on who created or deleted file is very neccessary and important for our project.
Any ideas? Maybe without SVN. I can connect using WebDAV but only in read-only mode (because there is no LOCKS support in it).
You can set up the SVN server to allow exactly that.
Read the chapter in the SVN book about WebDAV and Autoversioning
So, what you want is the ability to roll back changes, and limit who can make the changes, but without the bother of checking in and out files?
Maybe Subversion isn't for you. I've done similar sharing with Dropbox and there's now BoxNet that's suppose to be like Dropbox on Steroids. Dropbox (and I assume box.net too) has some features that are very nice:
You can setup folder sharing between particular teams. That way, you can say who can and cannot access these files.
Dropbox automatically saves each and every version of a file, so you can always go back to previous versions -- even if that file has been deleted.
Files are stored locally. All a user has to know is to save a particular file in a particular folder, and everyone has access to it. I've successfully used Dropbox to collaborate with managers that make the Pointed Hair boss in Dilbert look like a high tech genius.
There's also Skydrive and Google Drive, but I don't find them as universal as Dropbox or as easy to use. It's possible to use Dropbox without ever going to the Dropbox website. To the non-geek, it appears to be magic as files I've written and edited appear on their drive. It took me a few weeks to train one person that he didn't have to email me his document when he made changes because I already had it.
Dropbox gives you 2 Gb of space for free which doesn't sound like a lot. However, my first hard drive was a whopping 20Mb which was twice the size of the standard 10Mb drive at that time. If you're not storing a lot of multimedia presentations or doing a lot of Photoshop, 2Gb might be more than enough for your project.
I know Windows 7 and later has some sort of versioning system built into it. I know this because anytime someone mentions that Mac OS X has time machine, some Wingeek pipes in stating that Windows has the same thing, but only better!. Unfortunately, Windows is not my forte, so I don't know too much about this specific feature. I believe the default is once per day, but it can be changed. This might be the perfect solution if everyone is on Windows.
Subversion can do autoversioning as Stefan stated. Considering his position in the Subversion community (especially his work on TortoiseSVN), he knows his stuff. Unfortunately I don't know too much about it since I've never used or seen this feature implemented. It's probably due to the fact that I work mainly with developers who know what a version control system is, and therefore have no need for something that does the versioning for them.
Also don't forget to check if you can use your corporate Sharepoint which does something very much what you want. I am not too impressed with Sharepoint, but if the facility is there, and your company can give you the support, it is something you probably want to look into.
I have a serious issue, my harddisk crashed yesterday , and i had tons of projects on it .I lost most of them but the ones i recovered are all the debug folders i sent to clients (most of them are desktop applications). I am unable to recover the code for most of the work , but i do have debug folders , my question is is there any way that i can recover my code from them.
I am sure some one you would have gone through it in the past, please help if you have any information regarding this.
Files in the debug folder:Example i make application apple
apple.exe type=application
apple.pdb type=PDBFIle
apple.vshost type=Application
apple.vshost.exe.mainfest type=MANIFEST File
ADDITIONAL INFO:
My laptop hardisk crashed so i am currently using it as a usb drive with another laptop . I had 3 partitions but now i see 4 i,j,k,l . One of them which used to be my D: drive working fine, i see it shows 72 GB free out of 150 GB. Rest of them they are just there no info, when i click them nothing for minutes then it says format drive etc... If you know how to fix that that would be wonderful.
Thank you
You really lucked out with having access to the Debug folders containing your compiled binaries. The fact that you're working in a managed language (C#) means that you can use one of the many .NET decompilers to display the source code that they contain in a readable format. It may not be exactly the same as what you initially typed into Visual Studio, but it will be pretty darn close—way better than can normally be expected in the event of a system crash.
I used to recommend Redgate's .NET Reflector for this task, but they recently decided to eliminate the free version of their decompiler utility and adopted some business practices that I personally disagreed with. Then again, their tools are probably still the best around, so you might consider downloading their 30-day trial to attempt to get your code back. Who knows, you might like it so much that you buy!
If you're a cheapskate like me, or a devotee to truly free software, you can try one of the free alternatives that cropped up after Reflector became not free, like ILSpy, developed by the same people who develop SharpDevelop. Even more alternatives are listed here.
Whichever decompiler utility you choose, download a copy and open its executable. Then from the "File" menu, choose "Open", and navigate to the first compiled .exe from which you want to recover source. The utility will display the name of your application and some metadata about your assembly. From here, you can make sure that you opened the correct file.
In both ILSpy and .NET Reflector, you can click the [+] toggle next to your application's name to expand its listing. You'll see a bunch more expandable items, like References (the DLLs that your application uses), Resources (the resource files compiled into your application), and the namespaces defined in your code. Expanding an individual namespace will show you all of the types defined in that namespace, and expanding a type will show you all of the types, methods, members, etc. defined in that type, and so on down the hierarchy. Clicking on individual items in the source tree to the left will display the decompiled code in the output pane to the right; both ILSpy and .NET Reflector support displaying the code as C#, which should look very readable to you.
For example, using ILSpy to open the ILSpy.exe application itself produces the following output:
You really can't break anything in here, so navigate around, exploring and seeing what all can be recovered, amazed at how well this works. Everything works just as well with DLLs as it does with EXEs.
Then get started copying and pasting...
The next order of business is getting your system stable again. If you had a hard disk crash, you definitely don't want to trust that drive ever again! Run out and buy a new one immediately, wipe it, and reload Windows.
Once you finish with that, you definitely want to get on setting up a source/version/revision control system to use in the future to store your code. All smart developers use this for so many reasons. Find more information with a Google search. There are lots of different options. Which one you pick is not important; the important thing is that you pick one!
I have just upgraded to VS 2010, and I have performance problems which I did not have before (in VS 2008).
The most annoying thing is that it freezes while I work in the text editor. Sometimes when it freezes I see that it is saving auto recovery information, but not always.
Almost anything I do gives an unacceptable long delay, like saving, starting to debug, ending debug session, switching between design and code view, and doing WinForms designing.
I have some parts of my home directory on a mapped network drive. I suspect that that might be a part of the problem. Is it possible to configure VS 2010 to use exclusively local disk for its "internal" work perhaps?
Any hints would be appreciated! Has anyone else experienced these kinds of problems?
Edit:
I forgot to give my specs:
Win 7 64-bit
4 gb memory
No addins, just standard installation
The project folder is on the network drive
One interesting thing is that I feel that I have better performance in a VM running XP (where the VM runs on the same PC).
VS is great if you do what microsoft recommends and work on a local copy of your projects.
As soon as you start tying to open projects in remote locations you will get this issue.
Recommendations:
use a source control solution.
create a copy of your project locally and run the solution from that.
Also ...
I think it does it's clever stuff in the background, I found the more i use it the faster it gets, especially on long running projects that I regularly go back to.
If you think it might be aformentioned WPF framework you may want to try switching off aero (as a test) if it helps the problem is likely that your chosen graphics hardware is not very good at effect or 3D based output so it's struggling.
Also try reducing the number of background services and apps you have running.
on windows 7 these days 4 gigs of ram is considered standard, so whilst it should perform fine maybe consider putting more ram in if you are trying to handle large datasets / similar business applications.
Another thing you could try is run a repair install over the top of your existing, it may not have cleanly installed something ... unlikely but it may help.
If you can, buy an SSD disk and move all your projects locally.
I find VS2010 super intensive on disk.
It fly on my home machine with an SSD but it's almost unusable on my work machine(Win7 4 gig RAM, but standard disk)
Try setting the number of parallel builds to half the number of cores you have (I think its in options, settings, Solutions and Project, build and run).. I had it set to 8 which was too much.. it spawned 8 msbuild.exe, rebuilding a solution with 70 projects bottlenecked the disk when they all tried to read/writte similar pre-compiled headers. Those msbuild's stick around even after you close the IDE.
Also I disabled the gather browsing info for implicit files, which made intellisense parsing quicker.
An old post I know, but in case it helps others (as the previous answers focused on source code)...
I found that it wasn't my source code that was the issue, that was held locally along with all the references, but the default locations (project, project templates and item templates) as these were held on a networked drive. These can be altered in the Tools -> Options -> Projects and Solutions.
Alternatively you could change the frequency of the saves or turn them off altogether via Tools -> Options -> AutoRecover
For a solo developer using Visual Studio 2010 beta 2 is TFS Basic a better option than Subversion (with VisualSVN or Ankh) and (optionally) something like Cruisecontrol?
I don't need distributed source or even remote access. I don't really care about drilldowns and all that reporting. I just want version control & potentially automated testing & building.
EDIT: to respond to Bob Aman's questions (Thanks Bob)
I was considering self-hosting but off-site is a good idea , as you say. I back-up regularly. It is really only me who will have access to the repository so access control would not be complicated. I do have an MSDN subscription so cost is not an issue. The repository won't get particularly large - I'm not that productive. :/
If you're not a large company beholden to microsoft, I'd recommend subversion.
TFS source control is - how can I put this - unfortunate.
It works very well if the only thing you ever do is work within visual studio. However, as soon as you want to interact with it from outside (for example, add a text file or some batch files that aren't inside a visual studio solution file), it becomes painful and problematic.
This is mainly because it marks all files as read-only on the file system unless you explicitly check them out. Visual studio does this automatically, but for anything else you always have to explicitly check out files using either the command line tf.exe or the TFS power tools. It gets really annoying really quickly.
Another side effect is the tracking of new files. Subversion et al will tell you about new untracked files and directories when you do a diff, whereas TFS will ignore them entirely. This has led to many many broken builds because a developer forgot to check in a new file, because as far as TFS was concerned, it didn't exist at all until they explicitly added it.
TFS is also strongly tied to the windows domain-model of user authentication. You need to add windows user accounts for people to have access to it, which a pain if you're not already running on a corporate domain network.
Depends very much on a couple of factors. Are you hosting the repository yourself or outsourcing it to a company that handles that for you? (Highly recommended, since that usually means you get off-site backup and redundancy for cheap. Things that can sometimes be hard to get right.) It's a lot easier to find stellar hosting for Subversion. I believe there's only one or two options on the market for TFS hosting. How many people will have access to the repository? Do you need to set permissions on portions of the repository? How do you want to handle access control? If you need to do anything particularly complicated, it's either not possible in TFS or its very expensive. Subversion can usually handle it, though it's not always easy to set up.
Almost everything that TFS's version control can do can also be done on Subversion, either out-of-the-box or with the aid of some additional tool. Subversion also integrates pretty well with Visual Studio, though personally I always preferred TortoiseSVN. It's also a lot less expensive, assuming you don't already have TFS through MSDN subscriptions of some sort.
However, if you ever get into the extreme realms of version control (absurdly huge repositories, or gigantic binaries in the repository, for example) what you really want is Perforce.