How to reduce the size of Qt build? - visual-studio-2010

I'm trying to build Qt with visual studio (2010), but the build has so far taken up over 16GB of my tiny hard drive. I've already uninstalled practically every program I have.
Is there any way I can compile qt without it hogging so much space, yet still get every feature? And once the build completes, will the unrequired files clear up, or do I have to do that manually? How big will the build be (as said earlier, I've already reached 16+ GB)?
I'm new to this, so please speak in layman's terms. Thanks.

Start with 40-50GB of free space. The build generates a lot of temporary files which you can clean later manually.
If that is too much for your computer, get an external harddisk. 1000GB should be less than $100.

You can reduce it's size by avoiding Qt-WebKit,etc(if you want). Check this link

Related

Speeding up WIX compiles

I have a WIX 3.0 installer that is building 88 slightly different builds (cross product of 32 and 64-bit, 11 locales, four editions (Beta, Retail, Evaluation, Different Evaluation).
Each build has slightly different contents in addition to localized UI, so I can't just build one configuration with multiple locales.
The resulting MSI is about 120MB. I'm already using the CabCache.
The installer takes about 3-5 minutes per release to build, resulting in a pretty lengthy overall build time.
The install appears to be heavily disk bound during linking (light.exe).
Clearly making the disks faster could help. Does anybody have advice on how to set up a machine that could crank through these installers faster? (or advice on reconfiguring my WIX project to build more efficiently?)
Get an SSD. Like one of those with internal RAID architecture from e.g. OCZ. SSD is every developer's upgrade of the decade. Plus more RAM if swapping is an issue.
If you have common parts (that are not localized) you can create a merge module with the common parts and then just add the differencing stuff to each build.
I am not sure if you have any say or communication with the developers of the application that you are installing, but if you have to create that many MSI's mainly because of languages, have you considered just offering one Language MSI that delivers all the language specific files to a resources directory and then the user can choose which language they would like to use (but only install this if they need something other than the default language). Also it might be worth looking into having the product made in such a way that the user can pick from within which language is best, then having all the languages installed from the start.
As for your question about speeding up the build, that is a tricky one. Using Merge Modules I would rule out right away, as I don't see any actual gain coming out of that. Of course updating the hardware (as you said) will give some results, but again, I am not sure how much of a jump you would be making so it is hard to tell what kind of gain that would give. I think it might be best to go over your WXS with a fine tooth comb and see what is really going on in there. You can sometimes find things that are left over from the developement of the package, or from a previous tool that are really slowing you down. One example would be that my company recently switched to WiX from a more automated setup creation utility (leaving the name out on purpose cause I am listing the problems with it :P ) and it automatically created every folder under Windows that might possibly be needed in the running of a windows application, as well as the common files folder, the current user profile, and many many more. I think I ended up erasing in all over 100 empty directories that this old technology was nice enough to add for me. That is just one example of optimization that was done. It is amazing what can be found when you take the time to REALLY review what is going on under the hood.
In your wixproj setup file add this just before the end of file in <PropertyGroup> tag
<IncrementalGet>true</IncrementalGet>
This will tell WIX to compile only those files which are changed after the previous build.

VS 2010 very slow

I have just upgraded to VS 2010, and I have performance problems which I did not have before (in VS 2008).
The most annoying thing is that it freezes while I work in the text editor. Sometimes when it freezes I see that it is saving auto recovery information, but not always.
Almost anything I do gives an unacceptable long delay, like saving, starting to debug, ending debug session, switching between design and code view, and doing WinForms designing.
I have some parts of my home directory on a mapped network drive. I suspect that that might be a part of the problem. Is it possible to configure VS 2010 to use exclusively local disk for its "internal" work perhaps?
Any hints would be appreciated! Has anyone else experienced these kinds of problems?
Edit:
I forgot to give my specs:
Win 7 64-bit
4 gb memory
No addins, just standard installation
The project folder is on the network drive
One interesting thing is that I feel that I have better performance in a VM running XP (where the VM runs on the same PC).
VS is great if you do what microsoft recommends and work on a local copy of your projects.
As soon as you start tying to open projects in remote locations you will get this issue.
Recommendations:
use a source control solution.
create a copy of your project locally and run the solution from that.
Also ...
I think it does it's clever stuff in the background, I found the more i use it the faster it gets, especially on long running projects that I regularly go back to.
If you think it might be aformentioned WPF framework you may want to try switching off aero (as a test) if it helps the problem is likely that your chosen graphics hardware is not very good at effect or 3D based output so it's struggling.
Also try reducing the number of background services and apps you have running.
on windows 7 these days 4 gigs of ram is considered standard, so whilst it should perform fine maybe consider putting more ram in if you are trying to handle large datasets / similar business applications.
Another thing you could try is run a repair install over the top of your existing, it may not have cleanly installed something ... unlikely but it may help.
If you can, buy an SSD disk and move all your projects locally.
I find VS2010 super intensive on disk.
It fly on my home machine with an SSD but it's almost unusable on my work machine(Win7 4 gig RAM, but standard disk)
Try setting the number of parallel builds to half the number of cores you have (I think its in options, settings, Solutions and Project, build and run).. I had it set to 8 which was too much.. it spawned 8 msbuild.exe, rebuilding a solution with 70 projects bottlenecked the disk when they all tried to read/writte similar pre-compiled headers. Those msbuild's stick around even after you close the IDE.
Also I disabled the gather browsing info for implicit files, which made intellisense parsing quicker.
An old post I know, but in case it helps others (as the previous answers focused on source code)...
I found that it wasn't my source code that was the issue, that was held locally along with all the references, but the default locations (project, project templates and item templates) as these were held on a networked drive. These can be altered in the Tools -> Options -> Projects and Solutions.
Alternatively you could change the frequency of the saves or turn them off altogether via Tools -> Options -> AutoRecover

How can I determine why a build runs slowly in Visual Studio 2005?

I wanted to know if it is possible to know why a Visual Studio 2005 (MSBuild) build is taking a long time to build a project.
Suddenly we are getting 7-minute build times on some computers, while others take less, such as 4 minutes.
So I think I need to identify changes that were made to the project and are causing a longer build time.
Any ideas on how I can do that?
Take a look at MSBuild Profiler to analyze where the slow down is. Based on that information, dig into what each task is doing and factor in the things that Chris mentions in his answer.
Is it a C++ project? I had the same problem when I moved my project from Visual Studio 6.0. Turning off the Code Optimization did save a lot of time. It was almost impossible to work with activated Optimization.
It re-evaluates the references on every build. If any of your references are on network drives that could be slowing it up.
Some computers are naturally going to perform builds faster than others.. This is, in part, a function of processor, ram, and HD speeds.
Regarding why you see 7 minute build times there could be any number of reasons. Amount of code in project(s). Number of projects in solution. Amount of post / pre build steps. Speed of network in downloading anything from source control it needs. Number of other processes running on your computers. Amount of RAM and other resources available to perform the builds..
You get the idea.

Why do some installations take so much time?

Pretty much we've all done an installer here and there - and all of us did an installation of some behemoth of a program. Why do some installations take so much time? Case in point: Adobe CS suite (with newer versions you can take a vacation) or Visual Studio.
I know there are files to copy - most of the time unpack even. There are some registry keys to set (if under Windows), maybe a service or couple to start. Some installations probably even check hardware/software combination. All of this does not justify sllloooow installation time in some of the programs.
How can I speed it up?
It obviously depends what you're installing As Colin Pickard pointed out, you'll be shifting huge quantities of data onto the disk (+optional virus check etc.).
For installations I've built recently, we have to request the shut down of some Windows services, wait for that, and check that they really have shut down before continuing. That takes time.
I confess that in the above, that's not parallelised, whereas it could be. I suspect that installations are not necessarily optimised. They may well be the last thing that the team put together prior to release, and they may well figure that you're only going to do it once (and forget the pain upon completion). Obviously not an ideal state of affairs!
Visual Studio on my machine is 3.03GB - 16,842 files in 1,979 folders. Passing 3GB through virus scan and auditing software and onto the filesystem is too much for my (dualcore,2GB,sata2) system - it's CPU or IO bound the whole way through the process. That's why it takes so long.
Most installers not only pack, but also compress their contents, so at installation time all of these files must be decompressed. All of the data that is decompressed must be written to disk after it is decompressed as well.
Look at the time a zip operation takes on several files. It's also slow.
Many installers maintain a log that is flushed to the disk after each primitive operation so that even if installation encounters a fatal failure the log is preserved and can be sent to the software vendor. Such flushing sums up and significantly contributes to overall time.

ReSharper sluggishness

I like ReSharper, but it is a total memory hog. It can quickly swell up and consume a half-gig of RAM without too much effort and bog down the IDE. Does anybody know of any way to configure it to be not as slow?
Turn off the on-the-fly compilation (which, unfortunately, is one of its best features)
The next release 4.5 is going to based around performance and memory footprint.
see Ilya Ryzhenkov's blog
Resharper 4.5 has been released
From my experience it is less of a memory hog, but i still can run out of memory.
I had an issue where it was taking upwards of 10 minutes to load a solution of 100+ projects. Once loaded VS performance would be ok, though it would oddly flutter back and forth between ok and really bad.
The short answer: Eliminating Resharper warnings seems to improve overall VS/R# performance.
The biggest problem ultimately was that we had a number of files of binary data (encrypted stuff) being included as embedded resources, which happened to have .xml extensions. Resharper was trying really really hard to analyze those files. Eventually it'd get through but would generate 100K+ errors in the process. Changing the extension to one Resharper did not automatically analyze (.bin in this case) solved the problem.
We still have about 10 files which when they or a file they depend on is edited performance tanks for a while. These files are the partial parts of a single class definition where each file averages 3000 LOC. Yes, that's right, it's about a 30K line class. It also happens to be rather poor code for other reasons, many of which Resharper flags making the right hand gutter bar practically a solid orange line. Editing often causes Resharper to reanalyze the whole thing. While that analysis runs, performance is noticeably affected.
I've come to the conclusion that the less errors/warnings there are for R# to identify, the better it performs. My anecdotal evidence gathered while cleaning up/refactoring this project seems to support it.
A lot of folks complain of perf problems with Resharper. If you have even a few big ugly code files with lots of Resharper warnings, then a little time spent cleaning that code up might yield better performance overall. It has for us.
Not sure how big your solutions are, but I stopped using 4.5 for the same reasons I stopped using all previous versions, memory usage.
Code analysis and unit test support was the main reason I bought it, turning it off means the rationale for using it is gone.
Workstation has 4GB of memory, and I can easily kill it with ReSharper when running our end-to-end stack in debuggers.
You can look how much memory ReSharper use.
ReSharper -> General -> Show managed memory usege in status bar.
If you are working on large source files, Resharper does get sluggish (I'm working on version 5.0 at the time of writing this).
You can view the memory usage of Resharper by clicking on Resharper options -> General -> Show memory use in status bar.
When I first did this, I noticed Resharper had clocked up hundreds of megabytes of memory usage! However, the next step worked for me in (temporarily) fixing the slugishness:
Right click the memory usage, and select "Collect garbage" - this seemed to fix the slugishness for me straight away.
Regarding memory hogging - I've found that my VS2008 memory footprint grows every time I close one solution and open another. This is true even if I close a solution and re-open that same solution.
The new ReSharper 4.5 works a lot better than the previous 4.x releases. I would recommend you try that one.
In previous versions I had the same problem, when 4.0 came out these problems have seemed to have gone away. Now with 4.1 i do not feel the huge slow down i used to have. My IDE does not freeze up anymore.
have you tried upgrading ?
Try the 4.5 beta. 4.1 was killing my 2GB dev machine, but it's back to running incredibly smoothly with the beta. Others have had the opposite experience, though, so YMMV.
Yes, 4.5 works much better. My understanding is that 4.5 was to address the performance issues.
Me and my colleagues are also having huge performance issues with ReSharper, just now my ReSharper took 1.1GB of memory. Visual Studio slows down specially when writing JavaScript, it's unbearable. You can turn of the on the fly compilation, but it's the best feature it has...
edit: Everybody in this thread seems to have ReShaprper 4.x, my version is 6.0.

Resources