I'm running out of disk space on my system. Space is filling up consistently even when I'm not installing any software. Free space went from 100 GB free space to 70 GB in a few weeks.
When I checked in my installed Softwares these were shown to take up much space.
As you may notice the following are the versions:
10.0.18362.1
10.0.17763.132
10.0.17134.12
10.0.16299.15
I want to ask if I can uninstall these safely from my system?
If not all, then which ones may I?
Are these redundant copies?
Also I'm not sure if these are needed for Visual Studio or Visual Studio Code. Either ways I was deciding to remove VSCode & VS. So assuming that, will it be safe to uninstall these 4 SDK in image above?
Related
The OSX 10.6.6 is installed inside VMware on Windows 7 host. The overall performance is great, However, the compiling time increased dramatically (1 hour against 2-3 min on pure MacOS). It's modern machine with Core i5 & 4GB RAM.
Here are the XBench results:
http://db.xbench.com/merge.xhtml?doc1=517768&doc2=1&setCookie=true
I think the problem could be in extremely slow 4k write value, but I don't know how to improve this.
Is there any way to increase performance?
UPD1: swap is not used, there is enough memory for all operations
the disk speed is also not related, since my another Macbook shows event worse results, and compiles hundreds times faster.
UPD2: problem solved, see my answer below
Sharing experience and solution.
My Xcode was running fine but when I build a project (even an empty one), it would take up to 10 minutes.
SOLUTION:
Go to Xcode -> Preference -> Source Control: Dissable Source Control
Now projects build and run in a matter of seconds.
In VMWare, you should have a setting where you can dedicate one or two cores entirely to the virtual machine. Assuming you have quad core, maybe give MacOSX 2 or 3 cores? If you have dual-core and you've allocated 1 core to the VM (and the problem still persists), i can't say much then!
It's good that your problem is solved, but I want to share my experience for improving vmware performance. Please do install VMware tools for mac os and they are present in .iso file.
Steps to install VMware tools for MAC OS:
1) Power on your VM.
2) At the right bottom they are some pop-up symbols(These are usually not present in full screen mode). Rightclick the CD/DVD symbol.
3) Click setting. In this window make sure that darwin.iso is selected.
4) Close this window and again right click CD/DVD symbol.
5) Click connect. An icon will appear with name darwin(300).
6) Inside this file tools are present. Install them!
The problem was: VERY SLOW recursive searching of include paths. If non-recursive, everything works smooth.
I also got the same problem, But i want share my personal experience here.
My CPUs RAM capacity is 4 GB, So i allocated 3.5 GB to the VMWare
because of this it was very slow the entire operating system.
So one day i clearly observed the VMWare settings, finally found the
solution. If we allocate the RAM memory more than recommended then
also your operating system hangs. For my System the recommended RAM
memory is 2048MB, after adjusting this now OS is fast.
We can adjust the RAM memory in Devices option, inside Hardware. For
clarification here i am attaching the screen shot.
I had the same problem and I fixed it as follow:
Most boost I got with changing my vmware config file to disable memory
stored in .vmem file. In my .vmx file I added :
mainMem.useNamedFile = "FALSE"
prefvmx.minVmMemPct = "100"
Setting max cores to the guest
When programming with swift and XCode. Remove all comments /* */ not really used.
I have a laptop machine with below configuration:
Core 2 Duo # 1.4 GHz
4GB RAM
320GB HardDrive
Windows 7
Whether this is sufficient for installing VS 2010? The speed of processor is 1.4GHz, but in Microsoft website they have given minimum of 1.6GHz processor speed. Can anyone tell from their experience?
Thanks in advance.
Will most likely install, however I would expect it will run slow. Depends on what sort of work you are doing. Small console apps would be OK but I doubt full blown WPF/Silverlight apps would be speedy. Also, if your connecting to a local SQL instance.. etc (could pull an increased overhead).
Sum Up.
Will install.
Work will be tedious.
Another SO post for reference VS 2010 Requirments
The main issue is the way that VS2010 uses WPF; you might find that large files behave a little jerkily in the text editor, but I don't think it'll be un-usable.
I've not tried VS2010, but I do have VS2008 + SQL Server Express installed on a netbook with a few years old Atom CPU and 2 GB of RAM, and it works fine though it's obviously a bit slow. So I'd assume that you'll have no problems since even if the requirements for VS2010 are higher, your laptop is much higher spec than that netbook.
Will work. but might have some performace issues on Editor / Designer. I had a machine with almost similar configuration. used it for silverlight developement. I always has problem in the design preview of the XAML file. - it gets loaded after some time then expected time.
I am trying to build a solution for windows XP in Visual Studio 2005. This solution contains 81 projects (static libs, exe's, dlls) and is being successfully used by our partners. I copied the solution bundle from their repository and tried setting it up on 3 similar machines of people in our group. I was successful on two machines and the solution failed to build on my machine.
The build on my machine encountered two problems:
During a simple build creation of the biggest static library (about 522Mb in debug mode) would fail with the message "13>libd\ui1d.lib : fatal error LNK1106: invalid file or disk full: cannot seek to 0x20101879"
Full solution rebuild creates this library, however when it comes to linking the library to main .exe file, devenv.exe spawns link.exe which consumes about 80Mb of physical memory and 250MB of virtual and spawns another link.exe, which does the same. This goes on until the system runs out of memory. On PCs of my colleagues where successful build could be performed, there is only one link.exe process which uses all the memory required for linking (about 500Mb physical).
There is a plenty of hard drive space on my machine and the file system is NTFS.
All three of our systems are similar - Core2Quad processors, 4Gb of RAM, Windows XP SP3. We are using Visual studio installed from the same source.
I tried using a different RAM and CPU, using dedicated graphics adapter to eliminate possibility of video memory sharing influencing the build, putting solution files to different location, using different versions of VS 2005 (Professional, Standard and Team Suite), changing the amount of available virtual memory, running memtest86 and building the project from scratch (i.e. a clean bundle).
I have read what MSDN says about LNK1106, none of the cases apply to me except for maybe "out of heap space", however I am not sure how I should fight this.
The only idea that I have left is reinstalling the OS, however I am not sure that it would help and I am not sure that my situation wouldn't repeat itself on a different machine.
Would anyone have any sort of advice for me?
Thanks
Yes, 522 Megabytes is about as large a contiguous chunk of virtual memory that can be allocated on the 32-bit version of Windows. That's one hunking large library, you'll have to split it up.
You might be able to postpone the inevitable by building on a 64-bit version of Windows, 32-bit programs get a much larger virtual memory space, close to 4 GB.
i need to run a few visual studios on windows XP and it seems to take up a lot of memory. i am also running resharper which is a memory hog.
i am running 32 bit XP. How much memory can i put into my machine until i get to the point where the OS hits its limit.
Also, any other ways of running multiple visual studio without such slow performance.
32-bit Operating Systems are limited to 4 GB of RAM, which may or may not be enough for you. Also, I think Windows shows 3 GB of RAM if you install 4 GB.
I suggest you switch to 64-bit and upgrade to 8 GB if you can.
UPDATE: See Jeff's blog post on the subject: Dude, Where's My 4 Gigabytes of RAM?
The maximum amount of memory that can be seen by 32bit WinXP is somewhere between 3 and 4 gigabytes depending on your chipset.
I have also run into issues running multiple instances of VS when I had resharper installed. The only thing you can do is run 64bit XP with more memory, or not use resharper (which is a bummer).
32-bit Windows kernel divides the 4GB virtual addressing space in 2GB/2GB partitions. If you feed the /3GB switch to NTLDR it will offer 1GB kernel space / 3GB user mode space. Note that this NOT implies that you can't write software to take advantage of machines with 32-bit CPUs and address more than 4GB at once.
A workaround is the hardware-supported feature to access the remaining memory in banks or "windows" since the CPU still sees a maximum of 4GB addressable space at once. Some database and GIS software offer this possibility. This is called Physical Address Extensions and allows to use (not addressing at once) up to 64GB with 36-bit addresses. WinXP offers AWE, an API built on top of PAE.
That's the theory. For using Visual Studio you can get the full 4GB for your system or upgrade to a 64-bit OS with more RAM. This only if VS offers a 64-bit version.
"Also, any other ways of running multiple visual studio without such slow performance."
+1 trick: you should use a RAM disk (download) to accelerate I/O.
If you're using - and hopefully do - source-managament system (ie. Subversion), you must just checkout your projects there. VS.NET makes tons of I/O calls, and RAM disks are much faster than real disks.
CAUTION! If you turn off your computer, RAM Disk disappers.
I'm setting up a complete .NET development environment on my Macbook Pro.
I'm using Visual Studio 08 team suite, SQL server 2008, MS Office and other tools (like FinalBuilder, RegexBuddy, Beyond Compare).
How big should my windows 7 (beta currently) partition be? Will 100GB be enough?
NOTE: I wasn't sure if this was programming related enough for SO, so I'll just let the community decide if this question is relevant.
100 GB should be more than enough for all those apps. I've installed win7 in a virtual machine and the virtual HDD ended up with a size of 7GB (that's only the OS of course). Trying the same with Vista, for example, uses about 25GB. It seems they're making it lighter.
You described my laptop. 100 GB would leave approx. 40GB for Users directory.
100 GB will be plenty. You'll have OS, apps, but no music, pics, videos. 100GB is probably overkill, especially if you can resize it if needed.
I have Windows 7 installed on a laptop with 2 100 Gb hard drives.
Currently I'm using 18 Gb and that's with most of the primary stuff installed, but not Visual Studio or SQL, but those probably won't use more than 10 Gb (I reckon). I do have Virtual XP Mode installed which is probably quite large too.
The Windows folder is about 9.3 Gb
The User folder is 3.2 Gb (but I have some large files on my desktop)
The Program Files is 3.0 Gb
The rest of the files on the OS-drive are mostly driver files which you don't have to leave on the drive itself.
So 100 gb would probably even be an overkill, but does give you some headroom!
Windows7 is going to be a little bit smaller than Windows Vista. So if you create partition big enough for Windows Vista, it will be perfect for Windows7.
See Engineering7 blog for more information about disk space in Windows7.
I would give as much as you could to Windows 7, since it will probably become your primary OS. I find that I rarely use my OSX partition, except for cracking WEP.
100 GB is barely enough. You can install Windows 7 and the mentioned programs along with lot of other stuff, but once you get to have some lots of trash there and there plus you happen to download movies and such it gets cumbersome.
Unless you're relying on some other device for things other than those tools, I recommend a larger space allocation, of at least 150 GB