How can I start always Visual Studio 2010 with the 3g memory limit, i.e. like with this:
bcdedit /set increaseuserva 3072
"How can I get the operating system started" would be the more typical question these days. Video adapters eat up too much physical address space for this option to still work. Get a 64-bit operating system and you'll have 4 gigabytes. And editbin.exe /largeaddressaware on devenv.exe. It voids the warranty I imagine.
Once you set that limit, it is system-wide. It (your whole system) will always start with that limit. However, remember that while you use .NET, each single object cannot exceed 2GB.
For applications to work above the 2GB border, it is necessary that they have been compiled with the /LARGEADDRESSAWARE flag. Not sure if that flag was used for compiling Visual Studio itself, nor for the .NET Framework. If not, your system as a whole can consume more than 2GB, but applications that do not have that flag set, including Visual Studio, cannot exceed that limit for their own address space.
Related
I'm running out of disk space on my system. Space is filling up consistently even when I'm not installing any software. Free space went from 100 GB free space to 70 GB in a few weeks.
When I checked in my installed Softwares these were shown to take up much space.
As you may notice the following are the versions:
10.0.18362.1
10.0.17763.132
10.0.17134.12
10.0.16299.15
I want to ask if I can uninstall these safely from my system?
If not all, then which ones may I?
Are these redundant copies?
Also I'm not sure if these are needed for Visual Studio or Visual Studio Code. Either ways I was deciding to remove VSCode & VS. So assuming that, will it be safe to uninstall these 4 SDK in image above?
I am trying to build a solution for windows XP in Visual Studio 2005. This solution contains 81 projects (static libs, exe's, dlls) and is being successfully used by our partners. I copied the solution bundle from their repository and tried setting it up on 3 similar machines of people in our group. I was successful on two machines and the solution failed to build on my machine.
The build on my machine encountered two problems:
During a simple build creation of the biggest static library (about 522Mb in debug mode) would fail with the message "13>libd\ui1d.lib : fatal error LNK1106: invalid file or disk full: cannot seek to 0x20101879"
Full solution rebuild creates this library, however when it comes to linking the library to main .exe file, devenv.exe spawns link.exe which consumes about 80Mb of physical memory and 250MB of virtual and spawns another link.exe, which does the same. This goes on until the system runs out of memory. On PCs of my colleagues where successful build could be performed, there is only one link.exe process which uses all the memory required for linking (about 500Mb physical).
There is a plenty of hard drive space on my machine and the file system is NTFS.
All three of our systems are similar - Core2Quad processors, 4Gb of RAM, Windows XP SP3. We are using Visual studio installed from the same source.
I tried using a different RAM and CPU, using dedicated graphics adapter to eliminate possibility of video memory sharing influencing the build, putting solution files to different location, using different versions of VS 2005 (Professional, Standard and Team Suite), changing the amount of available virtual memory, running memtest86 and building the project from scratch (i.e. a clean bundle).
I have read what MSDN says about LNK1106, none of the cases apply to me except for maybe "out of heap space", however I am not sure how I should fight this.
The only idea that I have left is reinstalling the OS, however I am not sure that it would help and I am not sure that my situation wouldn't repeat itself on a different machine.
Would anyone have any sort of advice for me?
Thanks
Yes, 522 Megabytes is about as large a contiguous chunk of virtual memory that can be allocated on the 32-bit version of Windows. That's one hunking large library, you'll have to split it up.
You might be able to postpone the inevitable by building on a 64-bit version of Windows, 32-bit programs get a much larger virtual memory space, close to 4 GB.
I am investigating a strange problem with my application, where the behaviour is different on 2 versions of Windows:
Windows XP (32-bit)
Windows Server 2008 (64-bit)
My findings are as follows.
Windows XP (32-bit)
When running my test scenario, the XML parser fails at a certain point during the parsing of a very large configuration file (see this question for more information).
At the time of failure, the process size is approximately 2.3GB. Note that a registry key has been set to allow the process to exceed the default maximum process size of 2GB (on 32-bit operating systems).
The system of the failure is a call to IXMLDOMDocument::load() failing, as described in the question linked above.
Windows Server 2008 (64-bit)
I run exactly the same test scenario in Windows Server 2008 -- the only variable is the operating system. When I look at my process under Task Manager, it has a * 32 next to it, which I am assuming means it is running in 32-bit compatibility mode.
What I am noticing is that at the point where the XML parsing fails on Windows XP, the process size on Windows Server 2008 is only about 1GB (IOW, approximately half the process size as on Windows XP).
The XML parsing does not fail on Windows Server 2008, it all works as it should.
My questions are:
Why would a 32-bit application (running in 32-bit mode) consume half the amount of memory on a 64-bit operating system? Is it really using half the memory, it is usual virtual memory differently, or is it something else?
Acknowledging that my application (seems) to be using half the amount of memory on Windows Server 2008, does anyone have any ideas as to why the XML parsing would be failing on Windows XP? Every time I run the test case, the error accessed via IXMLDOMParseError (see this answer) is different. Because this appears to be non-deterministic, it suggests to me that I am running into a memory usage problem rather than dealing with malformed XML.
You didn't say how you observed the process. I'll assume you used Taskmgr.exe. Beware that it's default view gives very misleading values in the Memory column. It shows Working set size, the amount of RAM that's being used by the process. That has nothing to do with the source of your problem, running out of virtual memory space. There is not much reason to assume that Windows 2008 would show the same value as XP, it has a significantly different memory manager.
You can see the virtual memory size as well, use View + Columns.
The reason your program doesn't bomb on a 64-bit operating system is because 32-bit processes have close to 4 gigabytes of addressable virtual memory. On a 32-bit operating system, it needs to share the address space with the operating system and gets only 2 gigabytes. More if you use the /3GB boot option.
Use the SAX parser to avoid consuming so much memory.
Not only are there differences in available memory between 32 bit and 64 bit (as discussed in previous answers), but its the availability of contiguous memory that may be killing your app on 32 bit.
On 32 bit machine your app's DLLs will be littering the memory landscape in the first 2GB of memory (app at 0x00400000, OS DLLs up at 0x7xxx0000, other DLLs elsewhere). Most likely the largest contiguous block you have available is about 1.1GB.
On a 64 bit machine (which gives you the 4GB address space with /LARGEADDRESSAWARE) you'll have a least one block in that 4GB space that is 2GB or more in size.
So there is your difference. If your XML parser is relying on a large blob of memory rather than many small blobs it may be that your XML parser is running out of contiguous usable space on 32 bit but is not running out of contiguous usable space on 64 bit.
If you want to visualize this on the 32 bit OS, grab a copy of VMValidator (free) and look at the Virtual view for a visualization of your memory and the Pages and Paragraphs views to see the data for each memory page/paragraph.
i need to run a few visual studios on windows XP and it seems to take up a lot of memory. i am also running resharper which is a memory hog.
i am running 32 bit XP. How much memory can i put into my machine until i get to the point where the OS hits its limit.
Also, any other ways of running multiple visual studio without such slow performance.
32-bit Operating Systems are limited to 4 GB of RAM, which may or may not be enough for you. Also, I think Windows shows 3 GB of RAM if you install 4 GB.
I suggest you switch to 64-bit and upgrade to 8 GB if you can.
UPDATE: See Jeff's blog post on the subject: Dude, Where's My 4 Gigabytes of RAM?
The maximum amount of memory that can be seen by 32bit WinXP is somewhere between 3 and 4 gigabytes depending on your chipset.
I have also run into issues running multiple instances of VS when I had resharper installed. The only thing you can do is run 64bit XP with more memory, or not use resharper (which is a bummer).
32-bit Windows kernel divides the 4GB virtual addressing space in 2GB/2GB partitions. If you feed the /3GB switch to NTLDR it will offer 1GB kernel space / 3GB user mode space. Note that this NOT implies that you can't write software to take advantage of machines with 32-bit CPUs and address more than 4GB at once.
A workaround is the hardware-supported feature to access the remaining memory in banks or "windows" since the CPU still sees a maximum of 4GB addressable space at once. Some database and GIS software offer this possibility. This is called Physical Address Extensions and allows to use (not addressing at once) up to 64GB with 36-bit addresses. WinXP offers AWE, an API built on top of PAE.
That's the theory. For using Visual Studio you can get the full 4GB for your system or upgrade to a 64-bit OS with more RAM. This only if VS offers a 64-bit version.
"Also, any other ways of running multiple visual studio without such slow performance."
+1 trick: you should use a RAM disk (download) to accelerate I/O.
If you're using - and hopefully do - source-managament system (ie. Subversion), you must just checkout your projects there. VS.NET makes tons of I/O calls, and RAM disks are much faster than real disks.
CAUTION! If you turn off your computer, RAM Disk disappers.
If a Windows application has the IMAGE_FILE_LARGE_ADDRESS_AWARE set in the image header (via the /LARGEADDRESSAWARE compiler flag), this is typically to allow a 32-bit application to use more than 2GB of memory (only makes sense if the 32-bit Operating System has set the 3GB switch in boot.ini). See MSDN article /3GB for more info.
My questions is, what happens if you run this application on a system that does NOT have the 3GB switch set. Is it simply ignored? Or will the app try and use a 3GB heap and get out-of-memory errors because the userspace only has 2GB available?
I keep hearing anecdotally that the LARGEADDRESSAWARE switch is ignored for 2GB userspace systems but cannot find any official Microsoft documentation on this.
Thanks in advance.
Basically the IMAGE_FILE_LARGE_ADDRESS_AWARE tells the system, "I know that addresses with the high bit set are not negative, and can handle them".
If the system is prepared to provide user mode addresses above 2GB, then it will. If the system is not prepared to give those addresses (ie., a 32-bit Windows OS without the /3GB setting), the process can't get those addresses anyway - but no harm done.
Also note that if an image has the IMAGE_FILE_LARGE_ADDRESS_AWARE bit set it will get access to address space above 2GB on Win64 systems, which do not support (or need) the /3GB switch. A 32-bit application will get an address space of something close to 4GB and a 64-bit application will get a huge address space - 7TB to 8TB depending on the platform (64-bit builds set the bit by default).
http://msdn.microsoft.com/en-us/library/aa366778.aspx#memory_limits
The switch is ignored, if you can call it that. For once, Microsoft actually managed to come up with a descriptive name.
The flag means exactly what it says. This image file is aware that large addresses exist.
That is, it won't crash, if it is given a pointer above the 2GB boundary.
And that's all. The OS doesn't have to treat the process special in any way. It simply indicates that if the OS is able to provide more than 2GB memory, this process can handle it without crashing.
You can make a simple hello world application which never uses more than 1.5MB, and still has this flag set. It doesn't mean "I want to use 3GB of memory", it means "When I request memory, I don't care if it's above or below the 2GB boundary".
So since the flag doesn't require the OS to do anything special, the OS simply won't do anything special if there is nothing special it can do.