Not enough storage is available to complete this operation - visual-studio-2010

Environment:
Visual Studio Ultimate 2010
Windows XP
WPF Desktop Application using .NET 4.0
We have a desktop application which plays a video. This video is part of a project and the project is packaged into the installer. Every once in a while building the installer project shows this error message:
Not enough storage is available to complete this operation
If I restart Visual Studio it works.
Is there a way to avoid this? Is there a better way to package videos in an installer?

This usually happens when the build process needs a lot of RAM memory and cannot get it. Since restarting Visual Studio fixes the problem, most likely it also your case.
Try closing some of the running applications. You can also try adding more RAM to your machine or increasing the page file.

I came across this question when trying to compile my C# solution in Visual Studio 2010 in Windows XP. One project had a fair number of embedded resources in (the size of the resultant assembly was ~140MiB) and I couldn't compile the solution because I was getting the
Not enough storage is available to complete this operation
error in my build output.
None of the answers on this question helped, but I did find an answer to "Not enough storage is available to complete this operation" by ScottBurton42 on social.msdn.microsoft.com. It suggests adding the 3GB switch to the Boot.ini file, and making devenv.exe large-address aware. Adding the 3GB switch to my Boot.ini file was what worked for me (I think devenv.exe for Visual Studio 2010 and above is already large-address aware).
My answer is based on that answer.
Solution 1: Set the /3GB Boot.ini switch
The page Memory Support and Windows Operating Systems on MSDN says:
The virtual address space of processes and applications is still limited to 2 GB unless the /3GB switch is used in the Boot.ini file.
The /3GB switch allocates 3 GB of virtual address space to an application that uses IMAGE_FILE_LARGE_ADDRESS_AWARE in the process header. This switch allows applications to address 1 GB of additional virtual address space above 2 GB.
The virtual address space of processes and applications is still limited to 2 GB, unless the /3GB switch is used in the Boot.ini file. The following example shows how to add the /3GB parameter in the Boot.ini file to enable application memory tuning:
[boot loader]
timeout=30
default=multi(0)disk(0)rdisk(0)partition(2)\WINNT
[operating systems]
multi(0)disk(0)rdisk(0)partition(2)\WINNT="????" /3GB
Note "????" in the previous example is be the programmatic name of the operating system.
In Windows XP, the Boot.ini file can be modified by going to
System Properties → Advanced → Startup and Recovery → Settings → System Startup → Edit
The page on the /3GB switch on MSDN says:
On 32-bit versions of Windows, the /3GB parameter enables 4 GT RAM Tuning, a feature that enlarges the user-mode virtual address space to 3 GB and restricts the kernel-mode components to the remaining 1 GB.
The /3GB parameter is supported on Windows Server 2003, Windows XP, and Windows 2000. On Windows Vista and later versions of Windows, use the IncreaseUserVA element in BCDEdit.
Restarting the machine will then cause the setting to take effect.
Solution 2: Make devenv.exe large address aware:
Open up a Visual Studio Command Prompt (or a Developer Command Prompt, depending on the version of Visual Studio)
Type and execute the following command line:
editbin /LARGEADDRESSAWARE {path}\devenv.exe`
where {path} is the path to devenv.exe (you can find this by going to the properties of the Visual Studio shortcut).
This will allow devenv.exe to access 3GB of memory instead of 2GB.

Problem
In my case, the problem was with a test project containing a very large (1.5GB) test file as an embedded resource. I have 16GB RAM in my machine with 8GB free when this occurred, so RAM was not the issue.
It is possible that we are hitting the 2 GB limit that the CLR has on any single object. Without delving into what MSBuild is doing under the hood, I can only speculate that during compile time, the embedded resource is loaded into an object graph that is hitting this limit.
The Error message is very unhelpful. My first thought when I saw it was, "Have I run out of disk space?"
Solution
It is a file validation test project. One of the requirements is to be able to handle files of this size, so on face value my team thought it reasonable to embed it for use in test cases.
We fixed the error by moving the file onto the network (in the same way that it would be accessed by the validator in production) and marking the test as an integration test instead of a unit test. After-all, aren't unit tests supposed to be fast-running?

Cleaning And rebuilding the solution worked for me

For Visual Studio, you can try to do the following:
Close All Visual Studio instances.
Open Visual Studio Developer tool in Administrator mode.
Navigate to:
C:\Program Files (x86)\Microsoft Visual Studio\2017\Professional\Common7\IDE.
Type following:
editbin /LARGEADDRESSAWARE devenv.exe.
It's worth also to restart PC.
Hope this helps )

In my case, I had very less memory left in C drive. I cleared few items from C drive and tried again. It worked.

I might be late to answer but for future reference, you might want to check the Windows dump file settings (and probably set it to none).

In my case the server I was executing the code on couldn't handle my parallelized code.
Normally I'm running a setup like the following
new ParallelOptions { MaxDegreeOfParallelism = Math.Max(1, Environment.ProcessorCount / 2) }
Introducing a variable and allowing lockdown of cores used to 1 (resulting in code like the following), resolved this issue for me.
new ParallelOptions { MaxDegreeOfParallelism = 1 }

The key for me:
We had embedded a huge database template (testing had filled it with lots of data) into the application. I have not seen this issue arise since removing Embedded Resource boils properly and moving the database to a recourse folder.

My fix this problem with delete or disable(exclude) the *.rpt files that have large size;and I've optimize the my reports!

I am late to Answer but may be useful for others
In my case just restarting Visual Studio fixes the problem

Related

Memory consumption of Windows application

I hope the question is not too vague and someone can add some light to my problem.
I created a Windows application (makefile) with the chromium project (already asked about this problem in the chromium forum) and Visual Studio 2019.
The application starts some processes and each of them used around 20 KB memory but strangely this same application uses over 200 KB per process is some PCs with same Windows version.
(Memory usage after starting the application, nothing else done)
I have been fighting a couple of days with the compiler/linker options with no success. Still huge memory usage.
Chromium examples did not show this problem using my makefile which made me even more crazy.
At the end I ended up changing the name of the exe file, instead of app.exe just app1.exe and...problem gone, normal memory usage in all PCs which shown this problem.
I changed the name in the makefile to generate same executable but with different name and also changed in Windows Explorer the name of the original and problematic exe file with same positive result.
I renamed in Windows Explorer the good app1.exe application back to app.exe and the problem appeared again...
I am searching now for some kind of Windows configuration or program which could generate this problem but no luck so far. Windows Firewall already disabled.
or could this be some kind of virus?
Problem solved...thanks to ProcessHacker tool I found out that the library "verifier.dll" was loaded in the bad case.
This dll is part of the Windows Application Verifier tool. I do not remember to have used this tool before, it is even not activated.
I then found out that there is a registry entry:
Computer\HKEY_LOCAL_MACHINE\SOFTWARE\WOW6432Node\Microsoft\Windows NT\CurrentVersion\Image File Execution Options\
where I could find the name of my application. After renaming this entry everything works as expected.
I will now investigate what the variables inside this entry mean:
GlobalFlag = 0x02000000 and PageHeapFlags = 0x2

Visual Studio 100% disk usage

I have VS 2013 and Microsoft Windows 8.1
The issue appeared at the ending of last week. Without any updating or important changing, when I do somethings in VS, disk usage reaches 100%. For example, when I click on "Check In" button in the "Team Explorer" window, disk usage raises up to 100%. Sometimes by a simple right-click in text editor this problems happens.
I googled about 100% disk usage problem but there are some things about this problem on windows 8.1 but on my computer, all applications are running without any problem, just VS2013 has a "full disk usage" problem.
Some information about my system:
OS Name: Microsoft Windows 8.1 Pro
OS Version: 6.3.9600 N/A Build 9600
System Type: x64-based PC
Processor(s): 1 Processor(s) Installed.
Intel64 Family 6 Model 60 Stepping 3GenuineIntel ~3500 Mhz
Total Physical Memory: 8,131 MB
Available Physical Memory: 3,836 MB
Virtual Memory: Max Size: 10,947 MB
Virtual Memory: Available: 5,275 MB
Virtual Memory: In Use: 5,672 MB
Page File Location(s): C:\pagefile.sys
(Comment for others landing here as #Marta explains that the problem no longer persists on their machine.)
In general, any performance issue in Visual Studio should be reported to Microsoft. It's easy to do this directly from VS using the Report a Problem tool. That feature will automatically attach logs/traces which are shared privately with Microsoft. Internally, tooling will analyse those attachments to assign a ticket to the relevant team. With such attachments, there is a high likelihood that the problem can be diagnosed and fixed in a future release of Visual Studio.
Instructions on the Report a Problem tool:
https://learn.microsoft.com/en-us/visualstudio/ide/how-to-report-a-problem-with-visual-studio?view=vs-2019
If you prefer to diagnose high disk IO yourself, FileMon can be a useful tool:
https://learn.microsoft.com/en-us/sysinternals/downloads/filemon
I am using Visual Code 1.71.2 as well as Visual Studio 2022 (Community Edition) on Windows 10. I am also facing the same issue.
After lot of checking, found disabling superfetch mitigates this issue. But again, Windows, applications startup take lot of time.
As a workaround, I found that by clearing %temp% folder after using visual studio or code eliminates this issue and disk activity is normal.
But every time, I may not remember this cleanup and hate it for forgetting :(
Hope this helps someone in similar situation.
It could be related to Visual Studio updates - which would show under C:\ProgramData\Package Cache.
A disk space management tool like TreeSize Pro will help figure it out though ... it will show which directory is using the most space. You can then target what aspect of Visual Studio is eating up your drive space.
There is a free trial at https://www.jam-software.com/treesize/
You can also use this tool to export and post a screenshot / export of the usage here and it may help identify what is going on.
I had a similar issue that turned out to be the built-in git provider having issues with large codebases containing a moderate-to-large amount of changes before a commit.
Changing to a third-party one fixed the issue.
The operating system manages the resources (CPU cores, disk drives, GPU) to deliver what you have asked of it.
Ideally (what the OS designers are hoping for), when you perform an action, all the resources spin up into action and due to a well balanced system, they all go to 100% utilization, for a brief length of time, then go back to idle.
This form of utilization is, in practice impossible to achieve, as the PC builders would have to know what your system is going to be used for.
When the task manager describes the CPU as 100% utilized, it means that all the cores on the box, are busy running code, and are the bottleneck.
When the task manager describes the disk as 100% utilized, it (as far as I can tell), means that there is always a queue of items to be read, or written to/from the disk. Even with 100% utilization, it may be that the metric is the only reason you are concerned, and the system is otherwise responsive.
In either of these cases, it shows that for a given workload, the CPU or the disk drive has become the rate determining step.
In practice, it should not matter, unless the length of time the system is at 100% is longer than a few minutes, or that your machine feels otherwise sluggish.
Further diagnosis can be performed by using the tool Sys internals : procmon, or the Microsoft : ADK
I would look using the procmon, at what files are being accessed during the 100%disk usage period, and decide whether
The behavior is sensible (if not raise a bug with Microsoft)
The machine is working usably (if not consider a hybrid or ssd disk)
I've had some exasperating problems with disk usage and source control explorer.
What fixed the issue for me was making sure that I never opened Source Control Explorer in more than one project at a time, keep it closed when I could and limit the amount of VS instances you have open.
An SSD may can solve this... Are you sure this is caused by visual studio? when I was using Windows 8.1, the Windows Defender get to 100% disk usage from time to time. If you're sure it occurs when you use Visual Studio, you can try to repair it using the installer. Hope these would help you.
Try moving the source code to SSD drive.
HDDs have much slower disk I/O performance compared to SSD drives.
Generally in windows C drive comes as SSD drive.

What are the implications of making Visual Studio 2010 able to use more than 2GB of RAM?

Alright, I found this guide and a few others on the internet which suggest running the following command from the VS 2010 IDE directory using the Visual Studio Command Prompt:
editbin /largeaddressaware devenv.exe
I've run this, and everything so far seems to work fine (I haven't run into any issues yet). But what I can't find information on is what negative implications, if any, there are by making Visual Studio 2010 use more than 2GB of RAM? Visual Studio was built to use a max of 2 GB of RAM. If VS was meant to use more than 2 GB of RAM, then I wouldn't have to hack the binary lol. While I love flying by the seat of my pants and trying new things without preparing for the worst (it's all I'm good at, haha), I'd at least like to know what issues I should be prepared to deal with should something go wrong.
TL;DR;: What negative implications are there, if any, by using the "editbin" command above to make Visual Studio 2010 aware of memory addresses greater than 2 GB?
The negative implications of enabling largeaddressaware is that the application could crash or corrupt memory in strange ways. The program was written assuming that no pointer value it had to deal with would be > 2GB. This can be done in subtle ways. The canonical example is probably calculating the midpoint address between to pointers.
ptrMid = (ptr1 + pt2) / 2;
That will work great if all of your pointers are < 2GB, but if they aren't you will get an incorrect result due to overflow.
ptrMid = (0x80000000 + 0x80000004) / 2 = 0x0000002, not 0x80000002
And not only do you have to worry about Visual Studio not being able to handle pointers > 2GB, any add-in would be affected by this as well.
See this question for some more things that have to be checked before enable largeaddressaware, see this question: What to do to make application Large Address Aware?
You really should never use editbin to change largeaddressaware on an application you don't control.
After reading this discussion and checking the existing headers, it looks like VS2010 already has this capability applied, at least for my installation anyway (64bit win7). If it was already compiled in I don't think you need to worry about bad side-effects.
This appears to be by design.
Recall that even when the /3GB switch is set, 32-bit programs receive
only 2GB of address space unless they indicate their willingness to
cope with addresses above 2GB by passing the /LARGEADDRESSAWARE flag.
This flag means the same thing on 64-bit Windows. But since 64-bit
Windows has a much larger address space available to it, it can afford
to give the 32-bit Windows program the entire 4GB of address space to
use. This is mentioned almost incidentally in Knowledge Base article Q889654 in the table "Comparison of memory and CPU limits in the
32-bit and 64-bit versions of Windows".
In other words, certain categories of 32-bit programs (namely, those
tight on address space) benefit from running on 64-bit Windows
machine, even though they aren't explicitly taking advantage of any
64-bit features.
http://blogs.msdn.com/b/oldnewthing/archive/2005/06/01/423817.aspx
Editbin is a Microsoft utility, so they're basically claiming that it works.

Issue with Visual C++ 2010 (Express) External Tools command

I posted this on SuperUser...but I was hoping the pros here at SO might have a good idea about how to fix this as well....
Normally we develop in VS 2005 Pro, but I wanted to give VS 2010 a spin. We have custom build tools based off of GNU make tools that are called when creating an executable.
This is the error that I see whenever I call my external tool:
...\gnu\make.exe): *** couldn't commit memory for cygwin heap, Win32 error 487
The caveat is that it still works perfectly fine in VS2005, as well as being called straight from the command line. Also, my external tool is setup exactly the same as in VS 2005.
Is there some setting somewhere that could cause this error to be thrown?
From problem with heap, win32 error 487 :
Each Cygwin app gets a special heap
area to hold stuff which is inherited
to child processes. Eg. all file
descriptor structures are stored in
that heap area (called the "cygheap").
The cygheap has room for at least 4000
file descriptor structures. But -
that's the clue - it's fixed size. The
cygheap can't grow. It's size is
reserved at the application's start
and it's blocks are commited on
demand.
For some reason your server
application needs all the cygheap
space when running under the described
conditions.
A possible solution might be found in Changing Cygwin's Maximum Memory:
Cygwin's heap is extensible. However,
it does start out at a fixed size and
attempts to extend it may run into
memory which has been previously
allocated by Windows. In some cases,
this problem can be solved by adding
an entry in the either the
HKEY_LOCAL_MACHINE (to change the
limit for all users) or
HKEY_CURRENT_USER (for just the
current user) section of the registry.
Add the DWORD value heap_chunk_in_mb
and set it to the desired memory limit
in decimal MB. It is preferred to do
this in Cygwin using the regtool
program included in the Cygwin
package. (For more information about
regtool or the other Cygwin utilities,
see the section called “Cygwin
Utilities” or use the --help option of
each util.) You should always be
careful when using regtool since
damaging your system registry can
result in an unusable system. This
example sets memory limit to 1024 MB:
regtool -i set /HKLM/Software/Cygwin/heap_chunk_in_mb 1024
regtool -v list /HKLM/Software/Cygwin
Exit all running Cygwin processes and
restart them. Memory can be allocated
up to the size of the system swap
space minus any the size of any
running processes. The system swap
should be at least as large as the
physically installed RAM and can be
modified under the System category of
the Control Panel.
It wouldn't hurt to ensure that the maximum size of your windows swap file is large enough.
To summerize : The environment doesn't allocate enough heap space for the cygwin executables. For some reason the problem is more acute with VS2010 Express. You need to either fix the environment, or use another Linux port than cygwin, or use Microsoft utilities.
From the cygwin email lists it looks like other people have run into similar situations, even when not running via Visual Studio, to which they've found that the solution is often to play with Cygwin's maximum memory settings:
http://www.cygwin.com/cygwin-ug-net/setup-maxmem.html
(note: it's worth reading this conversation, from above, about some values that did and didn't work).
Others have also reported issues with Anti-Virus software (recommendation is to unload from memory for some reason), and possibly also compatibility settings (try with it set to XP) which can affect cygwin in certain cases. See: http://www.avrfreaks.net/index.php?name=PNphpBB2&file=viewtopic&p=377066
As for Visual Studio: Are you on a 64bit machine and if so are you usually running the tool in a 64bit environment?
I've found that because Visual Studio 2010 runs in 32bit, tools launched from it are launched as 32bit processes (for a good illustration of this, add "cmd" as a tool). I'm not sure why this wouldn't be affected on 2005 (unless 2005 lets the system launch the process (64bit) and 2010 handles it itself (32bit)).

Visual Studio 2005 Memory Usage

I find that quite often Visual Studio memory usage will average ~150-300 MB of RAM.
As a developer who very often needs to run with multiple instances of Visual Studio open, are there any performance tricks to optimize the amount of memory that VS uses?
I am running VS 2005 with one add-in (TFS)
From this blog post:
[...]
These changes are all available from the Options dialog (Tools –> Options):
Environment
General:
Disable “Animate environment tools”
Documents:
Disable “Detect when file is changed outside the environment”
Keyboard:
Remove the F1 key from the Help.F1Help command
Help\Online:
Set “When loading Help content” to “Try local first, then online” or “Try local only, not online”
Startup:
Change the “At startup” option to “Show empty environment”
Projects and Solutions
General:
Disable “Track Active Item in Solution Explorer”
Text Editor
General (for each language you want):
Disable “Navigation bar” (this is the toolbar that shows the objects and procedures drop down lists allowing you to choose a particular object in your code.
Disable “Track changes”
Windows Forms Designer
General:
Set “AutotoolboxPopulate” to false.
Set “EnableRefactoringOnRename” to false.
Upgrade to a 64-bit OS. My instances of VS were taking ~700MB each (very large solutions).. and you rapidly run out of room with that.
Everyone on my team that has switched to 64-bit (and 8GB RAM) has wondered why they didn't do it sooner.
minimize and re-maximize the main vs window to get vs to release the memory.
By uninstalling (and re-installing) Visual Assist the problem got solved for me.
The number 1 thing you can do is switch to Windows 8.
It uses memory sharing / combining if the same DLL or memory page is loaded into multiple processes. Obviously there's a lot of overlap when running two instances of VS.
As you can see I've got 4 Visual studios running and the shared memory column (you need to enable this column for it to be visible) shows how much memory is being shared.
So in Windows 7 this would use 2454MB but I'm saving 600+MB that are shared with the other devenv processes.
Chrome too has a lot of savings (because each browser tab is a new process). So overall I've still got 2GB free where I'd normally be maxed out.

Resources