Win32 Error 8 during ccppc compile on Windows 7 x86 - gcc

I'm trying to compile a number of C source files with ccppc (version 3.3-e500). Most of the time, the compilations will fail, noting the following error
0 [main] cc1 {PID} sigproc_init: cannot create wait_sig thread, Win32 Error 8
I've looked up this Win32 error, and it corresponds to ERROR_NOT_ENOUGH_MEMORY. I'm quite certain that it's not an issue with physical memory, I have 4GB in this machine, with no more than 1.5GB in use at compile time. Unfortunately, I don't have local admin over my machine, and therefore cannot investigate issues with the pagefile.
After several failures, the source will eventually compile, and once all the source is compiled, there are no apparent issues with the outputted .o's. However, this issue turns a 10-minute build into an hour+ build.
This issue is not present on XP x86 machines, I have not investigated x64 for either OS.
Has anyone run into an issue like this with ccppc or other gcc binaries on Windows 7? Any guidance in finding a solution would be awesome.

On moving from compiling under Windows XP 32-bit to Windows 7 64-bit, found that the cc1.exe program from gcc 2.96 for PowerPC was failing with the same error on some compiles. The investigation, using the Visual Studio Debugger to attach to cc1.exe and monitoring with Sysinternals tools, showed that:
When the cc1.exe program was successful it spawns 3 threads.
When the cc1.exe program failed, the 3rd CreateThread() call failed with ERROR_NOT_ENOUGH_MEMORY.
The image header of cc1.exe, as reported by running the Visual Studio dumpbin /HEADERS command, showed that the stack reserve size is 400000000 bytes. i.e. each thread created requires 400000000 bytes of contiguous virtual address space.
When the CreateThread call failed with ERROR_NOT_ENOUGH_MEMORY, the Sysinternals VMMap tool showed that the maximum free virtual memory region was less than the requested stack reserve size of 400000000 bytes.
cc1.exe is a 32-bit process, which has 2Gbytes of virtual address space. Windows 7 has ASLR (Address space layout randomization), which causes the load address of DLLs to be randomized. I think that the ASLR is effectively fragmenting the virtual address spaces, such that on some invocations of the cc1.exe process the virtual addresses used by the loaded DLLs doesn't leave a large enough free region for the thread stacks. As well as the ASLR, in Windows 7 there was approx 4 times more DLLs loaded into the cc1.exe compared to under Windows XP which could be contributing to the problem.
Based upon the above investigation, the Visual Studio editbin /STACK:67108864 program was used to reduce the stack reserve size for cc1.exe to 64Mbytes, where 64Mbytes was chosen as that was the stack reserve size for the cc1plus.exe program used for C++ code. With the reduction of the stack reserve size the ERROR_NOT_ENOUGH_MEMORY errors have not been seen again.

So, I never really found a solution, but I did manage to come up with a viable workaround.
Since the compile will eventually work after multiple re-make's, you can wrap your make command in a do/while loop that checks the 'ERRORLEVEL' variable, and continues until it indicates success.
This obviously doesn't solve the "takes forever to compile" problem, but your compile will eventually succeed.

Related

Visual Studio Express: fatal error c1060, the compiler is out of heap space

I'm trying to build a program from its source code with VC 11. When the compiler is about to finish, it raises the error mentioned in title of this post.
As I've read here and in other forums, I tried to both close as many programs as possible and enlarge the size of the swap file in Windows... neither works.
I've read about a parameter called \Zm but I don't understand how to use it.
Can you please help me?
Take a look at this documentation which gives possible solutions:
I also had that problem and found the documentation useful. Main points:
If the compiler also issues errors C1076 and C3859, use the /Zm compiler option to lower the memory allocation limit. More heap space
is available to your application if you lower the remaining memory
allocation.
If the /Zm option is already set, try removing it. Heap space might be
exhausted because the memory allocation limit specified in the option
is too high. The compiler uses a default limit if you remove the /Zm
option.
If you are compiling on a 64-bit platform, use the 64-bit compiler toolset. For information, see How to: Enable a 64-Bit Visual C++
Toolset on the Command Line.
On 32-bit Windows, try using the /3GB boot.ini switch.
Increase the size of the Windows swap-file.
Close other running programs.
Eliminate unnecessary include files.
Eliminate unnecessary global variables, for example, by allocating memory dynamically instead of declaring a large array.
Eliminate unused declarations.
Split the current file into smaller files.
I can't tell much about the /Zm parameter, but I had the same issue (compiler is out of heap space).
What has helped me was the /m:4 (4 for the count of your CPUs) parameter so that you can use multiple CPUs for building.
Hope that helps you as well.
Also, if you are running on x64, be sure that the x64 version of "msbuild.exe" and "cl.exe" is beeing used. I had the issue that even when using e.g. the x64 ms powershell, the compiler would still choose the 32-bit version of msbuild.exe (in task manager "msbuild.exe*32", windows 7)
In addition to the other answers here (and in my case), fatal error C1060: compiler is out of heap space can be caused by a syntax error. The following code (in certain circumstances) can cause this error even with correct compiler options -- for instance if you've previously successfully compiled the same program.
r.push_back(e[1];
instead of
r.push_back(e[1]);
It seems to only cause this error rather than the standard error C2143: syntax error: missing ')' before ';' when r and e are of certain types, but it's worth checking any code you've edited recently if the program previously compiled without errors.
We had similar problem: a relativelly simple program (although, full of templates, using Eigen library) persistently failed to compile on one of the computers. All were using MSVC2013 x64, but only one was unable to compile the program due to C1060 error. We tried different compiler flags, setting/unsetting -Zm, but failed to resolve it without modifying code.
Some pointers were, however, given to us, when we switched from x64/x64 (64bit compiler for 64bit resulting executable) version of the compiler to the x86/x86 (32bit compiler for 32bit resulting executable). The x86 compiler gave us exact locations of the problematic parts of the program - calls to template functions receiving heavy templated objects. We have rewritten those to normal functions (build in different object file) and that solved the problem...
VS: Visual Studio 2015
OS: Windows10
If you are using VS2015 as your IDE, maybe there is another solution:
Go to update the VS2015 "Update3" package and everything will work smoothly.
In my case, a main program would not compile is VS 2022 Community Edition (free). It had many include files. By process of elimination, I managed to compile it once I removed any "volatile" modifiers in declarations that had this modifier.
A very strange bug, to say the least!
I got this error when compiling OnnxRuntime with MS Visual C++ 17 2022.
The solution for this issue was to close all other programs and compile using a single thread (in this case, removing the --parallel argument from the build.bat call).

What are the implications of making Visual Studio 2010 able to use more than 2GB of RAM?

Alright, I found this guide and a few others on the internet which suggest running the following command from the VS 2010 IDE directory using the Visual Studio Command Prompt:
editbin /largeaddressaware devenv.exe
I've run this, and everything so far seems to work fine (I haven't run into any issues yet). But what I can't find information on is what negative implications, if any, there are by making Visual Studio 2010 use more than 2GB of RAM? Visual Studio was built to use a max of 2 GB of RAM. If VS was meant to use more than 2 GB of RAM, then I wouldn't have to hack the binary lol. While I love flying by the seat of my pants and trying new things without preparing for the worst (it's all I'm good at, haha), I'd at least like to know what issues I should be prepared to deal with should something go wrong.
TL;DR;: What negative implications are there, if any, by using the "editbin" command above to make Visual Studio 2010 aware of memory addresses greater than 2 GB?
The negative implications of enabling largeaddressaware is that the application could crash or corrupt memory in strange ways. The program was written assuming that no pointer value it had to deal with would be > 2GB. This can be done in subtle ways. The canonical example is probably calculating the midpoint address between to pointers.
ptrMid = (ptr1 + pt2) / 2;
That will work great if all of your pointers are < 2GB, but if they aren't you will get an incorrect result due to overflow.
ptrMid = (0x80000000 + 0x80000004) / 2 = 0x0000002, not 0x80000002
And not only do you have to worry about Visual Studio not being able to handle pointers > 2GB, any add-in would be affected by this as well.
See this question for some more things that have to be checked before enable largeaddressaware, see this question: What to do to make application Large Address Aware?
You really should never use editbin to change largeaddressaware on an application you don't control.
After reading this discussion and checking the existing headers, it looks like VS2010 already has this capability applied, at least for my installation anyway (64bit win7). If it was already compiled in I don't think you need to worry about bad side-effects.
This appears to be by design.
Recall that even when the /3GB switch is set, 32-bit programs receive
only 2GB of address space unless they indicate their willingness to
cope with addresses above 2GB by passing the /LARGEADDRESSAWARE flag.
This flag means the same thing on 64-bit Windows. But since 64-bit
Windows has a much larger address space available to it, it can afford
to give the 32-bit Windows program the entire 4GB of address space to
use. This is mentioned almost incidentally in Knowledge Base article Q889654 in the table "Comparison of memory and CPU limits in the
32-bit and 64-bit versions of Windows".
In other words, certain categories of 32-bit programs (namely, those
tight on address space) benefit from running on 64-bit Windows
machine, even though they aren't explicitly taking advantage of any
64-bit features.
http://blogs.msdn.com/b/oldnewthing/archive/2005/06/01/423817.aspx
Editbin is a Microsoft utility, so they're basically claiming that it works.

Memory Error in Visual Studio, but plenty of memory available

This line of code produces the following error
rs[se_idx][ev_idx][re_idx].trs = new re_class[report_size];
std::bad_alloc at memory location 0x0037c29c
I think this is related to 'not enough memory'. When I decrease the amount being allocated, it runs fine.
I have plenty of memory (16 GB) on the machine and a resource monitor shows only a tiny fraction of it is being used by visual studio. I added the compiler options /F 4000000000 and /LARGEADDRESSAWARE, but still getting the error.
How can this be solved?
Are you sure your operating system can take advantage of the entire 16GB
and you're using a 64 bit version of VC++
http://msdn.microsoft.com/en-us/library/h2k70f3s%28v=vs.90%29.aspx
http://msdn.microsoft.com/en-us/library/9yb4317s%28v=vs.90%29.aspx

Microsoft's ASLR is weird

I watched a ASLRed dll images's based address for 32bit Process.
It's not a fully randomization. It just randomizated 1/2 probability.
For example, once I load a dll then the image is loaded on 0x12345678.
And I load the image again, the image is loaded on 0x23456789.(Base address is changed!)
But I load the image again
0x12345678
0x23456789
0x12345678
0x23456789
...
Why they did implement like this?
Is it for a crash report's frequency?(For getting same crash addresses of re-deployed dlls)
This is by design. Normally, Windows selects a preferred base address for an ASLR DLL when the DLL is first loaded, and then it keeps using that address until the system is rebooted. That way the DLL will be mapped at the same address in every process that loads it, allowing code pages to be shared.
However, if a DLL has been unloaded from every process, the system may sometimes select a different base address the next time the DLL is loaded. It does this to reduce virtual address space fragmentation, not for security reasons. This is what seems to be happening in your case.
It's documented as being at one of 1 of 256 possible starting addresses.
But i didn't think it even applied to a process, but to shared DLL's.
ASLR: is not on by default for process images. It's an opt-in thing, for compatiblity.(3)
Address Space Layout Randomization
(ASLR)
ASLR moves executable images into
random locations when a system boots,
making it harder for exploit code to
operate predictably. For a component
to support ASLR, all components that
it loads must also support ASLR. For
example, if A.exe consumes B.dll and
C.dll, all three must support ASLR. By
default, Windows Vista and later will
randomize system DLLs and EXEs, but
DLLs and EXEs created by ISVs must opt
in to support ASLR using the
/DYNAMICBASE linker option.
ASLR also randomizes heap and stack
memory:
When an application creates a heap in
Windows Vista and later, the heap
manager will create that heap at a
random location to help reduce the
chance that an attempt to exploit a
heap-based buffer overrun succeeds.
Heap randomization is enabled by
default for all applications running
on Windows Vista and later.
When a
thread starts in a process linked with
/DYNAMICBASE, Windows Vista and later
moves the thread's stack to a random
location to help reduce the chance
that a stack-based buffer overrun
exploit will succeed.
Had installed new Win8 RC x64 yesterday.
Watch out!
Kernel32.dll (64-bit version) have different base address in different processes (in single session, of course). Only ntdll.dll base address remains constant. I had to change the code, you can no longer rely on the permanent address Loadlibrary.

Issue with Visual C++ 2010 (Express) External Tools command

I posted this on SuperUser...but I was hoping the pros here at SO might have a good idea about how to fix this as well....
Normally we develop in VS 2005 Pro, but I wanted to give VS 2010 a spin. We have custom build tools based off of GNU make tools that are called when creating an executable.
This is the error that I see whenever I call my external tool:
...\gnu\make.exe): *** couldn't commit memory for cygwin heap, Win32 error 487
The caveat is that it still works perfectly fine in VS2005, as well as being called straight from the command line. Also, my external tool is setup exactly the same as in VS 2005.
Is there some setting somewhere that could cause this error to be thrown?
From problem with heap, win32 error 487 :
Each Cygwin app gets a special heap
area to hold stuff which is inherited
to child processes. Eg. all file
descriptor structures are stored in
that heap area (called the "cygheap").
The cygheap has room for at least 4000
file descriptor structures. But -
that's the clue - it's fixed size. The
cygheap can't grow. It's size is
reserved at the application's start
and it's blocks are commited on
demand.
For some reason your server
application needs all the cygheap
space when running under the described
conditions.
A possible solution might be found in Changing Cygwin's Maximum Memory:
Cygwin's heap is extensible. However,
it does start out at a fixed size and
attempts to extend it may run into
memory which has been previously
allocated by Windows. In some cases,
this problem can be solved by adding
an entry in the either the
HKEY_LOCAL_MACHINE (to change the
limit for all users) or
HKEY_CURRENT_USER (for just the
current user) section of the registry.
Add the DWORD value heap_chunk_in_mb
and set it to the desired memory limit
in decimal MB. It is preferred to do
this in Cygwin using the regtool
program included in the Cygwin
package. (For more information about
regtool or the other Cygwin utilities,
see the section called “Cygwin
Utilities” or use the --help option of
each util.) You should always be
careful when using regtool since
damaging your system registry can
result in an unusable system. This
example sets memory limit to 1024 MB:
regtool -i set /HKLM/Software/Cygwin/heap_chunk_in_mb 1024
regtool -v list /HKLM/Software/Cygwin
Exit all running Cygwin processes and
restart them. Memory can be allocated
up to the size of the system swap
space minus any the size of any
running processes. The system swap
should be at least as large as the
physically installed RAM and can be
modified under the System category of
the Control Panel.
It wouldn't hurt to ensure that the maximum size of your windows swap file is large enough.
To summerize : The environment doesn't allocate enough heap space for the cygwin executables. For some reason the problem is more acute with VS2010 Express. You need to either fix the environment, or use another Linux port than cygwin, or use Microsoft utilities.
From the cygwin email lists it looks like other people have run into similar situations, even when not running via Visual Studio, to which they've found that the solution is often to play with Cygwin's maximum memory settings:
http://www.cygwin.com/cygwin-ug-net/setup-maxmem.html
(note: it's worth reading this conversation, from above, about some values that did and didn't work).
Others have also reported issues with Anti-Virus software (recommendation is to unload from memory for some reason), and possibly also compatibility settings (try with it set to XP) which can affect cygwin in certain cases. See: http://www.avrfreaks.net/index.php?name=PNphpBB2&file=viewtopic&p=377066
As for Visual Studio: Are you on a 64bit machine and if so are you usually running the tool in a 64bit environment?
I've found that because Visual Studio 2010 runs in 32bit, tools launched from it are launched as 32bit processes (for a good illustration of this, add "cmd" as a tool). I'm not sure why this wouldn't be affected on 2005 (unless 2005 lets the system launch the process (64bit) and 2010 handles it itself (32bit)).

Resources