How do I turn off the fault tolerant heap? - windows-7

I've recently started seeing this line in my Visual Studio 2005 output window when launching my application:
FTH: (7156): *** Fault tolerant heap shim applied to current process. This is usually due to previous crashes. ***
I've tried turning off the fault tolerant heap using the instructions here:
http://msdn.microsoft.com/en-us/library/dd744764(VS.85).aspx
I'm running Windows 7 64-bit edition, so I have made the changes to both the 32-bit and 64-bit registries, and run the "Rundll32.exe fthsvc.dll,FthSysprepSpecialize" command using both the 32-bit and 64-bit versions of Rundll32.exe.
However, after rebooting I am still getting the fault tolerant heap when trying to debug my application!
This is a real problem since it masks the bug I am trying to reproduce, and it also kills performance.
Does anyone have any other suggestions how to disable the fault tolerant heap?

To disable it for a single application
Go to the HKEY_LOCAL_MACHINE and HKEY_CURRENT_USER versions of
Software\Microsoft\Windows
NT\CurrentVersion\AppCompatFlags\Layers\your_application.exe and
delete the Fault­Tolerant­Heap entry.
From here (actually here)

Set this registry value to 0:
HKEY_LOCAL_MACHINE\Software\Microsoft\FTH\Enabled

You can add the name of your executable to the ExclusionList.
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\FTH\ExclusionList
Works for me.

You can edit the application manifest to excluding your program from PCA
see also:How to reset Program Compatibility Assistant for testing

you can clear the list of applications tracked by FTH without stopping this service by following these steps:
Click the Start menu.
Right-click Computer and click Manage.
Click Event Viewer -> Applications and Services Logs -> Microsoft ->
Windows -> Fault-Tolerant-Heap.
View FTH Events.
you will find file named operational by right click and choose clear log,
then you can run you program again and warning message will disappear,
it worked with me without restarting operating system.

On Windows 10 the registry location is:
Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\FTH
You can remove you executable from the list in:
Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\FTH\State
or you can run this command from an elevated command prompt
Rundll32.exe fthsvc.dll,FthSysprepSpecialize
You may need to reboot your machine

"Rundll32.exe fthsvc.dll,FthSysprepSpecialize" looks to only clear the list of currently flagged applications. if your application still causes oddities, the FTH should still step in and take over.
as already mentioned:
Set this registry value to 0: HKEY_LOCAL_MACHINE\Software\Microsoft\FTH\Enabled
this should disable FTH for the whole system.

I had to rename the file as well because the registry entries associated with this key were empty of applicable data. I expect that they populate if you have a misbehaving application. But in my case I was debugging my own application within Visual Studio. So in that case, it was my process that was somehow loading the FTH whether the FTH Service was running or not. And in fact I had no applications listed that were previously tagged as misbehaving.
But I had to follow these instructions:
http://billroper.livejournal.com/960825.html
because it wouldn't let me rename the file until I took ownership and made sure I had full control.

I had similar issue when running a Unit test using (Microsoft::VisualStudio::CppUnitTestFramework).
Somehow I had violated some heap allocation, and next time I tried to debug I received the message : "Fault tolerant heap shim applied to current process. This is usually due to previous crashes. " and the debug environment froze.
To get it to work again, I had to remove test case, recompile and add it again and recompile, then I could set breakpoint and step into the test.

Also ran into this. Renaming/deleting AcXtrnal.dll inside Windows\AppPatch seems to work for me. I like how this Microsoft recommended action (which I did first) does nothing.

Related

Alternative to GFLAG.exe or registry setting

To get the crash-dump i used the below registry setting for windows 7 machine also tried gflags.exe.
[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\Windows Error Reporting\LocalDumps]
"DumpFolder"=hex(2):[path goes here in hex value]
"DumpType"=dword:00000002
"DumpCount"=dword:0000000a
This is works well in most of the cases and I am able to get the crash dump when my software crashes. But in one of the cases when I use my software in integration with one more custom software2 I am not able to get crashdump.
I did multiple testing and confirmed that whenever the custom software2 is running along with the main software the crash dumps are not getting generated. The registry setting are not helping. And we need to have custom software2 running along with the main software.
Are there any alternative way (other than registry setting or GFLAGS.exe) or software to generate the crash dumps in this scenario?
I cant debug it because the issue is at the deployed machine.
since none of the utilities are helping i am using the task manager to get the crash-dump. When my application crashes windows displays the window. During that time i am generating the crash-dump using task manager manually.For 32 bit application use the task manger from SYSWOR64 folder.

foxpro program freezes while it is running

I am using foxpro apps under Windows 7. During the compilation one of my program it suddenly became freezing until I move the mouse or press any key. And this happens all the time while I am working with prog.
This happens when I move only the data to a mapped directory on the host. If my application, foxpro and the data are in the same directory on the virtual machine there are no problem with it.
This happens when my data is not on the virtual machine.
Can it be a caching issue?
Change the registry:
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\WOW]
Save export as backup. Then change value "DefaultSeparateVDM" to "yes"
If you have 64bit then you need to create a file for 16bit applications that use the internal start command in separate memory space as follows:
Start /separate command
Also have a look at this article. http://www.reddit.com/r/Database/comments/2kz0x5/dbf_file_getting_corrupt/
There kind of similar problem, who knows, maybe it would be helpful for you
I have run into similar in the past, and it doesn't have to do with VFP application and data residing in the same folder. What has happened to me is the debugger. You mention "...while I am working with the prog." tells me you are in the VFP development mode, and not run-time from the app itself. If you have issues where your debugger has breakpoints, or other flags that have become corrupted somehow, I have done
CLEAR DEBUGGER
But that is something going back several years and MIGHT be what you are encountering.

Not enough storage is available to complete this operation

Environment:
Visual Studio Ultimate 2010
Windows XP
WPF Desktop Application using .NET 4.0
We have a desktop application which plays a video. This video is part of a project and the project is packaged into the installer. Every once in a while building the installer project shows this error message:
Not enough storage is available to complete this operation
If I restart Visual Studio it works.
Is there a way to avoid this? Is there a better way to package videos in an installer?
This usually happens when the build process needs a lot of RAM memory and cannot get it. Since restarting Visual Studio fixes the problem, most likely it also your case.
Try closing some of the running applications. You can also try adding more RAM to your machine or increasing the page file.
I came across this question when trying to compile my C# solution in Visual Studio 2010 in Windows XP. One project had a fair number of embedded resources in (the size of the resultant assembly was ~140MiB) and I couldn't compile the solution because I was getting the
Not enough storage is available to complete this operation
error in my build output.
None of the answers on this question helped, but I did find an answer to "Not enough storage is available to complete this operation" by ScottBurton42 on social.msdn.microsoft.com. It suggests adding the 3GB switch to the Boot.ini file, and making devenv.exe large-address aware. Adding the 3GB switch to my Boot.ini file was what worked for me (I think devenv.exe for Visual Studio 2010 and above is already large-address aware).
My answer is based on that answer.
Solution 1: Set the /3GB Boot.ini switch
The page Memory Support and Windows Operating Systems on MSDN says:
The virtual address space of processes and applications is still limited to 2 GB unless the /3GB switch is used in the Boot.ini file.
The /3GB switch allocates 3 GB of virtual address space to an application that uses IMAGE_FILE_LARGE_ADDRESS_AWARE in the process header. This switch allows applications to address 1 GB of additional virtual address space above 2 GB.
The virtual address space of processes and applications is still limited to 2 GB, unless the /3GB switch is used in the Boot.ini file. The following example shows how to add the /3GB parameter in the Boot.ini file to enable application memory tuning:
[boot loader]
timeout=30
default=multi(0)disk(0)rdisk(0)partition(2)\WINNT
[operating systems]
multi(0)disk(0)rdisk(0)partition(2)\WINNT="????" /3GB
Note "????" in the previous example is be the programmatic name of the operating system.
In Windows XP, the Boot.ini file can be modified by going to
System Properties → Advanced → Startup and Recovery → Settings → System Startup → Edit
The page on the /3GB switch on MSDN says:
On 32-bit versions of Windows, the /3GB parameter enables 4 GT RAM Tuning, a feature that enlarges the user-mode virtual address space to 3 GB and restricts the kernel-mode components to the remaining 1 GB.
The /3GB parameter is supported on Windows Server 2003, Windows XP, and Windows 2000. On Windows Vista and later versions of Windows, use the IncreaseUserVA element in BCDEdit.
Restarting the machine will then cause the setting to take effect.
Solution 2: Make devenv.exe large address aware:
Open up a Visual Studio Command Prompt (or a Developer Command Prompt, depending on the version of Visual Studio)
Type and execute the following command line:
editbin /LARGEADDRESSAWARE {path}\devenv.exe`
where {path} is the path to devenv.exe (you can find this by going to the properties of the Visual Studio shortcut).
This will allow devenv.exe to access 3GB of memory instead of 2GB.
Problem
In my case, the problem was with a test project containing a very large (1.5GB) test file as an embedded resource. I have 16GB RAM in my machine with 8GB free when this occurred, so RAM was not the issue.
It is possible that we are hitting the 2 GB limit that the CLR has on any single object. Without delving into what MSBuild is doing under the hood, I can only speculate that during compile time, the embedded resource is loaded into an object graph that is hitting this limit.
The Error message is very unhelpful. My first thought when I saw it was, "Have I run out of disk space?"
Solution
It is a file validation test project. One of the requirements is to be able to handle files of this size, so on face value my team thought it reasonable to embed it for use in test cases.
We fixed the error by moving the file onto the network (in the same way that it would be accessed by the validator in production) and marking the test as an integration test instead of a unit test. After-all, aren't unit tests supposed to be fast-running?
Cleaning And rebuilding the solution worked for me
For Visual Studio, you can try to do the following:
Close All Visual Studio instances.
Open Visual Studio Developer tool in Administrator mode.
Navigate to:
C:\Program Files (x86)\Microsoft Visual Studio\2017\Professional\Common7\IDE.
Type following:
editbin /LARGEADDRESSAWARE devenv.exe.
It's worth also to restart PC.
Hope this helps )
In my case, I had very less memory left in C drive. I cleared few items from C drive and tried again. It worked.
I might be late to answer but for future reference, you might want to check the Windows dump file settings (and probably set it to none).
In my case the server I was executing the code on couldn't handle my parallelized code.
Normally I'm running a setup like the following
new ParallelOptions { MaxDegreeOfParallelism = Math.Max(1, Environment.ProcessorCount / 2) }
Introducing a variable and allowing lockdown of cores used to 1 (resulting in code like the following), resolved this issue for me.
new ParallelOptions { MaxDegreeOfParallelism = 1 }
The key for me:
We had embedded a huge database template (testing had filled it with lots of data) into the application. I have not seen this issue arise since removing Embedded Resource boils properly and moving the database to a recourse folder.
My fix this problem with delete or disable(exclude) the *.rpt files that have large size;and I've optimize the my reports!
I am late to Answer but may be useful for others
In my case just restarting Visual Studio fixes the problem

Killing processes on Windows 7

I'm debugging plugins on Windows 7 and of course the plugin host (Cubase5.exe) occasionally crashes because of errors in the plugin. On XP or Vista, I could always restart it immediately and continue working. But on Windows 7, even though Cubase appears to close, it is still visible in Task Manager and I cannot kill it by any means. After a minute or two, it disappears by itself. In the mean time, I can't work because the plugin DLL is still locked by the process.
Does anyone know why this happens on Windows 7? I've already tried disabling Automatic Error Reporting but that didn't help. I've tried attaching cdb to Cubase, but I get:
Cannot debug pid 5252, NTSTATUS 0xC0000001
"{Operation Failed} The requested operation was unsuccessful."
Debuggee initialization failed, NTSTATUS 0xC0000001
"{Operation Failed} The requested operation was unsuccessful."
I tried following the instructions here but it appears this is only possible if I connect a second machine to my computer to debug it remotely.
I finally found the solution, using this article:
http://blogs.technet.com/b/markrussinovich/archive/2005/08/17/unkillable-processes.aspx
This required installing the Windows Debugging Tools for Windows (nice name) and LiveKd, but by following the steps outlined I was able to track which driver was causing the process to hang: it turned out to be the 64-bit driver for the M-Audio Oxygen 8 V2 controller I'm using. Unfortunately no driver update is available.
Anyway, if anyone encounters a similar problem, this is the way to solve it.
Have you tried Process Explorer by Mark Russinovich? It is really useful for "killing":)
If you have error reporting enabled, it's possible that werfault.exe has Cubase open to write a minidump for crash reporting purposes.
This is just a stab in the dark but it might be your problem.
One thing you can try is to check with Process Monitor what Cubase is doing. Set a filter so that everything with a process name containing "cubase" will be recorded. It could be that you are facing some timeout issue when Cubase wants to exit.
you can end the process the service is running under. You can find this process by going to the Services tab of the Task Manager, right-clicking, and selecting Go To Process(you need to click the Show processes from all users button.). Note that one process may host multiple services (especially if it's svchost.exe), and ending the process will kill all those services. Also, this is an unclean exit, and may cause data corruption depending on what the service(s) was doing when you killed it.
Depending on which specific service you are trying to stop, there may be a cleaner way to simulate failure.

How to prevent Windows from caching Com Class info?

Windows 7 is caching some of the COM class information. Older OSs didn't do this. After the OS looks up theHKCU\Software\Classes\CLSID\{GUID}\LocalServer32 value, it caches the value, and doesn't look it up again.
When we update our software, we place the new updates in a different directory, and then update the HKCU\Software\Classes\CLSID\{GUID}\LocalServer32 value to reflect the new path. The next time the software runs, it will use the latest files if running under older Windows OSs. However, on Windows 7, it will continue to use the older file, until the OS is rebooted.
I ran process monitor, and discovered that under Windows 7, it never reads the registry key again, after the first read. On older OSs, it reads that key every time.
My question is: Is there any way to force Windows 7 to re-read the LocalServer32 information from the HKCU hive each time a new out of proc COM object is created?
I have only been able to solve this problem by...
1: Stopping the Process
2: explicitly unregistering using regsvr32 the library ( or exename /unregserver)
3: Registering the new component
4: Starting the process back up.
I would suspect that it is the Un Reg part that is failing for you. If you are just changing the registry key directly then you should call RegSvr32 /u instead.
Also make sure the new directory location is the current directory when you call RegSvr32.
Note that I have always stopped the process and then unregistered, this is probably a significant detail.
As this is a top result in Google for this narrow-ish problem, I thought it would be valuable to add my troubleshooting outcome for this problem.
I found this response on SO: C# : How to change windows registry and take effect immediately
And linked solution from that answer: Registry Watcher C#
Both of which seem viable options for managing changed keys without forcing a reboot. For us (like the OP) this was when installing updates. For us (possibly unlike the OP) this is infrequent and we decided the effort to implement and test a fix as described was outweighed by the simple solution of requiring a reboot: a process Windows users have come to expect with installing software anyway.

Resources