I was looking into a bug where PdhLookupPerfNameByIndex was giving me a buffer size of 0 and discovered that there are 2 counters for "% Processor Time" and we were using the last one in the list (index 6 and 4676). The spanish language pack only has this counter once (index 6). I was curious as to why there would be 2 counters for the same thing in English and if there was a valid reason, why is it not included in the Spanish language pack.
Using Windows Server 2012 R2
Related
I'm running a 64-bit version of GhostScript (9.50) on 64-bit processor with 16gb of RAM under Windows 7.
GhostScript returns a random-ish error message (it will tell me that I have type error in the array command) when I try to allocate one too many arrays totaling more than 2 GBs of RAM.
To be clear, I am seeing how growth of the memory usage in Windows Task Monitor, not from within GhostScript
I'd like to know why this is so.
More importantly, I'd like to know if I can override this behavior.
Edit: This code produces the error --
/TL 25000 def
/TL- TL 1 sub def
/G TL array def
0 1 TL- { dup == flush G exch TL array put }for
The error looks like this: Here's the last bit of the messages I get
5335
5336
5337
5338
5339
5340
5341
5342
5343
5344
5345
Unrecoverable error: typecheck in array
Operand stack: --nostringval-- ---
Begin offending input ---
/TL 25000 def /TL- TL 1 sub def /G TL array def 0 1 TL- { dup == flush G exch TL array put }for --- End offending input --- file offset = 0 gsapi_run_string_continue returns -20
The amount of RAM is almost certainly not the limiting factor, but it would help if you were to post the actual error message. It may be 'random-ish' to you, but it's meaningful to people who program in PostScript.
More than likely you've tripped over some other internal limit, for example the operand stack size but without seeing the PostScript program or the error message I cannot say any more than that. I can say that (64-bit) Ghostscript will happily address more than 2GB of RAM, I was running a file last week which had Ghostscript using 8.1GB.
Note that PostScript itself is basically a 32-bit language; while Ghostscript has extended many of the architectural limitations documented in the PostScript Language Reference Manual (such as 64K elements in arrays and strings) moving beyond 32-bit limits is essentially unspecified.
As to whether you can change the behaviour, that depends on exactly what the problem is, and I can't tell from what's here.
Edit
Here's a screenshot of Ghostscript running the test file to completion, along with the Task Manager display showing the amount of memory the process is using. Not shown is the vmstatus which I ran from the PostScript environment afterwards. This showed that Ghostscript thinks it's using 10,010,729,850 bytes form a maximum of 10,012,037,312. My calculator says that 9,562.8MB comes out at 10,027,322,572.4 bytes, so a pretty close match.
To answer the points in the comments this is (as you can probably tell) on a 64-bit Windows 10 installation with quite a lot of memory.
The difference is, almost certainly, something which has been fixed since the release of 9.52. The 9.52 64-bit binary does exit with a VMerror after (for me) 5360 iterations. Obviously trying to use vast amounts of PostScript memory (as opposed to, say, canvas memory) is not a common occurrence, not least because many PostScript interpreters simply won't allow it, so this doesn't get exercised much.
The Ghostscript Git repository is here if you want to go through the commits and try to figure out which one caused the change. You only have to go back to March this year, anything before about the 19th March would have been in 9.52.
Beyond simple curiosity, is there a reason to try and use up loads of memory in PostScript ?
I want to binary analyze a Windows EXE file without Windows API call (because I will do it from another OS). I want to disginguish 2 x 2 types:
Is it a windowed program or a command line program?
Is it a Win32 or a Win64 program?
I hope that there are general bit structures which I can query.
The link Microsoft PE and COFF Specification was useful, but a little tricky. Here is my result now:
Every Windows program has got a DOS program block showing a text like "This program cannot be run under DOS" or a similar text. The length of the DOS block can differ. The "real Windows program" section begins later. The beginning offset address of the Windows program is coded in the bytes offset 0x3c and 0x3d. 0x3d holds the hi and 0x3c the lo value. So you have to calulate 256*(0x3d) + (0x3c) to get the offset address of the real Windows program.
The real Windows program begins with four bytes: "PE", followed by two nullbytes. The fifth and sixth byte is 0x4c01 if it is a Win32 program and 0x6486 if it is a Win64 program.
To check if the program is textbased, you have to read offset byte (counted from "PE"=0x00) 0x5c. A value of 3 means text based, 2 means a Windowed GUI program.
Today I finally found out what has been stalling my development process: Even though no errorcode is set, the function wglChoosePixelFormatARB returns 0 pixelformats.
I am trying to set up an OpenGL context in my C++ application and I have managed to retrieve the function pointers for the extensions.
glGetIntegerv(GL_MAJOR_VERSION, &maj)
returns 4 so, naturally, I assumed it would be possible to create an OpenGL 3.2 context. However, after finding out there were no matches, I started to comment out some of my requirements to go in the attribList parameter. There were no matches whatsoever.
Only when I, just to be certain, commented out
WGL_CONTEXT_MAJOR_VERSION_ARB, 3,
WGL_CONTEXT_MINOR_VERSION_ARB, 2,
I finally got matches. Out of the 8 matching pixel formats that the other requirements meet, not ONE of them seems to support version 3 of OGL.
Has anyone ever run into this? I have tried updating/reinstalling my video drivers, but nothing has changed. I am running this on Windows 7, MS Visual Studio 2008, and my graphics card is one from the AMD Radeon HD 7700 Series.
The WGL_CONTEXT_MAJOR_VERSION_ARB, WGL_CONTEXT_MINOR_VERSION_ARB and related attributes are not attributes of the Windows Pixelformat.
You must not use them with wglChoosePixelFormatARB().
Those options belong into the attribute list of wglCreateContextAttribsARB as defined by the WGL_ARB_create_context extension.
Hej,
I wrote a function that should give me the number of cores of a windows system. It works on all systems except XP 64 bit. Here's the way I get the information:
$objWMIItems = $objWMIService.ExecQuery ("SELECT * FROM Win32_Processor")
If (0 == IsObj($objWMIItems)) Then
;~ errorhandling
Else
For $objElement In $objWMIItems
$nCoreNumber = $objElement.NumberOfCores
Next
Regarding "NumberOfCores", Microsofts MSDN page tells me "Windows Server 2003, Windows XP, and Windows 2000: This property is not available". Somewhere I read, it is possible with having SP3 installed. I suppose that's true, because it works that way on XP 32 bit systems. But there is no SP3 for XP 64...
Is there another way to get the information?
Thanks
I think it's easiest to read the NUMBER_OF_PROCESSORS environment variable.
Do you want "cores" or "number of logical processors including hyperthreading"? (In other words, do you want to count hyperthreading as a "core")?
In any case, copying my answer from a similar question a while back:
If you actually need to distinguish between actual cores, chips, and
logical processors, the API to call is
GetLogicalProcessInformation
GetSystemInfo if just want to know how many logical processors on
a machine (with no differentiation for hyperthreading.).
I am running a VB6 application on Windows 2003 Server.
When I am running it, it is giving ;overflow6 error.
Can any one tell me why is so?
You are making a division and both num and denum are 0
You try to assign a bigger type to a smaller one (like Byte b = a Long value)
You multiply numbers and the result gets too big.
Check for divisions and if your data types are big enough to hold result of operations