Setting Size of String Buffer When Accessing Windows Registry - vbscript

I'm working on a VBScript web application that has a newly-introduced requirement to talk to the registry to pull connection string information instead of using hard-coded strings. I'm doing performance profiling because of the overhead this will introduce, and noticed in Process Monitor that reading the value returns two BUFFER OVERFLOW results before finally returning a success.
Looking online, Mark Russinovich posted about this topic a few years back, indicating that since the size of the registry entry isn't known, a default buffer of 144 bytes is used. Since there are two buffer overflow responses, the amount of time taken by the entire call is approximately doubled (and yes, I realize the difference is 40 microseconds, but with 1,000 or more page hits per second, I'm willing to invest some time in optimization).
My question is this: is there a way to tell WMI what the size of the registry value is before it tries to get it? Here's a sample of the code I'm using to access the registry:
svComputer = "." ' Local machine is simply "."
ivHKey = &H80000002 ' HKEY_LOCAL_MACHINE = &H80000002 (from WinReg.h)
svRegPath = "SOFTWARE\Path\To\My\Values"
Set oRegistry = GetObject("winmgmts:{impersonationLevel=impersonate}!\\" & svComputer & "\root\default:StdRegProv")
oRegistry.GetStringValue ivHKey, svRegPath, "Value", svValue

In VBScript strings are strings. They are however long they need to be. You don't pre-define their length. Also, if performance is that much of an issue for you, you should consider using a compiled instead of an interpreted language (or a cache for values you read before).

Related

Storing the data using Parallel Programming

The requirement is as follows:
We have a third party client from where we need to take the data and store in the database(ultimately).
The way client is sharing data to us is through the dll's having functions(the dll is built out of C++ code) and we need to call those functions with appropriate parameters and we get the result.
Declare Function wcmo_deal Lib "E:\IGB\System\Intex\vcmowrap\vcmowr64.dll" (
ByRef WCMOarg_Handle As String,
ByRef WCMOarg_User As String,
ByRef WCMOarg_Options As String,
ByRef WCMOarg_Deal As String,
ByRef WCMOarg_DataOut As String,
ByRef WCMOarg_ErrOut As String) _
As Long
wcmo_deal(wcmo_deal_WCMOarg_Handle, WCMOarg_User, WCMOarg_Options, WCMOarg_Deal, WCMOarg_DataOut, WCMOarg_ErrOut)
Here WCMOarg_DataOut is the data we get and that needs to be stored.
Similar to the above method we have 10 more methods(so,total 11 methods) which pull the data and that data(string of around 500 KB to 1 MB each) is stored in the files using the below method :
File.WriteAllText(logPath & sDealName & ".txt", sDealName & " - " & WCMOarg_ErrOut & vbCrLf)
Now these method calls run for each deal. So for a single deal we get output in 11 different folders with the text file stored with the data received from the client.
There are 5000 deals totally for which we need to call these methods and the data gets stored in the files.
The way this functionality has been implemented is by using Parallel Programming with Master-Child relationship as follows:
Dim opts As New ParallelOptions
opts.MaxDegreeOfParallelism = System.Environment.ProcessorCount
Parallel.ForEach(dealList, opts, Sub(deal)
If Len(deal) > 0 Then
Dim dealPass As String = ""
Try
If dealPassDict.ContainsKey(deal.ToUpper) Then
dealPass = dealPassDict(deal.ToUpper)
End If
Dim p As New Process()
p.StartInfo.FileName = "E:\IGB_New\CMBS Intex Data Deal v2.0.exe"
p.StartInfo.Arguments = deal & "|" & keycode & "|" & dealPass & "|" & clArgs(1) & "|"
p.StartInfo.UseShellExecute = False
p.StartInfo.CreateNoWindow = True
p.Start()
p.WaitForExit()
Catch ex As Exception
exceptions.Enqueue(ex)
End Try
End If
End Sub)
where CMBS Intex Data Deal v2.0.exe is the child code which will execute 5000 times as deallist contains 5000 deals.
The CMBS Intex Data Deal v2.0.exe code contains the code of calling the dlls and storing the data in the files mentioned above.
Issues faced :
The code was run keeping the Master and Child code in one single place but we get out of memory exception after 3000 deals.[for 32 GB RAM,Processor Count =16]
The above code(Master-Child) is also taking up a lot of memory, it runs fine upto 4800 deals in one hour(the memory usage gradually reaches 100% at around 4800 deals) and then for the remaining 200 deals it takes close to 1 hour(so , totally 2 hours).[for 32 GB RAM,Processor Count =16]
The reason Master child was tried was on the assumption that GC will take care of the Memory disposal of all the objects in the Child.
After the data is stored in text files, a perl script runs and loads the data into the database.
Approach tried:
Instead of keeping the data in text files and then storing into the database, I tried storing the data into the DB directly without putting them in the files (assuming I/O operations consume a lot of memory),but this too didnt work as the DB crashed/Hangs everytime.
Note:
All the handles related to the DLL is properly being closed.
The call to the DLL's method consume a lot of memory,but nothing can be done to reduce it as it cant be controlled by us.
The reason to use Parallel approach is if we go with sequential approach, it would take many hours to fetch and load the data and we need to run this twice a day as the data keeps changing, so need to be up-to-date with the latest data from the client.
There was a CPU maxing out issue as well but that has been resolved by keeping the MaxDegreeOfParallelism = System.Environment.ProcessorCount.
Question :
Is there a way to reduce the time taken by the process to complete.
Currently it takes 2 hours to complete but that could be due to no memory remaining as it reaches 4800 deals, and without any memory it cannot process any further.
Is there a way to reduce memory consumption here by trying out a different way to execute this or there is something if changed in the same code could make it work?
Parallelism is most likely totally useless. You are bound by the IO, not by the CPU. The IO is the bottleneck and parallelization might even make it worse. You might experiment by using a RAM drive and then copy all the output to the actual storage.

Windows (ReFS,NTFS) file preallocation hint

Assume I have multiple processes writing large files (20gb+). Each process is writing its own file and assume that the process writes x mb at a time, then does some processing and writes x mb again, etc..
What happens is that this write pattern causes the files to be heavily fragmented, since the files blocks get allocated consecutively on the disk.
Of course it is easy to workaround this issue by using SetEndOfFile to "preallocate" the file when it is opened and then set the correct size before it is closed. But now an application accessing these files remotely, which is able to parse these in-progress files, obviously sees zeroes at the end of the file and takes much longer to parse the file.
I do not have control over the this reading application so I can't optimize it to take zeros at the end into account.
Another dirty fix would be to run defragmentation more often, run Systernal's contig utility or even implement a custom "defragmenter" which would process my files and consolidate their blocks together.
Another more drastic solution would be to implement a minifilter driver which would report a "fake" filesize.
But obviously both solutions listed above are far from optimal. So I would like to know if there is a way to provide a file size hint to the filesystem so it "reserves" the consecutive space on the drive, but still report the right filesize to applications?
Otherwise obviously also writing larger chunks at a time obviously helps with fragmentation, but still does not solve the issue.
EDIT:
Since the usefulness of SetEndOfFile in my case seems to be disputed I made a small test:
LARGE_INTEGER size;
LARGE_INTEGER a;
char buf='A';
DWORD written=0;
DWORD tstart;
std::cout << "creating file\n";
tstart = GetTickCount();
HANDLE f = CreateFileA("e:\\test.dat", GENERIC_ALL, FILE_SHARE_READ, NULL, CREATE_ALWAYS, 0, NULL);
size.QuadPart = 100000000LL;
SetFilePointerEx(f, size, &a, FILE_BEGIN);
SetEndOfFile(f);
printf("file extended, elapsed: %d\n",GetTickCount()-tstart);
getchar();
printf("writing 'A' at the end\n");
tstart = GetTickCount();
SetFilePointer(f, -1, NULL, FILE_END);
WriteFile(f, &buf,1,&written,NULL);
printf("written: %d bytes, elapsed: %d\n",written,GetTickCount()-tstart);
When the application is executed and it waits for a keypress after SetEndOfFile I examined the on disc NTFS structures:
The image shows that NTFS has indeed allocated clusters for my file. However the unnamed DATA attribute has StreamDataSize specified as 0.
Systernals DiskView also confirms that clusters were allocated
When pressing enter to allow the test to continue (and waiting for quite some time since the file was created on slow USB stick), the StreamDataSize field was updated
Since I wrote 1 byte at the end, NTFS now really had to zero everything, so SetEndOfFile does indeed help with the issue that I am "fretting" about.
I would appreciate it very much that answers/comments also provide an official reference to back up the claims being made.
Oh and the test application outputs this in my case:
creating file
file extended, elapsed: 0
writing 'A' at the end
written: 1 bytes, elapsed: 21735
Also for sake of completeness here is an example how the DATA attribute looks like when setting the FileAllocationInfo (note that the I created a new file for this picture)
Windows file systems maintain two public sizes for file data, which are reported in the FileStandardInformation:
AllocationSize - a file's allocation size in bytes, which is typically a multiple of the sector or cluster size.
EndOfFile - a file's absolute end of file position as a byte offset from the start of the file, which must be less than or equal to the allocation size.
Setting an end of file that exceeds the current allocation size implicitly extends the allocation. Setting an allocation size that's less than the current end of file implicitly truncates the end of file.
Starting with Windows Vista, we can manually extend the allocation size without modifying the end of file via SetFileInformationByHandle: FileAllocationInfo. You can use Sysinternals DiskView to verify that this allocates clusters for the file. When the file is closed, the allocation gets truncated to the current end of file.
If you don't mind using the NT API directly, you can also call NtSetInformationFile: FileAllocationInformation. Or even set the allocation size at creation via NtCreateFile.
FYI, there's also an internal ValidDataLength size, which must be less than or equal to the end of file. As a file grows, the clusters on disk are lazily initialized. Reading beyond the valid region returns zeros. Writing beyond the valid region extends it by initializing all clusters up to the write offset with zeros. This is typically where we might observe a performance cost when extending a file with random writes. We can set the FileValidDataLengthInformation to get around this (e.g. SetFileValidData), but it exposes uninitialized disk data and thus requires SeManageVolumePrivilege. An application that utilizes this feature should take care to open the file exclusively and ensure the file is secure in case the application or system crashes.

Find MAC address using VB6

I want to find the MAC address of network adopter, I have a working code too but my problem is it only works if IP addres is enabled otherwise it won't get the address is there any solution to find MAC address even if IP Enabled is false
if its possible update me with sample code
Here is my code to find MAC
Dim myWMI As Object, myObj As Object, Itm
Set myWMI = GetObject("winmgmts:\\.\root\cimv2")
Set myObj = myWMI.ExecQuery("SELECT * FROM " & _
"Win32_NetworkAdapterConfiguration " & _
"WHERE IPEnabled = True")
For Each Itm In myObj
Dim macme As String
Dim NewMac As String
macme = Itm.MACAddress
NewMac = Replace(macme, ":", "")
Text1.Text = NewMac
Exit For
Next
Querying the heavyweight WMI service is not the best choice for application programs. It is a convenience feature meant only for admin scripting. The service could be disabled or even uninstalled on older versions of Windows.
Instead you can go directly to the source. The GetAdaptersInfo function is one of the more obvious candidates.
Using this involves more code than can be conveniently inlined here. Look for the many examples of use in VB6 posted with working source code at more programmer-friendly sites that permit uploaded attachments.
No matter how you go about it the value of doing this is questionable anyway. Many integrated network adapters allow users to insert their own MAC Address values anyway. Quite a few PCs today rely on USB network adapters that are not even plugged in all the time and may get swapped around between devices.

Excel Not Responding During Macro

I have a macro that runs on a continous loop 24/7. The computer will occasionally freeze but there is no error in the excel code. The code uses the following in order to run efficiently:
DoEvents
Application.DisplayAlerts = False
Application.ScreenUpdating = False
Application.Calculation = xlCalculationManual
startagain:
'code
'calculations, alerts, and screen are updated
GoTo startagain
I also am using what I believe is an efficient method of copying and pasting (a bulk of the code is pasting values and formulas):
Length = Range("C1").Value
Set rng = Sheets("Linked Data").Range("A2:AA" & Length)
Range("A2").Value = rng.Value
I have chagned the processor priority on the computer to "high" for EXCEL.exe, I have the computer performance set to maximum performance, I have disable all un-necessary add-ins, and I have turned off autorecover saving.
Despite all of the above, the computer will sometimes freeze and become unresponsive. Does anyone know anything that can be done to improve the reliability?
Avoid Goto and use Application.OnTime Method to run a procedure repeatively.
You can also think of creating a Automation-addin which may lead to performance imporvement as its compiled code.
Kindly refer this link
You may also refer this link for performance imporvement link

Out of memory error in VB6 application

Before anyone says it, I know this isn't the way it should be done, but it's the way it was done and I'm trying to support it without rewriting it all.
I can assure you this isn't the worst bit by far.
The problem occurs when the application reads an entire file into a string variable.
Normally this work OK because the files are small, but one user created a file of 107MB and that falls over.
intFreeFile = FreeFile
Open strFilename For Binary Access Read As intFreeFile
ReadFile = String(LOF(intFreeFile), " ")
Get intFreeFile, , ReadFile
Close intFreeFile
Now, it doesn't fall over at the line
ReadFile = String(LOF(intFreeFile), " ")
but on the
Get intFreeFile, , ReadFile
So what's going on here, surely the String has done the memory allocation so why would it complain about running out of memory on the Get?
Usually reading a file involves some buffering, which takes space. I'm guessing here, but I'd look at the space needed for byte to character conversion. VB6 strings are 16 bit, but (binary) files are 8 bit. You'll need 107MB for the file content, plus 214 MB for the converted results. The string allocation only reserves the 214 MB.
You do not need that "GET" call, just remove it, you are already putting the file into a string, so there is no need to use the GET call.
ReadFile = Input(LOF(intFreeFile), intFreeFile)
I got the same error . And we just checked the taskmanager showing 100% resource usage . we found out one of the update application was taking too much ram memory and we just killed it.
this solved the issue for me. One more thing was we gone in to config settings.
START->RUN->MSCONFIG
and go to startup tab and uncheck the application that looks like a updater application or some odd application that you dont use.

Resources