SAS stack overflow: PROC SQL reading dictionary.columns - stack-overflow

I have a program in which I am reading dictionary.columns. There is a big program with lot of code before and after the program segment in which I read dictionary.column.
The program used to work fine however it is giving error now. I have executed program 5 times it gave same error: STACK Overflow
However when I am executing same proc sql patch in different program it is executing fine.
Any suggestions on where the problem might be, and any possible solutions?

Usually, to fix stack overflow in SAS is just to close the session and start again. If you want to keep the datasets you created in the work folder, you can just go to the SAS Temporary Files folder (search for it, it's different on different machines, but it's usually located in Program Files/SAS Institute/... ). In the SAS Temporary Files folder you will find some TD_XXXX folders, the latest one probably contains your work library datasets.

Related

Migrating an old old old VB + MSAccess program to a different computer

I have an old program that was made for us a looong time ago. It consists of a large MDB (Access) file with all the data (no encryption, I can manually open the file and browse all the data) and an EXE file (probably VB?) that was custom made to easily manage the data in the file.
I'm trying to move this program for another user, to run in his own laptop.
First I tried just copying all the files, but I had MSCOM, GRD, LST, and ocx missing file errors. I tracked them all down and regsvr32'd them, and the program seemed to go a little further.
Then I got an ODBC connector error. Playing with ODBC sources manager and I added an entry with the name of the program that points to the specific MBD file. This helped too.
Now program starts and shows all menus, buttons and everything. However, the default record that should be onscreen is empty and as soon as I hit any control (next record, list, etc...) it crashes with a VB error 91:
Run-time Error '91': Object variable or With block variable not set
So it looks like the program can open the database file itself but it cant really access the data inside.
What else can I try to see what I need to set it all up correctly? Is there anything that "spies" inside and VB program to see how it's trying to access the MDB file?
Any help would be appreciated!
Probable cause of your problem is some missing dll/ocx file referenced in your application. Open your exe file with notepad (or notepad++) and find all occurrences of .dll and .ocx files and check if those
files exist at user's laptop. If not, just copy them from your working machine and regsvr32 them.
I will go with #smith suggestion.
While looking at the error message on microsoft's website, below is the solution that applies to your scenerio
"The object is a valid object, but it wasn't set because the object library in which it is described hasn't been selected in the Add References dialog box."
So ensure all files are correctly copied to new system.

Debugging software that is launched by batch file (x64dbg)

I've been getting in to reverse engineering a bit and have come across a piece of software that I'm unsure how to start on. The software is launched by a batch file (which also calls a second batch file) before it calls the executable. If I load the batch file in to x64dbg I get a PE file error which is expected. But I can't run the executable directly as I get a missing dll error. Any idea how I might get around this to get started? Cheers.
I ended up using Ollydbg instead as it seems to load batch files.
For future references, you can copy the dll(s) to the directory the PE is in. Not sure what software this is, but the downside I can see is possibly running into dependency hell? You won't really know until your executable stops complaining about missing dll's.

less stacks using StackWalk64

I built test.exe which will crash and generate .dmp file using MinidumpWriteDump, and parser.exe is used to read and print information from that dmp file.
In parser.exe I use StackWalk64 to get all stack traces of all threads in that dmp file.
But now I found that I can only get less stacks than that visual studio did.
I've tried all solutions I could find in google、stackoverflow、codeproject, nothing changed.
The following is what parser.exe do:
SymInitialize
MiniDumpReadDumpStream to read all information
SymLoadModuleEx & SymFindFileInPath to load pdb/exe/dll specified in .dmp file
Initialize STACKFRAME64 and call StackWalk64 in loop.
I want to know how to get the same count of stack as visual studio.
I could paste more code here if needed.
Any help will be appreciated.
StackWalk64 isn't robust enough to follow the full stack trace, especially through frames that have been optimized. (For example, see this stackoverflow question here).
The best approach is to actually use the debug engine supplied with WinDbg. Here are a couple of blog posts that show how to use the debug engine API:
Getting the Stack from a .DMP File (Automating Crash Dump Analysis
Part 2)
MiniDumps and "Bad" Stacks

Program to help sort files

I'm going through a lot of computers and a lot of data here and there.
I'm moving it all to a server so everybody has access to it.
There i have a folder for each computer. But a lot of the data is the same.
Is there any program to help me combine the data that is the same ?
It hell trying to do this manually.
Basically i want to tell this program, hey check this folder here C:/test and if there are any files that are duplicated, delete one of them.
If you need a tool for manual comparison of large directory structures, try Beyond Compare.
If you want automatic comparison scripts, Cygwin diff is good, possibly embedded in a shell script.

Get a look at the temporary files a process creates

I'm trying to reverse-engineer a program that does some basic parsing: text in, text out. I've got an executable "reference implementation" and the source code to what must be a different version, since the compiled source output != executable output.
The process creates and deletes temporary files very quickly in a multi-step parsing process. If I could take a look at the individual temporary files, I could get some great diagnostic data to narrow down where my source differs from the binary.
Is there any way to do any of the following?
Freeze a directory so that file creation will work but file deletion will fail silently?
Run a program in "slow motion" so that I can look at the files that it creates?
Log everything that a program does, including any data written out to files?
Running a tool like NTFS Undelete should give you the chance to recover the temporary files it's creating then deleting. Combine this with ProcMon from Sysinternals to get the right filenames.
You didn't mention what OS you're doing this on, but assuming you're using Windows...
You might be able to make use of SysInternals tools like Process Explorer and Process Monitor to get a better idea of the files being accessed. As far as I know, there's no "write-only" option on folders. For "slowing down" the files, you'd just need to use a slower computer. For logging, the SysInternals tools will help out quite a bit. Once you have a file name(s) that are being created, you could try preventing their deletion by opening the files in a stream from another process. That would prevent the system from being able to delete them.
There are two ways to attack this:
Run various small test cases through both systems and notice the differences. Since the test cases are small, you should be able to figure out why your code works differently than the executable.
Disassemble the executable and remove all the "delete temp file" instructions. Depending on how this works, this could be a very complex task (say when there is no central place where it happens).

Resources