How to "guard against ever-changing result" in ExcelDna - excel-dna

I am implementing, and adapting, the sample code for array resizing found here: https://github.com/Excel-DNA/ExcelDna/blob/master/Distribution/Samples/ArrayResizer.dna
I have implemented one or two other sorts of functions (say Resize(...) and NameRange(...), following a similar approach (use ExcelAsyncUtil.QueueAsMacro() to perform some worksheet function)
The functions seem to work just fine when called separately. However, when nested (say NameRange(Resize()), Excel seems to go into an infinite re-calculation loop, particularly when there are other functions that reference the newly named range.
The code sample mentions including a guard against ever-changing results. Is this what's causing the re-calculations? What would/should such a guard look like?

Related

Does cscope have a query language/api?

I am trying to do some deep code analysis on a python2 codebase that is large and messy enough that most analysis tools I've tried have not worked. I have however been able to use pycscope to generate a cscope database. I can even use it to do basic things like find function usages or all functions called directly from a given function.
To me, it seems like the fact that there is a database and that it can be used for simple things means that it should be possible to use it for more complex things too. For example if I wanted to find all functions that are called from within a given function recursively, or all codepaths that depend on that function.
However cscope documentation is very light and I'm not much of a c-expert. Is there an actual query language for it? An extension that knows how to use the database in this manner?

Run user-submitted code in Go

I am working on an application which allows users to compare the execution of different string comparison algorithms. In addition to several algorithms (including Boyer-Moore, KMP, and other "traditional" ones) that are included, I want to allow users to put in their own algorithms (these could be their own algorithms or modifications to the existing ones) to compare them.
Is there some way in Go to take code from the user (for example, from an HTML textarea) and execute it?
More specifically, I want the following characteristics:
I provide a method signature and they fill in whatever they want in the method.
A crash or a syntax error in their code should not cause my whole program to crash. It should instead allow me to catch the error and display an error message.
(In this case, I am not worried about security against malicious code because users will only be executing my program on their own machines, so security is their own responsibility.)
If it is not possible to do this natively with Go, I am open to embedding one of the following languages to use for the comparison functions (in order of preference): JavaScript, Python, Ruby, C. Is there any way to do any of those?
A clear No.
But you can do fancy stuff: Why not recompile the program including the user provided code?
Split the stuff into two: One driver which collects user code, recompiles the actual code, executes the actual code and reports the outcome.
Including other interpreters for other languages can be done, e.g. Otto is a Javascript interpreter. (C will be hard :-)
Have you considered doing something similar to the gopherjs playground? According to this, the compilation is being done client-side.

How can one get a list of Mathematica's built-in global rewrite rules?

I understand that over a thousand built-in rewrite rules in Mathematica populate the global rules table by default. Is there any way to get Mathematica to give a full or even partial list of those rules?
The best way is to get a job at Wolfram Research.
Failing that, I think that for things not completely compiled into the kernel you can recover most of the rules/definitions. Look at
Attributes[fn]
where fn is the command that you're interested in. If it returns
{Protected, ReadProtected}
then there's something you can get a look at (although often it's just a MakeBoxes (formatting) definition or a AutoLoad/Stub type definition). To see what's there run
Unprotect[fn];
ClearAttributes[fn, ReadProtected];
??fn
Quite often you'll have to run an example of the command to load it if it was a stub. You'll also have to dig down from the user-facing commands to the back-end implementations.
Eventually you'll most likely reach a core command that is compiled into the kernel that you can not see the details of.
I previously mentioned this in tips for creating Graph diagrams and it got a mention in What is in your Mathematica tool bag?.
An good example, with a nice bite-sized and digestible bit of code is Experimental`AngularSlider[] mentioned in Circular/Angular slider. I'll leave it up to you to look at the code produced.
Another example is something like BoxWhiskerChart, where you need to call it once in order to load all of the code. Then you see that BoxWhiskerChart proceeds to call Charting`iBoxWhiskerChart which you'll have to unprotect to look at, etc...

Including a Netlogo source file into another

How can I include the procedures from one Netlogo file into another? Basically, I want to separate the code of a genetic algorithm from my (quite complicated) fitness function, but, obviously, I want the fitness reporter, which will reside in "fitness.nlogo", to be available in the genetic algorithm code, probably "genetic.nlogo".
If it can be done, how are the procedures imported, and the code executed? Is it like Python, where importing a module pretty much executes everything in the module, or like C/C++, where the file is blindly "joined"?
This may be a stupid question, but I couldn't find anything on Google. The Netlogo documentation says something about __includes, an experimental keyword that may do the trick, but there's not much explained there. No example either.
Any hints? Should I go with __includes? How does it work?
To include a file you use
__includes["libfile.nls"]
After adding this and pressing the “Check” button, a new button will appear next to the Procedures drop-down menu. There you can create and manage multiple source files.
The libfile.nls is just a text file that contains NetLogo code. It is not a netlogo model, which always end in .nlogo, as a NetLogo model contains a lot of other information besides the NetLogo code.
Including a file is the equivalent of just inserting all its contents at that point. In order to make it work in a way like reusable library files, one should create procedures which use agentsets and parameters as input variables to be independent of global definitions or interface settings.
The feature is documented in the NetLogo User Manual at http://ccl.northwestern.edu/netlogo/docs/programming.html#includes.
You can create a file libfile.nls and in the same folder create your main model model.nlogo.
After that, go to your model.nlogo and write:
__includes["libfile.nls"]
This file contains your reporters and procedures that you can call in your model.

How can I make an external toolbox available to a MATLAB Parallel Computing Toolbox job?

As a continuation of this question and the subsequent answer, does anyone know how to have a job created using the Parallel Computing Toolbox (using createJob and createTask) access external toolboxes? Is there a configuration parameter I can specify when creating the function to specify toolboxes that should be loaded?
According to this section of the documentation, one way you can do this is to specify either the 'PathDependencies' property or the 'FileDependencies' property of the job object so that it points to the functions you need the job's workers to be able to use.
You should be able to point the way to the KbCheck function in PsychToolbox, along with any other functions or directories needed for KbCheck to work properly. It would look something like this:
obj = createJob('PathDependencies',{'path_to_KbCheck',...
'path_to_other_PTB_functions'});
A few comments, based on my work troubleshooting this:
It appears that there are inconsistencies with how well nested functions and anonymous functions work with the Parallel Computation toolkit. I was unable to get them to work, while others have been able to. (Also see here.) As such, I would recommend having each function stored in it's own file, and including those files using the PathDependencies or FileDependencies properties, as described by gnovice above.
It is very hard to troubleshoot the Parallel Computation toolkit, as everything happens outside your view. Use breakpoints liberally in your code, and the inspect command is your friend. Also note that if there is an error, task objects will contain an error parameter, which in turn will contain ErrorMessage string, and possibly the Error.causes MException object. Both of these were immensely useful in debugging.
When including Psychtoolbox, you need to do it as follows. First, create a jobStartup.m file with the following lines:
PTB_path = '/Users/eliezerk/Documents/MATLAB/Psychtoolbox3/';
addpath( PTB_path );
cd( PTB_path );
SetupPsychtoolbox;
However, since the Parallel Computation toolkit can't handle any graphics functionality, running SetupPsychtoolbox as-is will actually cause your thread to crash. To avoid this, you need to edit the PsychtoolboxPostInstallRoutine function, which is called at the very end of SetupPsychtoolbox. Specifically, you want to comment out the line AssertOpenGL (line 496, as of the time of this answer; this may change in future releases).

Resources