How can one get a list of Mathematica's built-in global rewrite rules? - wolfram-mathematica

I understand that over a thousand built-in rewrite rules in Mathematica populate the global rules table by default. Is there any way to get Mathematica to give a full or even partial list of those rules?

The best way is to get a job at Wolfram Research.
Failing that, I think that for things not completely compiled into the kernel you can recover most of the rules/definitions. Look at
Attributes[fn]
where fn is the command that you're interested in. If it returns
{Protected, ReadProtected}
then there's something you can get a look at (although often it's just a MakeBoxes (formatting) definition or a AutoLoad/Stub type definition). To see what's there run
Unprotect[fn];
ClearAttributes[fn, ReadProtected];
??fn
Quite often you'll have to run an example of the command to load it if it was a stub. You'll also have to dig down from the user-facing commands to the back-end implementations.
Eventually you'll most likely reach a core command that is compiled into the kernel that you can not see the details of.
I previously mentioned this in tips for creating Graph diagrams and it got a mention in What is in your Mathematica tool bag?.
An good example, with a nice bite-sized and digestible bit of code is Experimental`AngularSlider[] mentioned in Circular/Angular slider. I'll leave it up to you to look at the code produced.
Another example is something like BoxWhiskerChart, where you need to call it once in order to load all of the code. Then you see that BoxWhiskerChart proceeds to call Charting`iBoxWhiskerChart which you'll have to unprotect to look at, etc...

Related

Run user-submitted code in Go

I am working on an application which allows users to compare the execution of different string comparison algorithms. In addition to several algorithms (including Boyer-Moore, KMP, and other "traditional" ones) that are included, I want to allow users to put in their own algorithms (these could be their own algorithms or modifications to the existing ones) to compare them.
Is there some way in Go to take code from the user (for example, from an HTML textarea) and execute it?
More specifically, I want the following characteristics:
I provide a method signature and they fill in whatever they want in the method.
A crash or a syntax error in their code should not cause my whole program to crash. It should instead allow me to catch the error and display an error message.
(In this case, I am not worried about security against malicious code because users will only be executing my program on their own machines, so security is their own responsibility.)
If it is not possible to do this natively with Go, I am open to embedding one of the following languages to use for the comparison functions (in order of preference): JavaScript, Python, Ruby, C. Is there any way to do any of those?
A clear No.
But you can do fancy stuff: Why not recompile the program including the user provided code?
Split the stuff into two: One driver which collects user code, recompiles the actual code, executes the actual code and reports the outcome.
Including other interpreters for other languages can be done, e.g. Otto is a Javascript interpreter. (C will be hard :-)
Have you considered doing something similar to the gopherjs playground? According to this, the compilation is being done client-side.

Can I debug dynamically added Ruby method?

I want to store brief snippets of code in the database (following a standard signature) and "inject" them at runtime. One way would be using eval(my_code). Is there some way to debug the injected code using breakpoints, etc? (I'm using Rubymine)
I'm aware I can just log to console, etc, but I'd prefer IDE-style debugging if possible.
Hm. Let's analyze your question. Firstly, it does not seem to have anything to do with databases: You simply store a code block in the source form somewhere. It can be a file, or a database. Secondly, you don't want IDE-style "debugging", but TDD-style. (But don't concentrate on that question now.)
What you need, is assertions about your code. That is, you need to describe what output should your code produce given some input examples. And then, you need to run that code and see whether its function matches the expectations. Furthermore, if you are not sure about the source of your snippets, run them in a sandbox (with $SAFE = 4). If your code fails the expectations, raise nice errors (TypeError, or even better, your custom made exception), and then you can eg. rescue those exceptions and eg. use some default code snippets...
... but maybe I'm not actually answering the same question that you are asking. If that's the case, then let me share one link to this sourcify gem, which let's you know the source, so that you can insert a breakpoint by saying eg. require 'rdebug' in the middle of code, or can even convert code to sexps. That's all I know.

Scala dynamic class management

I would like to know if the following is possible in Scala (but I think the question can be applied also to Java):
Create a Scala file dynamically (ok, no problem here)
Compile it (I don't think this would be a real problem)
Load/Unload the new class dynamically
Aside from knowing if dynamic code loading/reloading is possible (it's possible in Java so I think it's feasible also in Scala) I would like also to know the implication of this in terms of performance degradation (I could have many many classes, with no name clash but really many of them!).
TIA!
P.S.: I know other questions about class loading in Scala exist, but I haven't been able to find an answer about performance!
Yes, everything you want to do is certainly possible. You might like to take a look at ScalaMock, which is an example of creating Scala source code dynamically. And at SBT which is an example of calling the compiler from code. And then there are many different systems that load classes dynamically - look at the documentation for loadLibrary as a starting point.
But, depending on what you want to achieve, you might like to look at Scala Macros instead. They provide the same kind of flexibility as you would get by generating source code and then compiling it, but without many of the downsides of that approach. The original version of ScalaMock used to work by generating source code, but I'm in the process of moving to using macros instead.
It's all possible in Scala, as is clearly demonstrated by the REPL. It's even going to be relatively easy with Scala 2.10.

On the use of of Internal`Bag, and any official documentation?

(Mathematica version: 8.0.4)
lst = Names["Internal`*"];
Length[lst]
Pick[lst, StringMatchQ[lst, "*Bag*"]]
gives
293
{"Internal`Bag", "Internal`BagLength", "Internal`BagPart", "Internal`StuffBag"}
The Mathematica guidebook for programming By Michael Trott, page 494 says on the Internal context
"But similar to Experimental` context, no guarantee exists that the behavior and syntax of the functions will still be available in later versions of Mathematica"
Also, here is a mention of Bag functions:
Implementing a Quadtree in Mathematica
But since I've seen number of Mathematica experts here suggest Internal`Bag functions and use them themselves, I am assuming it would be sort of safe to use them in actual code? and if so, I have the following question:
Where can I find a more official description of these functions (the API, etc..) like one finds in documenation center? There is nothing now about them now
??Internal`Bag
Internal`Bag
Attributes[Internal`Bag]={Protected}
If I am to start using them, I find it hard to learn about new functions by just looking at some examples and trial and error to see what they do. I wonder if someone here might have a more complete and self contained document on the use of these, describe the API and such more than what is out there already or a link to such place.
The Internal context is exactly what its name says: Meant for internal use by Wolfram developers.
This means, among other things, the following things hold about anything you might find in there:
You most likely won't be able to find any official documentation on it, as it's not meant to be used by the public.
It's not necessarily as robust about invalid arguments. (Crashing the kernel can easily happen on some of them.)
The API may change without notice.
The function may disappear completely without notice.
Now, in practice some of them may be reasonably stable, but I would strongly advise you to steer away from them. Using undocumented APIs can easily leave you in for a lot of pain and a nasty surprise in the future.

Is there an easy way to convert HTTP_ACCEPT_LANGUAGE to Oracle NLS_LANG settings?

When adding internationalisation capabilities to an Oracle web application (build on mod_plsql), I'd like to interpret the HTTP_ACCEPT_LANGUAGE parameter and use it to set various NLS_* settings in the Oracle session.
For example:
HTTP_ACCEPT_LANGUAGE=de
alter session set nls_territory=germany;
alter session set nls_lang=...
However, you could get something more complicated I suppose...
HTTP_ACCEPT_LANGUAGE=en-us,en;q=0.5
How have folks tackled this sort of thing before?
EDIT - following on from Curt's detailed answer below
Thanks for the clear and detailed reply Curt. I didn't really make myself clear though, as I was really asking if there were any existing Oracle widgets that handled this.
I'm already down the road of manually parsing the HTTP_ACCEPT_LANGUAGE variable and - as Curt indicated in his answer - there are a few subtle areas of complexity. It feels like something that must have been done many times before. As I wrote more and more code I had that sinking "I'm reinventing the wheel" feeling. :)
There must be an existing Oracle approach for this - probably something in iAS??
EDIT - stumbled across the answer
While looking for something else, I stumbled across the UTL_I18N package, which does exactly wham I'm after:
Is there an easy way to convert HTTP_ACCEPT_LANGUAGE to Oracle NLS_LANG settings?
Sure, and it's not too tough, if you break up the problem properly and don't get to ambitious at first.
You need, essentially, two functions: one to parse the HTTP_ACCEPT_LANGUAGE and produce a language code, and one to take that and generate the appropriate set commands.
The former can get pretty sophisticated; if you're given only 'en', you probably want to generate 'en-us', you need to deal with chosing one of multiple choices when nothing matches perfectly, you need to deal with malformed header values, and so on. Don't try to tackle this all at once: just do something very simple at first, and extend it later.
The same more or less goes for the other half of it, generating the set commands, but this is pretty simple in and of itself anyway; it's really just a lookup function, though it may get a bit more sophisticated depending on what is provided to it.
What will really make or break your programming experience on something like this is your unit tests. This is an ideal problem for unit testing and test-driven development. Unit tests will make sure that when you change things, old functionality keeps working, and make it easier to add new functionality and fix bugs, because you just add another test and you have that to guide you from that point on. (You'll also find it easier to do a complete rewrite of one of the functions if you find out you've gone terribly wrong at some point, because you can easily confirm that the new version isn't breaking anything.)
How you do unit testing in your environment is probably a bit beyond the scope of this question, but let me add a few hints. First, if there's a unit test framework ("pl-sql-unit?") available for your environment, that's great. If not, don't panic. You don't need anything sophisticated: just a set of inputs and expected outputs, and a way to run them through the function and either say "all OK!" or show any incorrect results. You can probably write a single, simple PL/SQL function that reads the inputs and expected outputs from a table and does this for you.
Finally stumbled across the answer. The Oracle package UTL_I18N contains functions to map from the browser language codes to Oracle NLS settings:
utl_i18n.map_language_from_iso;
utl_i18n.map_territory_from_iso;
The mapping doesn't seem to cope very well with multi-language settings, e.g. en-us,en;q=0.5, but as long as you just use the first 5 characters the functions seem to work ok.
HTTP_ACCEPT_LANGUAGE: ar-lb,en-gb;q=0.5
v_language:
v_territory:
HTTP_ACCEPT_LANGUAGE: ar-lb
v_language: ARABIC
v_territory: LEBANON

Resources