I have a bunch of user-defined functions that are frequently used in mathematica. I wonder if I can store them in separate files and mathematica will load them on start and treat them as built-in functions, so that I don't have to repeat the definitions whenever I create a new .nb file. Something similar to functions in Matlab...
Thanks!
You can create a package in $UserBaseDirectory/Autoload. This will be loaded at Kernel initialization time.
Your package should have a Kernel/init.m file
MyPackage/Kernel/init.m
Reference documentation on Mathematica packages:
http://reference.wolfram.com/mathematica/tutorial/SettingUpMathematicaPackages.html
DeclarePackage[] is a lazy loading mechanism for symbols and their definitions. The associated package is loaded only when the symbol is used:
http://reference.wolfram.com/mathematica/ref/DeclarePackage.html
Related
I'm a little hazy on the differences between (use) and (import) in Chicken. Similarly, how do (load), (require) and (require-extension) differ?
These things don't seem to be mentioned much on the website.
Load and require are purely run-time, procedural actions. Load accepts a string argument and loads the file with that name (which can be source or compiled code) into the running Scheme, so that whatever it defines becomes available. Require does the same thing, but checks if the file has already been loaded by seeing if provide has been called on the same name (typically by the file as it is loaded). They are relatively rare in Scheme programming, corresponding to plug-ins in other languages, where code unknown at compile time needs to be loaded. See the manual page for unit eval for more details.
Import is concerned with modules rather than files. It looks for the named module, which must have already been loaded (but see below for Chicken 5), and makes the names exported from that module visible in the current context. In order to successfully import a module, there must be an import library for it. It is syntax, so the module name must appear explicitly in the call and cannot be computed at run time. See the manual page on modules for more details.
Require-library does the Right Thing to load code. If the code is already part of the running Scheme, either because it is built into Chicken, it does nothing. Otherwise it will load a core library unit if there is one, or will call require as a last resort. At compile time, it does analogous things to make sure that the environment will be correct at run time. See the "Non-standard macros and special forms" page in the manual for more details.
Use does a require-library followed by an import on the same name. It is the most common way to add functionality to your Chicken program. However, we write (import scheme) and (import chicken) because the functionality of these modules is already loaded. Require-extension is an exact synonym for use, provided for SRFI 55 compatibility. In R7RS mode, import is also a synonym for use.
Update for Chicken 5
Use has been removed from the language, and import now does what use used to do: it loads (if necessary) and then imports. Require-extension is consequently now a synonym for import.
In addition, Chicken-specific procedures and macros have been broken into modules with names like (chicken base) and (chicken bitwise).
I have some question regarding dynamic initialization (i.e. constructors before main) and DLL link ordering - for both Windows and POSIX.
To make it easier to talk about, I'll define a couple terms:
Load-Time Libraries: libraries which have been "linked" at compile
time such that, when the system loads my application, they get loaded
in automatically. (i.e. ones put in CMake's target_link_libraries
command).
Run-Time Libraries: libraries which I load manually by dlopen or
equivalents. For the purposes of this discussion, I'll say that I only
ever manually load libraries using dlopen in main, so this should
simplify things.
Dynamic Initialization: if you're not familiar with the C++ spec's
definition of this, please don't try to answer this question.
Ok, so let's say I have an application (MyAwesomeApp) and it links against a dynamic library (MyLib1), which in turn links against another library (MyLib2). So the dependency tree is:
MyAwesomeApp -> MyLib1 -> MyLib2
For this example, let's say MyLib1 and MyLib2 are both Load-Time Libraries.
What's the initialization order of the above? It is obvious that all static initialization, including linking of exported/imported functions (windows-only) will occur first... But what happens to Dynamic Initialization? I'd expect the overall ordering:
ALL import/export symbol linking
ALL Static Initialization
ALL of MyLib2's Dynamic Initialization
ALL of MyLib1's Dynamic Initialization
ALL of MyAwesomeApp's Dynamic Initialization
MyAwesomeApp's main() function
But I can't find anything in specs that mandate this. I DID see something with elf that hinted at it, but I need to find guarantees in specs for me to do something I'm trying to do.
Just to make sure my thinking is clear, I'd expect that library loading works very similarly to 'import in Python in that, if it hasn't been loaded yet, it'll be loaded fully (including any initialization) before I do anything... and if it has been loaded, then I'll just link to it.
To give a more complex example to make sure there isn't another definition of my first example that yields a different response:
MyAwesomeApp depends on MyLib1 & MyLib2
MyLib1 depends on MyLib2
I'd expect the following initialization:
ALL import/export symbol linking
ALL Static Initialization
ALL of MyLib2's Dynamic Initialization
ALL of MyLib1's Dynamic Initialization
ALL of MyAwesomeApp's Dynamic Initialization
MyAwesomeApp's main() function
I'd love any help pointing out specs that say this is how it is. Or, if this is wrong, any spec saying what REALLY happens!
Thanks in advance!
-Christopher
Nothing in the C++ standard mandates how dynamic linking works.
Having said that, Visual Studio ships with the C Runtime (aka CRT) source, and you can see where static initializers get run in dllcrt0.c.
You can also derive the relative ordering of operations if you think about what constraints need to be satisfied to run each stage:
Import/export resolution just needs .dlls.
Static initialization just needs .dlls.
Dynamic initialization requires all imports to be resolved for the .dll.
Step 1 & 2 do not depend on each other, so they can happen independently. Step 3 requires 1 & 2 for each .dll, so it has to happen after both 1 & 2.
So any specific loading order that satisfies the constraints above will be a valid loading order.
In other words, if you need to care about the specific ordering of specific steps, you probably are doing something dangerous that relies on implementation specific details that will not be preserved across major or minor revisions of the OS. For example, the way the loader lock works for .dlls has changed significantly over the various releases of Windows.
gperftools documentation says that libprofiler should be linked into a target program:
$ gcc myprogram.c -lprofiler
(without changing a code of the program).
And then program should be run with a specific environment variable:
CPUPROFILE=/tmp/profiler_output ./a.out
The question is: how does libprofile have a chance to start and finish a profiler when it is merely loaded, but its functions are not called?
There is no constructor function in that library (proof).
All occasions of "CPUPROFILE" in library code do not refer to any place where profiler is started.
I am out of ideas, where to look next?
As per the documentation the linked webpage, under Linking the library, it describes that the -lprofiler step is the same as linking against the shared object file with LD_PRELOAD option.
The shared object file isn't the same as just the header file. The header file contains function declarations which are looked up by when compiling a program , so the names of the functions resolve, but the names are just names, not implementations. The shared object file (.so) contains the implementations of the functions. For more information see the following StackOverflow answer.
Source file of /trunk/src/profiler.cc on Line 182, has a CPUProfiler constructor, that checks for whether profiling should be enabled or not based on the CPUPROFILE environment variable (Line 187 and Line 230).
It then calls the Start function on Line 237. As per the comments in this file, the destructor calls the Stop function on Line 273.
To answer your question I believe Line 132 CpuProfiler CpuProfiler::instance_; is the line where the CpuProfiler is instantiated.
This lack of clarity in the gperftools documentation is known issue see here.
I think the profiler gets initialized with the REGISTER_MODULE_INITIALIZER macro seen at the bottom of profile-handler.cc (as well as in heap-checker.cc, heap-profiler.cc, etc). This calls src/base/googleinit.h which defines a dummy static object whose constructor is called when the library is loaded. That dummy constructor then calls ProfileHandlerRegisterThread() which then uses the pthread_once variable to initialize the singleton object (ProfileHandler::instance_).
The function REGISTER_MODULE_INITIALIZER simulates the module_init()/module_exit() functions seen in Linux loadable kernel modules.
(my answer is based on the 2.0 version of gperftools)
How can I include the procedures from one Netlogo file into another? Basically, I want to separate the code of a genetic algorithm from my (quite complicated) fitness function, but, obviously, I want the fitness reporter, which will reside in "fitness.nlogo", to be available in the genetic algorithm code, probably "genetic.nlogo".
If it can be done, how are the procedures imported, and the code executed? Is it like Python, where importing a module pretty much executes everything in the module, or like C/C++, where the file is blindly "joined"?
This may be a stupid question, but I couldn't find anything on Google. The Netlogo documentation says something about __includes, an experimental keyword that may do the trick, but there's not much explained there. No example either.
Any hints? Should I go with __includes? How does it work?
To include a file you use
__includes["libfile.nls"]
After adding this and pressing the “Check” button, a new button will appear next to the Procedures drop-down menu. There you can create and manage multiple source files.
The libfile.nls is just a text file that contains NetLogo code. It is not a netlogo model, which always end in .nlogo, as a NetLogo model contains a lot of other information besides the NetLogo code.
Including a file is the equivalent of just inserting all its contents at that point. In order to make it work in a way like reusable library files, one should create procedures which use agentsets and parameters as input variables to be independent of global definitions or interface settings.
The feature is documented in the NetLogo User Manual at http://ccl.northwestern.edu/netlogo/docs/programming.html#includes.
You can create a file libfile.nls and in the same folder create your main model model.nlogo.
After that, go to your model.nlogo and write:
__includes["libfile.nls"]
This file contains your reporters and procedures that you can call in your model.
Beyond creating a dll with all the same functions with the same interface and calling conventions, does the replacement dll need to exactly duplicate the export map including ordinal numbers of the original as well? So that not only explicit loading via GetProcAddress works, but implicit linking as well?
(edit: this is an unmanaged, c/c++ windows dll I'm talking about, not .net)
You will need to mimic every export that any other client is using, you don't need to mimic "dead" exports that no one is using. You need to keep the ordinals only if other clients are linked by using ordinal instead of export name (which is quite rare).
There a is something that you need to keep in mind: If the dll contains C++ classes and it is not using extern "C" then you need to maintain binary comparability, meaning the classes in the replacement dll needs to have the same fields in the same order as the original classes. If your using interfaces that you need to keep the vtable with the same arguments for each method.