How to compile a function that computes the Hessian? - wolfram-mathematica

I am looking to see how a function that computes the Hessian of a log-likelihood can be compiled, so that it can be efficiently used with different sets of parameters.
Here is an example.
Suppose we have a function that computes the log-likelihood of a logit model, where y is vector and x is a matrix. beta is a vector of parameters.
pLike[y_, x_, beta_] :=
Module[
{xbeta, logDen},
xbeta = x.beta;
logDen = Log[1.0 + Exp[xbeta]];
Total[y*xbeta - logDen]
]
Given the following data, we can use it as follows
In[1]:= beta = {0.5, -1.0, 1.0};
In[2]:= xmat =
Table[Flatten[{1,
RandomVariate[NormalDistribution[0.0, 1.0], {2}]}], {500}];
In[3]:= xbeta = xmat.beta;
In[4]:= prob = Exp[xbeta]/(1.0 + Exp[xbeta]);
In[5]:= y = Map[RandomVariate[BernoulliDistribution[#]] &, prob] ;
In[6]:= Tally[y]
Out[6]= {{1, 313}, {0, 187}}
In[9]:= pLike[y, xmat, beta]
Out[9]= -272.721
We can write its hessian as follows
hessian[y_, x_, z_] :=
Module[{},
D[pLike[y, x, z], {z, 2}]
]
In[10]:= z = {z1, z2, z3}
Out[10]= {z1, z2, z3}
In[11]:= AbsoluteTiming[hess = hessian[y, xmat, z];]
Out[11]= {0.1248040, Null}
In[12]:= AbsoluteTiming[
Table[hess /. {z1 -> 0.0, z2 -> -0.5, z3 -> 0.8}, {100}];]
Out[12]= {14.3524600, Null}
For efficiency reasons, I can compile the original likelihood function as follows
pLikeC = Compile[{{y, _Real, 1}, {x, _Real, 2}, {beta, _Real, 1}},
Module[
{xbeta, logDen},
xbeta = x.beta;
logDen = Log[1.0 + Exp[xbeta]];
Total[y*xbeta - logDen]
],
CompilationTarget -> "C", Parallelization -> True,
RuntimeAttributes -> {Listable}
];
which yields the same answer as pLike
In[10]:= pLikeC[y, xmat, beta]
Out[10]= -272.721
I am looking for an easy way to obtain similarly, a compiled version of the hessian function, given my interest in evaluating it many times.

Leonid already beat me, but I'll post my line of thought anyway just for laughs.
The main problem here is that compilation works for numerical functions whereas D needs symbolics. So the trick would be to first define the pLike function with the same amount of variables as needed for the particular size of matrices you intend to use, e.g,
pLike[{y1, y2}, {{x1, x2, x3}, {x12, x22, x32}}, {z1, z2, z3}]
The Hessian:
D[pLike[{y1, y2}, {{x1, x2, x3}, {x12, x22, x32}}, {z1, z2, z3}], {{z1, z2, z3}, 2}]
This function should be compilable as it depends on numerical quantities only.
To generalize for various vectors one could build something like this:
Block[{ny = 2, nx = 3, z1, z2, z3},
hessian[
Table[ToExpression["y" <> ToString[i] <> "_"], {i, ny}],
Table[ToExpression["xr" <> ToString[i] <> "c" <> ToString[j] <> "_"],
{i, ny}, {j, nx}], {z1_, z2_, z3_}
] =
D[
pLike[
Table[ToExpression["y" <> ToString[i]], {i, ny}],
Table[ToExpression["xr" <> ToString[i] <> "c" <> ToString[j]],
{i, ny}, {j, nx}], {z1, z2, z3}
],
{{z1, z2, z3}, 2}
]
]
And this can of course be easily generalized for variable nx and ny.
And now for the Compile part. It's an ugly piece of code, consisting of the above and a compile and made suitable for variable y size. I like ruebenko's code more than mine.
ClearAll[hessianCompiled];
Block[{z1, z2, z3},
hessianCompiled[yd_] :=
(hessian[
Table[ToExpression["y" <> ToString[i] <> "_"], {i, yd}],
Table[ToExpression["xr" <> ToString[i]<>"c"<>ToString[j] <>"_"],{i,yd},{j,3}],
{z1_, z2_, z3_}
] =
D[
pLike[
Table[ToExpression["y" <> ToString[i]], {i, yd}],
Table[ToExpression["xr" <> ToString[i] <> "c" <> ToString[j]], {i,yd},{j,3}],
{z1, z2, z3}
], {{z1, z2, z3}, 2}
];
Compile[{{y, _Real, 1}, {x, _Real, 2}, {z, _Real, 1}},
hessian[Table[y[[i]], {i, yd}], Table[x[[i, j]], {i, yd}, {j, 3}],
Table[z[[i]], {i, 3}]]]// Evaluate] // Quiet
)
]
hessianCompiled[500][y, xmat, beta] // Timing
{1.497, {{-90.19295669, -15.80180276, 6.448357845},
{-15.80180276, -80.41058154, -26.33982586},
{6.448357845, -26.33982586, -72.92978931}}}
ruebenko's version (including my edits):
(cf = mkCHessian[500, 3]; cf[y, xmat, beta]) // Timing
{1.029, {{-90.19295669, -15.80180276, 6.448357845},
{-15.80180276, -80.41058154, -26.33982586},
{6.448357845, -26.33982586, -72.92978931}}}
Note that both tests include compilation time. Running the calculation on its own:
h = hessianCompiled[500];
Do[h[y, xmat, beta], {100}]; // Timing
Do[cf[y, xmat, beta], {100}]; // Timing
(* timing for 100 hessians:
==> {0.063, Null}
==> {0.062, Null}
*)

Here is an idea based on the previous post(s): We construct the input to Compile symbolically.
mkCHessian[{y_, ys_Integer}, {x_, xs_Integer}, {beta_, bs_Integer}] :=
With[{
args = MapThread[{#1, _Real, #2} &, {{y, x, beta}, {1, 2, 1}}],
yi = Quiet[Part[y, #] & /# Range[ys]],
xi = Quiet[Table[Part[x, i, j], {i, xs}, {j, xs}]],
betai = Quiet[Part[beta, #] & /# Range[bs]]
},
Print[args];
Print[yi];
Print[xi];
Print[betai];
Compile[Evaluate[args],
Evaluate[D[pLike[yi, xi, betai], {betai, 2}]]]
]
And then generate the compiled function.
cf = mkCHessian[{y, 3}, {x, 3}, {beta, 3}];
You then call that compiled function
cf[y, xmat, beta]
Please verify that I did not make a mistake; in de Vries's post y is of length 2. Mine is length 3. I am sure what is correct. And of course, the Print are for illustration...
Update
A version with slightly improved dimension handling and with variables localized:
ClearAll[mkCHessian];
mkCHessian[ys_Integer, bs_Integer] :=
Module[
{beta, x, y, args, xi, yi, betai},
args = MapThread[{#1, _Real, #2} &, {{y, x, beta}, {1, 2, 1}}];
yi = Quiet[Part[y, #] & /# Range[ys]];
xi = Quiet[Table[Part[x, i, j], {i, ys}, {j, bs}]];
betai = Quiet[Part[beta, #] & /# Range[bs]];
Compile[Evaluate[args], Evaluate[D[pLike[yi, xi, betai], {betai, 2}]]]
]
Now, with asim's definitions in In[1] to In[5]:
cf = mkCHessian[500, 3];
cf[y, xmat, beta]
(* ==> {{-8.852446923, -1.003365612, 1.66653381},
{-1.003365612, -5.799363241, -1.277665283},
{1.66653381, -1.277665283, -7.676551252}} *)
Since y is a random vector results will vary.

Related

Generating the Sierpinski triangle iteratively in Mathematica?

I have written code which draws the Sierpinski fractal. It is really slow since it uses recursion. Do any of you know how I could write the same code without recursion in order for it to be quicker? Here is my code:
midpoint[p1_, p2_] := Mean[{p1, p2}]
trianglesurface[A_, B_, C_] := Graphics[Polygon[{A, B, C}]]
sierpinski[A_, B_, C_, 0] := trianglesurface[A, B, C]
sierpinski[A_, B_, C_, n_Integer] :=
Show[
sierpinski[A, midpoint[A, B], midpoint[C, A], n - 1],
sierpinski[B, midpoint[A, B], midpoint[B, C], n - 1],
sierpinski[C, midpoint[C, A], midpoint[C, B], n - 1]
]
edit:
I have written it with the Chaos Game approach in case someone is interested. Thank you for your great answers!
Here is the code:
random[A_, B_, C_] := Module[{a, result},
a = RandomInteger[2];
Which[a == 0, result = A,
a == 1, result = B,
a == 2, result = C]]
Chaos[A_List, B_List, C_List, S_List, n_Integer] :=
Module[{list},
list = NestList[Mean[{random[A, B, C], #}] &,
Mean[{random[A, B, C], S}], n];
ListPlot[list, Axes -> False, PlotStyle -> PointSize[0.001]]]
This uses Scale and Translate in combination with Nest to create the list of triangles.
Manipulate[
Graphics[{Nest[
Translate[Scale[#, 1/2, {0, 0}], pts/2] &, {Polygon[pts]}, depth]},
PlotRange -> {{0, 1}, {0, 1}}, PlotRangePadding -> .2],
{{pts, {{0, 0}, {1, 0}, {1/2, 1}}}, Locator},
{{depth, 4}, Range[7]}]
If you would like a high-quality approximation of the Sierpinski triangle, you can use an approach called the chaos game. The idea is as follows - pick three points that you wish to define as the vertices of the Sierpinski triangle and choose one of those points randomly. Then, repeat the following procedure as long as you'd like:
Choose a random vertex of the trangle.
Move from the current point to the halfway point between its current location and that vertex of the triangle.
Plot a pixel at that point.
As you can see at this animation, this procedure will eventually trace out a high-resolution version of the triangle. If you'd like, you can multithread it to have multiple processes plotting pixels at once, which will end up drawing the triangle more quickly.
Alternatively, if you just want to translate your recursive code into iterative code, one option would be to use a worklist approach. Maintain a stack (or queue) that contains a collection of records, each of which holds the vertices of the triangle and the number n. Initially put into this worklist the vertices of the main triangle and the fractal depth. Then:
While the worklist is not empty:
Remove the first element from the worklist.
If its n value is not zero:
Draw the triangle connecting the midpoints of the triangle.
For each subtriangle, add that triangle with n-value n - 1 to the worklist.
This essentially simulates the recursion iteratively.
Hope this helps!
You may try
l = {{{{0, 1}, {1, 0}, {0, 0}}, 8}};
g = {};
While [l != {},
k = l[[1, 1]];
n = l[[1, 2]];
l = Rest[l];
If[n != 0,
AppendTo[g, k];
(AppendTo[l, {{#1, Mean[{#1, #2}], Mean[{#1, #3}]}, n - 1}] & ## #) & /#
NestList[RotateLeft, k, 2]
]]
Show#Graphics[{EdgeForm[Thin], Pink,Polygon#g}]
And then replace the AppendTo by something more efficient. See for example https://mathematica.stackexchange.com/questions/845/internalbag-inside-compile
Edit
Faster:
f[1] = {{{0, 1}, {1, 0}, {0, 0}}, 8};
i = 1;
g = {};
While[i != 0,
k = f[i][[1]];
n = f[i][[2]];
i--;
If[n != 0,
g = Join[g, k];
{f[i + 1], f[i + 2], f[i + 3]} =
({{#1, Mean[{#1, #2}], Mean[{#1, #3}]}, n - 1} & ## #) & /#
NestList[RotateLeft, k, 2];
i = i + 3
]]
Show#Graphics[{EdgeForm[Thin], Pink, Polygon#g}]
Since the triangle-based functions have already been well covered, here is a raster based approach.
This iteratively constructs pascal's triangle, then takes modulo 2 and plots the result.
NestList[{0, ##} + {##, 0} & ## # &, {1}, 511] ~Mod~ 2 // ArrayPlot
Clear["`*"];
sierpinski[{a_, b_, c_}] :=
With[{ab = (a + b)/2, bc = (b + c)/2, ca = (a + c)/2},
{{a, ab, ca}, {ab, b, bc}, {ca, bc, c}}];
pts = {{0, 0}, {1, 0}, {1/2, Sqrt[3]/2}} // N;
n = 5;
d = Nest[Join ## sierpinski /# # &, {pts}, n]; // AbsoluteTiming
Graphics[{EdgeForm#Black, Polygon#d}]
(*sierpinski=Map[Mean, Tuples[#,2]~Partition~3 ,{2}]&;*)
Here is a 3D version,https://mathematica.stackexchange.com/questions/22256/how-can-i-compile-this-function
ListPlot#NestList[(# + RandomChoice[{{0, 0}, {2, 0}, {1, 2}}])/2 &,
N#{0, 0}, 10^4]
With[{data =
NestList[(# + RandomChoice#{{0, 0}, {1, 0}, {.5, .8}})/2 &,
N#{0, 0}, 10^4]},
Graphics[Point[data,
VertexColors -> ({1, #[[1]], #[[2]]} & /# Rescale#data)]]
]
With[{v = {{0, 0, 0.6}, {-0.3, -0.5, -0.2}, {-0.3, 0.5, -0.2}, {0.6,
0, -0.2}}},
ListPointPlot3D[
NestList[(# + RandomChoice[v])/2 &, N#{0, 0, 0}, 10^4],
BoxRatios -> 1, ColorFunction -> "Pastel"]
]

How to parallelize integrating in Mathematica 8

Somebody have idea how to use all cores for calculating integration? I need to use parallelize or parallel table but how?
f[r_] := Sum[(((-1)^n*(2*r - 2*n - 7)!!)/(2^n*n!*(r - 2*n - 1)!))*
x^(r - 2*n - 1), {n, 0, r/2}];
Nw := Transpose[Table[f[j], {i, 1}, {j, 5, 200, 1}]];
X1 = Integrate[Nw . Transpose[Nw], {x, -1, 1}];
Y1 = Integrate[D[Nw, {x, 2}] . Transpose[D[Nw, {x, 2}]], {x, -1, 1}];
X1//MatrixForm
Y1//MatrixForm
I changed the integration of a list into a list of integrations so that I can use ParallelTable:
X1par=ParallelTable[Integrate[i, {x, -1, 1}], {i, Nw.Transpose[Nw]}];
X1par==X1
(* ===> True *)
Y1par = ParallelTable[Integrate[i,{x,-1,1}],{i,D[Nw,{x,2}].Transpose[D[Nw,{x,2}]]}]
Y1 == Y1par
(* ===> True *)
In my timings, with {j, 5, 30, 1} instead of {j, 5, 200, 1} to restrict the time used somewhat, this is about 3.4 times faster on my quod-core. But it can be done even faster with:
X2par = Parallelize[Integrate[#, {x, -1, 1}] & /# (Nw.Transpose[Nw])]
X2par == X1par == X1
(* ===> True *)
This is about 6.8 times faster, a factor of 2.3 of which is due to Parallelize.
Timing and AbsoluteTiming are not very trustworthy when parallel execution is concerned. I used AbsoluteTime before and after each line and took the difference.
EDIT
We shouldn't forget ParallelMap:
At the coarsest list level (1):
ParallelMap[Integrate[#, {x, -1, 1}] &, Nw.Transpose[Nw], {1}]
At the deepest list level (most fine-grained parallelization):
ParallelMap[Integrate[#, {x, -1, 1}] &, Nw.Transpose[Nw], {2}]
If one helps Integrate a bit by expanding the matrix elements first,
things are doable with a little bit of effort.
On a quad-core laptop with Windows and Mathematica 8.0.4 the following code below runs
for the asked DIM=200 in about 13 minutes,
for DIM=50 the code runs in 6 second.
$starttime = AbsoluteTime[]; Quiet[LaunchKernels[]];
DIM = 200;
Print["$Version = ", $Version, " ||| ", "Number of Kernels : ", Length[Kernels[]]];
f[r_] := f[r] = Sum[(((-1)^n*(-(2*n) + 2*r - 7)!!)*x^(-(2*n) + r - 1))/(2^n*n!*(-(2*n) + r - 1)!), {n, 0, r/2}];
Nw = Transpose[Table[f[j], {i, 1}, {j, 5, DIM, 1}]];
nw2 = Nw . Transpose[Nw];
Print["Seconds for expanding Nw.Transpose[Nm] ", Round[First[AbsoluteTiming[nw3 = ParallelMap[Expand, nw2]; ]]]];
Print["do the integral once: ", Integrate[x^n, {x, -1, 1}, Assumptions -> n > -1]];
Print["the integration can be written as a simple rule: ", intrule = (pol_Plus)?(PolynomialQ[#1, x] & ) :>
(Select[pol, !FreeQ[#1, x] & ] /. x^(n_.) /; n > -1 :> ((-1)^n + 1)/(n + 1)) + 2*(pol /. x -> 0)];
Print["Seconds for integrating Nw.Transpose[Nw] : ", Round[First[AbsoluteTiming[X1 = ParallelTable[row /. intrule, {row, nw3}]; ]]]];
Print["expanding: ", Round[First[AbsoluteTiming[preY1 = ParallelMap[Expand, D[Nw, {x, 2}] . Transpose[D[Nw, {x, 2}]]]; ]]]];
Print["Seconds for integrating : ", Round[First[AbsoluteTiming[Y1 = ParallelTable[py /. intrule, {py, preY1}]; ]]]];
Print["X1 = ", (Shallow[#1, {4, 4}] & )[X1]];
Print["Y1 = ", (Shallow[#1, {4, 4}] & )[Y1]];
Print["seq Y1 : ", Simplify[FindSequenceFunction[Diagonal[Y1], n]]];
Print["seq X1 0 : ",Simplify[FindSequenceFunction[Diagonal[X1, 0], n]]];
Print["seq X1 2: ",Simplify[FindSequenceFunction[Diagonal[X1, 2], n]]];
Print["seq X1 4: ",Simplify[FindSequenceFunction[Diagonal[X1, 4], n]]];
Print["overall time needed in seconds: ", Round[AbsoluteTime[] - $starttime]];

Plotting arrows at the edges of a curve

Inspired by this question at ask.sagemath, what is the best way of adding arrows to the end of curves produced by Plot, ContourPlot, etc...? These are the types of plots seen in high school, indicating the curve continues off the end of the page.
After some searching, I could not find a built-in way or up-to-date package to do this. (There is ArrowExtended, but it's quite old).
The solution given in the ask.sagemath question relies on the knowledge of the function and its endpoints and (maybe) the ability to take derivatives. Its translation into Mathematica is
f[x_] := Cos[12 x^2]; xmin = -1; xmax = 1; small = .01;
Plot[f[x],{x,xmin,xmax}, PlotLabel -> y==f[x], AxesLabel->{x,y},
Epilog->{Blue,
Arrow[{{xmin,f[xmin]},{xmin-small,f[xmin-small]}}],
Arrow[{{xmax,f[xmax]},{xmax+small,f[xmax+small]}}]
}]
An alternative method is to simply replace the Line[] objects generate by Plot[] with Arrow[]. For example
Plot[{x^2, Sin[10 x], UnitStep[x]}, {x, -1, 1},
PlotStyle -> {Red, Green, {Thick, Blue}},
(*AxesStyle -> Arrowheads[.03],*) PlotRange -> All] /.
Line[x__] :> Sequence[Arrowheads[{-.04, .04}], Arrow[x]]
But this has the problem that any discontinuities in the lines generate arrow heads where you don't want them (this can often be fixed by the option Exclusions -> None). More importantly, this approach is hopeless with CountourPlots. Eg try
ContourPlot[x^2 + y^3 == 1, {x, -2, 2}, {y, -2, 1}] /.
Line[x__] :> Sequence[Arrowheads[{-.04, .04}], Arrow[x]]
(the problems in the above case can be fixed by the rule, e.g., {a___, l1_Line, l2_Line, b___} :> {a, Line[Join[l2[[1]], l1[[1]]]], b} or by using appropriate single headed arrows.).
As you can see, neither of the above (quick hacks) are particularly robust or flexible. Does anyone know an approach that is?
The following seems to work, by sorting the segments first:
f[x_] := {E^-x^2, Sin[10 x], Sign[x], Tan[x], UnitBox[x],
IntegerPart[x], Gamma[x],
Piecewise[{{x^2, x < 0}, {x, x > 0}}], {x, x^2}};
arrowPlot[f_] :=
Plot[{#}, {x, -2, 2}, Axes -> False, Frame -> True, PlotRangePadding -> .2] /.
{Hue[qq__], a___, x___Line} :> {Hue[qq], a, SortBy[{x}, #[[1, 1, 1]] &]} /.
{a___,{Line[x___], d___, Line[z__]}} :>
List[Arrowheads[{-.06, 0}], a, Arrow[x], {d},
Arrowheads[{0, .06}], Arrow[z]] /.
{a___,{Line[x__]}}:> List[Arrowheads[{-.06, 0.06}], a, Arrow[x]] & /# f[x];
arrowPlot[f]
Inspired by both Alexey's comment and belisarius's answers, here's my attempt.
makeArrowPlot[g_Graphics, ah_: 0.06, dx_: 1*^-6, dy_: 1*^-6] :=
Module[{pr = PlotRange /. Options[g, PlotRange], gg, lhs, rhs},
gg = g /. GraphicsComplex -> (Normal[GraphicsComplex[##]] &);
lhs := Or##Flatten[{Thread[Abs[#[[1, 1, 1]] - pr[[1]]] < dx],
Thread[Abs[#[[1, 1, 2]] - pr[[2]]] < dy]}]&;
rhs := Or##Flatten[{Thread[Abs[#[[1, -1, 1]] - pr[[1]]] < dx],
Thread[Abs[#[[1, -1, 2]] - pr[[2]]] < dy]}]&;
gg = gg /. x_Line?(lhs[#]&&rhs[#]&) :> {Arrowheads[{-ah, ah}], Arrow##x};
gg = gg /. x_Line?lhs :> {Arrowheads[{-ah, 0}], Arrow##x};
gg = gg /. x_Line?rhs :> {Arrowheads[{0, ah}], Arrow##x};
gg
]
We can test this on some functions
Plot[{x^2, IntegerPart[x], Tan[x]}, {x, -3, 3}, PlotStyle -> Thick]//makeArrowPlot
And on some contour plots
ContourPlot[{x^2 + y^2 == 1, x^2 + y^2 == 6, x^3 + y^3 == {1, -1}},
{x, -2, 2}, {y, -2, 2}] // makeArrowPlot
One place where this fails is where you have horizontal or vertical lines on the edge of the plot;
Plot[IntegerPart[x],{x,-2.5,2.5}]//makeArrowPlot[#,.03]&
This can be fixed by options such as PlotRange->{-2.1,2.1} or Exclusions->None.
Finally, it would be nice to add an option so that each "curve" can arrow heads only on their boundaries. This would give plots like those in Belisarius's answer (it would also avoid the problem mentioned above). But this is a matter of taste.
The following construct has the advantage of not messing with the internal structure of the Graphics structure, and is more general than the one suggested in ask.sagemath, as it manage PlotRange and infinities better.
f[x_] = Gamma[x]
{plot, evals} =
Reap[Plot[f[x], {x, -2, 2}, Axes -> False, Frame -> True,
PlotRangePadding -> .2, EvaluationMonitor :> Sow[{x, f[x]}]]];
{{minX, maxX}, {minY, maxY}} = Options[plot, PlotRange] /. {_ -> y_} -> y;
ev = Select[evals[[1]], minX <= #[[1]] <= maxX && minY <= #[[2]] <= maxY &];
seq = SortBy[ev, #[[1]] &];
arr = {Arrow[{seq[[2]], seq[[1]]}], Arrow[{seq[[-2]], seq[[-1]]}]};
Show[plot, Graphics[{Red, arr}]]
Edit
As a function:
arrowPlot[f_, interval_] := Module[{plot, evals, within, seq, arr},
within[p_, r_] :=
r[[1, 1]] <= p[[1]] <= r[[1, 2]] &&
r[[2, 1]] <= p[[2]] <= r[[2, 2]];
{plot, evals} = Reap[
Plot[f[x], Evaluate#{x, interval /. List -> Sequence},
Axes -> False,
Frame -> True,
PlotRangePadding -> .2,
EvaluationMonitor :> Sow[{x, f[x]}]]];
seq = SortBy[Select[evals[[1]],
within[#,
Options[plot, PlotRange] /. {_ -> y_} -> y] &], #[[1]] &];
arr = {Arrow[{seq[[2]], seq[[1]]}], Arrow[{seq[[-2]], seq[[-1]]}]};
Show[plot, Graphics[{Red, arr}]]
];
arrowPlot[Gamma, {-3, 4}]
Still thinking what is better for ListPlot & al.

Mathematica Interpolation[] that remains constant when outside range

I want to "modify" Mathematica's Interpolation[] function (in 1
dimension) by replacing extrapolation with constant values when the
input is out of range.
In other words, if the interpolation domain is [1,20] and f[1]==7 and
f[20]==12, I want:
f[x] = 7 for x<=1
f[x] = 12 for x>=20
f[x] = Interpolation[...]
However, this fails:
(* interpolation w cutoff *)
interpcut[r_] := Module[{s, minpair, maxpair},
(* sort array by x coord *)
s = Sort[r, #1[[1]] < #2[[1]] &];
(* find min x value and corresponding y value *)
minpair = s[[1]];
(* ditto for max x value *)
maxpair = s[[-1]];
(* return the pure function representing cutoff interpolation *)
Piecewise[{
{minpair[[2]] &, #1 < minpair[[1]] &},
{maxpair[[2]] &, #1 > maxpair[[1]] &},
{Interpolation[r], True}
}]]
test = Table[{x,Prime[x]},{x,1,10}]
InputForm[interpcut[test]]
Piecewise[{{minpair$59[[2]] & , #1 < minpair$59[[1]] & },
{maxpair$59[[2]] & , #1 > maxpair$59[[1]] & }},
InterpolatingFunction[{{1, 10}}, {3, 1, 0, {10}, {4}, 0, 0, 0, 0},
{{1, 2, 3, 4, 5, 6, 7, 8, 9, 10}}, {{2}, {3}, {5}, {7}, {11}, {13}, {17},
{19}, {23}, {29}}, {Automatic}]]
I'm sure I'm missing something basic. What?
Function definition
interpcut[r_, x_] :=
Module[{s},(*sort array by x coord*)
s = SortBy[r, First];
Piecewise[
{{First[s][[2]], x < First[s][[1]]},
{Last [s][[2]], x > Last [s][[1]]},
{Interpolation[r][x], True}}]];
Test
test = Table[{x, Prime[x]}, {x, 1, 10}];
f[x_] := interpcut[test, x]
Plot[f[x], {x, -10, 30}]
Edit
Answering your comment about pure functions.
I did it that way just for clarity, not for cheating. For using pure functions just "follow the recipe":
interpcut[r_] := Module[{s},
s = SortBy[r, First];
Function[Piecewise[
{{First[s][[2]], # < First[s][[1]]},
{Last [s][[2]], # > Last [s][[1]]},
{Interpolation[r][#], True}}]]
]
test = Table[{x, Prime[x]}, {x, 1, 10}];
f = interpcut[test] // InputForm
Plot[interpcut[test][x], {x, -10, 30}]
Let me add an update to this old thread. Since V9 you can use native (but still experimental) "ExtrapolationHandler" parameter
test = Table[{x, Prime[x]}, {x, 1, 10}];
g = Interpolation[test, "ExtrapolationHandler" ->
{If[# <= test[[1, 1]], test[[1, 2]], test[[-1, 2]]] &,
"WarningMessage" -> False}];
Plot[g[x], {x, -10, 30}]
Here's a possible alternative to belisarius's answer:
interpcut[r_] := Module[{s}, s = SortBy[r, First];
Composition[Interpolation[r], Clip[#, Map[First, Through[{First, Last}[s]]]] &]]

in mathematica, how to make initial condition as a variable in ndsolve?

i'd like to have something like this
w[w1_] :=
NDSolve[{y''[x] + y[x] == 2, y[0] == w1, y'[0] == 0}, y, {x, 0, 30}]
this seems like it works better but i think i'm missing smtn
w := NDSolve[{y''[x] + y[x] == 2, y[0] == w1, y'[0] == 0},
y, {x, 0, 30}]
w2 = Table[y[x] /. w, {w1, 0.0, 1.0, 0.5}]
because when i try to make a table, it doesn't work:
Table[Evaluate[y[x] /. w2], {x, 10, 30, 10}]
i get an error:
ReplaceAll::reps: {<<1>>[x]} is neither a list of replacement rules nor a valid dispatch table, and so cannot be used for replacing. >>
ps: is there a better place to ask questions like that? mathematica doesn't have supported forums and only has mathGroup e-mail list. it would be nice if stackoverflow would have more specific mathematica tags like simplify, ndsolve, plot manipulation
There are a lot of ways to do that. One is:
w[w1_] := NDSolve[{y''[x] + y[x] == 2,
y'[0] == 0}, y[0] == w1,
y[x], {x, 0, 30}];
Table[Table[{w1,x,y[x] /. w[w1]}, {w1, 0., 1.0, 0.5}]/. x -> u, {u, 10, 30, 10}]
Output:
{{{0., 10, {3.67814}}, {0.5, 10, {3.25861}}, {1.,10, {2.83907}}},
{{0., 20, {1.18384}}, {0.5, 20, {1.38788}}, {1.,20, {1.59192}}},
{{0., 30, {1.6915}}, {0.5, 30, {1.76862}}, {1.,30, {1.84575}}}}
I see you already chose an answer, but I want to toss this solution for families of linear equations up. Specifically, this is to model a slight variation on Lotka-Volterra.
(*Put everything in a module to scope x and y correctly.*)
Module[{x, y},
(*Build a function to wrap NDSolve, and pass it
the initial conditions and range.*)
soln[iCond_, tRange_, scenario_] :=
NDSolve[{
x'[t] == -scenario[[1]] x[t] + scenario[[2]] x[t]*y[t],
y'[t] == (scenario[[3]] - scenario[[4]]*y[t]) -
scenario[[5]] x[t]*y[t],
x[0] == iCond[[1]],
y[0] == iCond[[2]]
},
{x[t], y[t]},
{t, tRange[[1]], tRange[[2]]}
];
(*Build a plot generator*)
GeneratePlot[{iCond_, tRange_, scen_,
window_}] :=
(*Find a way to catch errors and perturb iCond*)
ParametricPlot[
Evaluate[{x[t], y[t]} /. soln[iCond, tRange, scen]],
{t, tRange[[1]], tRange[[2]]},
PlotRange -> window,
PlotStyle -> Thin, LabelStyle -> Medium
];
(*Call the plot generator with different starting conditions*)
graph[scenario_, tRange_, window_, points_] :=
{plots = {};
istep = (window[[1, 2]] - window[[1, 1]])/(points[[1]]+1);
jstep = (window[[2, 2]] - window[[2, 1]])/(points[[2]]+1);
Do[Do[
AppendTo[plots, {{i, j}, tRange, scenario, window}]
, {j, window[[2, 1]] + jstep, window[[2, 2]] - jstep, jstep}
], {i, window[[1, 1]] + istep, window[[1, 2]] - istep, istep}];
Map[GeneratePlot, plots]
}
]
]
We can then use Animate (or table, but animate is awesome)
tRange = {0, 4};
window = {{0, 8}, {0, 6}};
points = {5, 5}
Animate[Show[graph[{3, 1, 8, 2, 0.5},
{0, t}, window, points]], {t, 0.01, 5},
AnimationRunning -> False]

Resources