CATransformLayer and 3D rotation, venetian store or accordion like, keeping tiles adjacent? - calayer

I have a set of tiles vertically arranged. They are meant to rotate around the X axis. Think of an accordion. I want each tile n+1 to always hang off tile n.
The cyan tile, in the last two screen shots, ought to be attached to the yellow tile. It is not.
I do not understand why. Even less so, why when I rotate row0 I get all next 5 tiles rotated (which is what I want) but when I want to "correct" starting from some later tile by rotating in the other direction, I end up with a disconnect between the 3rd and 4th tile.
For reference, here's the setup:
CGRect tileRect = (CGRect) {CGPointZero, {self.bounds.size.width, 30.0f}} ;
CGRect pageRect = tileRect ; pageRect.size.height *= 3 ;
CGPoint baseCenter= (CGPoint) {self.bounds.size.width / 2.0f, 0} ;
CGPoint anchorMidTop= (CGPoint) {0.5f, 0.0f} ;
CGPoint anchorTopLeft= (CGPoint) {0.0f, 0.0f} ;
CGPoint positionNW = (CGPoint) {0.0f, 0.0f} ;
CALayer * (^setupLayerGeometry)(CALayer *, CGRect, CGPoint, CGPoint) =
^(CALayer * layer, CGRect b, CGPoint a, CGPoint p) {
layer.bounds = b ;
layer.anchorPoint = a ;
layer.position = p ;
return layer ;
} ;
CALayer * (^setupColor)(CALayer *, UIColor *) = ^(CALayer * layer, UIColor * color) {
layer.backgroundColor = color.CGColor ;
layer.opacity = 0.850f ;
return layer ;
} ;
CALayer * (^stdLayer)(UIColor *, CGPoint, CGPoint) =
^CALayer * (UIColor * color, CGPoint anchor, CGPoint pos) {
return setupLayerGeometry(
setupColor([CALayer layer], color)
, tileRect
, anchor
, pos) ;
} ;
CATransformLayer * (^transformLayer)(CGRect, CGPoint, CGPoint) =
^CATransformLayer * (CGRect bounds, CGPoint anchor, CGPoint pos) {
return (CATransformLayer *) setupLayerGeometry([CATransformLayer layer]
, bounds
, anchor
, pos) ;
} ;
self.baseLayer = transformLayer(tileRect, anchorTopLeft, baseCenter) ;
CATransform3D initialTransform = self.baseLayer.sublayerTransform ;
initialTransform.m34 = 1.0f / -200.0f ;
self.baseLayer.sublayerTransform = initialTransform ;
[self.layer addSublayer:self.baseLayer] ;
CALayer * (^wrap0) (CALayer *) = ^CALayer * (CALayer * layer) {
CALayer * wrap = transformLayer(tileRect, anchorTopLeft, positionNW) ;
[wrap addSublayer:layer] ;
return wrap ;
} ;
CALayer * (^wrap)(CALayer *) = wrap0 ;
Now I'm creating six tiles, as plain CALayer's.
CALayer * row0 = stdLayer([UIColor redColor], anchorMidTop, (CGPoint) {0, 0}) ;
CALayer * row1 = stdLayer([UIColor blueColor], anchorMidTop, (CGPoint) {0, 30}) ;
CALayer * row2 = stdLayer([UIColor yellowColor], anchorMidTop, (CGPoint) {0, 60}) ;
CALayer * row3 = stdLayer([UIColor cyanColor], anchorMidTop, (CGPoint) {0, 90}) ;
CALayer * row4 = stdLayer([UIColor purpleColor], anchorMidTop, (CGPoint) {0, 120}) ;
CALayer * row5 = stdLayer([UIColor magentaColor], anchorMidTop, (CGPoint) {0, 150}) ;
Now, I'm wrapping each such tile into a parent CATransformLayer, which I am then adding as sublayer of the previous row, with the intention of having each CATransformLayer affect each of its sublayers.
[self.baseLayer addSublayer:wrap(row0)] ;
[row0.superlayer addSublayer:wrap(row1)] ;
[row1.superlayer addSublayer:wrap(row2)] ;
[row2.superlayer addSublayer:wrap(row3)] ;
[row3.superlayer addSublayer:wrap(row4)] ;
[row4.superlayer addSublayer:wrap(row5)] ;
CGFloat angle = M_PI / 10.0f ;
row0.superlayer.sublayerTransform = CATransform3DMakeRotation(-angle, 1, 0, 0);
At this point I get this:
And I thought I had nailed it until I replaced the last line with:
row3.superlayer.sublayerTransform = CATransform3DMakeRotation(-angle, 1, 0, 0);
And here's what I get:
If, instead, I replace the above line with:
row0.superlayer.sublayerTransform = CATransform3DMakeRotation(-angle, 1, 0, 0);
row3.superlayer.sublayerTransform = CATransform3DMakeRotation( angle, 1, 0, 0);
I get this:
I have read and re-read Apple's documentation, paying close attention to this:
Anchor Points Affect Geometric Manipulations
Geometry related
manipulations of a layer occur relative to that layer’s anchor point,
which you can access using the layer’s anchorPoint property. The
impact of the anchor point is most noticeable when manipulating the
position or transform properties of the layer. The position property
is always specified relative to the layer’s anchor point, and any
transformations you apply to the layer occur relative to the anchor
point as well.
(emphasis mine)
It is fair to say that I am now utterly confused. Anyone can see/explain what I am doing wrong?
How can I arrange those layers such that my tiles are always adjacent, top to bottom?
Here's a dump of the layer hierarchy:
<CATransformLayer> frame: {{100, 0}, {200, 30}} pos: {100, 0} anchor: {0, 0}
<CATransformLayer> frame: {{0, 0}, {200, 30}} pos: {0, 0} anchor: {0, 0}
<CALayer> frame: {{-100, 0}, {200, 30}} pos: {0, 0} anchor: {0.5, 0}
<CATransformLayer> frame: {{0, 0}, {200, 30}} pos: {0, 0} anchor: {0, 0}
<CALayer> frame: {{-100, 30}, {200, 30}} pos: {0, 30} anchor: {0.5, 0}
<CATransformLayer> frame: {{0, 0}, {200, 30}} pos: {0, 0} anchor: {0, 0}
<CALayer> frame: {{-100, 60}, {200, 30}} pos: {0, 60} anchor: {0.5, 0}
<CATransformLayer> frame: {{0, 0}, {200, 30}} pos: {0, 0} anchor: {0, 0}
<CALayer> frame: {{-100, 90}, {200, 30}} pos: {0, 90} anchor: {0.5, 0}
<CATransformLayer> frame: {{0, 0}, {200, 30}} pos: {0, 0} anchor: {0, 0}
<CALayer> frame: {{-100, 120}, {200, 30}} pos: {0, 120} anchor: {0.5,
<CATransformLayer> frame: {{0, 0}, {200, 30}} pos: {0, 0} anchor: {0,
<CALayer> frame: {{-100, 150}, {200, 30}} pos: {0, 150} anchor: {0.5, 0}

I got it. Finally!
Here's the code:
- (void) reloadData {
for (CALayer * layer in [self.baseLayer.sublayers copy]) {
[layer removeFromSuperlayer] ;
}
CGRect tileRect = (CGRect) {CGPointZero, {self.bounds.size.width, 30.0f}} ;
CGFloat halfWidth = self.bounds.size.width / 2.0f ;
The helper blocks:
CALayer * (^setupLayerGeometry)(CALayer *, CGRect, CGPoint, CGPoint, NSString *) =
^(CALayer * layer, CGRect b, CGPoint a, CGPoint p, NSString * n) {
layer.bounds = b ;
layer.anchorPoint = a ;
layer.position = p ;
layer.name = n ;
return layer ;
} ;
CALayer * (^setupColor)(CALayer *, UIColor *) = ^(CALayer * layer, UIColor * color) {
layer.backgroundColor = color.CGColor ;
layer.opacity = 0.850f ;
return layer ;
} ;
CALayer * (^stdLayer)(UIColor *, CGPoint, CGPoint, NSString *) =
^CALayer * (UIColor * color, CGPoint anchor, CGPoint pos, NSString * n) {
return setupLayerGeometry(
setupColor([CALayer layer], color)
, tileRect
, anchor
, pos
, n) ;
} ;
CATransformLayer * (^transformLayer)(CGRect, CGPoint, CGPoint, NSString *) =
^CATransformLayer * (CGRect bounds, CGPoint anchor, CGPoint pos, NSString * n) {
return (CATransformLayer *) setupLayerGeometry(
[CATransformLayer layer]
, bounds
, anchor
, pos
, n) ;
} ;
Now for the baseLayer that is the root CATransformLayer
#define ANC(x,y) (CGPoint) {x, y}
#define POS(x,y) (CGPoint) {x, y}
[self.baseLayer removeFromSuperlayer] ;
self.baseLayer = transformLayer(tileRect, ANC(0, 0), POS(halfWidth, 0), #"baseLayer") ;
Creating each row with a CATransformLayer as its root that also acts as the super layer of the next row.
CALayer * row0 = stdLayer([UIColor redColor], ANC(0.0, 0.0), POS(-halfWidth, 0), #"row0") ;
CATransformLayer * trn0 = (CATransformLayer *) transformLayer(tileRect, ANC(0, 0), POS(0, 0), #"trn0") ;
[self.baseLayer addSublayer:trn0] ;
[trn0 addSublayer:row0] ;
CALayer * row1 = stdLayer([UIColor blueColor], ANC(0.0, 0.0), POS(-halfWidth, 0), #"row1") ;
CATransformLayer * trn1 = (CATransformLayer *) transformLayer(tileRect, ANC(0, 0), POS(0, 30), #"trn1") ;
[trn1 addSublayer:row1] ;
[trn0 addSublayer:trn1] ;
CALayer * row2 = stdLayer([UIColor yellowColor], ANC(0.0, 0.0), POS(-halfWidth, 0), #"row2") ;
CATransformLayer * trn2 = (CATransformLayer *) transformLayer(tileRect, ANC(0, 0), POS(0, 30), #"trn2") ;
[trn2 addSublayer:row2] ;
[trn1 addSublayer:trn2] ;
CALayer * row3 = stdLayer([UIColor cyanColor], ANC(0.0, 0.0), POS(-halfWidth, 0), #"row3") ;
CATransformLayer * trn3 = (CATransformLayer *) transformLayer(tileRect, ANC(0, 0), POS(0, 30), #"trn3") ;
[trn3 addSublayer:row3] ;
[trn2 addSublayer:trn3] ;
CALayer * row4 = stdLayer([UIColor purpleColor], ANC(0.0, 0.0), POS(-halfWidth, 0), #"row4") ;
CATransformLayer * trn4 = (CATransformLayer *) transformLayer(tileRect, ANC(0, 0), POS(0, 30), #"trn4") ;
[trn4 addSublayer:row4] ;
[trn3 addSublayer:trn4] ;
CALayer * row5 = stdLayer([UIColor magentaColor], ANC(0.0, 0.0), POS(-halfWidth, 0), #"row5") ;
CATransformLayer * trn5 = (CATransformLayer *) transformLayer(tileRect, ANC(0, 0), POS(0, 30), #"trn5") ;
[trn5 addSublayer:row5] ;
[trn4 addSublayer:trn5] ;
CALayer * row6 = stdLayer([UIColor lightGrayColor], ANC(0.0, 0.0), POS(-halfWidth, 0), #"row6") ;
CATransformLayer * trn6 = (CATransformLayer *) transformLayer(tileRect, ANC(0, 0), POS(0, 30), #"trn6") ;
[trn6 addSublayer:row6] ;
[trn5 addSublayer:trn6] ;
CALayer * row7 = stdLayer([UIColor darkGrayColor], ANC(0.0, 0.0), POS(-halfWidth, 0), #"row7") ;
CATransformLayer * trn7 = (CATransformLayer *) transformLayer(tileRect, ANC(0, 0), POS(0, 30), #"trn7") ;
[trn7 addSublayer:row7] ;
[trn6 addSublayer:trn7] ;
Finally setting the transforms. Since each row is the parent of the next row (thanks to the embedded CATransformLayer) the angle needed is relative to the immediate parent, thus is independent of the row number a particular tile is displayed at.
CGFloat angle = M_PI / 10.0f ;
trn0.sublayerTransform = CATransform3DMakeRotation(1.0*angle, 1, 0, 0);
trn1.sublayerTransform = CATransform3DMakeRotation(-1.0*angle, 1, 0, 0);
trn2.sublayerTransform = CATransform3DMakeRotation(-1.0*angle, 1, 0, 0);
trn3.sublayerTransform = CATransform3DMakeRotation(1.0*angle, 1, 0, 0);
trn4.sublayerTransform = CATransform3DMakeRotation(1.0*angle, 1, 0, 0);
trn5.sublayerTransform = CATransform3DMakeRotation(-1.0*angle, 1, 0, 0);
trn6.sublayerTransform = CATransform3DMakeRotation(-1.0*angle, 1, 0, 0);
trn7.sublayerTransform = CATransform3DMakeRotation(2.0*angle, 1, 0, 0);
CATransform3D initialTransform = self.baseLayer.sublayerTransform ;
initialTransform.m34 = 1.0f / -200.0f ;
self.baseLayer.sublayerTransform = initialTransform ;
[self.layer addSublayer:self.baseLayer] ;
#undef ANC
#undef POS
}
Now I only have to get rid of those stupid colors, antialias the edges and provide some shadowy gradient layer and watch that perspective which is annoying me (the last row "gray" is too tall) More tweaking with m34 and/or the baseLayer center [anchorPoint/position].
Replacing the angle and coefficients as:
CGFloat angle = M_PI / 3.0f ;
trn0.sublayerTransform = CATransform3DMakeRotation(-1.0*angle, 1, 0, 0);
trn1.sublayerTransform = CATransform3DMakeRotation(2.0*angle, 1, 0, 0);
trn2.sublayerTransform = CATransform3DMakeRotation(-2.0*angle, 1, 0, 0);
trn3.sublayerTransform = CATransform3DMakeRotation(2.0*angle, 1, 0, 0);
trn4.sublayerTransform = CATransform3DMakeRotation(-2.0*angle, 1, 0, 0);
trn5.sublayerTransform = CATransform3DMakeRotation(2.0*angle, 1, 0, 0);
trn6.sublayerTransform = CATransform3DMakeRotation(-2.0*angle, 1, 0, 0);
trn7.sublayerTransform = CATransform3DMakeRotation(2.0*angle, 1, 0, 0);
And the perspective to:
initialTransform.m34 = 1.0f / -400.0f ;
We now get:

Related

VPython Object Revolution

Having to use VPython currently, and I want to make a model of the Solar System.
Currently I have all the Planets and the orbital Rings, however, the actual orbit is what I'm finding very difficult.
GlowScript 2.7 VPython
from visual import *
# Declaring Celestial Body Objects
Sun = sphere(pos = vec(0, 0, 0), radius = 10, color = color.yellow)
Mercury = sphere(pos = vec(25, 0, 0), radius = 2, color = color.green)
Venus = sphere(pos = vec(40, 0, 0), radius = 2.5, color = color.red)
Earth = sphere(pos = vec(50, 0, 0), radius = 2.65, color = color.blue)
Mars = sphere(pos = vec(70, 0, 0), radius = 2.3, color = color.red)
Jupiter = sphere(pos = vec(90, 0, 0), radius = 3, color = color.orange)
Saturn = sphere(pos = vec(105, 0, 0), radius = 2.9, color = color.orange)
Uranus = sphere(pos = vec(117.5, 0, 0), radius = 2.9, color = color.orange)
Neptune = sphere(pos = vec(135, 0, 0), radius = 2.8, color = color.blue)
Pluto = sphere(pos = vec(165, 0, 0), radius = 1.5, color = color.white)
# Declaring Orbital Rings of Celestial Body Objects
Mercury.ring = ring(pos = vec(0, 0, 0), axis = vec(0, 1, 0), size = vec(0.1, Mercury.pos.x * 2, Mercury.pos.x * 2))
Venus.ring = ring(pos = vec(0, 0, 0), axis = vec(0, 1, 0), size = vec(0.1, Venus.pos.x * 2, Venus.pos.x * 2))
Earth.ring = ring(pos = vec(0, 0, 0), axis = vec(0, 1, 0), size = vec(0.1, Earth.pos.x * 2, Earth.pos.x * 2))
Mars.ring = ring(pos = vec(0, 0, 0), axis = vec(0, 1, 0), size = vec(0.1, Mars.pos.x * 2, Mars.pos.x * 2))
Jupiter.ring = ring(pos = vec(0, 0, 0), axis = vec(0, 1, 0), size = vec(0.1, Jupiter.pos.x * 2, Jupiter.pos.x * 2))
Saturn.ring = ring(pos = vec(0, 0, 0), axis = vec(0, 1, 0), size = vec(0.1, Saturn.pos.x * 2, Saturn.pos.x * 2))
Uranus.ring = ring(pos = vec(0, 0, 0), axis = vec(0, 1, 0), size = vec(0.1, Uranus.pos.x * 2, Uranus.pos.x * 2))
Neptune.ring = ring(pos = vec(0, 0, 0), axis = vec(0, 1, 0), size = vec(0.1, Neptune.pos.x * 2, Neptune.pos.x * 2))
Pluto.ring = ring(pos = vec(0, 0, 0), axis = vec(0, 1, 0), size = vec(0.1, Pluto.pos.x * 2, Pluto.pos.x * 2))
# Infinite Loop
while 1 == 1:
Mercury.rotate(angle = radians(360), axis = vec(Mercury.pos.y, Mercury.pos.x, 0), origin = vec(0, 0, 0))
rate(50)
print("Error! Escaped While Loop!")
When I switch out the rotate method with Mercury.rotate(angle = 0.0174533, axis = vec(0, Mercury.pos.x, 0), origin = vec(0, 0, 0)), it properly rotates... yet only for a quarter of the rotation. I've read about everything to do with this, but N/A.
After the quarter revolution, the planet sometimes decides to violently "seizure," when the angle is a larger number. It just seems like a barrier of sorts.
You should write axis=vec(0,1,0). The axis of rotation needs to be always pointing upward.

Mathematica code doesn't work when using version 9

I have written the following code which works ok in Mathematica 8. However, when I open the very same notebook in Mathematica 9, the following message arises: "InterpolatingFunction::dmval: "Input value {0.000408163} lies outside the range of data in the interpolating function. Extrapolation will be used. ", and there is no graph.
Here's the code:
Manipulate[
ParametricPlot[
Evaluate[{x1[t], a YP[[2]]/YP[[1]] x1[t] + (1 - a) YP[[2]] x3[t]} /.
Quiet#NDSolve[
{x1'[t] == x2[t],
x2'[t] == -1/
Mass (c x2[t] +
a YP[[2]]/YP[[1]] x1[t] + (1 - a) YP[[2]] x3[t] -
Fmax Sin[(2 π)/T t]),
x3'[t] ==
x2[t]/YP[[
1]] (1 - Abs[x3[t]]^n (γ Sign[x2[t] x3[t]] + (1 - γ))),
x1[0] == 0,
x2[0] == 0,
x3[0] == 0},
{x1[t], x2[t], x3[t]},
{t, 0, tTotal}]],
{t, 0, tTotal},
ImageSize -> {450, 450}, PlotRange -> 10, AxesLabel -> {"u", "F"}],
{{tTotal, 20, "Total time"}, 0.5, 100, Appearance -> "Labeled"},
{{Mass, 2.86, "m"}, 0.1, 10, 0.01, Appearance -> "Labeled"},
{{T, 4.0, "T"}, 0.1, 10, 0.01, Appearance -> "Labeled"},
{{Fmax, 8.0, "Fmax"}, 0.1, 10, 0.01, Appearance -> "Labeled"},
{{n, 2.0, "n"}, 0.1, 10, 0.01, Appearance -> "Labeled"},
{{c, 0.0, "c"}, 0.0, 10, 0.01, Appearance -> "Labeled"},
{{a, 0.05, "a"}, 0.0, 1, 0.01, Appearance -> "Labeled"},
{{γ, 0.5, "γ"}, 0.01, 1, 0.01, Appearance -> "Labeled"},
{{YP, {0.111, 2.86}}, {0, 0}, {10, 10}, Locator}]
Any ideas?
TIA
Summarising the problem. This works in version 7 & 8 but fails in version 9:-
tTotal = 20; Mass = 2.86; T = 4.0;
Fmax = 8.0; n = 2.0; c = 0.0; a = 0.05;
\[Gamma] = 0.5; YP = {0.111, 2.86};
NDSolve[{x1'[t] == x2[t],
x2'[t] == -1/Mass (c x2[t] + a YP[[2]]/YP[[1]] x1[t] +
(1 - a) 2.86 x3[t] - Fmax Sin[(2 \[Pi])/4.0 t]),
x3'[t] == x2[t]/0.111 (1 -
Abs[x3[t]]^2.0 (\[Gamma] Sign[x2[t] x3[t]] + (1 - \[Gamma]))),
x1[0] == 0, x2[0] == 0, x3[0] == 0},
{x1[t], x2[t], x3[t]}, {t, 0, 20}]
Use a more specific method in version 9.
NDSolve[... , Method -> {"DiscontinuityProcessing" -> False}]

Plotting Interpolations in Mathematica

It's apparently very simple but I can't find my mistake. The Plot gives me no points at all.
tmax = 1.;
nmax = 10;
deltat = tmax/nmax;
h[t_, s_] := t^2 + s^2;
T = Table[{{n*deltat}, {n*deltat}, h[n*deltat, n*deltat]}, {n, 0, nmax}]
inth = ListInterpolation[T]
Plot3D[inth[s, t], {s, 0, 1}, {t, 0, 1}]
Any help would be mostly welcome!
Marco
I think your "T" is supposed to be a list of 3D points, in which case you should generate it with:
tmax = 1.;
nmax = 10;
deltat = tmax/nmax;
h[t_, s_] := t^2 + s^2;
T = Table[{n*deltat, n*deltat, h[n*deltat, n*deltat]}, {n, 0, nmax}]
inth = ListInterpolation[T]
Plot3D[inth[s, t], {s, 0, 1}, {t, 0, 1}]
Now T[[1]] = {0., 0., 0.} and not {{0.}, {0.}, 0.} as before.

Transform(align) a plane plot into a 3D plot in Mathematica

I have an ODE and I solve it with NDSolve, then I plot the solution on a simplex in 2D.
Valid XHTML http://ompldr.org/vY2c5ag/simplex.jpg
Then I need to transform (align or just plot) this simplex in 3D at coordinates (1,0,0),(0,1,0),(0,0,1), so it looks like this scheme:
Valid XHTML http://ompldr.org/vY2dhMg/simps.png
I use ParametricPlot to do my plot so far. Maybe all I need is ParametricPlot3D, but I don't know how to call it properly.
Here is my code so far:
Remove["Global`*"];
phi[x_, y_] = (1*x*y)/(beta*x + (1 - beta)*y);
betam = 0.5;
betaf = 0.5;
betam = s;
betaf = 0.1;
sigma = 0.25;
beta = 0.3;
i = 1;
Which[i == 1, {betam = 0.40, betaf = 0.60, betam = 0.1,
betaf = 0.1, sigma = 0.25 , tmax = 10} ];
eta[x2_, y2_, p2_] = (betam + betaf + sigma)*p2 - betam*x2 -
betaf*y2 - phi[x2, y2];
syshelp = {x2'[t] == (betam + betaf + sigma)*p2[t] - betam*x2[t] -
phi[x2[t], y2[t]] - eta[x2[t], y2[t], p2[t]]*x2[t],
y2'[t] == (betaf + betam + sigma)*p2[t] - betaf*y2[t] -
phi[x2[t], y2[t]] - eta[x2[t], y2[t], p2[t]]*y2[t],
p2'[t] == -(betam + betaf + sigma)*p2[t] + phi[x2[t], y2[t]] -
eta[x2[t], y2[t], p2[t]]*p2[t]};
initialcond = {x2[0] == a, y2[0] == b, p2[0] == 1 - a - b};
tmax = 50;
solhelp =
Table[
NDSolve[
Join[initialcond, syshelp], {x2, y2, p2} , {t, 0, tmax},
AccuracyGoal -> 10, PrecisionGoal -> 15],
{a, 0.01, 1, 0.15}, {b, 0.01, 1 - a, 0.15}];
functions =
Map[{y2[t] + p2[t]/2, p2[t]*Sqrt[3]/2} /. # &, Flatten[solhelp, 2]];
ParametricPlot[Evaluate[functions], {t, 0, tmax},
PlotRange -> {{0, 1}, {0, 1}}, AspectRatio -> Automatic]
Third day with Mathematica...
You could find a map from the triangle in the 2D plot to the one in 3D using FindGeometricTransformation and use that in ParametricPlot3D to plot your function, e.g.
corners2D = {{0, 0}, {1, 0}, {1/2, 1}};
corners3D = {{1, 0, 0}, {0, 1, 0}, {0, 0, 1}};
fun[pts1_, pts2_] := FindGeometricTransform[Append[pts2, Mean[pts2]],
PadRight[#, 3] & /# Append[pts1, Mean[pts1]],
"Transformation" -> "Affine"][[2]]
ParametricPlot3D[Evaluate[fun[corners2D, corners3D][{##, 0}] & ### functions],
{t, 0, tmax}, PlotRange -> {{0, 1}, {0, 1}, {0, 1}}]
Since your solution has the property that x2[t]+y2[t]+p2[t]==1 it should be enough to plot something like:
functions3D = Map[{x2[t], y2[t], p2[t]} /. # &, Flatten[solhelp, 2]];
ParametricPlot3D[Evaluate[functions3D], {t, 0, tmax},
PlotRange -> {{0, 1}, {0, 1}, {0, 1}}]

Lighting and OpenGL ES

I'm working on getting a simple lighting right on my OpenGL ES iPhone scene. I'm displaying a simple object centered on the origin, and using an arcball to rotate it by touching the screen. All this works nicely, except I try to add one fixed light (fixed w.r.t. eye position) and it is badly screwed: the whole object (an icosahedron in this example) is lit uniformly, i.e. it all appears in the same color.
I have simplified my code as much as possible so it's standalone and still reproduces what I experience:
glClearColor (0.25, 0.25, 0.25, 1.);
glClear (GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glEnable (GL_DEPTH_TEST);
glEnable(GL_LIGHTING);
glMatrixMode (GL_PROJECTION);
glLoadIdentity ();
glOrthof(-1, 1, -(float)backingWidth/backingHeight, (float)backingWidth/backingHeight, -10, 10);
glMatrixMode (GL_MODELVIEW);
glLoadIdentity ();
GLfloat ambientLight[] = { 0.2f, 0.2f, 0.2f, 1.0f };
GLfloat diffuseLight[] = { 0.8f, 0.8f, 0.8, 1.0f };
GLfloat specularLight[] = { 0.5f, 0.5f, 0.5f, 1.0f };
GLfloat position[] = { -1.5f, 1.0f, -400.0f, 0.0f };
glEnable(GL_LIGHT0);
glLightfv(GL_LIGHT0, GL_AMBIENT, ambientLight);
glLightfv(GL_LIGHT0, GL_DIFFUSE, diffuseLight);
glLightfv(GL_LIGHT0, GL_SPECULAR, specularLight);
glLightfv(GL_LIGHT0, GL_POSITION, position);
glShadeModel(GL_SMOOTH);
glEnable(GL_NORMALIZE);
float currRot[4];
[arcball getCurrentRotation:currRot];
glRotatef (currRot[0], currRot[1], currRot[2], currRot[3]);
float f[4];
f[0] = 0.5; f[1] = 0; f[2] = 0; f[3] = 1;
glMaterialfv (GL_FRONT_AND_BACK, GL_AMBIENT, f);
glMaterialfv (GL_FRONT_AND_BACK, GL_DIFFUSE, f);
f[0] = 0.2; f[1] = 0.2; f[2] = 0.2; f[3] = 1;
glMaterialfv (GL_FRONT_AND_BACK, GL_SPECULAR, f);
glEnableClientState (GL_VERTEX_ARRAY);
drawSphere(0, 0, 0, 1);
where the drawSphere function actually draws an icosahedron:
static void drawSphere (float x, float y, float z, float rad)
{
glPushMatrix ();
glTranslatef (x, y, z);
glScalef (rad, rad, rad);
// Icosahedron
const float vertices[] =
{ 0., 0., -1., 0., 0., 1., -0.894427, 0., -0.447214, 0.894427, 0.,
0.447214, 0.723607, -0.525731, -0.447214, 0.723607, 0.525731,
-0.447214, -0.723607, -0.525731, 0.447214, -0.723607, 0.525731,
0.447214, -0.276393, -0.850651, -0.447214, -0.276393, 0.850651,
-0.447214, 0.276393, -0.850651, 0.447214, 0.276393, 0.850651,
0.447214 };
const GLubyte indices[] =
{ 1, 11, 7, 1, 7, 6, 1, 6, 10, 1, 10, 3, 1, 3, 11, 4, 8, 0, 5, 4, 0,
9, 5, 0, 2, 9, 0, 8, 2, 0, 11, 9, 7, 7, 2, 6, 6, 8, 10, 10, 4, 3,
3, 5, 11, 4, 10, 8, 5, 3, 4, 9, 11, 5, 2, 7, 9, 8, 6, 2 };
glVertexPointer (3, GL_FLOAT, 0, vertices);
glDrawElements (GL_TRIANGLES, sizeof(indices)/sizeof(indices[0]), GL_UNSIGNED_BYTE, indices);
glPopMatrix ();
}
A movie of what I see as the result is here. Thanks to anyone who can shed some light into this (no kidding!). I'm sure it will look embarassingly trivial to someone, but I swear I have looked at many lighting tutorials before this and am stuck.
Try adding some vertex normals using glNormalPointer(). It looks like OpenGL ES is just using the default normal for everything.

Resources