how make animation in geoserver using "dimension" - animation

I have one layer in geoserver. I would like make a time animation with this layer. I'm able to modify the "dimension" parameter. How can I visualize the animation after that I made this?

There is a very detailed tutorial on how to use the GeoServer Animator, basically you make a call to the animator endpoint:
http://localhost:8080/geoserver/wms/animate?layers=cite:ne_50m_coastline,cite:earthquakes
with your required layer(s) specified. For an actual animation you need to provide two more parameters:
aparam - the value to change
avalues - the values to set aparam to for each frame.
so in the following:
aparam=time&avalues=1900,%201901,%201902,%201903,%201904
I am setting the time dimension to 1900, then 1901, 1902 etc
this returns me an animated gif like this:
(Though I had to use decades here as by year is too big to upload to stackoverflow).

Related

Define timeframe for animation in KML

Is there any possibility to define a timewindow for an animation of KML objects? Like if i have to occurences animated (polygon 1 appears on the 1/1/2018 and polygon 2 on the 6/10/2018). Is there any way to define that the whole animation should last for f.e. 30 or 45 seconds? I only see that Google Earth always interpolates the animation time depending on the given
<TimeSpan> <begin> 2004-03 </begin> <end>2004-04</end></TimeSpan>
dates of the document.
In the current Google Earth Pro interface, there is no way to specify the duration (in playback-time) of an animation like this. As you noted, it expands the time slider to include the dates & times from all loaded KMLs, and plays across the slider at a preset speed (adjustable in the slider settings).
One way you could apply time playback with precise control is to set up a KML Tour which animates between two views (with time tags applied), over a specific number of seconds. Then you could have your user play it back using the Tour interface instead, and see the timing you want. Unfortunately KML Touring is rather complex with a long learning curve. There are simple things you can do (possibly including something like your request) using the tour recording interfaces in Earth Pro, but to really harness the full power of touring you'll need to create custom KML code, so consider yourself warned. :-)

Is it possible to name rotations/joints in threejs and how is it possible to set them?

I am doing a bit of work with threejs. And now just wondering if it was possible to name rotations or joints.
So it seems possible to write code like:
arm.rotateZ( 180 ).name="ARM_ANGLE";
But then how does one subsequently access and set the same rotation?
I know in x3d it is possible to do this, so was thinking it would be possible to do as well in threejs. In x3d, one can define a reference as:
<Transform DEF="ArmAngle" rotation="0 0 1 3.19">
And then later define a route to reference it like:
<ROUTE fromNode='spinarm' fromField='value_changed' toNode='ArmAngle' toField='set_rotation'></ROUTE>
What you are describing sounds like animation keys or transform key frames.
You can define these in a modeller like Blender and export them or generate them programatically.
But generally, what you are describing from x3d would have to be a layer built on top of three if you really want that style of interface, but honestly, it's pretty straightforward to use the scene graph style of manipulation.. i.e. finding an object and setting its position and rotation.. OR defining an animation in a modeller and then calling that animation. The advantage of using animations is that you can then blend between them.
You CAN name Objects in three.. so for instance you could name your arm.. and then find it using scene.getObjectByName("arm"). getObjectByName is a method of all Object3Ds.

WebAudio changing of orientation of Listener and/or Panner

I am trying to understand how the WebAudio API would work. I have two objects; one representing the listener and one the source. And have used the below link as an example. I am able to move the source and the sound position changes.
https://mdn.github.io/webaudio-examples/panner-node/
The command to change the orientation has been provided: viz this.panner.setOrientation or this.listener.setOrientation. The question I have is: if I have a source or listener object (in canvas mode using ThreeJS viz. we know its position and rotation) how do I change the orientation of either the panner or listener (as the case may be) via JS.
An example would greatly help. Thanks
Any reason not to use THREE's PositionalAudio object? https://threejs.org/docs/index.html#api/audio/PositionalAudio. That lets you add a sound to mesh object, and THREE will take care of moving it. If you want to source a different sound than an AudioBuffer, just connect the audio source to the .gain of the PositionalAudio object.

Three.js calling clipAction.play() makes animated objects vanish

In Three.js, Calling action.play() makes objects just vanish, without any error or warning on the console.
I use THREE.ObjectLoader to load a JSON file created in blender. The srt (position/scale/quaternion) animation is in the generated file. As are the morphtargets. To optimise filesize I animated the srt as a series of null objects. The morphtargets tracks are in the main object, which I clone 5 times to build the characters (balloons to be exact).
I previously did extensive testing to introduce shape/morph animation. After being succesfull I finalised all the animations. Only to be trumped by the disappearing models. The srt (position/scale/quaternion) animation was working fine before. But after refactoring the code, to be less spagettied, upon calling action.play(). The objects just vanish, exactly then. Echoeing the mixers and the array containing the clips, everything looks correct (ie I see the tracks, the names are right etc). Also examining the newly generated JSON, it seems the same and correct (also I have not changed the SRT animations, only introduced shapeanimation)
So I am lost, and think this looks more and more like a bug. From previous experience I do know it works (or has worked).
I created a jsfiddle: https://jsfiddle.net/oompol/3ya6sqed/
[edit] I turned on the action.play and call the function from the link in the div [/edit] please note I commented out calling action.play(). So you see the load and init work. See the function listed below
function playScene(scene) {
for (parentName in srtMixers) {
var clpName = "balloon1_fly";
var clp = THREE.AnimationClip.findByName(animLib, clpName);
var action = srtMixers[parentName].clipAction(clp);
action.clampWhenFinished = true;
console.log("playScene:", clpName, clp, parentName, srtMixers);
//this is when the problem happens
action.play();
}
}
This is the JSON I am loading:
https://rawgit.com/bakajin/2e3d2f6a722103ed4aefd76f6250ec08/raw/28cad35c20060d478499c0cd40a2753611993720/oomp-scene_balloons-oomp-6.9.4.json
Ok,
there was something very wrong with the scaling indeed.
The io_three JSON exporter for Blender (r87 dev) writes incorrect matrix transformation data in the geometry object (really tiny scaling values). The animation track with the scaling keys were correctly written as 1,1,1. So all the objects just scaled out of view immediately.
Hard to see because the geometry has no separate scaling value but a matrix. Seems to happen when you set "Scene" to true on export.
Worked around the problem by entering the scaling value in the keyframe tracks. But this will only work if you have no scaling animation (so the keys are all one).
Meanwhile I have extensively edited the JSON by hand. Because this is not the only incorrect data. The formatting of the animation object is also wrong. The durations for the morphTargetInfluence Keys is also incorrect. The formatting of these keys is also not always correct.
Hope this helps some other ppl

How can I dynamically change keyframe values from code in Unity 5?

I have gameObject with animator attached to it, it has animation curves, I need to dynamically change keyframe values from code. How do I access them?
Already asked this in the Live Session. The Answer is you can't since the anim file is the core for running the Animation Controller.
The alternate way that they gave me is to use the legacy SpriteRenderer instead.
Animation Controller doesn't support Dynamic changing of the values. Instead they provided Animator for you to make Dynamic changes of path from anim files, so consider making different anim files and path to your Animator if you don't like to work with SpriteRenderer.
If "dynamically" still means editor-time, then you could use UnityEditor.AnimationUtility, which provides methods to get and set curves and key frames and more.
One can retrieve the bindings with AnimationUtility.GetCurveBindings() or AnimationUtility.GetObjectReferenceCurveBindings(). And with AnimationUtility.GetObjectReferenceCurve() one can get the key frames, make modifications and apply it with AnimationUtility.SetObjectReferenceCurve()
At runtime - probably only with workarounds.
For example via some animated relay value within a certain custom script that then applies the wanted value to the actual property on Update() - a kind of custom constraint if you will. And within that you could apply then again custom modifications via code on runtime. But on editor time and when animating, the preview would be broken because you would not directly animate the property, but only that relay value. Maybe one could use [ExecuteInEditMode] in that custom constraint behaviour and AnimationMode.InAnimationMode() to simulate a preview. But all of this would be experimental.

Resources