Extracting relationship between two.sentences - stanford-nlp

I am looking for a algorithm which can help.me identify relationships between two sentences.
For example-
a. Planet revolves around stars.
b. Satellites revolve around planets.
c. Sun is a star.
d. Earth revolves around sun.
e. Moon revolves around earth.
Inference should be-
The Moon is a Satellite which revolves around Earth which is a Planet. Earth revolves around sun which is a star
Help me in this please
Thanks in advance

Related

Threejs Loads of the same Plane Mesh

Ive wrote a simple 3d dungeon generator using Threejs, but since i use alot of spotlights for torches in the dungeon im starting to get a FPS drop.
I know the light is the problem but before i tackle the light issue, i was thinking it could be possible to optimize the level. The level is made only by using a plane 200x200 with a wall texture. Ive read about instancing, is that what i want in this scenario ? Walls wont move. If they move i can make separate meshes for the moving ones.
For the lights im using LambertMaterial, should be the fastest one, but besides that ive done nothing to improve performance on that matter. Ive tryed to bake the room lights into the textures with this https://github.com/mem1b/lightbaking but failed.
So in the end, Is instancing the aproach to optimize the level polygons ? I read lttle bout it could not fully understand.
Say you have 100 torches distributed on a flat wall. Each one of them pretty much only affects the part of the wall that it's closest to, area wise this would be 1/100th of the wall.
Now what if you divide this wall into a struct wall segment + torch. Instead of giving the global scene all 100 lights, you'd give each light segment a single local light.
Cool, now if you render the entire wall, you get 100 lights at the cost of one... computationally, but you also introduce 100 draw calls instead of one.
Instancing would help here. You could instance your wall segment geometry, and then for each instance set a light attribute instead of a uniform.
This way you could draw 100 lights on 100 light segments in one draw call. This would be much faster than drawing stuff 100 times.

Mapping 3D data on a unit sphere

I have a three D data set and I am trying to map it on the surface of a unit sphere. I don't know how to start but I found some papers in web but it is not clear to me how to use them. Can you share you thoughts on this algorithm.
In my data I have three value at each point. For example a coffee mug. I have the exact coordinates of each point on the mug surface. Now I want to map mug on a unit sphere. Any help in this regard.
I also found a similar post here in stackExchange at Messed up Sphere but they are using opengl. I want to use this in simple plane c or fortran code.
Thank you.

How to rotate circle in response to collision?

I am in the process of developing a very simple physics engine. The only non-static objects in it will be circles and the only collision detection I will be performing is between circles and line pieces.
For the purpose I am utilizing the principals described in Advanced Character Physics. That is, I do integration by using a simple Verlet integrator. I perform collision detection and response simply by calculating the distance between the circles and the line pieces and in case that the distance is less than the cirles radius I project the circle out of the line piece.
This works very well and the result is a practically perfect moving circle. The current state of the engine can be seen here: http://jsfiddle.net/8K4Wj/. This however, also shows the one major problem I am facing: The circle does not rotate at all.
As far as I can figure out there is three different collision cases that will have to be dealt with seperately:
When the circle is colliding with a line vertex and is not rolling along the line.
When the circle has just hit or rolled of a line. Then the exact point of impact will have to be calculated (how?) and the circle is rotated according to the distance between the impact position and the projected position.
When the circle is rolling along a line. Then is it simply rotated according to the distance traveled since last frame.
Here is the closest I have got to solving the problem: http://jsfiddle.net/vYjzt/. But as the demo shows it doesn't handle the edge cases probably.
I have searched for a solution online but I can not find any material that deals with the given problem specifically (as I said the physics engine is relatively simple and I do not want to bother with complex physic simulation concepts).
What looks wrong in your demo is that you're not considering angular moment and energy when determining the motion.
For the easy case, when the wheel is dropping to the floor in your demo, it stops spinning while in free fall. Angular momentum should keep it going.
A more complicated situation is when the wheel finally lands on the floor, it moves with the same horizontal velocity it had before hitting floor. This would be correct if it wasn't rolling but since it is rolling, some of the kinetic energy will have to go into the spinning motion, and this should slow it down. As a more clear example of this, consider the opposite case where the wheel is spinning quickly but has no linear momentum. When this wheel is set on the floor, it should take off and the spinning should slow. Also, for example, as the wheel rolls down a hill, it accelerates more slowly because the energy needs to go into both linear and circular motion.
It's not too hard to do, but to show a rolling object in a way that looks intuitively correct, I think you'll need to consider the kinetic energy and angular momentum associated with rolling. By "not too hard", I mean that all of your equations will essentially but twice as long, with one term for linear motion and another for angular. I won't recite all of the equations, it's basically just the chapter in rotational motion from any physics text.
(Nice demo, btw!)

OpenGL : Line jittering with large scene and small values

I'm currently drawing a 3D solar system and I'm trying to draw the path of the orbits of the planets. The calculated data is correct in 3D space but when I go towards Pluto, the orbit line shakes all over the place until the camera has come to a complete stop. I don't think this is unique to this particular planet but given the distance the camera has to travel I think its more visible at this range.
I suspect its something to do with the frustum but I've been plugging values into each of the components and I can't seem to find a solution. To see anything I'm having to use very small numbers (E-5 magnitude) for the planet and nearby orbit points but then up to E+2 magnitude for the further regions (maybe I need to draw it twice with different frustums?)
Any help greatly appreciated...
Thanks all for answering but my solution to this was to draw it with the same matrices that were drawing the planet since it wasn't bouncing around as well. So the solution really is to code better really, sorry.

Any ideas on real life rocks 3d Reconstruction from Single View?

So in general, when we think of Single View Reconstruction we think of working with planes, simple textures and so on... Generally, simple objects from nature's point of view. But what about such thing as wet beach stones? I wonder if there are any algorithms that could help with reconstructing 3d from single picture of stones?
Shape from shading would be my first angle of attack.
Smooth wet rocks, such as those in the first image, may exhibit predictable specular properties allowing one to estimate the surface normal based only on the brightness value and the relative angle between the camera and the light source (the sun).
If you are able to segment individual rocks, like those in the second photo, you could probably estimate the parameters of the ground plane by making some assumptions about all the rocks in the scene being similar in size and lying on said ground plane.

Resources