I want to make it so when I fire my gun the slider shoots back and then comes forward like a real gun. However I don't know how I should start because I can't find any relevant information on Google and I don't know what to search. All I need is the logic behind how to do it, I can code it myself.
I would agree with zambari.
Just create an animation and play it when your guns gets fired.
Edit:
Since you are talking about 3D, you could either
move the pivot of the object to the point where the trigger would be attached. This way, all you need to do is change the objects rotation for the animation.
use joints
This is the best tool for VR guns with interactable parts. I HIGHLY recommend looking at this to ANYONE making a VR game
https://github.com/Oyshoboy/weaponReloadVR
Related
Is there a way to animate wind for certain game objects?
For example, branches of trees should gently move, like there's a breeze in game. Not gameplay, more like a special background effect.
If it's not possible in code, what would be the best way to create proper sprite images?
I see a few options to actually achieve that Wind Effect.
SKFieldNode, It allows you to actually apply physics effects to nodes. And if you want real tree branches that can move based on the physics, you should combine SKFieldNode with SKPhysicsJoint. When you combine those two you can actually create indepedent Branch Nodes to receive some kind of force to simulate a wind effect. To understand what you can do with SKPhysicsJoint, check out this guide. This solution can get really complex and hard to achieve with superficial understanding of SpriteKit Engine, but you can create an amazing effect using it. Personally, I would not recommend if you have deadlines to attend, physics always get buggy if you lose your grasp on what you are doing, you may invest a lot of time trying to achieve this using physics.
Create different animation of your tree responding to wind movement and control which animation frame you should use at that specif case. I Would highly recommend this one, because you will have control over what is going on with you tree and people that play games don't pay that much attention to what is going on with background, altho is a good thing to think about it from the game experience.
Create your tree textures and change the anchor point to be at the place you want the effect to be more effective and responsive. The farthest the coordinate in the node is from the anchor points coordinate, less effect from your SKAction it will get.
Sorry for my English, is not my native language.
my question is just about choosing the right approach because i'm not sure about the solution.
i got 3d model in my project, at some point i want to show animated disassembly , the object is made of somthing like 200 pieces.
so animating with keyframe one by one is time consuming.
the animation i'm looking for is like explosion from the center of the object so the parts will just move out of its center.
example image:
what would you do?
what is the best way to manage such task?
I would code it. Maybe I am biased because I am a programmer, but animating it would be a pain.
So I would import the model into Unity3d. Then I would grab all the parts and store them in a list. Once I have the 200 parts then I can do anything I want to them.
I would then proceed to attach rigibodies and box colliders to them all -- this can be done programmatically. Then you can initiate the explosion by adding a velocity to each part. If you want to be fairly realistic and have something that is fairly random you can give each object mass and then use the equation F=ma for the explosion. That is, each part will get different acceleration depending on the mass they have.
I'm using Farseer and XNA on on WP7. I have 2 objects in my game. The first one is a wall generated from a bitmap. The second one is a player controller - in fact it's just a circle object. This circle follows player's finger.
I need a certain behavior - probably it's very basic, but I can't figure out how to google it. It's a collision detection that just wouldn't allow the controller to come into the wall. It shouldn't bounce. It should just try to follow the finger but not enter the wall.
I know it's not hard to implement it on my own, but if I'm using a physics engine and it happens to offer such a functionality it would be a shame not to take advantage of it. :)
you need to use a BoundingBox object, and check collision VS the object (you should create a BoundingSphere wrapping it)
http://www.riemers.net/eng/Tutorials/XNA/Csharp/Series2/Collision_detection.php
Can someone tell me what would be the math required for moving around the small ball in the bar(meter) according to the touch motion.... I needed a clear idea on the logic..
The logic depends on how you want to model the interaction between the finger and the ball. Is it moving through a fluid, like a ball bearing? A gravitational field? Do you want it to decelerate as if under the influence of friction from the sides of the bar? Or do you want to simply have the ball track the finger, without fancy damping of the motion?
My point is that the "logic" can be pretty complicated, depending on how you decide to model it. I'd recommend looking for something canned in JavaScript before coding it yourself.
I have a hierarchical animated model in DirectX which loads and animates based on the following DirectX sample: http://msdn.microsoft.com/en-us/library/ee418677%28VS.85%29.aspx
As good as the sample is it does not really go into some of the details of animation that I'd like. For example, if I have a mesh which has a running animation and a throwing animation as seperate animation sets how can I get the throwing animation to occur for bones above the hip and the walking animation to occur for bones underneath the hip?
Also if I wanted to for example have the person lean left or right would I simply have to find the bone for the hip and multiplay a rotation matrix by its matrix? In this case I think the matrix is m_amxBoneOffsets?
Composing multiple animations to a single one is usually the job of an animation system, something that is way out of scope of the D3D sample.
Let's look at your 2 examples:
running and throwing
Well, in this case you could apply the animation for the lower part of the body from the running animation and the animation for the upper part of the body from the throwing animation. And you'd get a very crappy result.
The how is just a matter of knowing which bones are where in the bone palette (something that depends on how they are stored, and in which order, but nothing inherently hard. The definite reference should be the documentation of the tool generating the animation data)
In practice, you're better off with a blending of the 2 animation. This is, in general, is hard, and software packages exist out there that do this for you. Gamebryo, e.g.
Or, an animation of a running guy who throws is different enough from a standing guy who throws that you might be better off having 2 animations.
Leaning
If you apply a rotation matrix to the root bone, you'll simply rotate your whole character.
Now if you rotate the next bone in the hierarchy (from the spine), you'll get all the bones that depend on it to rotate likewise. It will probably do what you want, but there's a sure way to find out. Try it!
Well the thing is the running animation SHOULD affect the throwing animation slightly. What you need to look into is animation blending.
I'm sure Valve wrote a good paper on how they implemented it in Counter-strike many years ago. Its not on the valve site though so I'm not sure where I got this memory from ...