To demonstrate my problem I use libgdx's ShadowMappingTest.
This test has model with box and sphere:
ModelBuilder modelBuilder = new ModelBuilder();
modelBuilder.begin();
MeshPartBuilder mpb = modelBuilder.part("parts", GL20.GL_TRIANGLES, Usage.Position | Usage.Normal | Usage.ColorUnpacked,
new Material(ColorAttribute.createDiffuse(Color.WHITE)));
mpb.setColor(1f, 1f, 1f, 1f);
mpb.box(0, -1.5f, 0, 10, 1, 10);
mpb.setColor(1f, 0f, 1f, 1f);
mpb.sphere(2f, 2f, 2f, 10, 10);
model = modelBuilder.end();
instance = new ModelInstance(model);
And on screen:
Then I replaced sphere by triangle:
val modelBuilder = ModelBuilder()
modelBuilder.begin()
val mpb = modelBuilder.part("parts",
GL20.GL_TRIANGLES,
(Usage.Position or Usage.Normal or Usage.ColorUnpacked).toLong(),
Material(ColorAttribute.createDiffuse(Color.WHITE)))
mpb.setColor(1f, 1f, 1f, 1f)
mpb.box(0f, -1.5f, 0f, 10f, 1f, 10f)
mpb.setColor(1f, 0f, 1f, 1f)
mpb.triangle(Vector3(0f, 2f, 3f), <<<<<< look
Vector3(3f, 2f, 0f),
Vector3(0f, 2f, 0f))
model = modelBuilder.end()
instance = ModelInstance(model)
And I got:
Where is shadow of the triangle?
How can I fix it?
Related
I am trying to make my 3d object to move. Using default vs2022 template I managed to import model from .ply file and load to pixelshader / vertexshader. Object is displaying on screen, but I have no idea how could I move it. I know I can edit all verticies but I think there is other way of doing it using Matrix.
I was trying to do
XMStoreFloat4x4(&m_constantBufferData.model, XMMatrixTranslation(0.5f, 0.0f, 0.0f));
But It shreded textures.
Upper image is after move.
Can someone help me and explain how to do this?
Camera Initialize:
void Sample3DSceneRenderer::CreateWindowSizeDependentResources()
{
Size outputSize = m_deviceResources->GetOutputSize();
float aspectRatio = outputSize.Width / outputSize.Height;
float fovAngleY = 70.0f * XM_PI / 180.0f;
if (aspectRatio < 1.0f)
{
fovAngleY *= 2.0f;
}
XMMATRIX perspectiveMatrix = XMMatrixPerspectiveFovRH(
fovAngleY,
aspectRatio,
0.01f,
100.0f
);
XMFLOAT4X4 orientation = m_deviceResources->GetOrientationTransform3D();
XMMATRIX orientationMatrix = XMLoadFloat4x4(&orientation);
XMStoreFloat4x4(
&m_constantBufferData.projection,
XMMatrixTranspose(perspectiveMatrix * orientationMatrix)
);
static const XMVECTORF32 eye = { 0.0f, 0.7f, 8.0f, 0.0f };
static const XMVECTORF32 at = { 0.0f, -0.1f, 0.0f, 0.0f };
static const XMVECTORF32 up = { 0.0f, 1.0f, 0.0f, 0.0f };
XMStoreFloat4x4(&m_constantBufferData.view, XMMatrixTranspose(XMMatrixLookAtRH(eye, at, up)));
}
void Sample3DSceneRenderer::Move()
{
XMStoreFloat4x4(&m_constantBufferData.model, XMMatrixTranslation(0.5f, 0.0f, 0.0f));
}
My CreateRegisterView function returns the result I expect
Last lines of my function
var content = new StackLayout
{
Margin = 0,
Padding = 0,
BackgroundColor = Color.Gold
};
content.Children.Add(forms);
content.Children.Add(new BoxView
{
Margin = 0,
Color = Color.Red,
VerticalOptions = LayoutOptions.FillAndExpand
});
AbsoluteLayout.SetLayoutFlags(content, AbsoluteLayoutFlags.PositionProportional);
AbsoluteLayout.SetLayoutBounds(content, new Rectangle(0f, 0f, AbsoluteLayout.AutoSize, AbsoluteLayout.AutoSize));
AbsoluteLayout.SetLayoutFlags(_ActivityIndicator, AbsoluteLayoutFlags.PositionProportional);
AbsoluteLayout.SetLayoutBounds(_ActivityIndicator, new Rectangle(0.5, 0.5, AbsoluteLayout.AutoSize, AbsoluteLayout.AutoSize));
var overlay = new AbsoluteLayout();
overlay.Children.Add(content);
overlay.Children.Add(_ActivityIndicator);
return overlay;
But as soon as I add an activity indicator I have this result
I didn't see any definition of _ActivityIndicator. The only explanation for what you're facing would that _ActivityIndicatorrefers to the same instance of content in some code You've omitted.
If it's not the case, try changing your bounds getting rid of the AutoSize to the content view.
Like this:
AbsoluteLayout.SetLayoutBounds(content, new Rectangle(0f, 0f, 1f, 1f));
AbsoluteLayout.SetLayoutFlags(content, AbsoluteLayoutFlags.All);
AbsoluteLayout.SetLayoutBounds(_ActivityIndicator, new Rectangle(0.5, 0.5, AbsoluteLayout.AutoSize, AbsoluteLayout.AutoSize));
AbsoluteLayout.SetLayoutFlags(_ActivityIndicator, AbsoluteLayoutFlags.PositionProportional);
I had a working shape but when I try to change the coordinates it disappears.
Here is what I did.
This is class level variables:
private BasicEffect _effect;
private VertexPositionColor[] _vertices = new VertexPositionColor[5];
Then in Initialization method I put these:
float aspectRatio = (float)GraphicsDevice.Viewport.Bounds.Width / GraphicsDevice.Viewport.Bounds.Height;
Matrix projection = Matrix.CreatePerspectiveFieldOfView(MathHelper.ToRadians(45), aspectRatio, 0.1f, 1000.0f);
Matrix view = Matrix.CreateLookAt(new Vector3(0, 0, 10), Vector3.Zero, Vector3.Up);
_effect = new BasicEffect(GraphicsDevice);
_effect.LightingEnabled = false;
_effect.TextureEnabled = false;
_effect.VertexColorEnabled = true;
_effect.Projection = projection;
_effect.View = view;
_effect.World = Matrix.Identity;
Color color = Color.Black;
_vertices[0] = new VertexPositionColor(new Vector3(-1, -1, 0), color);
_vertices[2] = new VertexPositionColor(new Vector3(-1, 1, 0), color);
_vertices[2] = new VertexPositionColor(new Vector3(1, -1, 0), color);
_vertices[3] = new VertexPositionColor(new Vector3(1, 1, 0), color);
And in draw method I put:
foreach (EffectPass pass in _effect.CurrentTechnique.Passes)
{
// Apply the pass
pass.Apply();
// Draw the square
GraphicsDevice.DrawUserPrimitives(PrimitiveType.TriangleStrip, _vertices, 0, 2);
}
This is working fine. but when I change it to this one it doesn't work anymore.
_vertices[1].Position = new Vector3(-0.10f, 1.37f, -3.0f);
_vertices[0].Position = new Vector3(-0.15f, 1.40f, -3.0f);
_vertices[2].Position = new Vector3(-0.00f, 1.40f, -3.0f);
_vertices[4].Position = new Vector3(0.15f, 1.40f, -3.0f);
_vertices[3].Position = new Vector3(0.10f, 1.37f, -3.0f);
another change that i made:
GraphicsDevice.DrawUserPrimitives<VertexPositionColor>(PrimitiveType.TriangleStrip, _vertices, 0, 3);
any idea?
Thanks in advance
It may be that you are drawing your polygons backwards. Meaning you are declaring your vertices in the wrong direction and you may need to change the order. I am not sure which is the correct direction to declare vertices.
Alternatively you can turn off back face culling which will draw
both sides of polygons. This will have a performance hit 2 draws for each polygon. You could try turning this off first and if it fixes it then you know that my first point is the cause.
I made my borders with this:
class Maze
{
private Body _agentBody;
private Sprite _box;
private GameplayScreen _screen;
private float _offset;
public Maze(World world, GameplayScreen screen, Vector2 position)
{
_agentBody = BodyFactory.CreateBody(world, position);
_agentBody.BodyType = BodyType.Dynamic;
_agentBody.IsStatic = true;
_agentBody.Restitution = 0.2f;
_agentBody.Friction = 0.2f;
_offset = ConvertUnits.ToDisplayUnits(1f);
// spodek
_agentBody.CreateFixture(new PolygonShape(PolygonTools.CreateRectangle(1f, 0.05f, new Vector2(0f, 1f), 0), 1f));
// spodek
_agentBody.CreateFixture(new PolygonShape(PolygonTools.CreateRectangle(1f, 0.05f, new Vector2(0f, -1f), 0), 1f));
// pravy bok
_agentBody.CreateFixture(new PolygonShape(PolygonTools.CreateRectangle(0.05f, 1f, new Vector2(1f, 0f), 0), 1f));
// levy bok
_agentBody.CreateFixture(new PolygonShape(PolygonTools.CreateRectangle(0.05f, 1f, new Vector2(-1f, 0f), 0), 1f));
_screen = screen;
//GFX
AssetCreator creator = _screen.ScreenManager.Assets;
_box = new Sprite(creator.TextureFromVertices(PolygonTools.CreateRectangle(1f, 0.05f),
MaterialType.Blank, Color.White, 1f));
}
public Body Body
{
get { return _agentBody; }
}
public void Draw()
{
SpriteBatch batch = _screen.ScreenManager.SpriteBatch;
batch.Draw(_box.Texture, ConvertUnits.ToDisplayUnits(_agentBody.Position), null,
Color.White, _agentBody.Rotation, _box.Origin + new Vector2(0f, _offset), 1f, SpriteEffects.None, 0f);
batch.Draw(_box.Texture, ConvertUnits.ToDisplayUnits(_agentBody.Position), null,
Color.White, _agentBody.Rotation, _box.Origin + new Vector2(0f, -_offset), 1f, SpriteEffects.None, 0f);
batch.Draw(_box.Texture, ConvertUnits.ToDisplayUnits(_agentBody.Position), null,
Color.White, _agentBody.Rotation + MathHelper.Pi / 2f, _box.Origin + new Vector2(0f, _offset), 1f, SpriteEffects.None, 0f);
batch.Draw(_box.Texture, ConvertUnits.ToDisplayUnits(_agentBody.Position), null,
Color.White, _agentBody.Rotation + MathHelper.Pi / 2f, _box.Origin + new Vector2(0f, -_offset), 1f, SpriteEffects.None, 0f);
}
}
And these are my little particles:
for (int i = 0; i < 8; i++)
{
_sands[i] = BodyFactory.CreateRectangle(_world, 0.05f, 0.05f, 1f);
_sands[i].IsStatic = false;
_sands[i].Restitution = 0.1f;
_sands[i].Friction = 0.1f;
_sands[i].Position = new Vector2(1.8f + i * 0.2f, 2.2f);
}
_sand = new Sprite(ScreenManager.Assets.TextureFromShape(_sands[0].FixtureList[0].Shape,
MaterialType.Dots,
Color.SandyBrown, 0.8f));
I draw it this way:
foreach (Body sand in _sands)
{
spriteBatch.Draw(_sand.Texture, ConvertUnits.ToDisplayUnits(sand.Position), null, Color.SandyBrown, sand.Rotation, _sand.Origin, 1f, SpriteEffects.None, 0f);
}
_maze.Draw();
But I can't figure out why if I rotate with borders then why partlicles are still in place. I tried change restitution of particles and when there is 1f they are restitute (bouncing) allright and I can rotate with borders and they restitute from new position of borders but when I have settings like above particles fall down, the ones which are inside of borders they stopped at bottom border and others fall down entirely. So after start I have first image and after I rotate with borders I get seccond image. What I am doing wrong? Why when I change restitution they are bouncing a with 0.2 they are not?
Edit:
New lines in maze constructor:
agentBody = BodyFactory.CreateBody(world, position);
_agentBody.BodyType = BodyType.Dynamic;
_agentBody.IgnoreGravity = true;
_agentBody.Restitution = 0.1f;
_agentBody.Friction = 1f;
_offset = ConvertUnits.ToDisplayUnits(1.5f);
FixtureFactory.AttachRectangle(3f, 0.1f, 1f, new Vector2(0, 1.55f), _agentBody);
FixtureFactory.AttachRectangle(3f, 0.1f, 1f, new Vector2(0f, -1.55f), _agentBody);
FixtureFactory.AttachRectangle(width, 3f, 1f, new Vector2(-1.55f, 0f), _agentBody);
FixtureFactory.AttachRectangle(width, 3f, 1f, new Vector2(1.55f, 0f), _agentBody);
This is how it looks with debug view:
Rotating with body:
public override void HandleInput(GameTime gameTime, InputState input)
{
if (input == null)
throw new ArgumentNullException("input");
// Read in our gestures
foreach (GestureSample gesture in input.Gestures)
{
if (gesture.GestureType == GestureType.HorizontalDrag)
{
if (gesture.Delta.X < 0)
{
_maze.Body.Rotation += 0.02f;
}
else if (gesture.Delta.X > 0)
{
_maze.Body.Rotation -= 0.02f;
}
}
}
The little particle bodies are probably sleeping once they stop moving. The frame around them is a static body so the engine does not expect it to ever move, but you are moving it. In this situation the little particle bodies will not wake up.
You will need to set the little particle bodies so that they cannot sleep, or make the frame around them a dynamic or kinematic body. As a general rule, if a body is gonna move, don't make it a static body.
Im loading a bitmap into the renderer class and I want to render the texture centered in the screen with the right ratio and as big as it can be
The vertices and texture data:
private final float[] mVerticesData =
{
-1f, 1f, 0.0f, // Position 0
0.0f, 0.0f, // TexCoord 0
-1f, -1f, 0.0f, // Position 1
0.0f, 1.0f, // TexCoord 1
1f, -1f, 0.0f, // Position 2
1.0f, 1.0f, // TexCoord 2
1f, 1f, 0.0f, // Position 3
1.0f, 0.0f // TexCoord 3
};
private final short[] mIndicesData =
{
0, 1, 2, 0, 2, 3
};
And the draw method :
public void onDrawFrame(GL10 glUnused)
{
// Set the viewport
GLES20.glViewport(0, 0, bitmap.width, bitmap.height);
// Clear the color buffer
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
// Use the program object
GLES20.glUseProgram(mProgramObject);
// Load the vertex position
mVertices.position(0);
GLES20.glVertexAttribPointer ( mPositionLoc, 3, GLES20.GL_FLOAT,
false,
5 * 4, mVertices );
// Load the texture coordinate
mVertices.position(3);
GLES20.glVertexAttribPointer ( mTexCoordLoc, 2, GLES20.GL_FLOAT,
false,
5 * 4,
mVertices );
GLES20.glEnableVertexAttribArray ( mPositionLoc );
GLES20.glEnableVertexAttribArray ( mTexCoordLoc );
// Bind the texture
GLES20.glActiveTexture ( GLES20.GL_TEXTURE0 );
GLES20.glBindTexture ( GLES20.GL_TEXTURE_2D, mTextureId );
// Set the sampler texture unit to 0
GLES20.glUniform1i ( mSamplerLoc, 0 );
GLES20.glDrawElements ( GLES20.GL_TRIANGLES, 6, GLES20.GL_UNSIGNED_SHORT, mIndices );
}
Right now Im setting the viewport with the bitmap dimensions(not working). How should I calculate the viewport dimensions for the texture to fit the screen keeping the ratio?
thank you.
May be you should use ModelViewPrjoection matrix. Look at this source
http://code.google.com/p/android-opengles-samples-for-vikulov-blogspot-com/source/browse/trunk/OpenglTemplate/src/com/vikulov/opengl/MyGLRenderer.java