For all the other shaders there is a VSSetShader or PSSetShader function in the device context but I don't see the equivalent for the geometry shader. Thanks!
You're looking for ID3D11DeviceContext::GSSetShader.
Related
I would like to transform the shader pass below to a shader material.
https://github.com/felixturner/bad-tv-shader
Is it possible, how to procede ? For me it should be easy if the shaderMaterial is applied to a plane Geometry.
I tried this transformation on this codesandbox : https://codesandbox.io/s/r3f-wavey-image-shader-forked-4fb238
The only thing I did for the moment is copying the code of the shader pass into a object.
Thanks.
I am working on a Threejs project. I am hoping to apply a custom shader to the entire scene, rather than just an object. I'm very unexperienced with shaders so I'm not even sure if this concept makes sense, but is it possible to apply a shader to the renderer? Or maybe the camera?
It is a fragment shader if that makes a difference, but it would be nice to apply any type of shader in the future.
You can apply shaders to the scene with ✨ post-processing ✨.
Here is the link to three.js documentation about post-processing 🔽🔽🔽
https://threejs.org/docs/#manual/en/introduction/How-to-use-post-processing
My program is a rain particle system based on the compute shader for advancing rain drops and another rendering shader(vertex shader, geometry shader, pixel shader) for rendering the advanced rain drops.
I use the draw call: DrawInstancedIndirect to apply the results from the Compute Shader to the rendering step.
My problem is in the rendering step, at the Geometry shader, where I'm trying to draw a billboard for each rain drop. If I just draw a normal regtangle, it render well, and when I change to a billboard, nothing is in the render target. I'm trying to find a way to debug this geometry shader. I used the following tools for debugging geometry shader, but thet do work out for me.
Graphics Debugger in VS2012. It seems that this tool do not support draw call: DrawInstancedIndirect.
GPU PeftStudio. It support vertex, pixel shader, but not Geometry shader. I tried to pass out the immediate values from geometry shader to pixel shaders for seeing them, and they are all zero. But I need to dig into geometry shader for finding out the error.
Nsight by NVDIA. My graphics card is 720M, and it's so sad that Nsight only supports from 730M. May be it is the reason the shader list is empty why I am in the debugging process.
I'm desperated now, and seeing no way to find out the problem. I hope you could suggest me a way to debug this geometry shader. Thanks so much!
You can try to use RenderDoc by Crytek, it's really easy to use and you can monitor every buffers in every stages.
I have a Collada object which i load per Three.js into my scene.
Now I want to change some vertex positions of the model in the
vertex shader, which is no problem.
But with this I have to skip the exported collada material and take a
ShaderMaterial.
The problem is now that I have to calculate the complete lighting of
the scene in my fragment shader.
Before, with the collada material, the complete lighting was calculated by the framework by using a directional light and a hemisphere light.
So my question is, if there is a solution where I can leave the fragmentShader untouched and all the colors are calculated as if I would use no ShaderMaterial.
I tried to use THREE.ShaderLib and pass only the fragmentShader of the phong shader
and my own vertexShader. But this gave only errors that not the same varyings are defined in both shaders.
Unfortunately, you are correct. You have to create your own ShaderMaterial, and it will be tedious to incorporate scene lighting.
If your scene lighting is not that important, you can hack in some ambient light and a single light at the camera location in your fragment shader.
If your scene lighting is important, then you need to set the ShaderMaterial parameter lights: true, and you will have access to the scene light uniforms in your vertex and fragment shaders.
three.js r.63
I'm using OpenGL ES + GLKit. I've never been this low-level before in my life so I still have to learn a lot of things. I've developed a Unity games before and you just give it a .obj file and corresponding texture and it's done. (UV mapping happens to be inside the .obj file?)
I want to develop a kind of special Toon Shader with some different characteristics for use with 3D model. So I need to write a vertex shader (.vsh) and fragment shader (.fsh) right?
However, I just know that in order to apply a texture to a model with correct UV coordinate, you have to do this in shader? (am I right?) With "Texture Shader".
So, If I want to both apply the texture with UV mapping then apply my special Toon Shader, I have to write both in the same shader? There is no way I can create a plug-and-play Toon shader so I can use it with anything?
As a side question, which file format is a UV coordinate and how can I take that in to a shader program? What kind of attribute variable?
So I need to write a vertex shader (.vsh) and fragment shader (.fsh)
right?
Yes.
However, I just know that in order to apply a texture to a model with
correct UV coordinate
True
There is no way I can create a plug-and-play Toon shader so I can use
it with anything?
Check Uber-Shaders
and how can I take that in to a shader program? What kind of attribute
variable?
You are defining your attributes in shader by yourself. Check this GLSL tutorial