I'd like to add a second crate texture to the shader using the shader/technique provided with this example:
https://threejs.org/examples/#webgl_buffergeometry_instancing_dynamic
I figured I could pass in another uniform to add a second map.
crateMaterial = new THREE.RawShaderMaterial( {
uniforms: {
map: { value: new THREE.TextureLoader().load( './img/textures/crate/crate.gif' ),
map2: { value: new THREE.TextureLoader().load( './img/textures/crate2/crate2.gif' ) }
};
However I'm struggling to figure out how to "tag" specific crates and then use the shader to draw vertices with the correct texture, as my experience and skill with GLSL are quite limited.
Could I just pass in another uniform consisting of (vertex) indices to specify where the shader should apply the second texture? ie:
crateMaterial.uniforms.cratesTexturemap = [];
for(i=0;i<cratesToRender;i++) {
/* set position */
...
this._instancePositions.push( position.x, position.y, position.z );
if (drawCrate2) {
crateMaterial.uniforms.cratesTexturemap.push(i); /* correlates to (vertex) position index */
crateMaterial.uniforms.cratesTexturemap.push(i+1);
crateMaterial.uniforms.cratesTexturemap.push(i+2);
}
...
}
Also performance/memory-wise, is it better to have a (dynamic) array of textures passed to the shader or is passing them one by one (a uniform value for every texture as above) more advantageous?
Example shader code for reference:
<script id="vertexShader" type="x-shader/x-vertex">
precision highp float;
uniform mat4 modelViewMatrix;
uniform mat4 projectionMatrix;
attribute vec3 position;
attribute vec3 offset;
attribute vec2 uv;
attribute vec4 orientation;
varying vec2 vUv;
// http://www.geeks3d.com/20141201/how-to-rotate-a-vertex-by-a-quaternion-in-glsl/
vec3 applyQuaternionToVector( vec4 q, vec3 v ){
return v + 2.0 * cross( q.xyz, cross( q.xyz, v ) + q.w * v );
}
void main() {
vec3 vPosition = applyQuaternionToVector( orientation, position );
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4( offset + vPosition, 1.0 );
}
</script>
<script id="fragmentShader" type="x-shader/x-fragment">
precision highp float;
uniform sampler2D map;
varying vec2 vUv;
void main() {
gl_FragColor = texture2D( map, vUv );
}
</script>
Can't you just pass an array of textures and choose which to apply using an index passed by a int uniform?
So for every crate you could do
crate.material.uniforms.index = 2
to choose the 3rd texture in the array
Related
I'm trying to use the shader from this post https://stackoverflow.com/a/27764539/6736544 with a ThreeJS geometry, so far without success.
This is the original shader code from #gman:
<script src="https://webglfundamentals.org/webgl/resources/webgl-utils.js"></script>
<script id="vs" type="foo">
attribute vec4 a_position;
attribute float a_v;
varying float v_v;
void main() {
// PS: In a real WebGL app you'd probably need to multiply a_position by
// a matrix at a minimum
gl_Position = a_position;
v_v = a_v;
}
</script>
<script id="fs" type="foo">
precision mediump float;
varying float v_v;
uniform float u_borderSize;
uniform vec4 u_baseColor;
uniform vec4 u_borderColor;
void main() {
float mixAmount = step(u_borderSize, v_v);
gl_FragColor = mix(u_baseColor,
u_borderColor,
mixAmount);
}
</script>
<canvas id="c" width="256" height="256"></canvas>
This is my attempt:
<script>
var vertexShader = `
varying vec2 vUv;
attribute float alpha;
varying float vAlpha;
attribute vec3 center;
varying vec3 vCenter;
void main() {
vUv = uv;
vAlpha = alpha;
vCenter = center;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
`;
var fragmentShader = `
//precision mediump float;
uniform vec3 color1;
uniform float color1Alpha;
uniform vec3 borderColor;
uniform float borderThickness;
varying vec2 vUv;
varying float vAlpha;
varying vec3 vCenter;
void main() {
float u_borderSize = borderThickness;
vec4 u_baseColor = vec4(color1, color1Alpha);
vec4 u_borderColor = vec4(borderColor, 1.0);
// This is where I get stuck. No idea how to get a correct 'v_v' value.
float v_v = vUv.y;
float mixAmount = step(u_borderSize, v_v);
gl_FragColor = mix(u_baseColor,
u_borderColor,
mixAmount);
}
`;
</script>
See the jsfiddle here: https://jsfiddle.net/vuqarejz/
This is what I'm trying to achieve:
Questions:
Not sure what I'm missing, nor where the 'v_a' attribute is coming from.
Maybe it is the way UVs are unwrapped?
Maybe this is not the correct shader to use for my purpose?
Solved by tfoller in ThreeJS forum:
https://discourse.threejs.org/t/unwrapping-a-custom-shapebuffergeometry/39565/4
I want to render a texture to a plane using custom shaders. This texture has an 'offset' property set, that works correctly when I use a standard threejs material. However, I cannot figure out how to access these offsets in my custom fragment shader. It simply renders the whole texture over the whole plane:
shaders:
<script id="vertex_shader" type="x-shader/x-vertex">
varying vec2 vUv;
void main() {
vUv = uv;
gl_Position = projectionMatrix *
modelViewMatrix *
vec4(position,1.0);
}
</script>
<script id="fragment_shader" type="x-shader/x-fragment">
uniform sampler2D texture1;
varying vec2 vUv;
void main()
{
gl_FragColor = texture2D(texture1, vUv);
}
</script>
if I could somehow say something like:
gl_FragColor = texture2D(texture1, vUv + texture1.offset);
? Maybe that would work. But obviously that throws an error.
UPDATE:
so I sent the texture offset in as a uniform and that works. Dont know why I didn't think of that.
If I understand your question correctly, then the answer should be to add and use the uniform mat3 uvTransform; uniform to your fragment shader.
THREE will look for and populate that uniform with the texture transformation (which includes texture1.offset), when rendering the texture onto your geometry.
You should be able to access and extract the data supplied to texture1.offset to offset your texture sampling as follows:
<script id="vertex_shader" type="x-shader/x-vertex">
varying vec2 vUv;
void main() {
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position,1.0);
}
</script>
<script id="fragment_shader" type="x-shader/x-fragment">
// [UPDATE] The uv offset uniform we defined below
uniform vec2 uvOffset;
// [UPDATE] The texture uniform we defined below
uniform sampler2D texture;
varying vec2 vUv;
void main()
{
// [UPDATE] Apply offset to texture lookup
gl_FragColor = texture2D(texture, vUv + uvOffset);
}
</script>
You would then accompany the vertex and fragment shaders above, with the following THREE.ShaderMaterial:
<script>
var material = new THREE.ShaderMaterial({
uniforms: THREE.UniformsUtils.merge([
{
//Declare texture uniform in shader
texture: { type: 't', value: null },
//Declare texture offset in shader
uvOffset : { type : 'v', value : new THREE.Vector2(0,0) }
}
]),
vertexShader:
document.getElementById('vertexshader').textContent,
fragmentShader:
document.getElementById('fragmentshader').textContent
});
// Shader uniforms can be updated like so
material.uniforms.map.value = yourTexture;
material.uniforms.uvOffset.value = yourTextureOffsetVector2;
</script>
I want to see a wireframe of an object without the diagonals like
Currently, I add lines according to the vertices, the problem is after I have several of those I experience a major performance degradation.
The examples here are either too new for my version of Three or don't work (I commented there about it).
So I want to try to implement a shader instead.
I tried to use this shader: https://stackoverflow.com/a/31610464/4279201 but it breaks the shape to parts and I'm getting WebGL errors.
That's how I use it:
const vertexShader = `
varying vec2 vUv;
void main() {
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position,1.0);
}
`
const fragmentShader = `
#version 150 compatibility
flat in float diffuse;
flat in float specular;
flat in vec3 edge_mask;
in vec2 bary;
uniform float mesh_width = 1.0;
uniform vec3 mesh_color = vec3(0.0, 0.0, 0.0);
uniform bool lighting = true;
out vec4 frag_color ;
float edge_factor(){
vec3 bary3 = vec3(bary.x, bary.y, 1.0 - bary.x - bary.y);
vec3 d = fwidth(bary3);
vec3 a3 = smoothstep(vec3(0.0, 0.0, 0.0), d * mesh_width, bary3);
a3 = vec3(1.0, 1.0, 1.0) - edge_mask + edge_mask * a3;
return min(min(a3.x, a3.y), a3.z);
}
void main() {
float s = (lighting && gl_FrontFacing) ? 1.0 : -1.0;
vec4 Kdiff = gl_FrontFacing ?
gl_FrontMaterial.diffuse : gl_BackMaterial.diffuse;
float sdiffuse = s * diffuse;
vec4 result = vec4(0.1, 0.1, 0.1, 1.0);
if (sdiffuse > 0.0) {
result += sdiffuse * Kdiff +
specular * gl_FrontMaterial.specular;
}
frag_color = (mesh_width != 0.0) ?
mix(vec4(mesh_color, 1.0), result, edge_factor()) :
result;
}`
...
const uniforms = {
color: {
value: new THREE.Vector4(0, 0, 1, 1),
type: 'v4'
}
}
const material = new THREE.ShaderMaterial({
fragmentShader: data.fragmentShader,
vertexShader: data.vertexShader,
uniforms
})
this._viewer.impl.matman().addMaterial(
data.name, material, true)
const fragList = this._viewer.model.getFragmentList()
this.toArray(fragIds).forEach((fragId) => {
fragList.setMaterial(fragId, material)
})
So to implement this shader, is the right approach would be to basically check the angle between every two vertices, and draw a line if the degree is 90?
How can I have access to all the vertices of the shape from the vertex shader?
And how can I tell the fragment shader to draw a line between two vertices that match the above condition? (also to leave the default shading for everything else as is)
I'm using Autodesk viewer that uses Three.js rev 71.
// -- Vertex Shader --
precision mediump float;
// Input from buffers
attribute vec3 aPosition;
attribute vec2 aBaryCoord;
// Value interpolated accross pixels and passed to the fragment shader
varying vec2 vBaryCoord;
// Uniforms
uniform mat4 uModelMatrix;
uniform mat4 uViewMatrix;
uniform mat4 uProjMatrix;
void main() {
vBaryCoord = aBaryCoord;
gl_Position = uProjMatrix * uViewMatrix * uModelMatrix * vec4(aPosition,1.0);
}
// ---------------------
// -- Fragment Shader --
// This shader doesn't perform any lighting
precision mediump float;
varying vec2 vBaryCoord;
uniform vec3 uMeshColour;
float edgeFactor() {
vec3 d = fwidth(vBaryCoord);
vec3 a3 = smoothstep(vec3(0.0,0.0,0.0),d * 1.5,vBaryCoord);
return min(min(a3.x,a3.y),a3.z);
}
void main() {
gl_FragColor = vec4(uMeshColour,(1.0 - edgeFactor()) * 0.95);
}
// ---------------------
/*
This code isn't tested so take it with a grain of salt
Idea taken from
http://codeflow.org/entries/2012/aug/02/easy-wireframe-display-with-barycentric-coordinates/
*/
My vertex shader:
varying vec2 texCoord;
void main() {
texCoord = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
My fragment shader:
varying vec2 texCoord;
uniform sampler2D texture1;
uniform sampler2D texture2;
uniform float multiplier;
void main( void ) {
vec3 tex1 = texture2D(texture1, texCoord).xyz ;
vec3 tex2 = texture2D(texture2, texCoord).xyz ;
vec3 finaltex = mix( tex1, tex2, multiplier) ;
gl_FragColor = vec4(finaltex , 1.0);
}
Now this work very well when i run using the two texture.check http://shaderfrog.com/app/view/68 for the multiplier action.
But now what I want is I am having texture like this:
So with the single texture I want to index the offset of my texCoord so that I just need to sample one texture and I can get three representation form this like:
var shaderMaterial = new THREE.ShaderMaterial({
uniforms:{
texture1: { type: "t", value: texture1 }
},
vertexShader: document.getElementById( 'vertexShader' ).textContent,
fragmentShader: document.getElementById( 'fragmentShader' ).textContent
});
Can offset my tri-color in the fragment shader. or Some one can help me modifying the fragment shader so that I can pass uniform to index my tri-color into individual Yellow,pink,red.
So either from shader or threejs I can get help regarding same.
I have done reference using two texture because I want to interpolate with cross fade effect on the texture same I want cross fade using fragment shader (independent to this I have already achieve by the texture.offset.x = currentColoum / horizontal and texture.offset.y = currentRow / Vertical;
I found the answer to this question, even implemented into application :D.
vertexShader:
void main() {
texCoord = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
FragmentShader:
varying vec2 texCoord;
uniform sampler2D texture;
uniform vec2 offset_current;
uniform vec2 offset_next;
uniform float multiplier;
void main( void ) {
vec2 offset1 = offset_current + vec2( texCoord.x* 1.0, texCoord.y * 0.333333);
vec2 offset2 = offset_next + vec2( texCoord.x* 1.0, texCoord.y * 0.333333);
vec3 tex1 = texture2D(texture1,offset1).xyz ;
vec3 tex2 = texture2D(texture1, offset2).xyz ;
vec3 mixCol = mix( tex1, tex2, multiplier );
vec4 fragColor = vec4(mixCol, 1.0);
if(fragColor.a == 0.0)
discard;
gl_FragColor = fragColor;
}
Explanation:
Since I am having the vertical texture of three different type I make my offset into y direction 0.3333. Because texture is read from [0,1]. I have extended this code same for the horizontal direction.
If some one going to make this dynamic then instead of hard coded we can pass the 0.3333 as the calculate one taking the inspiration form link.
Hi can anyone help me whith this? I have this shader, it works with THREE.Mesh but doesn't with THREE.Particlesystem?
I want each particle to have a portion of a given map(texture) and change their positions accordingly, something like this http://www.chromeexperiments.com/detail/webcam-displacement/?f=webgl
<script id="vs" type="x-shader/x-vertex">
uniform sampler2D map;
varying vec2 vUv;
void main() {
vUv = uv;
vec4 color = texture2D( map, vUv );
float value = ( color.r + color.g + color.b ) / 3.0;
vec4 pos = vec4( position.xy, value * 100.0, 1.0 );
gl_PointSize = 20.0;
gl_Position = projectionMatrix * modelViewMatrix * pos;
}
</script>
<script id="fs" type="x-shader/x-fragment">
uniform sampler2D map;
varying vec2 vUv;
void main() {
gl_FragColor = texture2D( map, vUv );
}
</script>
ParticleSystem doesn't really support UVs as there aren't faces, just single points. Texture mapping particles is done with gl_PointCoord (IIRC), but that gives you same mapping for every particle. In order to give different portion of the same texture to each particle, you should use BufferGeometry, which in the latest version of three.js supports all attributes including custom ones (and it is very efficient and fast!). You'd then supply a vec2 offset attribute for each particle: you get the correct UV from this offset and the gl_PointCoord.