three.js emissive material maps - three.js

I'm currently experimenting a bit in three.js, and I'd like to use an emissive map. I've tried just loading a texture into the emissive property of a phong material, but it doesn't work like that, unfortunately. Here's my code:
var params = {
emissive: THREE.ImageUtils.loadTexture( emissive ),
shininess: shininess,
map: THREE.ImageUtils.loadTexture( map ),
normalMap: THREE.ImageUtils.loadTexture( normalMap ),
normalScale: new THREE.Vector2(0,-1),
envMap: this.reflectionCube,
combine: THREE.MixOperation,
reflectivity: 0.05
};
var material = new THREE.MeshPhongMaterial(params);
Can anyone point me in the right direction to get the emissive map working?

You can make a material with emissive (glow) map support by extending the shaders from existing three.js materials (MeshPhong, MeshLambert, etc).
The benefit is you retain all the functionality from the standard three.js material and also add glow map support.
For the purposes of this example, I'll use the three.js Phong shader as a starting point:
Make a "PhongGlowShader" by extending (via UniformsLib/ShaderChunk) the existing Phong shader
Add glow map uniforms:
"glowMap" : { type: "t", value: null },
"glowIntensity": { type: "f", value: 1 },
Add a glow map factor to its fragment shader:
float glow = texture2D(glowMap, vUv).x * glowIntensity * 2.0; // optional * 2.0 and clamp
gl_FragColor.xyz = texelColor.xyz * clamp(emissive + totalDiffuse + ambientLightColor * ambient + glow, 0.0, 2.0) + totalSpecular;
Create a new THREE.ShaderMaterial using that shader, and pass the glow texture along with its usual uniforms
For further details, take a look at this fiddle: http://jsfiddle.net/Qr7Bb/2/.
You'll notice I made a "MeshPhongGlowMaterial" class that inherits from THREE.ShaderMaterial. That's purely optional; you can also just have a function that creates a new THREE.ShaderMaterial with the above shaders and uniforms.
The standard "emissive" property affects the entire surface of the mesh, it has nothing to do with the glow map (instead use the custom "glowIntensity" property for that).

Related

Applying Alpha Map to model using material causes model to show up as white

Attempting to use an alpha map to reveal reveal an underlying model according to the alpha maps. Currently the alpha map overrides the model texture and shows as all white.
I've tried loading the alpha map alone and applying it to a material which then gets attached to the object --> Shows the relevant alpha map as white, does not reveal the underlying model as expected
I've tried loading up the alpha map and also the texture into a MeshBasicMaterial --> Shows the full texture with none of it transparent
let material = new THREE.MeshBasicMaterial({
// map: iceTexture,
color: '0xffffff',
transparent : false,
side: THREE.DoubleSide,
alphaTest: 0.5,
alphaMap: this.alphaMaps[0]
});
The current result is that the alpha map shows as all white --> Current Output
This is the underlying texture I expect to be showing (just where the alpha map allows it to) --> Underlying Texture
NOTE: I am currently not using any shaders, the underlying model is a simple glb with the ice texture above applied
NOTE 2: In this answer It says to add in a second object behind the first object...that does not work, it just shows the object on top with no transparency applied
I think the line that's messing you up is transparent: false, when it should be transparent: true. I just tried the code below (click here for a live CodePen demo), and the transparency works as expected. I also don't think you need alphaTest: 0.5, since it seems you have an animation sequence that moves the gradient.
In the demo, I use this image as the alphaMap:
... and this image as the regular map:
The essence of the code is below:
// Load the textures
const texLoader = new THREE.TextureLoader();
var alphaTexture = texLoader.load("https://i.imgur.com/aH0jI5N.png");
var mapTexture = texLoader.load("https://i.imgur.com/qdWJkbc.jpg");
// Create geometry
var geometry = new THREE.PlaneBufferGeometry(10, 10, 10, 10);
// Create a basic material
var material = new THREE.MeshBasicMaterial({
map: mapTexture,
alphaMap: alphaTexture,
transparent: true
});
var plane = new THREE.Mesh( geometry, material );

Color a tetrahedron in three.js

In three.js, I'm trying to draw a tetrahedron using THREE.TetrahedronGeometry where each face is a different color. When I use MeshNormalMaterial, each vertex has a different color but the faces are color gradients between the vertexes. This works for a BoxGeometry, but not for TetrahedronGeometry.
I tried using PhongMaterial with shading: THREE.FlatShading but that just gives me black or white faces.
I tried writing my own ShaderMaterial and in the fragment material, I color using the normal vector, but that also gets the gradient affect.
I'm sure I'm missing something obvious, but can't see it...
For versions of three.js prior to r125*, this is how you do it:
var geo = new THREE.TetrahedronGeometry(sphereRadius, 0);
for ( var i = 0; i < geo.faces.length; i ++ ) {
geo.faces[ i ].color.setHex( Math.random() * 0xffffff );
}
var material = new THREE.MeshBasicMaterial({
side: THREE.DoubleSide,
shading: THREE.FlatShading,
vertexColors: THREE.VertexColors
})
var mesh = new THREE.Mesh( geo, material );
So you need THREE.FlatShader, THREE.VertexColors, and then you need to assign the face colors.
For later versions, see how to render a tetrahedron with different texture on each face (using three.js)?.
* THREE.Geometry will be removed from core with r125

Rotating a texture using PointCloudMaterial

When I used CanvasRenderer and SpriteMaterial, I was able to set a texture's rotation using the rotation attribute in the material. So, say for instance the texture is a cone, and I want to rotate it by 180 degrees:
material = new THREE.SpriteMaterial({
map : texture,
transparent : true,
rotation : Math.PI
});
But that doesn't seem to work with PointCloudMaterial in the WebGLRenderer. For example:
material = new THREE.PointCloudMaterial({
depthWrite : true,
alphaTest : 0.1,
map : texture,
transparent : true,
vertexColors : THREE.VertexColors,
rotation : Math.PI
});
So how can I go about rotating a texture with PointCloudMaterial and a PointCloud mesh? Note that in both instances, the texture is loaded as a base64 string, as follows:
var image = document.createElement('img');
var texture = new THREE.Texture(image);
image.src = /* The base64 string */
Thanks so much!
I ended up using the canvas to do this, following the pattern described at Three.js Rotate Texture. This pretty much worked like a charm.

Three.js use framebuffer as texture

I'm using an image in a canvas element as a texture in Three.js, performing image manipulations on the canvas using JavaScript, and then calling needsUpdate() on the texture. This works, but it's quite slow.
I'd like to perform the image calculations in a fragment shader instead. I've found many examples which almost do this:
Shader materials: http://mrdoob.github.io/three.js/examples/webgl_shader2.html This example shows image manipulations performed in a fragment shader, but that shader is functioning as the fragment shader of an entire material. I only want to use the shader on a texture, and then use the texture as a component of a second material.
Render to texture: https://threejsdoc.appspot.com/doc/three.js/examples/webgl_rtt.html This shows rendering the entire scene to a WebGLRenderTarget and using that as the texture in a material. I only want to pre-process an image, not render an entire scene.
Effects composer: http://www.airtightinteractive.com/demos/js/shaders/preview/ This shows applying shaders as a post-process to the entire scene.
Edit: Here's another one:
Render to another scene: http://relicweb.com/webgl/rt.html This example, referenced in Three.js Retrieve data from WebGLRenderTarget (water sim), uses a second scene with its own orthographic camera to render a dynamic texture to a WebGLRenderTarget, which is then used as a texture in the primary scene. I guess this is a special case of the first "render to texture" example listed above, and would probably work for me, but seems over-complicated.
As I understand it, ideally I'd be able to make a new framebuffer object with its own fragment shader, render it on its own, and use its output as a texture uniform for another material's fragment shader. Is this possible?
Edit 2: It looks like I might be asking something similar to this: Shader Materials and GL Framebuffers in THREE.js ...though the question doesn't appear to have been resolved.
Render to texture and Render to another scene as listed above are the same thing, and are the technique you want. To explain:
In vanilla WebGL the way you do this kind of thing is by creating a framebuffer object (FBO) from scratch, binding a texture to it, and rendering it with the shader of your choice. Concepts like "scene" and "camera" aren't involved, and it's kind of a complicated process. Here's an example:
http://learningwebgl.com/blog/?p=1786
But this also happens to be essentially what Three.js does when you use it to render a scene with a camera: the renderer outputs to a framebuffer, which in its basic usage goes straight to the screen. So if you instruct it to render to a new WebGLRenderTarget instead, you can use whatever the camera sees as the input texture of a second material. All the complicated stuff is still happening, but behind the scenes, which is the beauty of Three.js. :)
So: To replicate a WebGL setup of an FBO containing a single rendered texture, as mentioned in the comments, just make a new scene containing an orthographic camera and a single plane with a material using the desired texture, then render to a new WebGLRenderTarget using your custom shader:
// new render-to-texture scene
myScene = new THREE.Scene();
// you may need to modify these parameters
var renderTargetParams = {
minFilter:THREE.LinearFilter,
stencilBuffer:false,
depthBuffer:false
};
myImage = THREE.ImageUtils.loadTexture( 'path/to/texture.png',
new THREE.UVMapping(), function() { myCallbackFunction(); } );
imageWidth = myImage.image.width;
imageHeight = myImage.image.height;
// create buffer
myTexture = new THREE.WebGLRenderTarget( width, height, renderTargetParams );
// custom RTT materials
myUniforms = {
colorMap: { type: "t", value: myImage },
};
myTextureMat = new THREE.ShaderMaterial({
uniforms: myUniforms,
vertexShader: document.getElementById( 'my_custom_vs' ).textContent,
fragmentShader: document.getElementById( 'my_custom_fs' ).textContent
});
// Setup render-to-texture scene
myCamera = new THREE.OrthographicCamera( imageWidth / - 2,
imageWidth / 2,
imageHeight / 2,
imageHeight / - 2, -10000, 10000 );
var myTextureGeo = new THREE.PlaneGeometry( imageWidth, imageHeight );
myTextureMesh = new THREE.Mesh( myTextureGeo, myTextureMat );
myTextureMesh.position.z = -100;
myScene.add( myTextureMesh );
renderer.render( myScene, myCamera, myTexture, true );
Once you've rendered the new scene, myTexture will be available for use as a texture in another material in your main scene. Note that you may want to trigger the first render with the callback function in the loadTexture() call, so that it won't try to render until the source image has loaded.

textures without color distortion through shading/lighting needed

I want to use a map/texture which isn't affected by lighting/shading. Which Three.js shader should I choose? In other 3D programs, you would probably use a map on the ambient color.
You need to use MeshBasicMaterial. It is not affected by lights.
material = new THREE.MeshBasicMaterial( { color: 0xffffff, map: texture } );

Resources