three.js shadermaterial not working at all - three.js

I've been fiddling around with three.js a bit, and im stumped when it comes to THREE.ShaderMaterial. it started out with copy/pasting directly from the example, shown here: http://threejs.org/examples/#webgl_materials_normalmap
i copied it into a function that just returns the material. it wouldnt work (error, i'll get to the specifics later), so i removed all the uniforms that had been set and went for a complete blank material. just to see if the same error would still show.
so here's my code:
var testmaterial = function(params){
var shader = THREE.ShaderLib[ "normalmap" ];
var uniforms = THREE.UniformsUtils.clone( shader.uniforms );
var parameters = { fragmentShader: shader.fragmentShader, vertexShader: shader.vertexShader, uniforms: uniforms};
var material = new THREE.ShaderMaterial( parameters );
return material;
};
nothing fancy you'd say, and i'd agree, however, the browser seems to disagree. here's the error i just cant seem to get rid of:
error X6077: texld/texldb/texldp/dsx/dsy instructions with r# as source cannot be used inside dynamic conditional 'if' blocks, dynamic conditional subroutine calls, or loop/rep with break*.
does anyone have the slightest clue on what i'm doing wrong? any help would be greatly appreciated.

The shader you chose, namely normalmap, requires some input uniforms to be set.
If you look at https://github.com/mrdoob/three.js/blob/r68/src/renderers/shaders/ShaderLib.js#L595 you will see the variables that are null:
"tDisplacement": { type: "t", value: null },
"tDiffuse" : { type: "t", value: null },
"tCube" : { type: "t", value: null },
"tNormal" : { type: "t", value: null },
"tSpecular" : { type: "t", value: null },
"tAO" : { type: "t", value: null },
So either you need to set these or since you are just fiddling around try another simple shader that does not require inputs. Seems that most others do not require input uniforms to be set.
EDIT:
You also need to compute the models' tangents.
But for that you need to use a different pattern.
var geometry = new THREE.SphereGeometry(100, 50, 50);
geometry.computeTangents();
var material = myShaderMaterial({
//enableAO : 0,
enableDiffuse : 1,
//enableSpecular : 0,
//enableReflection : 0,
enableDisplacement : 1,
tDisplacement : THREE.ImageUtils.loadTexture('textures/planets/earthbump1k.jpg'),
tDiffuse : THREE.ImageUtils.loadTexture('textures/planets/earthbump1k.jpg'),
//tCube : planet.maps.planet.
tNormal : THREE.ImageUtils.loadTexture('textures/planets/earthbump1k.jpg'),
//tSpecular : planet.maps.planet.
//tAO : planet.maps.planet.
});
var mesh = new THREE.Mesh(geometry, material);

Related

A-Frame: Geometry Caching / Registering new Geometry from GLTF

I'm hoping to get some help using geometry instancing with A-Frame. I was trying to figure out the bottleneck for my web app and after implementing pooling for physics objects being created in the scene, saw that the number of draw calls was increasing with each new object -- I had thought that by utilizing the asset management system in A-Frame my models were automatically cached, but I think I was mistaken.
I was wondering, if I register the geometry of the model using AFRAME.registerGeometry, would I be able to utilize geometry instancing? I saw that creating from a pool of object using the A-Frame geometry primitives did not increase the geometry count of the scene on a per-entity basis. I took a shot at loading my GLTF and registering the geometry from the mesh, but I'm getting an error from a-node that I don't understand:
<script>
AFRAME.registerGeometry('ramshorn', {
schema: {
depth: {default: 1, min: 0},
height: {default: 1, min: 0},
width: {default: 1, min: 0},
},
init: function(data) {
var model = null;
var geometry = null;
var manager = new THREE.LoadingManager();
manager.onLoad = function () {
console.log(geometry);
this.geometry = geometry;
console.log(this.geometry);
}
var gltfLoader = new THREE.GLTFLoader(manager);
gltfLoader.setCrossOrigin('anonymous');
const src = "./assets/ramsHorn/Ram's Horn 2.gltf";
gltfLoader.load(src, function ( gltf ) {
console.log("Gltf: " + gltf);
model = gltf.scene;
console.log("Model: " + model)
model.children.forEach((child) => {
console.log(child);
});
gltf.scene.traverse(function (o) {
if (o.isMesh) {
geometry = o.geometry;
//console.log(geometry);
//tried assigning "this.geometry" here
}
});
}, undefined, function ( error ) {
console.error(error);
});
//tried assigning "this.geometry" here
}
});
</script>
Error:
core:a-node:error Failure loading node: TypeError: "t is undefined"
aframe-master.min.js:19:658
Any help with this would be appreciated! Thanks
The code at 19:658 in aframe-master.min.js is trying to run something with a variable t but it has not been declared.
By using aframe-master.js it would be possible to get a more meaningful error.

Achieving texture feedback with THREE.EffectComposer

Moving some old effects into alteredq's EffectComposer, and currently unsure of how to achieve texture feedback effects.
In other applications, this is a really straightforward process. I send a signal into a feedback shader (which scales/multiplies/rotates), send the signal out to another effect chain, and then back into the feedback for another pass. This hasn't been this straightforward with Three, but I've managed to make it work by ping-ponging the fbos.
I'm interested in streamlining this by using EffectComposer, but finding it difficult to recreate the effect.
Currently I can conceptualize the use of four composers:
First, an input composer that applies effects to the current scene, and renders to a target.
this.inputComposer = new THREE.EffectComposer(this.manager.renderer, this.rawTexture);
this.inputComposer.setSize(this.manager.width, this.manager.height);
const inputPass = new THREE.RenderPass(this.manager.scene, this.manager.camera.getCamera());
this.inputComposer.addPass(inputPass);
Second, a feedback shader that accepts the output from the inputComposer, and also an input from a subsequent effect composer.
this.feedbackComposer = new THREE.EffectComposer(this.manager.renderer, this.textureB);
this.feedbackComposer.setSize(this.manager.width, this.manager.height);
this.shader_feedback = new THREE.ShaderMaterial({
uniforms: {
fb: { value: this.textureA.texture },
feedback: { value: 0.6 },
scale: { value: 0.992 },
vPoint: { value: [0.5,0.5] }
},
vertexShader: feedback.vert,
fragmentShader: feedback.frag
})
const inputPass = new THREE.TexturePass(this.rawTexture.texture, 1.0);
const feedbackPass = new THREE.ShaderPass(this.shader_feedback, "feedback");
this.feedbackComposer.addPass(inputPass);
this.feedbackComposer.addPass(feedbackPass);
Third, an intermediate composer, responsible for adding mid-feedback shaders. This should be fed back into the feedback shader.
this.intermediateComposer = new THREE.EffectComposer(this.manager.renderer, this.intermediateTarget);
this.intermediateComposer.setSize(this.manager.width, this.manager.height);
this.shader_sharpen = new THREE.ShaderMaterial({
uniforms: {
width: { value: 0.8 }
},
vertexShader: sharpen.vert,
fragmentShader: sharpen.frag
})
const inputPass = new THREE.TexturePass(this.textureB.texture, 1.0);
const sharpenPass = new THREE.ShaderPass(this.shader_sharpen, "sharpen");
this.intermediateComposer.addPass(inputPass);
this.intermediateComposer.addPass(sharpenPass);
Fourth, a post-feedback output composer, for whatever post effects should be on top of all else.
this.finalComposer = new THREE.EffectComposer(this.manager.renderer);
this.finalComposer.setSize(this.manager.width, this.manager.height);
this.shader_chroma = new THREE.ShaderMaterial({
uniforms: {
barrelPower: { value: 0.4 },
zoom: { value: 1.0 }
},
vertexShader: barrelBlurChroma.vert,
fragmentShader: barrelBlurChroma.frag
})
const inputPass = new THREE.TexturePass(this.textureC.texture, 1.0);
const chromaPass = new THREE.ShaderPass(this.shader_chroma, "chroma");
chromaPass.renderToScreen = true;
this.finalComposer.addPass(inputPass);
this.finalComposer.addPass(chromaPass);
and then finally, in render:
this.inputComposer.render( delta );
this.feedbackComposer.render(delta);
this.intermediateComposer.render( delta );
this.finalComposer.render( delta );
(1) I am not entirely sure whether or not I'm overlooking a more elegant approach.
(2) I can't figure out for the life of me which targets need to be swapped. I know that EffectComposer has swapBuffers(), which does the same, but I'm not entirely sure where to go from here.

how to put threejs building on mapbox to its real place

currently i've load eiffel tower obj file, and render it using threejs, but how can i put the building on map to its place in real world. i use mapgox-gl-js to handle map issues, for its convenience on 3d map.
style: {
"version": 8,
"sources": {
"satellite": {
"type": "raster",
"url": "mapbox://mapbox.satellite",
"tileSize": 256
},
"canvas": {
type: 'canvas',
canvas: 'idOfMyHTMLCanvas',
// animate: true,
coordinates: [
[-74.02204952394804, 40.706782422418456],
[-73.99115047610259, 40.706782422418456],
[-73.99115047610259, 40.72021689994298],
[-74.02204952394804, 40.72021689994298]
],
contextType: 'webgl'
}
},
"layers": [{
"id": "satellite",
"type": "raster",
"source": "satellite"
}, {
"id": "video",
"type": "raster",
"source": "canvas"
}]
}
thank you for any help.
You may want to check out Threebox, which is designed to sync a Three.js scene graph with a Mapbox GL JS map.
This question is quite old, but indeed as suggested by #lauren-budorick, it took me 5 minutes to do this sample using the latest version of threebox and the result is like this
<script>
mapboxgl.accessToken = 'PASTE HERE YOUR TOKEN';
var origin = [2.294514, 48.857475];
var map = new mapboxgl.Map({
container: 'map',
style: 'mapbox://styles/mapbox/satellite-v9',
center: origin,
zoom: 18,
pitch: 60,
bearing: 0
});
map.on('style.load', function () {
map
.addLayer({
id: 'custom_layer',
type: 'custom',
renderingMode: '3d',
onAdd: function (map, mbxContext) {
window.tb = new Threebox(
map,
mbxContext,
{
defaultLights: true,
}
);
// import tower from an external glb file, downscaling it to real size
// IMPORTANT: .glb is not a standard MIME TYPE, you'll have to add it to your web server config,
// otherwise you'll receive a 404 error
var options = {
obj: '/3D/eiffel/eiffel.glb',
type: 'gltf',
scale: 0.01029,
units: 'meters',
rotation: { x: 0, y: 0, z: 0 }, //default rotation
adjustment: { x: -0.5, y: -0.5, z: 0 } // place the center in one corner for perfect positioning and rotation
}
tb.loadObj(options, function (model) {
model.setCoords(origin); //position
model.setRotation({ x: 0, y: 0, z: 45.7 }); //rotate it
tb.add(model);
})
},
render: function (gl, matrix) {
tb.update();
}
});
})
</script>
I just stumbled across this question and wanted to provide an updated answer for anyone else who ends up here. At the time the question was asked, this was not possible in Mapbox GL JS without a plugin but it can be achieved now with the CustomLayerInterface.
Here's an example of adding a Three.js model to a Mapbox GL JS map.

How to pass an array of objects as uniforms to custom shader in Three.js

I would like to pass an array of Javascript objects to my custom shader written with THREE.RawShaderMaterial but still haven't been able to figure out how to achieve this.
Specifically, I would like to manually perform the same thing as THREE.PointLightSource. I am currently achieving this by using Three.js's functionality:
var matrixUniforms = {
projectionMat: { value: new THREE.Matrix4() },
modelViewMat: { value: new THREE.Matrix4() },
normalMat: { value: new THREE.Matrix3() },
};
var material = new THREE.RawShaderMaterial( {
uniforms: THREE.UniformsUtils.merge( [
matrixUniforms,
THREE.UniformsLib[ "lights" ],
] ),
vertexShader: $( "#" + vShader ).text(),
fragmentShader: $( "#" + fShader ).text(),
lights: true,
} );
Let's say I have
var pointLights = [
{ position: new THREE.Vector3(0,0,0),
color: new THREE.Vector3(1,1,1) },
{ position: new THREE.Vector3(10,10,10),
color: new THREE.Vector3(2,2,2) }
];
How can I add it to uniforms of THREE.RawShaderMaterial? Is there any easy way to pass an array of objects that has the same properties? I may have found its solution for the older version of Three.js but couldn't find a way to do it with the latest version (r84).
EDIT:
I found this thread and this thread. However, it seems that it is not documented anywhere...

How to add reflectivity to the golden ring in Three.js?

I have an stl file of the gold ring mesh exported from the blender. When I imported it into three JS and add image texture to get the gold metal effect, the texture is not getting applied properly.
Any suggestions?
Here is the code,
loader.load( objaddress, stladdress, function ( object ) {
object.traverse( function ( child ) {
if ( child instanceof THREE.Mesh ) {
material = new THREE.MeshPhongMaterial({map: new THREE.ImageUtils.loadTexture("Gold.jpg")});
child.material = material;
}
} );
This is the jpg file url I've used:
http://cdn.designbeep.com/wp-content/uploads/2012/10/7.gold-textures.jpg
Once the geometry is loaded from the external file, use THREE js ShaderMaterial to create the material for the loaded geometry. Inside the ShaderMaterial specify the image texture along with the smoothness, noise, wrapping to get a better look and feel.
For reference: http://threejs.org/docs/#Reference/Materials/ShaderMaterial
Here is the code to create the Shader material for the geometry,
var material = new THREE.ShaderMaterial({
uniforms: {
tMatCap: {
type: 't',
value: THREE.ImageUtils.loadTexture('Gold.jpg')
},
time: {
type: 'f',
value: 0
},
bump: {
type: 'f',
value: 0
},
noise: {
type: 'f',
value: .04
},
useNormal: {
type: 'f',
value: 0
},
normalScale: {
type: 'f',
value: .5
},
normalRepeat: {
type: 'f',
value: 1
} });

Resources