music visualization with webgl and three.js - three.js

Can you help me? I try to make music visualization http://webmaster9.ru/freelance/mysicusa/ it based on http://iacopoapps.appspot.com/hopalongwebgl/
I have signal (level of music) but cant change visual effects according music. It use
var materials = new THREE.ParticleBasicMaterial( { size: (3 ), map: sprite1, blending: THREE.AdditiveBlending, depthTest: false, transparent : true } );
materials.color.setHSV(hueValues[s], DEF_SATURATION, DEF_BRIGHTNESS);
var particles = new THREE.ParticleSystem( geometry, materials );
particles.myMaterial = materials;
1 Can I change opacity or brightness according music? Probably using controls? can you show me examples?
2 Can I change texture according music?
2 Can I add light and change light according music?
Thank you

For sure you can get the generated audio data from DOM and use in WebGL javascript part to handle values or even use THREE.js to handle those values.
Just take a look how web-audio works

Related

Why transparency doesn't work on my mesh?

I have two meshes on my scene. One cylinder and one classic plane in the middle.
I applied a png texture on my cylinder so we can see through. It seems to work for the cylinder.
On this screenshot you'll easily see my issue : I don't understand why my image is not visible behind my cylinder.
The code I used for my cylinder :
myCylinderMesh.material.transparent = true;
myCylinderMesh.material.side = THREE.DoubleSide;
How can I manage to see the part of the image hidden behind the cylinder ?
EDIT 1 :
I added the code that #ScieCode sent me :
myCylinderMesh.material.alphaTest = 0.5;
Here's the result :
It works better : now I can see the part missing of my image. But there's one thing missing : the opacity of my cylinder. I'm supposed to see my image behind the letters too.
Currently I have this opacity :
myCylinderMesh.material.opacity = 0.7;
Do you know what I am missing ? Thanks
EDIT 2 :
Here's the code for my two meshes :
Cylinder :
geoCylinder = new THREE.CylinderBufferGeometry( 0.4, 0.4, 2*Math.PI*0.4/(2048/128), 64, 1, true );
matCylinder = new THREE.MeshBasicMaterial( { map:texture, transparent:true, color:0x000000, alphaTest: 0.5, opacity: 0.6, side: THREE.DoubleSide } );
meshCylinder = new THREE.Mesh( geoCylinder, matCylinder );
Plane :
geoPlane = new THREE.PlaneBufferGeometry( 0.8, 0.8 );
matPlane = new THREE.MeshBasicMaterial( { map: texturePlane, transparent:true} );
meshPlane = new THREE.Mesh( geoPlane, matPlane );
This behavior happens because of how transparency rendering works internally. Transparent objects need to be sorted/rendered separately from opaque objects. This assures that objects will render as expected on the final image. (not always, though)
The problem here is that your plane geometry is inside the cylinder geometry, when sorting it will either be rendered first or later. Which, in turn, causes these artifacts you are experiencing here. The whole transparency render is a lot more complex than what I'm making it to be.
Since your plane object doesn't need to be translucent, you can simply set the alphaTest property of its material. Which will only render the fragment pixels with alpha greater than this value. This will also prevent that object from being considered transparent and it will always be rendered first, fixing the artifacts in your scene.
JSFiddle
Additional info: When using a transparent material with DoubleSide, you might experience self transparency problems. This happens for the same reason I just explained, but between faces of the same object. A possible solution for this problem is to set depthWrite = false, this prevent the object from writing to the depth buffer. So every face will get rendered, disregarding if another face occludes it.

How to scale/resize Mesh Outline?

In my current project, I need a way to outline a mesh.This color outline will represent the object's current state, relevant for me.
The problem is that it is a custom mesh, loaded using JSONLoader.
I've tried different approaches, following (mainly) these 2 examples: https://stemkoski.github.io/Three.js/Outline.html and
THREEx.geometricglow. In both cases, I scale the mesh outline to a bit bigger than the original. My main problem is that scaling equally in all axis will not cover my object the way I intended to.
Here is the code I'm using:
var outlineMaterial2 = new THREE.MeshBasicMaterial( { color: 0x00ff00, side: THREE.BackSide, transparent: true, opacity:0.5 } );
var outlineMesh2 = new THREE.Mesh( object.geometry, outlineMaterial2 );
outlineMesh2.position.copy(object.position);
outlineMesh2.scale.copy(object.scale);
outlineMesh2.scale.multiplyScalar(1.1)
scene.add( outlineMesh2 );
`
With a simple cube mesh, the outline will be good.
But with my custom mesh, the scale will not fit the shape correctly.
Here is a image demonstrating: http://s13.postimg.org/syujtd75z/print1.png
Also, using Stemkoski approach, the outlining mesh will also show in front of the object, not just outline (as seen in the above picture).
My question is: How should I resize the mesh? For what I know, it might have something to do with face normals.
Thanks for your time.

Three js crossfade three backgrounds continously

I'm currently working on my first three js project, and getting quite an education. But, I've hit a wall, and am seeking a generalized outline of what to do.
I have three images that I want to use as background images. I want them to crossfade at a specified interval... let's say every 5 seconds, the background crossfades to the next one. After the last background is displayed, crossfade into the first one, and so forth in a loop.
I've found a few examples where there's crossfading between two objects, like this fiddle, but that seems to depend on having two cameras. I've taken other examples I've found as far as I could, nothing worthy of posting.
I don't understand enough about three, which is why I'm seeking help. If someone could help me define my approach, that would be fantastic. Should I be altering the opacity of my meshes? Doing something with shaders? Something else?
Here, at least, is how I'm adding one background:
camera = new THREE.PerspectiveCamera( 75, SCREEN_WIDTH / SCREEN_HEIGHT, 1, 10000 );
camera.position.z = 450;
scene = new THREE.Scene();
// Load the background texture
var summerTexture = THREE.ImageUtils.loadTexture( 'tree-animation/images/summer.png' );
summerMesh = new THREE.Mesh(
new THREE.PlaneGeometry(2, 2, 0),
new THREE.MeshBasicMaterial({
map: summerTexture,
}));
summerMesh.material.depthTest = false;
summerMesh.material.depthWrite = false;
backgroundCamera = new THREE.Camera();
summerScene = new THREE.Scene();
summerScene.add(backgroundCamera);
summerScene.add(summerMesh);
Any direction would be most appreciated!
This can be achieved by writing a custom shader and using the mix() or smooth-step() function between the images and add a clock to your render loop to update the shader uniforms to manipulate the transition in your shader over time.
Here is an example of a static blend of textures but can easily intergrated into your own project:
http://stemkoski.github.io/Three.js/Shader-Heightmap-Textures.html
check the frag shader

In three.js, when we need PointCloud.sortParticles enabled?

I found some of the examples enable this property, but there're not enough comments to make me understand what it matters.
Could you please tell me what's the best practice of this property?
EDIT: PointCloud is now Points, and the .sortParticles property has been removed. That means the points are rendererd in the order they exist in the buffer. The answer below is outdated. three.js r.73
This is a perfect example of why it is important to understand at least the basic concepts of webGL if you are using three.js. (See this SO post.)
You need to set PointCloud.sortParticles = true if the rendering order matters.
Here is an example where the rendering order matters:
var material = new THREE.PointCloudMaterial( {
map: texture, // has transparent areas
transparent: true
} );
var pointCloud = new THREE.PointCloud( geometry, material );
pointCloud.sortParticles = true; // the default is false
In this case, the renderer will sort the points by depth, so points further from the camera are rendered first, and show through the transparent areas of the closer ones.
Here is an example where rendering order does not matter:
var material = new THREE.PointCloudMaterial( {
map: texture,
blending: THREE.AdditiveBlending,
depthTest: false,
transparent: true
} );
// point cloud
var pointCloud = new THREE.PointCloud( geometry, material );
Since sorting occurs on the CPU-side, it is best to choose your material settings wisely, so you can avoid the overhead -- especially for large systems.
I suggest you build a testbed and experiment. There are many, many possible cases to consider.
three.js r.69

Three.js custom objLoader geometry lighting

I have this object I'm loading with THREE.objLoader and then create a mesh with it like so:
mesh = new THREE.SceneUtils.createMultiMaterialObject(
geometry,
[
new THREE.MeshBasicMaterial({color: 0xFEC1EA}),
new THREE.MeshBasicMaterial({
color: 0x999999,
wireframe: true,
transparent: true,
opacity: 0.85
})
]
);
In my scene I then add a DirectionalLight, it works and I can see my object, however it's like the DirectionalLight was an ambient one. No face is getting darker or lighter as it should be.
The object is filled with the color, but no lighting is applied to it.
If someone can help me with that it would be much appreciated :)
What could I be missing ?
Jsfiddle here: http://jsfiddle.net/5hcDs/
Ok folks, thanks to Maƫl Nison and mr doob I was able to understand the few things I was missing, being the total 3d noob that I am... I believe people starting to get into the 3d may find useful a little recap:
Basic 3d concepts
A 3d Face is made of some points (Vertex), and a vector called a normal, indicating the direction of the face (which side is the front and which one is the backside).
Not having normals can be really bad, because lighting is applied on the frontside only by default. Hence the black model when trying to apply a LambertMaterial or PhongMaterial.
An OBJ file is a way to describe 3D information. Want more info on this? Read this wikipedia article (en). Also, the french page provides a cube example which can be useful for testing.
Three.js tips and tricks
When normals are not present, the lighting can't be applied, hence the black model render. Three.js can actually compute vertex and face normals with geometry.computeVertexNormals() and/or geometry.computeFaceNormals() depending on what's missing
When you do so, there's a chance Three.js' normal calculation will be wrong and your normals will be flipped, to fix this you can simply loop through your geometry's faces array like so:
/* Compute normals */
geometry.computeFaceNormals();
geometry.computeVertexNormals();
/* Next 3 lines seems not to be mandatory */
mesh.geometry.dynamic = true
mesh.geometry.__dirtyVertices = true;
mesh.geometry.__dirtyNormals = true;
mesh.flipSided = true;
mesh.doubleSided = true;
/* Flip normals*/
for(var i = 0; i<mesh.geometry.faces.length; i++) {
mesh.geometry.faces[i].normal.x = -1*mesh.geometry.faces[i].normal.x;
mesh.geometry.faces[i].normal.y = -1*mesh.geometry.faces[i].normal.y;
mesh.geometry.faces[i].normal.z = -1*mesh.geometry.faces[i].normal.z;
}
You have to use a MeshPhongMaterial. MeshBasicMaterial does not take light in account when computing fragment color.
However, when using a MeshPhongMaterial, your mesh becomes black. I've never used the OBJ loader, but are you sure your model normales are right ?
Btw : you probably want to use a PointLight instead. And its position should probably be set to the camera position (light.position = camera.position should do the trick, as it will allow the light to be moved when the camera position will be edited by the Controls).

Resources