Working around gl_PointSize limitations in three.js / webGL - three.js

I'm using three.js to create an interactive data visualisation. This visualisation involves rendering 68000 nodes, where each different node has a different size and color.
Initially I tried to do this by rendering meshes, but that proved to be very expensive. My current attempt is to use a three.js particle system, with each point being a node in the visualisation.
I can control the color * size of the point, but only to a certain point. On my card, the maximum size for a gl point seems to be 63. As I zoom in to the visualisation, points get larger - to a point, and then remain at 63 pixels.
I'm using a vertex & fragment shader currently:
vertex shader:
attribute float size;
attribute vec3 ca;
varying vec3 vColor;
void main() {
vColor = ca;
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
gl_PointSize = size * ( 300.0 / length( mvPosition.xyz ) );
gl_Position = projectionMatrix * mvPosition;
}
Fragment shader:
uniform vec3 color;
uniform sampler2D texture;
varying vec3 vColor;
void main() {
gl_FragColor = vec4( color * vColor, 1.0 );
gl_FragColor = gl_FragColor * texture2D( texture, gl_PointCoord );
}
These are copied almost verbatim from one of the three.js examples.
I'm totally new to GLSL, but I'm looking for a way to draw points larger than 63 pixels. Can I do something like draw a mesh for any points larger than a certain size, but use a gl_point otherwise? Are there any other work-arounds I can use to draw points larger than 63 pixels?

You can make your own point system by making arrays of unit quads + the center point then expanding by size in GLSL.
So, you'd have 2 buffers. One buffer is just a 2D unitQuad repeated for how ever many points you want to draw.
var unitQuads = new Float32Array([
-0.5, 0.5, 0.5, 0.5, -0.5, -0.5, 0.5, -0.5,
-0.5, 0.5, 0.5, 0.5, -0.5, -0.5, 0.5, -0.5,
-0.5, 0.5, 0.5, 0.5, -0.5, -0.5, 0.5, -0.5,
-0.5, 0.5, 0.5, 0.5, -0.5, -0.5, 0.5, -0.5,
-0.5, 0.5, 0.5, 0.5, -0.5, -0.5, 0.5, -0.5,
];
The second one is your points except the positions need to be repeated 4 times each
var points = new Float32Array([
p1.x, p1.y, p1.z, p1.x, p1.y, p1.z, p1.x, p1.y, p1.z, p1.x, p1.y, p1.z,
p2.x, p2.y, p2.z, p2.x, p2.y, p2.z, p2.x, p2.y, p2.z, p2.x, p2.y, p2.z,
p3.x, p3.y, p3.z, p3.x, p3.y, p3.z, p3.x, p3.y, p3.z, p3.x, p3.y, p3.z,
p4.x, p4.y, p4.z, p4.x, p4.y, p4.z, p4.x, p4.y, p4.z, p4.x, p4.y, p4.z,
p5.x, p5.y, p5.z, p5.x, p5.y, p5.z, p5.x, p5.y, p5.z, p5.x, p5.y, p5.z,
]);
Setup your buffers and attributes
var buf = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, buf);
gl.bufferData(gl.ARRAY_BUFFER, unitQuads, gl.STATIC_DRAW);
gl.enableVertexAttribArray(unitQuadLoc);
gl.vertexAttribPointer(unitQuadLoc, 2, gl.FLOAT, false, 0, 0);
var buf = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, buf);
gl.bufferData(gl.ARRAY_BUFFER, points, gl.STATIC_DRAW);
gl.enableVertexAttribArray(pointLoc);
gl.vertexAttribPointer(pointLoc, 3, gl.FLOAT, false, 0, 0);
In your GLSL shader, compute the gl_PointSize you want then multiply the unitQuad by that size in view space or screen space. Screen space would match what gl_Point does but often people want their points to scale in 3D like normal stuff in which case view space is what you want.
attribute vec2 a_unitQuad;
attribute vec4 a_position;
uniform mat4 u_view;
uniform mat4 u_viewProjection;
void main() {
float fake_gl_pointsize = 150;
// Get the xAxis and yAxis in view space
// these are unit vectors so they represent moving perpendicular to the view.
vec3 x_axis = view[0].xyz;
vec3 y_axis = view[1].xyz;
// multiply them by the desired size
x_axis *= fake_gl_pointsize;
y_axis *= fake_gl_pointsize;
// multiply them by the unitQuad to make a quad around the origin
vec3 local_point = vec3(x_axis * a_unitQuad.x + y_axis * a_unitQuad.y);
// add in the position you actually want the quad.
local_point += a_position;
// now do the normal math you'd do in a shader.
gl_Position = u_viewProjection * local_point;
}
I'm not sure that made any sense but there's more complicated but a working sample here

Can I do something like draw a mesh for any points larger than a certain size, but use a gl_point otherwise?
Not in WebGL.
You can draw your particle system as a series of quads (ie: two triangles). But that's about it.

Related

Use 2 meshes + shader materials with each a different fragment shader in 1 scene (three.js)

I have 2 meshes with each a shaderMaterial and each a different fragment shader. When I add both meshes to my scene, only one will show up. Below you can find my 2 fragment shaders (see both images to see what they look like). They're basically the same.
What I want to achieve: Use mesh1 as a mask and put the other one, mesh2 (purple blob) on top of the mask.
Purple blob:
// three.js code
const geometry1 = new THREE.PlaneBufferGeometry(1, 1, 1, 1);
const material1 = new THREE.ShaderMaterial({
uniforms: this.uniforms,
vertexShader,
fragmentShader,
defines: {
PR: window.devicePixelRatio.toFixed(1)
}
});
const mesh1 = new THREE.Mesh(geometry1, material1);
this.scene.add(mesh1);
// fragment shader
void main() {
vec2 res = u_res * PR;
vec2 st = gl_FragCoord.xy / res.xy - 0.5;
st.y *= u_res.y / u_res.x * 0.8;
vec2 circlePos = st;
float c = circle(circlePos, 0.2 + 0. * 0.1, 1.) * 2.5;
float offx = v_uv.x + sin(v_uv.y + u_time * .1);
float offy = v_uv.y * .1 - u_time * 0.005 - cos(u_time * .001) * .01;
float n = snoise3(vec3(offx, offy, .9) * 2.5) - 2.1;
float finalMask = smoothstep(1., 0.99, n + pow(c, 1.5));
vec4 bg = vec4(0.12, 0.07, 0.28, 1.0);
vec4 bg2 = vec4(0., 0., 0., 0.);
gl_FragColor = mix(bg, bg2, finalMask);
}
Blue mask
// three.js code
const geometry2 = new THREE.PlaneBufferGeometry(1, 1, 1, 1);
const material2 = new THREE.ShaderMaterial({
uniforms,
vertexShader,
fragmentShader,
defines: {
PR: window.devicePixelRatio.toFixed(1)
}
});
const mesh2 = new THREE.Mesh(geometry2, material2);
this.scene.add(mesh2);
// fragment shader
void main() {
vec2 res = u_res * PR;
vec2 st = gl_FragCoord.xy / res.xy - 0.5;
st.y *= u_res.y / u_res.x * 0.8;
vec2 circlePos = st;
float c = circle(circlePos, 0.2 + 0. * 0.1, 1.) * 2.5;
float offx = v_uv.x + sin(v_uv.y + u_time * .1);
float offy = v_uv.y * .1 - u_time * 0.005 - cos(u_time * .001) * .01;
float n = snoise3(vec3(offx, offy, .9) * 2.5) - 2.1;
float finalMask = smoothstep(1., 0.99, n + pow(c, 1.5));
vec4 bg = vec4(0.12, 0.07, 0.28, 1.0);
vec4 bg2 = vec4(0., 0., 0., 0.);
gl_FragColor = mix(bg, bg2, finalMask);
}
Render Target code
this.rtWidth = window.innerWidth;
this.rtHeight = window.innerHeight;
this.renderTarget = new THREE.WebGLRenderTarget(this.rtWidth, this.rtHeight);
this.rtCamera = new THREE.PerspectiveCamera(
this.camera.settings.fov,
this.camera.settings.aspect,
this.camera.settings.near,
this.camera.settings.far
);
this.rtCamera.position.set(0, 0, this.camera.settings.perspective);
this.rtScene = new THREE.Scene();
this.rtScene.add(this.purpleBlob);
const geometry = new THREE.PlaneGeometry(window.innerWidth, window.innerHeight, 1);
const material = new THREE.MeshPhongMaterial({
map: this.renderTarget.texture,
});
this.mesh = new THREE.Mesh(geometry, material);
this.scene.add(this.mesh);
I'm still new to shaders so please be patient. :-)
There are probably infinite ways to mask in three.js. Here's a few
Use the stencil buffer
The stencil buffer is similar to the depth buffer in that it for every pixel in the canvas or render target there is a corresponding stencil pixel. You need to tell three.js you want a stencil buffer and then you can tell it when rendering what to do with the stencil buffer when you're drawing things.
You the stencil settings on Material
You tell three.js
what to do if the pixel you're drawing fails the stencil test
what to do if the pixel your drawing fails the depth test
what to do if the pixel you're drawing passes the depth test.
The things you can tell it to do for each of those conditions are keep (do nothing), increment, decrement, increment wraparound, decrement wraparound, set to a specific value.
You can also specify what the stencil test is by setting Material.stencilFunc
So, for example you can clear the stencil buffer to 0 (the default?), set the stencil test so it always passes, and set the conditions so if the depth test passes you set the stencil to 1. You then draw a bunch of things. Everywhere they are drawn there will now be a 1 in then stencil buffer.
Now you change the stencil test so it only passes if it equals 1 (or 0) and then draw more stuff, now things will only be drawn where the stencil equals the value you set
This exmaple uses the stencil
Mask with an alpha mask
In this case you need 2 color textures and an alpha texture. How you get those is up to you. For example you could load all 3 from images. Or you could generate all 3 using 3 render targets. Finally you pass all 3 to a shader that mixes them as in
gl_FragColor = mix(colorFromTexture1, colorFromTexture2, valueFromAlphaTexture);
This example uses this alpha mixing method
Note that if one of your 2 colors textures has an alpha channel you could use just 2 textures. You'd just pass one of the color textures as your mask.
Or of course you could calculate a mask based on the colors in one image or the other or both. For example
// assume you have function that converts from rgb to hue,saturation,value
vec3 hsv = rgb2hsv(colorFromTexture1.rgb);
float hue = hsv.x;
// pick one or the other if color1 is close to green
float mixAmount = step(abs(hue - 0.33), 0.05);
gl_FragColor = mix(colorFromTexture1, colorFromTexture2, mixAmount);
The point here is not that exact code, it's that you can make any formula you want for the mask, based on whatever you want, color, position, random math, sine waves based on time, some formula that generates a blob, whatever. The most common is some code that just looks up a mixAmount from a texture which is what the linked example above does.
ShaderToy style
Your code above appears to be a shadertoy style shader which is drawing a fullscreen quad. Instead of drawing 2 separate things you can just draw them in the same shader
vec4 computeBlueBlob() {
...
return blueBlobColor;
}
vec4 computeWhiteBlob() {
...
return whtieBlobColor;
}
vec4 main() {
vec4 color1 = computeBlueBlob();
vec4 color2 = computeWhiteBlob();
float mixAmount = color.a; // note: color2.a could be any
// formula to decide which colors
// to draw
gl_FragColor = mix(color1, color2, mixAmount);
}
note just like above how you compute mixAmount is up to you. Based it off anything, color1.r, color2.r, some formula, some hue, some other blob generation function, whatever.

OpenGL - Texture origin is top-left

I have a little problem:
I know that in UVs, the origin (0, 0) is at the bottom left.
I advanced in the discovery of opengl (I use lwjgl 3), and I wanted to make an interface.
In my opinion, I just need to display a texture that includes the text to display a button.
So I created, as usual, a new shader (vertexShader + fragmenShader).
But by great surprise, my texture is upside down! Why?
It's very weird.
this is my vertexShader:
#version 330
in vec3 position;
in vec2 texCoords;
out vec2 pass_texCoords;
void main() {
gl_Position = vec4(position, 1);
pass_texCoords = texCoords;
}
This is my fragmentShader:
#version 330
in vec2 pass_texCoords;
out vec4 out_Color;
uniform sampler2D texSampler;
void main() {
out_Color = texture(texSampler, pass_texCoords);
}
This is how I create my vao
(for the moment I plan to display a triangle):
my vetrices : -0.5, 0.5, 0, -0.5, -0.5, 0, 0.5, -0.5, 0
my texCoords : 0, 1, 0, 0, 1, 0
my indices : 0, 1, 2
And now how I display it:
//bind shader, bind VAO, enable VBOs
GL13.glActiveTexture(GL13.GL_TEXTURE0);
GL11.glBindTexture(GL11.GL_TEXTURE_2D, texture);
GL11.glDrawElements(GL11.GL_TRIANGLES, 3, GL11.GL_UNSIGNED_INT, 0);
//disableVBOs, unbind VAO, unbind shader
My triangle looks like (but the texture inside is inverted) :
| \
| \
| \
|_________
Need more code?
I know how invert the texture manually, but I think that the best way is to find the problem...
But if someone know how add an user-interface in a better way, I'm more than interrested!

Convert ndc coordinates to world coordinates in fragment shader threejs

My goal is to draw a circle around my mouse cursor over a plane.
I get NDC coordinates (-1 to +1) that represent my cursor position:
const rect = targetHTML.getBoundingClientRect();
const mousePositionX = event.clientX - rect.left;
const mousePositionY = event.clientY - rect.top;
this._currentPoint = {
x: (mousePositionX / targetHTML.clientWidth * 2 - 1),
y: (mousePositionY / targetHTML.clientHeight * -2 + 1),
};
I pass it to my fragment shader via uniforms:
this._cursorMaterial.uniforms.uBrushPosition.value =
new window.THREE.Vector2(this._currentPoint.x, this._currentPoint.y);
In my fragment shader, I want to convert it to a world coordinate in order to compare it to the fragment world location.
// vertex shader
varying vec4 vPos;
void main() {
vPos = modelMatrix * vec4(position, 1.0 );
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0 );
}
// fragment shader
varying vec4 vPos;
uniform vec2 uBrushPosition;
void main() {
// convert uBrush position to world space
// uBrushPosition
vec3 brushWorldPosition = ?
//
if (distance(brushWorldPosition, vpos) < 10.) {
gl_FragColor = vec4(1., 0., 0., .5);
}
discard;
Not in the shader, but you can send it in as a uniform.
var mouseWorld = new THREE.Vector3( mouse.x, mouse.y, distanceFromCamera )
mouseWorld.unproject( camera )

Flickering of THREE.Points based on camera position and texture coordinates, but only on Nvidia cards

I have a problem with flickering of THREE.Points depending on their UV coordinates, as seen in the following codepen: http://codepen.io/anon/pen/qrdQeY?editors=0010
The code in the codepen is condensed down as much as possible (171 lines),
but to summarize what I'm doing:
Rendering sprites using THREE.Points
BufferGeometry contains spritesheet index and position for each sprite
RawShaderMaterial with custom vertex and pixel shader to lookup up the UV coordinates of the sprite for the given index
a 128x128px spritesheet with 4x4 cells contains the sprites
Here's the code:
/// FRAGMENT SHADER ===========================================================
const fragmentShader = `
precision highp float;
uniform sampler2D spritesheet;
// number of spritesheet subdivisions both vertically and horizontally
// e.g. for a 4x4 spritesheet this number is 4
uniform float spritesheetSubdivisions;
// vParams[i].x = sprite index
// vParams[i].z = sprite alpha
varying vec3 vParams;
/**
* Maps regular UV coordinates spanning the entire spritesheet
* to a specific sprite within the spritesheet based on the given index,
* which points into a spritesheel cell (depending on spritesheetSubdivisions
* and assuming that the spritesheet is regular and square).
*/
vec2 spriteIndexToUV(float idx, vec2 uv) {
float cols = spritesheetSubdivisions;
float rows = spritesheetSubdivisions;
float x = mod(idx, cols);
float y = floor(idx / cols);
return vec2(x / cols + uv.x / cols, 1.0 - (y / rows + (uv.y) / rows));
}
void main() {
vec2 uv = spriteIndexToUV(vParams.x, gl_PointCoord);
vec4 diffuse = texture2D(spritesheet, uv);
float alpha = diffuse.a * vParams.z;
if (alpha < 0.5) discard;
gl_FragColor = vec4(diffuse.xyz, alpha);
}
`
// VERTEX SHADER ==============================================================
const vertexShader = `
precision highp float;
uniform mat4 modelViewMatrix;
uniform mat4 projectionMatrix;
uniform float size;
uniform float scale;
attribute vec3 position;
attribute vec3 params; // x = sprite index, y = unused, z = sprite alpha
attribute vec3 color;
varying vec3 vParams;
void main() {
vParams = params;
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
gl_Position = projectionMatrix * mvPosition;
gl_PointSize = size * ( scale / - mvPosition.z );
}
`
// THREEJS CODE ===============================================================
const scene = new THREE.Scene();
const camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000);
const renderer = new THREE.WebGLRenderer({canvas: document.querySelector("#mycanvas")});
renderer.setSize(window.innerWidth, window.innerHeight);
renderer.setClearColor(0xf0f0f0)
const pointGeometry = new THREE.BufferGeometry()
pointGeometry.addAttribute("position", new THREE.BufferAttribute(new Float32Array([
-1.5, -1.5, 0,
-0.5, -1.5, 0,
0.5, -1.5, 0,
1.5, -1.5, 0,
-1.5, -0.5, 0,
-0.5, -0.5, 0,
0.5, -0.5, 0,
1.5, -0.5, 0,
-1.5, 0.5, 0,
-0.5, 0.5, 0,
0.5, 0.5, 0,
1.5, 0.5, 0,
-1.5, 1.5, 0,
-0.5, 1.5, 0,
0.5, 1.5, 0,
1.5, 1.5, 0,
]), 3))
pointGeometry.addAttribute("params", new THREE.BufferAttribute(new Float32Array([
0, 0, 1, // sprite index 0 (row 0, column 0)
1, 0, 1, // sprite index 1 (row 0, column 1)
2, 0, 1, // sprite index 2 (row 0, column 2)
3, 0, 1, // sprite index 3 (row 0, column 4)
4, 0, 1, // sprite index 4 (row 1, column 0)
5, 0, 1, // sprite index 5 (row 1, column 1)
6, 0, 1, // ...
7, 0, 1,
8, 0, 1,
9, 0, 1,
10, 0, 1,
11, 0, 1,
12, 0, 1,
13, 0, 1,
14, 0, 1,
15, 0, 1
]), 3))
const img = document.querySelector("img")
const texture = new THREE.TextureLoader().load(img.src);
const pointMaterial = new THREE.RawShaderMaterial({
transparent: true,
vertexShader: vertexShader,
fragmentShader: fragmentShader,
uniforms: {
spritesheet: {
type: "t",
value: texture
},
spritesheetSubdivisions: {
type: "f",
value: 4
},
size: {
type: "f",
value: 1
},
scale: {
type: "f",
value: window.innerHeight / 2
}
}
})
const points = new THREE.Points(pointGeometry, pointMaterial)
scene.add(points)
const render = function (timestamp) {
requestAnimationFrame(render);
camera.position.z = 5 + Math.sin(timestamp / 1000.0)
renderer.render(scene, camera);
};
render();
// resize viewport
window.addEventListener( 'resize', onWindowResize, false );
function onWindowResize(){
camera.aspect = window.innerWidth / window.innerHeight;
camera.updateProjectionMatrix();
renderer.setSize( window.innerWidth, window.innerHeight );
}
If you have an Nvidia card you will see three sprites flicker while the camera
is moving back and forth along the Z axis. On integrated Intel graphics chips
the problem does not occur.
I'm not sure how to solve this problem. The affected uv coordinates seem kind of random. I'd be grateful for any pointers.
The mod()/floor() calculations inside your spriteIndexToUV() function are causing problems in certain constellations (when spriteindex is a multiple of spritesheetSubdivisions).
I could fix it by tweaking the cols variable with a small epsilon:
vec2 spriteIndexToUV(float idx, vec2 uv)
{
float cols = spritesheetSubdivisions - 1e-6; // subtract epsilon
float rows = spritesheetSubdivisions;
float x = mod(idx, cols);
float y = floor(idx / cols);
return vec2(x / cols + uv.x / cols, 1.0 - (y / rows + (uv.y) / rows));
}
PS: That codepen stuff is really cool, didn't know that this existed :-)
edit: It might be even better/clearer to write it like this:
float cols = spritesheetSubdivisions;
float rows = spritesheetSubdivisions;
float y = floor ((idx+0.5) / cols);
float x = idx - cols * y;
That way, we keep totally clear of any critical situations in the floor operation -- plus we get rid of the mod() call.
As to why floor (idx/4) is sometimes producing 0 instead of 1 when idx should be exactly 4.0, I can only speculate that the varying vec3 vParams is subjected to some interpolation when it goes from the vertex-shader to the fragment-shader stage, thus leading to the fragment-shader seeing e.g. 3.999999 instead of exactly 4.0.

glClipPlane - Is there an equivalent in webGL?

I have a 3D mesh. Is there any possibility to render the sectional view (clipping) like glClipPlane in OpenGL?
I am using Three.js r65.
The latest shader that I have added is:
Fragment Shader:
uniform float time;
uniform vec2 resolution;
varying vec2 vUv;
void main( void )
{
vec2 position = -1.0 + 2.0 * vUv;
float red = abs( sin( position.x * position.y + time / 2.0 ) );
float green = abs( cos( position.x * position.y + time / 3.0 ) );
float blue = abs( cos( position.x * position.y + time / 4.0 ) );
if(position.x > 0.2 && position.y > 0.2 )
{
discard;
}
gl_FragColor = vec4( red, green, blue, 1.0 ); }
Vertex Shader:
varying vec2 vUv;
void main()
{
vUv = uv;
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
gl_Position = projectionMatrix * mvPosition;
}
Unfortunately in the OpenGL-ES specification against which WebGL has been specified there are no clip planes and the vertex shader stage lacks the gl_ClipDistance output, by which plane clipping is implemented in modern OpenGL.
However you can use the fragment shader to implement per-fragment clipping. In the fragment shader test the position of the incoming fragment against your set of clip planes and if the fragment does not pass the test discard it.
Update
Let's have a look at how clip planes are defined in fixed function pipeline OpenGL:
void ClipPlane( enum p, double eqn[4] );
The value of the first argument, p, is a symbolic constant,CLIP PLANEi, where i is
an integer between 0 and n − 1, indicating one of n client-defined clip planes. eqn
is an array of four double-precision floating-point values. These are the coefficients
of a plane equation in object coordinates: p1, p2, p3, and p4 (in that order). The
inverse of the current model-view matrix is applied to these coefficients, at the time
they are specified, yielding
p' = (p'1, p'2, p'3, p'4) = (p1, p2, p3, p4) inv(M)
(where M is the current model-view matrix; the resulting plane equation is unde-
fined if M is singular and may be inaccurate if M is poorly-conditioned) to obtain
the plane equation coefficients in eye coordinates. All points with eye coordinates
transpose( (x_e, y_e,z_e, w_e) ) that satisfy
(p'1, p'2, p'3, p'4)  x_e  ≥ 0
 y_e 
 z_e 
 w_e 
lie in the half-space defined by the plane; points that do not satisfy this condition
do not lie in the half-space.
So what you do is, you add uniforms by which you pass the clip plane parameters p' and add another out/in pair of variables between the vertex and fragment shader to pass the vertex eye space position. Then in the fragment shader the first thing you do is performing the clip plane equation test and if it doesn't pass you discard the fragment.
In the vertex shader
in vec3 vertex_position;
out vec4 eyespace_pos;
uniform mat4 modelview;
void main()
{
/* ... */
eyespace_pos = modelview * vec4(vertex_position, 1);
/* ... */
}
In the fragment shader
in vec4 eyespace_pos;
uniform vec4 clipplane;
void main()
{
if( dot( eyespace_pos, clipplane) < 0 ) {
discard;
}
/* ... */
}
In the newer versions (> r.76) of three.js clipping is supported in the THREE.WebGLRenderer. There is an array property called clippingPlanes where you can add your custom clipping planes (THREE.Plane instances).
For three.js you can check these two examples:
1) WebGL clipping (code base here on GitHub)
2) WebGL clipping advanced (code base here on GitHub)
A simple example
To add a clipping plane to the renderer you can do:
var normal = new THREE.Vector3( -1, 0, 0 );
var constant = 0;
var plane = new THREE.Plane( normal, constant );
renderer.clippingPlanes = [plane];
Here a fiddle to demonstrate this.
You can also clip on object level by adding a clipping plane to the object material. For this to work you have to set the renderer localClippingEnabled property to true.
// set renderer
renderer.localClippingEnabled = true;
// add clipping plane to material
var normal = new THREE.Vector3( -1, 0, 0 );
var constant = 0;
var color = 0xff0000;
var plane = new THREE.Plane( normal, constant );
var material = new THREE.MeshBasicMaterial({ color: color });
material.clippingPlanes = [plane];
var mesh = new THREE.Mesh( geometry, material );
Note: In r.77 some of the clipping functionality in the THREE.WebGLRenderer was moved moved to a separate THREE.WebGLClipping class, check here for reference in the three.js master branch.

Resources