I get his error
"XNA Framework Reach profile requires TextureAddressMode to be Clamp when using texture sizes that are not powers of two"
for line
GraphicsDevice.DrawUserPrimitives<VertexPositionTexture>
(PrimitiveType.TriangleStrip, verts, 0, 2);
What should I do ?
Thanks,
When using a reach profile, use power of two sized textures or set the TextureAddressingMode to Clamp.
The TextureAddressingMode is part of the GraphicsDevice SamplerState. You need to set this state before your draw call. The following code shows how to set the first texture sampler to one of the built in sampler states.
GraphicsDevice.SamplerStates[0] = SamplerState.LinearClamp;
GraphicsDevice.DrawUserPrimitives<VertexPositionTexture>
(PrimitiveType.TriangleStrip, verts, 0, 2);
I my case, the Model.fx file was setting the AddressMode to Wrap and that was messing it up.
Here's the correct sampler_state from my Model.fx:
sampler TextureSampler = sampler_state
{
Texture = (Texture);
MinFilter = Linear;
MagFilter = Linear;
MipFilter = Linear;
AddressU = Clamp;
AddressV = Clamp;
};
More info can be found here: http://www.packtpub.com/article/xna-hsl
Related
I am trying to assign a standardised color on each point on a 3d point cloud on load and the point cloud is represented with a Buffer Geometry using Three.js. The buffer geometry does not have an initial color attribute attached to it at load.
The issue I'm facing is that every time I reload the point cloud, the color of the point cloud changes even though the color is supposed to be hardcoded in. Even the rgb values in the color attribute array remained the same in the separate loads as seen in the following image.
Thus, I am wondering where the additional shade of color came from that causes my buffer geometry to appear differently from the way I coded it. Thank you for your help.
The following is the code I used to assign the color to the points on the point cloud:
if (!points.geometry.attributes.color) {
// set a default color attribute
const colorsArr = []
for (let i = 0, n = points.geometry.attributes.position.count; i < n; i++) {
colorsArr.push(0.69, 0.78, 1)
}
points.geometry.setAttribute(
'color',
new THREE.Float32BufferAttribute(colorsArr, 3, false)
)
}
points.geometry.attributes.color.needsUpdate = true
I have a basic three.js game working and I'd like to add particles. I've been searching online, including multiple questions here, and the closest I've come to getting a 'particle system' working is using a THREE.BufferGeometry, a THREE.BufferAttribute and a THREE.Points mesh. I set it up like this:
const particleMaterial = new THREE.PointsMaterial( { size: 10, map: particleTexture, blending: THREE.AdditiveBlending, transparent: true } );
const particlesGeometry = new THREE.BufferGeometry;
const particlesCount = 300;
const posArray = new Float32Array(particlesCount * 3);
for (let i = 0; i < particlesCount; i++) {
posArray[i] = Math.random() * 10;
}
const particleBufferAttribute = new THREE.BufferAttribute(posArray, 3);
particlesGeometry.setAttribute( 'position', particleBufferAttribute );
const particlesMesh = new THREE.Points(particlesGeometry, particleMaterial);
particlesMesh.counter = 0;
scene.add(particlesMesh);
This part works and displays the particles fine, at their initial positions, but of course I'd like to move them.
I have tried all manner of things, in my 'animate' function, but I am not happening upon the right combination. I'd like to move particles, ideally one vertex per frame.
The current thing I'm doing in the animate function - which does not work! - is this:
particleBufferAttribute.setXYZ( particlesMesh.counter, objects[0].position.x, objects[0].position.y, objects[0].position.z );
particlesGeometry.setAttribute( 'position', particleBufferAttribute );
//posArray[particlesMesh.counter] = objects[0].position;
particlesMesh.counter ++;
if (particlesMesh.counter > particlesCount) {
particlesMesh.counter = 0;
}
If anyone has any pointers about how to move Points mesh vertices, that would be great.
Alternatively, if this is not at all the right approach, please let me know.
I did find Stemkoski's ShaderParticleEngine, but I could not find any information about how to make it work (the docs are very minimal and do not seem to include examples).
You don't need to re-set the attribute, but you do need to tell the renderer that the attribute has changed.
particleBufferAttribute.setXYZ( particlesMesh.counter, objects[0].position.x, objects[0].position.y, objects[0].position.z );
particleBufferAttribute.needsUpdate = true; // This is the kicker!
By setting needsUpdate to true, the renderer knows to re-upload that attribute to the GPU.
This might not be concern for you, but just know that moving particles in this way is expensive, because you re-upload the position attribute every single frame, which includes all the position data for every particle you aren't moving.
I need to sample the RGBA values of a texture, and I'm using the alpha channel to store custom data, not really to as opacity. The problem is that as alpha approaches 0, my RGB values also get multiplied towards 0, but I need them to remain independent.
My first render pass gets rendered into a WebGLRenderTarget, and this is how I'm storing data in GLSL into the alpha channel:
gl_FragColor.rgb = mainColor;
gl_FragColor.a = depthData;
the result shows Alpha fading to white as expected:
My second render pass gets rendered to the canvas, so I read the RGB values from from the first texture in GLSL, with alpha as 1:
gl_FragColor.rgb = texture2D(tColor, vUv).rgb;
gl_FragColor.a = 1.0;
The result should be the regular RGB values of the spheres, but it seems that the RGB values fade to black when Alpha approaches 0.
Is there a way to use the alpha channel as data storage without affecting the RGB values? I've tried setting the target texture's premultiplyAlpha = false as follows, but it doesn't change anything:
var target = new THREE.WebGLRenderTarget(
window.innerWidth,
window.innerHeight, {
format: THREE.RGBAFormat,
type: THREE.UnsignedByteType
});
target.texture.premultiplyAlpha = false;
Update:
I was building a working demo here, and I was unable to reproduce the bug. It looks like I'm going to have to rip apart my entire project until I get to the bottom of this...
I am working on a project that displays buildings. The requirement is to let the building gradually fade out (transparent) based on the distance between the camera and the buildings. Also, this effect has to follow the camera's movement.
I consider using THREE.Fog(), but the Fog seems can only change the material's color.
Above is a picture of the building with white fog.
The buildings are in tiles, each tile is one single geometry (I merged all the buildings into one) using
var bigGeometry = new THREE.Geometry();
bigGeometry.merge(smallGeometry);
The purple/blue color thing is the ground, and ground.material.fog = false;. So the ground won't interact with the fog.
My question is:
Is it possible to let the fog interact with the building's material's opacity instead of color? (more white translate to more transparent)
Or should I use Shader to control the material's opacity based on distance to the camera? But I have no idea of how to do this.
I also considered adding alphaMap. If so, each building tile have to map an alphaMap and all these alphaMap have to interact with the camera's movement. It's going to be a tons of work.
So any suggestions?
Best Regards,
Arthur
NOTE: I suspect there are probably easier/prettier ways to solve this than opacity. In particular, note that partially-opaque buildings will show other buildings behind them. To address that, consider using a gradient or some other scene background, and choosing a fog color to match that, rather than using opacity. But for the sake of trying it...
Here's how to alter an object's opacity based on its distance. This doesn't actually require THREE.Fog, I'm not sure how you would use the fog data directly. Instead I'll use THREE.NodeMaterial, which (as of three.js r96) is fairly experimental. The alternative would be to write a custom shader with THREE.ShaderMaterial, which is also fine.
const material = new THREE.StandardNodeMaterial();
material.transparent = true;
material.color = new THREE.ColorNode( 0xeeeeee );
// Calculate alpha of each fragment roughly as:
// alpha = 1.0 - saturate( distance / cutoff )
//
// Technically this is distance from the origin, for the demo, but
// distance from a custom THREE.Vector3Node would work just as well.
const distance = new THREE.Math2Node(
new THREE.PositionNode( THREE.PositionNode.WORLD ),
new THREE.PositionNode( THREE.PositionNode.WORLD ),
THREE.Math2Node.DOT
);
const normalizedDistance = new THREE.Math1Node(
new THREE.OperatorNode(
distance,
new THREE.FloatNode( 50 * 50 ),
THREE.OperatorNode.DIV
),
THREE.Math1Node.SAT
);
material.alpha = new THREE.OperatorNode(
new THREE.FloatNode( 1.0 ),
normalizedDistance,
THREE.OperatorNode.SUB
);
Demo: https://jsfiddle.net/donmccurdy/1L4s9e0c/
Screenshot:
I am the OP. After spending some time reading how to use Three.js's Shader material. I got some code that is working as desired.
Here's the code: https://jsfiddle.net/yingcai/4dxnysvq/
The basic idea is:
Create an Uniform that contains controls.target (Vector3 position).
Pass vertex position attributes to varying in the Vertex Shader. So
that the Fragment Shader can access it.
Get the distance between each vertex position and controls.target. Calculate alpha value based on the distance.
Assign alpha value to the vertex color.
Another important thing is: Because the fade out mask should follow the camera move, so don't forget to update the control in the uniforms every frame.
// Create uniforms that contains control position value.
uniforms = {
texture: {
value: new THREE.TextureLoader().load("https://threejs.org/examples/textures/water.jpg")
},
control: {
value: controls.target
}
};
// In the render() method.
// Update the uniforms value every frame.
uniforms.control.value = controls.target;
I had the same issue - a few years later - and solved it with the .onBeforeCompile function which is maybe more convenient to use.
There is a great tutorial here
The code itself is simple and could be easily changed for other materials. It just uses the fogFactor as alpha value in the material.
Here the material function:
alphaFog() {
const material = new THREE.MeshPhysicalMaterial();
material.onBeforeCompile = function (shader) {
const alphaFog =
`
#ifdef USE_FOG
#ifdef FOG_EXP2
float fogFactor = 1.0 - exp( - fogDensity * fogDensity * vFogDepth * vFogDepth );
#else
float fogFactor = smoothstep( fogNear, fogFar, vFogDepth );
#endif
gl_FragColor.a = saturate(1.0 - fogFactor);
#endif
`
shader.fragmentShader = shader.fragmentShader.replace(
'#include <fog_fragment>', alphaFog
);
material.userData.shader = shader;
};
material.transparent = true
return material;
}
and afterwards you can use it like
const cube = new THREE.Mesh(geometry, this.alphaFog());
I'm trying to take a users mouse/touch drawn line and then have it alpha fade out the result using a tween. The problem is when cap and joint style are set to rounded then joint point fades behind the rest of the line. It looks fine when set to miter or bevel.
What I want is a smooth solid fade of the shape. Any ideas?
Fiddle: http://jsfiddle.net/mcfarljw/ZNGK2/
Function for drawing the line based on user input:
function handleMouseMove(event) {
var midPt = new createjs.Point(oldPt.x + stage.mouseX >> 1, oldPt.y + stage.mouseY >> 1);
drawingCanvas.graphics.setStrokeStyle(stroke, 'round', 'round').beginStroke(color).moveTo(midPt.x, midPt.y).curveTo(oldPt.x, oldPt.y, oldMidPt.x, oldMidPt.y);
oldPt.x = stage.mouseX;
oldPt.y = stage.mouseY;
oldMidPt.x = midPt.x;
oldMidPt.y = midPt.y;
stage.update();
}
Tween applied to the shape after line is finished:
createjs.Tween.get(drawingCanvas).to({
alpha: 0
}, 2000).call(function() {
drawingCanvas.alpha = 1;
drawingCanvas.graphics.clear();
});
You'll want to cache the whole shape before fading it out. See the updates I have made to the fiddle. Mainly, take a look at line 52 on the handleMouseUp event.
drawingCanvas.cache(0, 0, 800, 800);
Then, when your fade is complete. Make sure to uncache before showing the object again. Otherwise your graphics.clear() won't work.
drawingCanvas.uncache();