SKShader performance slowdown - performance

I need to implement a custom shader node using SpriteKit.
In simulator everything is ok. On a device (iPad 3rd gen) shader animation is smooth just for first ~30 seconds, after that shader's fps are gradually falling down until it looks like a slideshow (1 fps or even less)
It is worth noting, that SpriteKit shows 60 fps, so does Xcode.
CPU is ~75% busy, but shader itself shows ~1fps.
I own only 3rd generation iPad and I currently don't have an opportunity to test it on other devices
Shader code is taken from wwdc session, but the issue is reproduced with any animated shaders I've tried.
Shader itself:
void main(void)
{
float currTime = u_time;
vec2 uv = v_tex_coord;
vec2 circleCenter = vec2(0.5, 0.5);
vec3 circleColor = vec3(0.8, 0.5, 0.7);
vec3 posColor = vec3(uv, 0.5+0.5 * sin(currTime)) * circleColor;
float illu = pow(1. - distance(uv, circleCenter), 4.) * 1.2;
illu *= (2. + abs(0.4 + cos(currTime * -20. + 50. * distance(uv, circleCenter))/1.5));
gl_FragColor = vec4(posColor * illu * 2.0, illu * 2.0);
}
GameScene:
import SpriteKit
class GameScene: SKScene {
override func didMoveToView(view: SKView) {
addChild(shaderSprite(view.center))
}
func shaderSprite(position: CGPoint) -> SKSpriteNode {
let sprite = SKSpriteNode(texture: SKTexture(imageNamed: "dummy"), color: nil, size: CGSizeMake(100, 100))
sprite.shader = SKShader(fileNamed: "wwdc")
sprite.position = position
return sprite
}
}
Result in simulator or first ~30 seconds on iPad3:
Result on iPad3 after ~2 minutes:
Please point my mistake or just build the project on any other device to check if it's a device-specific issue. Thanks in advance.
Test project: https://github.com/alexburtnik/SKShaderTest

This is an issue with the u_time uniform, I got around it by adding my own uniform float and using it instead of using u_time
Shader
void main(void)
{
float currTime = myUniform;
...
}
Update
override func update(currentTime: CFTimeInterval) {
let shadedNode = self.childNodeWithName("shadedNode") as SKSpriteNode
let uniform = shadedNode.shader!.uniformNamed("myUniform")
uniform!.floatValue = Float(currentTime)
}

Gelan Almagro's answer above works perfectly. I'm adding the Objective-C code for the update: method.
SKSpriteNode *shadedNode = (SKSpriteNode*)[self childNodeWithName:#"shadedNode"];
SKUniform *uniform = [shadedNode.shader uniformNamed:#"myUniform"];
uniform.floatValue = (float)currentTime;

Related

Autodesk Forge Viewer - smoke particle effect in Forge Viewer

I've been trying to make a smoke particle effect in Forge Viewer for several days. I want to achieve something like this in the Forge Viewer. I find some threejs particle engine samples. But all of them can't use in version r71(which used by Forge Viewer). So I decided to write my own particle engine. But there's a problem I can't figured it out why.
At first ,I've tried it in threejs (not in Forge Viewer)(with version r71 of course) and I can do something like this. It seems good for me, I think I can start writing my particle engine. But when I test it in Forge Viewer, things aren't going well.
Back to Forge Viewer, I have tested point cloud with custom shader and it worked well in the Forge Viewer. I can customize every attributes such as color, size, position to every single point. But when I try to add an image texture using texture2D in my fragmentShader. The browser shows me some warnings and nothing show on the viewer.
Here are the warnings showed by browser:
WebGL: INVALID_OPERATION: getUniformLocation: program not linked
WebGL: INVALID_OPERATION: getAttribLocation: program not linked
WebGL: INVALID_OPERATION: useProgram: program not valid
vertexShader:
attribute float customSize;
void main() {
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
gl_Position = projectionMatrix * mvPosition;
gl_PointSize = customSize;
}
fragmentShader:
uniform sampler2D texture;
void main() {
vec4 color = texture2D(texture,gl_PointCoord); //when I comment out this line, everything works well
gl_FragColor = color;
}
function for create points:
createPoints() {
const width = 100;
const height = 100;
const pointCount = width * height;
const positions = new Float32Array(pointCount * 3);
//const colors = new Float32Array(pointCount * 4);
const sizes = new Float32Array(pointCount);
const geometry = new THREE.BufferGeometry();
const material = new THREE.ShaderMaterial({
uniforms: {
texture: { type: "t", value: this.particleTexture }
},
vertexShader: this.vertexShader,
fragmentShader: this.fragmentShader,
transparent: true,
depthTest: true,
blending: THREE.NormalBlending
});
material.supportsMrtNormals = true;
let i = 0;
for (var x = 0; x < width; x++) {
for (var y = 0; y < height; y++) {
const u = x / width, v = y / height;
positions[i * 3] = u * 20;
positions[i * 3 + 1] = v * 20;
positions[i * 3 + 2] = Math.sin(u * 20) + Math.cos(v * 20);
sizes[i] = 1 + THREE.Math.randFloat(1, 5);
colors[i * 4] = THREE.Math.random();
colors[i * 4 + 1] = THREE.Math.random();
colors[i * 4 + 2] = THREE.Math.random();
colors[i * 4 + 3] = 1;
i++;
}
}
//const colorsAttribute = new THREE.BufferAttribute(colors, 4);
//colorsAttribute.normalized = true;
geometry.addAttribute("position", new THREE.BufferAttribute(positions, 3));
geometry.addAttribute("customSize", new THREE.BufferAttribute(sizes, 1));
//geometry.addAttribute("customColor", colorsAttribute);
geometry.computeBoundingBox();
geometry.isPoints = true;
points = new THREE.PointCloud(geometry, material);
viewer.impl.createOverlayScene('pointclouds');
viewer.impl.addOverlay('pointclouds', points);
}
in the createPoints() function, this.particleTexture comes from :
THREE.ImageUtils.loadTexture("../img/smokeparticle.png")
vertexShader ,fragmentShader and the createPoints() function are all the same between threejs testing app on browser(not in Forge Viewer) and in Forge Viewer app. But it works well only when it's not running in Forge Viewer.
I have searched a lot of tutorials and blogs, but just can't find a solution that fits me. Can anyone help? Or maybe there's a better way to make smoke effect in Forge Viewer? Thx for help!
(If I missed some information just tell me. I would update them!)
I tried your code and I managed to make it working changing the uniform name : texture by tex.
I think texture is a reserved word in WebGL2.
If you switch to WebGL1 (as describe in this article : Custom shader materials in Forge) it works.
Fragment shader :
uniform sampler2D tex;
void main() {
vec4 color = texture2D(tex,gl_PointCoord);
gl_FragColor = color;
}
Replace uniform name in material constructor :
const material = new THREE.ShaderMaterial({
uniforms: {
tex: { type: "t", value: this.particleTexture }
},
vertexShader: this.vertexShader,
fragmentShader: this.fragmentShader,
transparent: true,
depthTest: true,
blending: THREE.NormalBlending
});
Can you try it ?
My colleague blogged about adding 3D markups to the viewer using THREE.Points and custom shader with textures: https://forge.autodesk.com/blog/3d-markup-icons-and-info-card. This sounds quite close to what you're trying to do.

How can I make waves from the center of a Plane Geometry in Three.JS using the vertex shader?

I've been learning Three.js and I can't seem to wrap my head around shaders. I have an idea of what I want, and I know the mathematical tools within the GLSL language and what they do in simple terms, but I don't understand how they work together.
I have a plane geometry with a shader material, I want to be able to create waves from the center of the vertex shader, but I am unsure how to accomplish this.
Also, if there is a course or documentation you can provide that could explain simple concepts regarding vertex and fragment shaders that would be great!
This is what I have done so far:
varying vec2 vUv;
varying float vuTime;
varying float vElevation;
uniform float uTime;
void main(){
vec4 modelPosition = modelMatrix * vec4(position, 1.0);
float elevation = sin(modelPosition.x * 10.0 - uTime) * 0.1;
modelPosition.y += elevation;
vec4 viewPosition = viewMatrix * modelPosition;
vec4 projectedPosition = projectionMatrix * viewPosition;
gl_Position = projectedPosition;
vuTime = uTime;
vUv = uv;
vElevation = elevation;
}
I have set up a simple animation using the sin function and a time variable passed to the shader which creates a simple wave effect without the use of noise. I am trying to create a circular wave stemming from the center of the plane geometry.
What I THINK I have to do is use PI to offset the position away from the center while the wave is moving with uTime. To get to the center of the Plane geometry I need to offset the position with 0.5 float.
That is my understanding right now and I would love to know if I'm correct in my thinking or what a correct way is of accomplishing this.
I also am passing the varying variable to the fragment shader to control the color at the elevation.
Thanks for any help you guys provide; I appreciate it!
In your shader code, try to change this line
float elevation = sin(modelPosition.x * 10.0 - uTime) * 0.1;
to this
float elevation = sin(length(modelPosition.xz) * 10.0 - uTime) * 0.1;
You can use either UV coords or position.
let scene = new THREE.Scene();
let camera = new THREE.PerspectiveCamera(60, innerWidth / innerHeight, 1, 1000);
camera.position.set(0, 10, 10).setLength(10);
let renderer = new THREE.WebGLRenderer();
renderer.setSize(innerWidth, innerHeight);
document.body.appendChild(renderer.domElement);
let controls = new THREE.OrbitControls(camera, renderer.domElement);
scene.add(new THREE.GridHelper(10, 10, "magenta", "yellow"));
let g = new THREE.PlaneGeometry(10, 10, 50, 50);
let m = new THREE.ShaderMaterial({
wireframe: true,
uniforms: {
time: {value: 0},
color: {value: new THREE.Color("aqua")}
},
vertexShader:`
#define PI 3.1415926
#define PI2 PI*2.
uniform float time;
void main(){
vec3 pos = position;
pos.z = sin((length(uv - 0.5) - time) * 6. * PI2);
gl_Position = projectionMatrix * modelViewMatrix * vec4(pos, 1.);
}
`,
fragmentShader:`
uniform vec3 color;
void main(){
gl_FragColor = vec4(color, 1.);
}
`
});
let o = new THREE.Mesh(g, m);
o.rotation.x = -Math.PI * 0.5;
scene.add(o);
let clock = new THREE.Clock();
renderer.setAnimationLoop(() => {
let t = clock.getElapsedTime();
m.uniforms.time.value = t * 0.1;
renderer.render(scene, camera);
});
body{
overflow: hidden;
margin: 0;
}
<script src="https://threejs.org/build/three.min.js"></script>
<script src="https://threejs.org/examples/js/controls/OrbitControls.js"></script>

Why does this SKShader display correctly in Xcode particle editor but not in the app I'm building?

I'm experimenting with using an SKShader attached to an SKEmitterNode.
The shader just creates circles. The particle emitter is based on the firefly particle template in Xcode. I left it as is aside from adding in the shader.
The shader looks how I want it in the Xcode particle editor. I created a simple app based on the spritekit game template.
When I compile and run the app with the shader attached to an SKEmitterNode, the circles compromising the particles are clipped. What am I doing wrong?
Here is the didMove(to view:) from my game scene:
override func didMove(to view: SKView) {
let particlePath = Bundle.main.path(
forResource: "fireflyparticles", ofType: "sks")
if particlePath != nil {
let particleNode =
NSKeyedUnarchiver.unarchiveObject(withFile: particlePath!)
as! SKEmitterNode
let circleShader = SKShader(fileNamed: "Circles.fsh")
particleNode.shader = circleShader
particleNode.position = CGPoint(x: frame.midX, y: frame.midY)
self.addChild(particleNode)
}
}
and here is the shader:
void main() {
const float radius = 0.5;
vec2 position = v_tex_coord - vec2(0.5,0.5);
if (length(position) > radius) {
gl_FragColor = vec4(vec3(0.0), 0.0);
} else {
gl_FragColor = vec4(vec3(1.0), 1.0);
}
}
here's an image of the particle editor on the left, compared to the app on the right:
edit:
it seems like this is only an issue in the Mac app. I created a new app targeting iOS and it looks fine there. So... it works in the particle editor, and on iOS... but not in a MacOS app.

Three.js: Black artifacts using SubSurface Scattering shader

I would like to use a ShaderMaterial in Three.js that allows some light to go through. As I learned recently, the effect I need is called "sub-Surface Scattering". I found several examples, but only a few allow real-time calculations (without additional mapping). The closest one is shown in this snippet:
var container;
var camera, scene, renderer;
var sssMesh;
var lightSourceMesh;
var sssUniforms;
var clock = new THREE.Clock();
init();
animate();
function init() {
container = document.getElementById('container');
camera = new THREE.PerspectiveCamera(40, window.innerWidth / window.innerHeight, 1, 3000);
camera.position.z = 4;
camera.position.y = 2;
camera.rotation.x = -0.45;
scene = new THREE.Scene();
var boxGeometry = new THREE.CubeGeometry(0.75, 0.75, 0.75);
var lightSourceGeometry = new THREE.CubeGeometry(0.1, 0.1, 0.1);
sssUniforms = {
u_lightPos: {
type: "v3",
value: new THREE.Vector3()
}
};
var sssMaterial = new THREE.ShaderMaterial({
uniforms: sssUniforms,
vertexShader: document.getElementById('vertexShader').textContent,
fragmentShader: document.getElementById('fragment_shader').textContent
});
var lightSourceMaterial = new THREE.MeshBasicMaterial();
sssMesh = new THREE.Mesh(boxGeometry, sssMaterial);
sssMesh.position.x = 0;
sssMesh.position.y = 0;
scene.add(sssMesh);
lightSourceMesh = new THREE.Mesh(lightSourceGeometry, lightSourceMaterial);
lightSourceMesh.position.x = 0;
lightSourceMesh.position.y = 0;
scene.add(lightSourceMesh);
renderer = new THREE.WebGLRenderer();
container.appendChild(renderer.domElement);
onWindowResize();
window.addEventListener('resize', onWindowResize, false);
}
function onWindowResize(event) {
camera.aspect = window.innerWidth / window.innerHeight;
camera.updateProjectionMatrix();
renderer.setSize(window.innerWidth, window.innerHeight);
}
function animate() {
requestAnimationFrame(animate);
render();
}
function render() {
var delta = clock.getDelta();
var lightHeight = Math.sin(clock.elapsedTime * 1.0) * 0.5 + 0.7;
lightSourceMesh.position.y = lightHeight;
sssUniforms.u_lightPos.value.y = lightHeight;
sssMesh.rotation.y += delta * 0.5;
renderer.render(scene, camera);
}
body {
color: #ffffff;
background-color: #050505;
margin: 0px;
overflow: hidden;
}
<script src="http://threejs.org/build/three.min.js"></script>
<div id="container"></div>
<script id="fragment_shader" type="x-shader/x-fragment">
varying vec3 v_fragmentPos;
varying vec3 v_normal;
uniform vec3 u_lightPos;
void main(void)
{
vec3 _LightColor0 = vec3(1.0,0.5,0.5);
float _LightIntensity0 = 0.2;
vec3 translucencyColor = vec3(0.8,0.2,0.2);
vec3 toLightVector = u_lightPos - v_fragmentPos;
float lightDistanceSQ = dot(toLightVector, toLightVector);
vec3 lightDir = normalize(toLightVector);
float ndotl = max(0.0, dot(v_normal, lightDir));
float inversendotl = step(0.0, dot(v_normal, -lightDir));
vec3 lightColor = _LightColor0.rgb * ndotl / lightDistanceSQ * _LightIntensity0;
vec3 subsurfacecolor = translucencyColor.rgb * inversendotl / lightDistanceSQ * _LightIntensity0;
vec3 final = subsurfacecolor + lightColor;
gl_FragColor=vec4(final,1.0);
}
</script>
<script id="vertexShader" type="x-shader/x-vertex">
varying vec3 v_fragmentPos;
varying vec3 v_normal;
void main()
{
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
v_fragmentPos = (modelMatrix * vec4( position, 1.0 )).xyz;
v_normal = (modelMatrix * vec4( normal, 0.0 )).xyz;
gl_Position = projectionMatrix * mvPosition;
}
</script>
The code above works perfect on a simple geometry (a cube in this case). But it draws some strange black faces on more complex meshes. The final objetive is to "iluminate" heightmap terrains, but applying this same code, I get this:
As you can see, the "glass-effect" is very nice on the lightened areas, but on the darker ones (where the light doesn´t reach so well) it draws some full black faces that I don´t know how to avoid.
I consider myself a medium-level Three.js user, but this is the first time I play with shaders, and is not being that easy, to be fair. Everytime I change something on the shaders code, the result is a visual garbage. The only single thing I changed properly are the light colors (both front and back). I also tried to increase the light intensity, but this doesn´t work, it only "burns" more the shader, making even more black faces to appear.
Can anyone point me on the right direction? What is this effect called on the first place? I don´t even know how to search about it on WebGL resources. Any help would be appreciated!
EDIT: It seems that making the terrain thicker (with a higher Z scale) solves the issue. Maybe this has something to do with the angle of the light against the faces? This also happens in the original snippet just when the light enters the cube (you can see a full black face in a frame). Or maybe I´m just talking nonsense. This is easily the hardest piece of code I faced in years, and it´s just 10 lines!! I just want to see the shader look as good in the original scale as it looks in the thicker one. But the complexity of the physics involved in that formulas is beyond me.
There is a few places where you potentially divide by zero in your pixel shader, which often shows up as black (line 3 and 4):
float ndotl = max(0.0, dot(v_normal, lightDir));
float inversendotl = step(0.0, dot(v_normal, -lightDir));
vec3 lightColor = _LightColor0.rgb * ndotl / lightDistanceSQ * _LightIntensity0;
vec3 subsurfacecolor = translucencyColor.rgb * inversendotl / lightDistanceSQ * _LightIntensity0;
vec3 final = subsurfacecolor + lightColor;
both ndotl and inversendotl can have a value of 0 (and pretty often, since you clamp them in both cases). Either of the subsequent terms can then be divided by zero, rendering them black. Try commenting out either subsurfacecolor or lightColor to isolate the faulty one (might be both).
max() can be non-zero by just having a tiny delta: max(0.0001, x)
step() can be solved by replacing it with a static branching, which would also have the benefit of getting rid of that extra dot product, since you just negate a vector, the result is just the opposite.

cocos2d :my rotation with the accelerometer is not smooth

so I use the accelerometer in cocos2d to rotate my sprite but the rotation isn't smooth at all . I know that I have to use filter but I don't know how to integrate it in my code :
-(id) init
{
self.isAccelerometerEnabled = YES;
[[UIAccelerometer sharedAccelerometer] setUpdateInterval:1/60];
}
- (void) accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration {
ombreoeuf1.rotation = acceleration.y * 90 ;
}
sorry for my english I'm french :/
Here's how to implement a lowpass filter. Experiment a bit with kFilteringFactor until you get nice results.
// Declare an int `accelY` in your class interface and set it to 0 in init
-(void) accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration {
float kFilteringFactor = 0.1;
accelY = (acceleration.y * kFilteringFactor) + (accelY * (1.0 - kFilteringFactor));
ombreoeuf1.rotation = accelY * 90;
}
One thing that might help You with the smoothness is to set the update interval to 30 fps instead of 60, so update your init to:
-(id) init
{
self.isAccelerometerEnabled = YES;
[[UIAccelerometer sharedAccelerometer] setUpdateInterval:1.0/30];
}

Resources