UE4: How to get an actor or surface to change it's material when clicked? - actor

I'm having a few questions about UE4. I'd like to have an actor or landscapes material changed, if a specified item is used on it.
for example: if i use a hoe on grass, the grass material should be replaced by a dirt material.
My Problem is, how do i get the specific actor or (part of) a landscape i'm hitting?
I'm working with blueprints in ue4 since I've just begun studying. I'm looking for the easiest solution, so i can improve.

AFAIK you cannot change a landscape material at runtime, unless you do some serious modification to the engine. I would suggest making a dynamic mesh plane and fitting it to the underlying landscape mesh structure, then applying your "hoed" material to it. Still needs a programmer or at least some code experience, though.
As for changing an actor's material - at runtime (in the BeginPlay node), create and store two Dynamic Material Instances. Set one material instance to the actor's default material and immediately apply it to the mesh. Set the other material instance to whatever you need. At the appropriate time, set your actor's mesh material to this second Dynamic Material Instance and voila!

Related

Transparent light-blocking Objects

I want to render a room with a floor + roof that is open to one side. The room contains a point light and the "outside" it lit by an ambient light (the sun). There is one additional requirement: The user should be able to look inside the room to see whats going on. But I cannot simply remove the roof because then the room is fully lit by the ambient light.
I think my problem could be solved by having 3d objects that are transparent by still are blocking the light.
To give you an idea about my current scene, this is how it looks like:
The grey thing is the wall of my room. The black thing is the floor of the room. The green thing is the ground of the scene. The room contains a point light.
I am currently using two scenes (see Exclude Area from Directional/Ambient Lighting) because I wanted the inside of the room to be unaffected by the ambient light. But now my lights can only affect either the inside of my room (the point light) OR the outside (the ambient light) but not both.
A runnable sample of my scene can be found here:
https://codesandbox.io/s/confident-worker-64kg7m?file=/src/index.js
Again: I think that my problem could be solved by having transparent objects that still block the light. If I had that I would simply have a 3d plane on top of my room (as the roof) and make it transparent... It would block the light that is inside of the (but still let it go outside if the room is open) and it would also block the ambient light (partially - if the room is open)...
Maybe there is also another solution that I am not seeing.
Just use one scene instead of two, then enable shadows across the relevant meshes so a light doesn't cross from inside to outside. Once you're using only one scene, the steps to take in your demo are:
Disable AmbientLight, and use DirectionalLight only, since AmbientLight illuminates everything indiscriminately, and that's not what you want.
Place the directional light above your structure, so it shines from the top-down.
Enable shadow-casting on the walls
Add a ceiling mesh with the material's side set to side: THREE.BackSide. This will only render the back side of the Mesh, which means it won't be visible from above, but it will still cast shadows.
const roomCeilMat = new MeshStandardMaterial({
side: BackSide
});
const roomCeiling = new Mesh(roomFloorGeo, roomCeilMat);
roomCeiling.position.set(0, 0, 1);
roomCeiling.castShadow = true;
scene1.add(roomCeiling);
See here for a working copy of your demo:
https://codesandbox.io/s/stupefied-williams-qd7jmi?file=/src/index.js
I would assign a flat, emissive material to the room. Or a depth gradient if it becomes terrain. Since ambient light doesn't cast shadow. It saves a light and extra geometry or groups. Plus web model viewer(s) would probably render it better. If you're doing a reveal transition, use a clip plane or texture alpha mask.
It depends on the presentation versus the output format. Also it depends on the complexity of the final floorplan. If your process is simple it will run Sims Lite on a Raspberry Voxel.

How to add a mesh collider to imported 3D object in A-Frame?

I am working on an aframe project where 3D objects are loaded from .obj files. However the raycaster wouldn't work with the imported objects. I suspect the reason is that unlike built-in geometries, these imported objects don't have proper collider set up. Is it possible to add mesh colliders to a generic 3D object, like in Unity? Or is it for some other possible reasons that the raycaster won't work on these objects?
Meshes do work with the raycaster so if you have issues 2 quick solutions:
make sure the mesh is indeed properly size and centered. Chances are it's not and the bounding box is incorrect so you have to look in specific parts of the mesh to make it work. Using the inspector allows to see the bounding box. You can then use a 3D modeling software like Blender to fix the mesh.
the bounding box might feel too small for natural interaction, you can then add an invisible transparent object e.g. a sphere or box that will then change with they raycaster. This solution only makes sense if don't mind imperfection on the volume.
Also make sure there is no object between the origin of the raycaster and the mesh! It's a silly problem but sometimes we forget that we add/remove object by making them transparent and... they prevent the raycaster to interact with the object behind.
PS: if you want a collider in addition to the raycaster there is the aabb-collider component.

Three.js, sharing ShaderMaterial between meshes but with different uniform sets

As the title says, I would like to reuse a given ShaderMaterial for different meshes, but with a different set of uniforms for each mesh (in fact, some uniforms may vary between meshes, but not necessarily all of them): is it possible ?
It seems a waste of resources to me to have to create a full ShaderMaterial for each mesh in this circumstance, the idea being to have a single vertex/fragment shader program but to configurate it through different uniforms, whose values would change depending on the mesh. If I create a new ShaderMaterial for each mesh, I will end up with a lots of duplications (vertex+fragment programs + all other data members of the Material / ShaderMaterial classes).
If the engine was able to call a callback before drawing a mesh, I could change the uniforms and achieve what I want to do. Another possibility would be to have a "LiteShaderMaterial" which would hold a pointer to the shared ShaderMaterial + only the specific uniforms for my mesh.
Note that my question is related to this one Many meshes with the same geometry and material, can I change their colors? but is still different, as I'm mostly concerned about the waste of resources - performance wise I don't think it would be a lot different between having multiple ShaderMaterial or a single one, as the engine should be smart enough to note that all materials have the same programs and don't resend them to the gfx card.
Thanks
When cloning a ShaderMaterial, the attributes and vertex/fragment programs are copied by reference. Only the uniforms are copied by value, which is what you want.
This should work efficiently.
You can prove it to yourself by creating a ShaderMaterial and then using ShaderMaterial.clone() to clone it for each mesh. Then assign each material unique uniform values.
In the console type "render.info". It should show 1 program.
three.js r.64
You can safely create multiple ShaderMaterial instances with the same parameters, with clone or otherwise. Three.js will do some extra checks as a consequence of material.needsUpdate being initially true for each instance, but then it will be able to reuse the same program for all instances.
In newer releases another option is to use a single ShaderMaterial, but to add changes to uniforms in the objects' onBeforeRender functions. This avoids unnecessary calls to initMaterial in the renderer, but whether or not this makes it a faster solution overall would have to be tested. It may be a risky solution if you push too much what is being modified before the rendering, as in worst case the single material could then have to be recompiled multiple times during the render. I recommend this guide for further tips.

THREE.PointLight

I create a THREE.PlaneGeometry with heights, in the highest point locate a THREE.PointLight, but this illuminates areas that are not seen from this point.
Why?
I want light from a point only the areas that are viewed from.
By default, the appearance of any given point on a surface is calculated using the lights, their properties and of course the material properties - it does not take the rest of the scene into account, as that would be very computationally expensive. Various ray tracing renderers do this, but they are really slow, and that's not how WebGL and Three.js work.
What you want is shadows. Three.js is capable of rendering shadows using Shadow Map method. There are various examples on using shadow maps both on the net and Three.js examples folder.
A word of warning though, getting shadows to work well can be hard if you don't have the basics down well - you may need to do some studying. Shadows can slow your application down (specially with many lighs) and look ugly if not properly configured and fine-tuned. Also I think shadow maps are only supported for SpotLight and DirectionalLight, PointLights are trickier.

Particles vs ParticleSystem in three.js

I'm struggling with a visualization I'm working on that involves a stream of repeated images. I have it working with a single sprite with a ParticleSystem, but I can only apply a single material to the system. Since I want to choose between textures I tried creating a pool of Particle objects so that I could choose the materials individually, but I can't get an individual Particle to show up with the WebGL renderer.
This is my first foray into WebGL/Three.js, so I'm probably doing something bone-headed, but I thought it would be worth asking what the proper way to go about this is. I'm seeing three possibilities:
I'm using Particle wrong (initializing with a mapped material, adding to the scene, setting position) and I need to fix what I'm doing.
I need a ParticleSystem for each sprite I want to display.
What I'm doing doesn't fit into particles at all and I really should be using another object type.
All the examples I see using the canvas renderer use Particle directly, but I can't find an example using the WebGL renderer that doesn't use ParticleSystem. Any hints?
Ok, I am going from what I have read elsewhere on this github issues page. You should start by reading it. It seems that the Particle is simply for the Canvas Renderer, and it will become Sprite in a further edition of Three.JS. ParticleSystem, however is not going to fulfill your needs either it seems. I don't think these classes are going to help you accomplish this in WebGL in 3D. Depending on what you are doing you might be better off with the CanvasRenderer anyway. ParticleSystem will only allow you to apply a single material which will serve as the material for each particle in the system as you suggested.
Short answer:
You can render THREE.Particle using THREE.CanvasRenderer only.

Resources