I cannot get a 3D model with texture transparencies to be rendered correctly in Xcode.
The 3D model of hair consists of two geometries (hair and cap) and was created in Maya. It was bought off-the-shelf from here.
The correctly rendered model looks like this (without the head):
I exported the model to COLLADA format (DAE), put it into a folder model.scnassets together with its textures and added it to Xcode. However in Xcode Scene Editor it is rendered like this:
What is wrong here?
Update:
Setting Transparent > Intensity = 5, Settings > Transparency > Blend Mode = Double Sided, Settings > Transparency > Options > Writes depth = false gives this image where hair polygons against the blue sphere are rendered correctly, but above the blue sphere the rendered hair polygons are not the ones nearest to the camera but seem to be the ones on the back side of the hair model. This is apparently because of Writes depth = false, but it seems to be necessary to disable it to render the semi-transparent hair. (Using a brown texture here instead of the read one, but the result is the same regardless of the chose texture)
If you want to see a correct result of Transparent slot in Xcode Scene Editor you need to use premultiplied TIFF or PNG files for transparent parts of your 3D object. TIFF and PNG file formats can hold four channels (RGBA), but JPEG can hold only three channels (RGB).
Premultiplied means RGB * Alpha = RGBA (the opposite to it is unpremult, or RGB, A).
After applying the premultiplied texture to Transparent slot, set its Intensity property to 1...5 and Components property to desired channel from drop-down menu (All, Red, Green, Blue, Alpha).
Apple Documentation.
By default, the transparent property’s contents object is a fully opaque black color, causing the property to have no visible effect. Setting the transparent property’s contents to any solid color uniformly fades the opacity of the material based on that color’s opacity value. To make parts of a material appear transparent, set the property’s contents to an image or other texture-mapped content whose alpha channel defines areas of full or partial opacity.
Related
I have a texture and I want to draw it to a FrameBufferObject. The texture has transparent areas. I would like the following:
For all transparent pixels, let them be drawn transparent in the FrameBufferObject.
For all non-transparent pixels, let them be drawn using a color of my choosing, ie pure red (ignore their actual rgb value)
I've tried Batch.setColor(red) before drawing the texture, but that just tints it - I want all non-transparent pixels to be drawn pure red.
I am also trying to figure out how to achieve this in just opengl directly, looks like there may be a way to do this with blending, which can then be related back to gdx.
Thanks
You need to write a shader and the shader is the OpenGL program that renders the texture. So your shader would render transparency without change and all else would the color of your own choosing.
The following link has libgdx shaders for palette swapping i.e. direct reassignment of color at render, which you can easily adapt for single color and transparency.
https://www.javaer101.com/en/article/12241616.html
I'm using a prefab for a box shape, which has a front and back plane.
My images are PNG and have transparent areas around the edge. I dragged the image onto my front plane, which now has a drop-down box for "Shader".
First I chose Shader: "Standard" but the transparent areas of my PNG image weren't transparent, so in order to fix that I changed it to "Sprites / Diffuse"... now the image looks fine (from the front).
However, when I rotate the shape, the image is also visible from the back. I want a way to not see the image / texture from the back.
How can I make the images only visible from the front side of a plane, whilst also preserving the transparency areas of the image / texture?
If you are using the standard built-in shader, you need to set the rendering mode to transparent in order for the texture's alpha channel to be transparent. The sprite shader, by default, forces the rendering of otherwise invisible back-faces, whereas the standard shader does not.
Is there a way to correctly display more than one transparent texture in three.js? There are no problems if you try to render a transparent texture on a non-transparent textured plane below, but if you have more than a transparent plane, the nearest will "delete" the others below as you can see here:
On the left pic, there's what I would have (achieved adding depthWrite = false to every transparent material), on the right there's what I have, only setting transparent = true to materials with RGBA textures.
I already tried using alphaTest, but it isn't what I need, and depthWrite sometimes can't satisfy my needs (look at the green line bounding the path in the first screen, which hasn't been covered by the house shadow).
i am trying to cast an shadow on an totally transparent plane in SceneKit on OSX. I am struggling with this problem since several hours and do not come to any solution.
My Purpose is to generate an Screenshot of several objects with an transparent background and just the shadow on an invisible Plane.
Do you have any suggestions for me how i can make this with apples SceneKit?
Do i have to program my own shader, can i make this work with shadermodifiers or can i use built in functionallity?
UPDATE:
I find an alternative solution for anyone who needs:
create a white plane under 3D model, note that the color of plane must be pure white.
set blend mode of plane's material to SCNBlendModeMultiply.
set light model of plane's material to SCNLightingModelLambert.
This works because any color multiply white color (1, ,1, 1) return itself And lambert light model will not take account of directional light, So the plane will always be background color which look like transparent. Another benefit of this solution is you don't need change light‘s shadow rendering mode.
For people who used to inspector of Xcode.
According to SceneKit: What's New.
First, add a plane under you model. Then prevent it from writing to colorBuffer.
Second, change your light model's shadow rendering mode to deferred. Notice that you must use light which can cast shadows.
Oily Guo, your solution works. Here the solution is in code:
Configuration of the light source:
light.shadowColor = UIColor(red: 0, green: 0, blue: 0, alpha: 0.4)
light.shadowMode = .deferred
And for the floor (ie. SCNFloor underneath your objects):
material.diffuse.contents = UIColor.white
material.colorBufferWriteMask = SCNColorMask(rawValue: 0)
I do not have an answer to your question, however I have a workaround:
Render your scene and keep the image in memory
Change all the materials in your object for pure black, no specular
Change the plane and the sky to a fully white material, lights to white
Render the scene to another image
On the second image, apply the CIColorInvertand CIMaskToAlpha Core Image filters
Using Core Image apply the Alpha Mask to the first render.
You'll get an image with a correct Alpha channel, and transparent shadows. You will need to tweak the materials and lights to get the results you want.
The shadow may become lighter on the edges, and the only way around that is rendering it as yet another image, and filling it with black after the Mask to Alpha step.
the color of a black image , or the black part of a image, can not be changed.
I set the image's color to red, but the image is still black.
Is that a desired feature?
What i want is , the image is changed to red.
the Unity version is 5.0.1f.
I am using the new UI.
To understand the source of the problem, you have to understand how "changing colors" work. It's nothing but a simple multiplication. In RGB terms, "black" is a vector of (0, 0, 0) — and it's pretty obvious, that whatever you multiply the 0 by, it stays black.
If you want the template image to be able to change to any color, use white.
To modify colours in a more complex way, you have to understand how the Color property of a Image component works. UI system hides a lot of complexity underneath (and it's good). Basically, Color modifies the vertex colours of a mesh. Since you don't usually specify a material, a default sprite material is used, and it uses a default sprite shader. And inside this shader, when it paints the pixels on the screen, it multiplies the vertex colours by the texture colour sampled at this pixel, and that's how it produces the end result. If you want the colours to be combined in a different way, you'll have to write a custom shader — which is really not as hard, but you probably don't need it for what you're trying to do in the scope of this question.
Black stays black.
White, however, will change with whatever Color you choose, so if you use an image editor and make your image be white instead of black, the image can then be whatever color you choose within Unity.
Check to see if your image has an alpha channel. You can do this with GIMP or Photoshop. If it does, then check your shader/material to ensure that it's set to transparent/diffuse (and not just diffuse). Diffuse only doesn't apply the alpha property to the material.
Also, it doesn't look like you assigned a material...
Make sure you have used yourImage.canvasRenderer.SetColor.
For example:
damageAnim.canvasRenderer.SetColor (new Color (255, 255, 255, 255));