How can I see the whole shader's text content, with all the prepended code by Three.js, when using ShaderMaterial or RawShaderMaterial (r148)? - three.js

I know that Three.js prepends the shader text you provide with some code that includes definitions etc. I am not a big fan of "magic" and I would like to be able to see the final text content of the shader(s) that Three.js actually uses.
I have seen that in earlier releases of Three.js, if there was en error in a shader file (either the vertex shader of the fragment shader), the whole text content of the shader was logged in console, including all the prepended code by Three.js.
I am currently using r148 and in case of an error, only the line(s) containing the error(s) gets logged in console.
Has the functionality of logging the whole shader file in console been removed or is there any way to enable it? Or, more generally, is there any other way to access the actual text content of each of the shaders?

Related

How do you get Xcode to use syntax editing and code completion on Metal Shader Modifiers?

Beginning my adventure down the path of learning all about both Metal and shaders. Currently I'm using this example from here to experiment with.
As you can see, he's storing his shader in a separate .txt file. While I like the idea of the shader being a resource, I don't like that because it's text, I lose all syntax highlighting, code completion, etc.
I attempted to change the extension to .metal and Xcode did now recognize it as such with the color-coding, but now I can't build the app with Xcode saying there are tons of errors with it which clearly isn't correct as it works just fine as .txt. To prove that, I simply changed it back again to being .txt and everything starts working as it was before, including with no highlighting or code completion.
Note: I also tried leaving it as text, but changing the type in the inspector to 'metal' but that didn't work either.
I keep hearing such great things about the metal debugger, but I can't even figure out how to get the metal editor working! Help!
So... how can I include my shaders as separate .metal files (or comparable) and still be able to edit/run with it, both literally and figuratively?
In the example that you provided, #warrenm used shaderModifiers:
A shader modifier is a snippet of source code in the Metal shader
language or OpenGL Shader Language (GLSL) that SceneKit injects into
its own shader programs at a defined entry point.
.metal - is a complete Vertex/Fragment/Compute implementation, which is not what shaderModifiers are.
Instead, you can create an SCNProgram:
A complete Metal or OpenGL shader program that replaces SceneKit's rendering of a geometry or material.
You should be able to manually set the file type in the File Inspector:

Issues adding texture to Reality Converter?

Anyone having issues with Reality Converter by Apple. Mainly, when I add an .obj file, it’s able to display the white object. However, when I go ahead and add a texture .png files into the Materials folder, nothing gets updated. I end up with a plain white 3d object (even after restarting/exporting).
The only way it works is if I upload a .gITF folder, where it will actually add in the textures/color.
Not sure if this is a glitch? Or if I’m doing something wrong?
In order to apply a texture to .obj file you need not only a texture file but also its inseparable companion .mtl file (Material Template Library) – a special material definitions file for .obj.

Why does my three.js canvas masking not work?

I'm trying to reproduce example of Szenia Zadvornykh presented here https://medium.com/#Zadvorsky/webgl-masking-composition-75b82dd4cdfd
His demo is based on three.js r80, so I referenced r101 and tried to remove most of unrelated parts, and just have scene with grid and png mask on top.
Here's my code: http://jsfiddle.net/mmalex/y2kt3Lrf/
Having commented // composer.addPass(maskPass) the grid shows up. But it does not seem like uniform sampler2D tDiffuse has the output of render pass.
I expect to see the grid helper and underlying HTML content under the canvas, where mask makes canvas transparent.
UPDATE, working now, thanks to #Mugen87: http://jsfiddle.net/mmalex/y2kt3Lrf/
There is a mismatch of files in your fiddle. If I use the latest version of three.js and the respective post processing classes, your code works fine:
http://jsfiddle.net/pk24zby7/
three.js has deprecated the renderTarget and forceClear parameter from WebGLrenderer.render() with R102. When the change was done, it was necessary to change many files in the examples directory in order to avoid warnings and even breakage. So using the latest post-processing classes with an older three.js version does not work.
Since the change is listed in the release note, I suggest you read the respective PRs for more details.

Blender MakeHuman to Three.js

I'm trying to integrate a animated 3D character in a Web navigator.
I use MakeHuman 1.02 to create a character which I import in Blender 2.74 in .mhx format.
I retarget to a BVH using the MakeWalk plugin for Blender. It's for the motion.
When I try to export the character in .json format (three.js), the following error appears :
MakeHuman is not a valid mesh object.
A mesh object is an object that we can modify properties or vertices, isn't it ?
I try others format like .dae format (collada) but it seems that the navigators doesn't find the skeleton and the textures of the character (even if they are in the same directory) necessary for the character's motion.
How to get the character like a mesh object ? Or somebody knows another process to success ?
Like Erica pointed out, you need to have a mesh selected to export it. The problem with this is it doesn't seem to work if you have multiple meshes. Only one will export. This is a problem when using MakeHuman because their clothes are separate meshes.
One way to fix this is to select all meshes and combine them into one (I believe that's CTRL + J). However, you'd have to somehow merge all your texture files into one big one and I have no idea how to do that.
What I do is to export the entire scene. Then it doesn't matter what is selected. All meshes get exported. You can load it using either the ColladaLoader, which I would recommend since you're retargeting to a BVH (worked great for me), or the new ObjectLoader.
If you have your own Scene object on the page that you want to use, you can still load the scene created by the exporter, traverse it to get the items you care about, and add those items to your scene that will display on the page.

How to import Blender 3D animation to iPhone OpenGL ES?

I am trying to do animations on iPhone using OpenGL ES. I am able to do the animation in Blender 3D software. I can export as a .obj file from Blender to OpenGL and it works on iPhone.
But I am not able to export my animation work from Blender 3D to OpenGL. Can anyone please help me to solve this?
If you have a look at this article by Jeff LaMarche, you'll find a blender script that will output a 3D model to a C header file. There's also a followup article that improves upon the aforementioned script.
After you've run the script, it's as simple as including the header in your source, and passing the array of vertices through your drawing function. Ideally you'd want a method of loading arbitrary model files at runtime, but for prototyping this method is the simplest to implement.
Seeing as you already have a method of importing models (obj) then the above may not apply. However, the advantage of using a blender script is that you can then modify the script to suit your own needs, perhaps also exporting bone information or model keyframes.
Well first off, I wouldn't recommend .obj for this purpose since the obj file format doesn't support animation, only static 3D models. So you'll need to export the animation data as a separate file that you load at the same time as the obj.
Which file format I would recommend depends on what exactly your animations are. I don't remember off the top of my head what file formats Blender supports, but as I recall it does not export Collada files with animation, which would be the most general recommendation. Other options would be md2 for character animations, or 3ds for simple "rigid objects moving around" animations. I think Blender's FBX exporter will work, although that file format may be too complicated for your needs.
That said, and assuming you only need simple rigid object movements, you could use .obj for the 3D model shapes and then write a simple Python script to export a file from Blender that has at the keyframes listed, with the frame, position, and rotation for each keyframe. Then load that data in your code and play back those keyframes on the 3D model.
This is an old question and since then some new iOS frameworks have been released such as GLKit. I recommend relying on them as much as possible when you can, since they take care of many inherent conversions like this, though I haven't researched the specifics. Also, while not on iOS, the new Scene Graph technology for OS X (which will likely arrive on iOS) in the future, take all this quite a bit further and a crafty individual could do some conversions with that tool and then take the output to iOS.
Also have a look at SIO2.
I haven't used recent versions of Blender, but my understanding is that it supports exporting mesh animation as a sequence of .obj files. If you can already display a single .obj in your app, then displaying several of them one after another will achieve what you want.
Now, note that this is not the most efficient form to export this type of animation, since each .obj file will have a lot of duplicated info. If your mesh stays fixed over time (i.e. only the vertices move with the polygon structure, uv coords, etc. all fixed) then you can just import the entire first .obj and from the rest just read the vertex array.
If you wanted to optimize this even more, you could compress the vertex arrays so that you only store the differences from the previous frame of the animation.
Edit: I see that Blender 2.59 has export to COLLADA. According to the Blender manual, you can export object transformations, and you can also export baked animation for rigged objects. The benefit for you in supporting the COLLADA format in your iPhone app is that you are free to switch between animation tools, since most of them export this format.

Resources