stereoEffect and CSS3DStereoEffect threejs - three.js

I'm making a VR application. So far I've done the desktop version. For mobiles devices I plan to use google cardboard with a stereoEffect, but I'm using a cssRenderer and a webglRenderer together.
My question is how do I use stereoEffect and CSS3DStereoEffect together?
Here's an inline link to my demo app.

Related

My 3d game now only runs on Internet Explorer

Years ago I make a 3D game on Construct 2 with 3D plugin "Q3D" (Three.js). It runs well in Chrome and Firefox
Suddenly one day, my game it stopped working.
The URL of the game is http://altiplanet.github.io
Is rare because the game run on Internet Explorer.
I view the console but I do not understand anything (various errors appear)
Very Thanks!
Your code uses the setVelocity() method of the PannerNode class from the Web Audio API.
According to the MDN, this method is deprecated and should no longer be in use.
PannerNode.setVelocity() is used in your class C2AudioInstance which is no part of the three.js API.

Native mobile app with React VR

I have a bit of experience with React JS & React Native. Now I have just started with React VR. From what I have learnt so far, one can use React VR to create virtual reality experiences in web pages.
I have a web application which uses Three JS to create a virtual walkthrough (using panoramic images). I wanted to know whether it is possible to create a native mobile app for the same using React VR and React Native.
My main reason for wanting to do this is to harness the power of the native hardware for better performance of the app. I hope to keep most of the existing Three JS code intact by making it work with React VR.
React VR is essentially a React-ive wrapper for Three.js. So, yes, you can keep most of your Three.js code intact and pop it into an a React VR app. You can simply pop your Three.js code into the Client.js file of a React VR app.
Some of the newer phones are WebVR friendly (e.g. Google Pixel), and from my experience, work well with React VR apps. To this end, this does leverage the native hardware.
However, as for combining React VR and React Native, in short, its not possible. React VR makes Views renderable via WebGl and leverages the WebVR API that many browsers are becoming equipped with these days.
React Native on the other hand takes Views and renders them on a device as Java (android) or objective-C (iOS) Views. In short, React Native, at the time of writing this answer, does not have VR capabilities, and cannot be combined with React VR.
Check out this informative link for a more detailed discussion.

Google Tango Unity Examples not working

I've tried the codelab and followed the video tutorial and and just can't make it work. In both cases the app loads, but (a) no motion tracking permission is requested and (b) the device motion does not control the camera.
These labs are super simple and don't contain any code, so I am fairly confident I do not have an error.
Unity v5.3.1f1 Personal
Unity Tango SDK FurudUnity5

Custom Color action buttons, transparent card background,full screen custom layout notifications.Changing card color

Custom Color action buttons, transparent card background,full screen custom layout notifications.Changing card color,ttf fonts for texts. changing text color . placing on different areas.
are these things possible ??
if its , why we cant use .
if its not how these peoples used these things
I really wanted to develop apps with these features. but current public api is really bad.
it provides nothing at all.but when we see screenshots new beautiful features exists.
I liked google wear and i see future about it . but when we compare with samsung galaxy gear or sony smartwatch, development in public goes slow.Maybe google devs doing things at the background,but if we don't have resources,how can developers can build apps for it.
I shared a basic app with my wishes and ideas.nobody answered about 2 days on google+ no one cares about posts on google+.public support of google wear for developers is sucks at the moment.
I need more customizable things for android wear to build apps.
but in current stage,i can't do much things with it.
i wish we had these features in screenshots below..
Those things are still not possible with the current preview release of AndroidWear.
If you look closely, you'll see that they've been using photoshop (or some other photo editing software)
Using these templates to design an app for Android Wear.
And for testing the design on the watch they used Android Design Preview which is a tool that lets you mirror a portion of your desktop to your device:

Interactive 3d Web Technology

I would like to use interactive 3D models in a web page. The required functionality is:
Import dxf file which defines & displays a room.
Add/move prebuilt objects from javascript
Add/move lamp which cast shadows from javascript
Return room dimensions to javascript
Return object positions to javascript
Can I import dxf files into any WebGL engine?
I have a small repeat user base so a browser installation is no problem at all. Is there any plugin technology I could use? Java applets? Unity? Can I use an OpenGL engine as a plugin? How about a java3d applet?
I will start with desktop but would need to be targeting tablets soon and mobiles in two years or so.
I am becoming convinced I will need to hire an expert to write this but I want to understand the options. Can you recommend a suitable technology?
I think WebGL is an excellent choice for this application; the graphics functions you describe are well within its capabilities. I can't comment on model loading, though, as I'm not familiar with WebGL engines.
However, mobile is a big wrinkle. Regarding the techniques you mention:
WebGL is supported in Chrome and Firefox for Android. On iOS, Mobile Safari does not enable WebGL, but an implementation exists (used strictly for in-app ads), so it is likely that there may be broader WebGL support in the future (possibly requiring a custom web view wrapper to enable it).
Java applets are not supported by any browser on either Android or iOS.
Unity is a viable choice; the Unity Web Player browser plugin allows embedding Unity content in a web page. However, there is no such plugin for mobile-OS devices; Unity content may be compiled into a application for iOS or Android but it cannot be viewed within a web page.
News as of August 2014: Unity has announced that the upcoming Unity 5 will include publishing to JavaScript + WebGL (no plugin required). Assuming this works as promised, you can use Unity if your target platform has a browser with WebGL support.
So, if you absolutely need cross-platform 3D including iOS right now, Unity is where it's at, but WebGL is a good choice for desktop and Android now and likely to improve on mobile in the future, and is the only way to embed 3D in a web page, not an app across desktop and Android.
WebGL is not the only way to embed 3D in a web page - see phoria.js:
http://www.kevs3d.co.uk/dev/phoria/
https://github.com/kevinroast/phoria.js
Also three.js has a canvas rendering for without web-gl.
Phoria.js and others like it will work on iPhone/iPad and Android phones that don't support web-gl. Of course, the performance is MUCH lower but if you don't have complex models and want it to work everywhere...

Resources