I have used video texture videoTexture = new THREE.VideoTexture(video); to show video on canvas where video is html element. I want to show loader in video texture untill video gets loaded to play. Cuerently it's shows white screen and then playing video when video get loaded. Is there any way to achieve what i am looking?
Maybe you can do it like this: First, you apply a simple, static placeholder texture to your material that will later be used to display the video. Now it's important to replace this texture at the right time with the video (when the video is ready to play). You can try to use the canplay event in order to detect this situation. After you have obtained a reference to the video element, set the event listener like this:
const video = document.getElementById( 'video' );
video.addEventListener( 'canplay', changeTexture );
three.js R91
Related
Wanna achieve result something like this:
I already knows how to playing FBX animation in three.js, but how to simply place a video file in 3D scene like this?
Take a look at the video texture here
https://threejs.org/docs/#api/en/textures/VideoTexture.
By this, You could set the video into a texture and set this texture inside the material.
And you may want to create Video element without getElementId,
And this may help you
Dynamically create a HTML5 video element without it being shown in the page
I want to use camera video feed as background in a-frame, while overlaying objects on it. I know it is possible but I don't know how exactly to do it.
so I am here to ask for help!
You can have a simple overlay by adding an element before the scene:
<img src='overlay.jpg' />
<a-scene></a-scene>
fiddle here.
Here is a nice article about using a camera stream, I'll just use a basic version of it:
html
<video autoplay></video>
<a-scene></a-scene>
js
// grab the video element
const video = document.querySelector('video');
// this object needs to be an argument of getUserMedia
const constraints = {
video: true
};
// when you grab the stream - display it on the <video> element
navigator.mediaDevices.getUserMedia(constraints).
then((stream) => {video.srcObject = stream});
Fiddle here.
I took the above example and turned it into a glitch so the demo could be run on a phone. I also modified the aframe example code to look like a basketball game.
https://glitch.com/edit/#!/3dof-ar?path=index.html%3A33%3A33
I need to take a picture with a frame effect.
Can I preview camera image with the transparent PNG and later save the image?
Thanks!
Use nativescript-camera-plus plugin to add preview within your app, style it around with your frame.
Once the picture is taken, use nativescript-bitmap-factory to create a new image that combines your frame and the actual image form camera.
I'm trying to export a VR scene made with a-frame (three.js-based VR library) into a 360° video.
All I could find was a way to export the canvas to a regular flat mpeg4 using a chrome plugin (RenderCan). I would like it to be a video that could be watched from any angle though.
Is there a way e.g. to snapshot every frame to an equirectangular image or something?
You can make equirectangular screenshot of aframe scene by doing
document.querySelector('a-scene').components.screenshot.getCanvas('equirectangular');
or normal perspective screenshot of what you're looking with
document.querySelector('a-scene').components.screenshot.capture('perspective');
Maybe you can try to make some screenshot and make a sort of gif
I'm trying to make a VR with three.js. Right now, all the 3D objects are showing up in my scene with a stereo effect, but I can't seem to get the event to launch properly when I click on an object of my scene. It seems that the stereo effect is on top of the regular camera and that I'm clicking on the real scene and not the one shown in the stereo effect. Is there a way that I can add an event listener to the object on each camera of the stereo effect? For example, when I click on an image on one side of the stereo effect, the event will be launched.
If it isn't possible, is there a way to see what the center of the camera is looking at. For example , if I'm looking at an image it will call the event associated to this image (for example, it will grow)? I read that this can be achieved with RayCasting but I don't know how to set the ray to the middle of the camera.
Thank you very much!
Edit : This is my current scene with the stereo effect : http://i.imgur.com/FzbHV2U.png. Also, the code from the stereo effect comes from : http://threejs.org/examples/js/effects/StereoEffect.js
There are game extensions available for Three.js grouped in a library called THREEx. You should take a look at threex.domevents which adds typical Dom events to elements in the canvas.