I have an webGL canvas rendering an 3D hologram image using Three.js and I want to stream it as a video to another pc.
I already tried the code below and didn't work.
var gl = renderer.getContext(); //get webGl context
var canvas = gl.canvas; //get gl's canvas
var video = document.querySelector('video');
var stream = canvas.captureStream();
video.srcObject = stream;
Is it possible to stream and webGl canvas with WebRTC?
this is not yet fully supported in Chrome (Firefox should work since 43), you need "Experimental Web Platform features" in chrome://flags.
See this demo for a classical teapot example.
Related
I want to use camera video feed as background in a-frame, while overlaying objects on it. I know it is possible but I don't know how exactly to do it.
so I am here to ask for help!
You can have a simple overlay by adding an element before the scene:
<img src='overlay.jpg' />
<a-scene></a-scene>
fiddle here.
Here is a nice article about using a camera stream, I'll just use a basic version of it:
html
<video autoplay></video>
<a-scene></a-scene>
js
// grab the video element
const video = document.querySelector('video');
// this object needs to be an argument of getUserMedia
const constraints = {
video: true
};
// when you grab the stream - display it on the <video> element
navigator.mediaDevices.getUserMedia(constraints).
then((stream) => {video.srcObject = stream});
Fiddle here.
I took the above example and turned it into a glitch so the demo could be run on a phone. I also modified the aframe example code to look like a basketball game.
https://glitch.com/edit/#!/3dof-ar?path=index.html%3A33%3A33
I have used video texture videoTexture = new THREE.VideoTexture(video); to show video on canvas where video is html element. I want to show loader in video texture untill video gets loaded to play. Cuerently it's shows white screen and then playing video when video get loaded. Is there any way to achieve what i am looking?
Maybe you can do it like this: First, you apply a simple, static placeholder texture to your material that will later be used to display the video. Now it's important to replace this texture at the right time with the video (when the video is ready to play). You can try to use the canplay event in order to detect this situation. After you have obtained a reference to the video element, set the event listener like this:
const video = document.getElementById( 'video' );
video.addEventListener( 'canplay', changeTexture );
three.js R91
So, the scene include an earth spinning on its axis, a moon rotating around the earth, and a light source to the right that will help to simulate the effect of an eclipse. I thought it would be easy because we've done shadows and transformations before but I ran into a problem.
In our template we have the following at the top:
// For the assignment where a texture is required you should
// deactivate the Detector and use ONLY the CanvasRenderer. There are some
// issues in using waht are called Cross Domain images for textures. You
// can get more details by looking up WebGL and CORS using Google search.
// if ( Detector.webgl )
// var renderer = new THREE.WebGLRenderer();
// else
var renderer = new THREE.CanvasRenderer();
My problem is, when I leave it like that, the spotlight doesn't appear on the scene. However, as was warned, if I activate the Detector, the textures won't work.
But I need both textures and the spotlight. How do I work around this?
You are confusing yourself. Detector.webgl only checks for support of WebGL on the browser. The code below uses the WebGL renderer if the current browser supports WebGL and CanvasRenderer if there is no WebGL support.
if ( Detector.webgl )
var renderer = new THREE.WebGLRenderer();
else
var renderer = new THREE.CanvasRenderer();
With WebGL - loading textures will run into a cross domain issue. Best to then execute the code either on a web server or a local server like http://www.wampserver.com/en/ for Windows or https://www.mamp.info/en/ for Mac. Or npm-package like https://github.com/tapio/live-server.
As far as I know shadows are not supported on the CSSCanvasRender. I would ask your assignment head to clarify.
I have html page and I use a generic handler to get a byte array for my img control.
I have recently moved to using the canvas element.
I have got an image loaded into my canvas after I have loaded it into a hidden img variable.
I want to refresh/change this image as quickly as I can.
I have read that canvas takes advantage of hardware accelerated graphics.
is this handled automatically by the canvas element? Do i need to add any additional code?
Do I need specific graphics card or is it down to the graphics driver installed?
This is my code so far:
I call a generic handler ashx page that returns
context.Response.BinaryWrite(data);
This loads an image into an Image variable
var myIMG= new Image();
And when it has hit the onload event of that image variable it draws onto the canvas:
var c1 = document.getElementById("live1x4");
var ctx1 = c1.getContext("2d");
c1.setAttribute('width', '360');
c1.setAttribute('height', '288');
img1x4.onload = function () {
ctx1.drawImage(img1x4, 0, 0);
};
img1x4.onerror = function () {
$("#divMode5").html('error#1');
};
Html5 Canvas automatically uses any existing GPU to accelerate graphics when it can.
Just be sure you have the latest browser installed (some old browsers don't use the GPU).
If you're just drawing an offscreen img element to the canvas, that's GPU accelerated.
context.drawImage(myImageElement,0,0);
I see that you're using a byte array for something. Therefore I should mention that .getImageData and .putImageData don't use the GPU for acceleration.
I have a problem with my Three.js 3D application - at least according to some people I know.
My application rests at [http://176.9.149.205/planungstool/]. Some people who supposedly have the most recent version of Chrome and Firefox, can not see the textured areas. For example, they do not see the roof or front of the 3D house. They do, however, see the non-textured stuff like the tree or the floor.
What's weird is that I don't have that problem and most of the other people I asked do not have it as well. Here is what it should look like and does look like for me: [http://176.9.149.205/planungstool/house.jpg]
Does anyone have an idea what could cause this? Could it be some client-side settings? Or maybe some access control policy?
I'm loading the textures like this:
var myTexture = new THREE.ImageUtils.loadTexture('gfx/textures/texture.jpg');
And then I just create meshes with lambert material that have this texture as their map.
If you read this and do not know what could cause this error, it would be nice if you could at least tell me if you see the textured areas or not, given you have a recent version of Chrome or Firefox.
I can see the textures on current chrome on mac. I had a similar problem with the canvas renderer (anything textured was invisible). For me I changed from using the ImageUtils.loadTexture to a texture and texture loader and it works.
var texture = new THREE.Texture();
var texLoader = new THREE.ImageLoader();
texLoader.addEventListener( 'load', function(event){
texture.image = event.content;
texture.needsUpdate = true;
} );
texLoader.load('texture.png');
I do however still have problems with a canvas renderer in safari but you appear to only be using the webgl renderer. Hope this helps.