In my use case I would like to record the screen activity and send it to server [not live]. I looked at few blogs/sample demos for this. But I couldn't find anything related to this. I could find lot of live streaming audio/video but not screen recording.
I have the following questions related to this,
Which one would be efficient WebRTC/Websockets for this use case?
Browser support for WebRTC/Websockets?
Is there any other methods to achieve this use case?
I am fairly new to WebRTC/Websockets, if I am not posting the enough information please comment. I will add.
Could someone help me on this? Any reference URL/any related info related to this use case would be really helpful.
Here's how to record the screen in Firefox (Update: try it in this fiddle):
var constraints = { video: { mediaSource: "screen", width: 320, height: 200 } };
var start = ms => navigator.mediaDevices.getUserMedia(constraints)
.then(stream => record(stream, ms)
.then(recording => {
stop(stream);
video.src = link.href = URL.createObjectURL(new Blob(recording));
link.download = "recording.blob";
link.innerHTML = "Download blob";
log("Playing "+ recording[0].type +" recording:");
})
.catch(log).then(() => stop(stream)))
.catch(log);
var record = (stream, ms) => {
var rec = new MediaRecorder(stream), data = [];
rec.ondataavailable = e => data.push(e.data);
rec.start();
log(rec.state + " for "+ (ms / 1000) +" seconds...");
var stopped = new Promise((r, e) => (rec.onstop = r, rec.onerror = e));
return Promise.all([stopped, wait(ms).then(() => rec.stop())])
.then(() => data);
};
var stop = stream => stream.getTracks().forEach(track => track.stop());
var wait = ms => new Promise(resolve => setTimeout(resolve, ms));
var log = msg => div.innerHTML += "<br>" + msg;
<button onclick="start(5000)">Record screen!</button>
<div id="div"></div><br>
<video id="video" height="120" width="160" autoplay></video>
<a id="link"></a>
Warning: Sharing your browser window on the web involves security risk! Read about it here!
Once you have the blob, you can upload it using regular web sockets (not shown).
The mediaRecorder bits should work in Chrome as well, but unfortunately screensharing is still not fully standardized and works differently and requires an extension in Chrome.
The solution can be divided into three parts:
getting hold of screen mediastream using getUserMedia, this falls under category of WebRTC, and since you are sharing screen, your site is gonna have to be https and your users probably going to need to use extensions( for both firefox and chrome), you could look for demo here
Recording the mediastream, firefox has been supporting this for a while through MediaRecorder and heard chrome started supporting it from 47. So with mediarecorder, you can get hold of blob of your recorded file.
How you post this blob to server is totally up to you, you could use any channel: websockets, http post, etc. You could make the server a WebRTC client, and send it through RTCDataChannel, but guessing that would be overkill for your use case.
As suggested by #mido, in the client side I would use MediaRecorder API. Another option would be to record in the server side. For the latter option, you can use some open source media server, like [Kurento] (http://www.kurento.org/).
Related
i'm working with squarespace developer mode, i create some javascript code to get my blog articles or events, the problem i have is that when i use ajax call to get my events to display them in a page i've created, the data is loaded correctly except blocks having audios and videos, i had the same issue with images, but SQS gives a solution for images :
var images = document.querySelectorAll('img[data-src]' );
for (var i = 0; i < images.length; i++) {
ImageLoader.load(images[i], {load: true});
}
But this do not solve the audio and videos blocks, i've tested every single code part existing on SQS forums but none of them work, ive also tested what suggested here but no solution.
Here is my code to get events:
$(document).ready(function(){
var url = '/test-blog?format=json';
$.getJSON(url).done(function(data) {
var items = data.items;
var $_container = $('#events-container-concert');
var result = "";
var appendText = [];
items.forEach(function(elm) {
var body = elm.body;
var $_body = $(body);
appendText.push("<div class='blog-item'><div id='body'>"+body+"</div>");
});
appendText.join(" ");
$_container.html(appendText);
var images = document.querySelectorAll('img[data-src]' );
for (var i = 0; i < images.length; i++) {
ImageLoader.load(images[i], {load: true});
}
});
});
Please is there anyone who had this issue on SQS?
no one answered my question on SQS forums
Thanks
You may use this to automatically load all the blocks (I believe even the image blocks, but if not then pair it with your call to ImageLoader):
window.Squarespace.AFTER_BODY_LOADED = false;
window.Squarespace.afterBodyLoad();
Of course, you'll want to wait until the content is on the page. You can pull up the page you linked-to and then run those two lines via console as a quick test. Worked well for me in that context.
Reference: https://github.com/Squarespace/squarespace-core/blob/master/src/Lifecycle.js
I'm currently doing a plugin on firefox, which should be very simple. But since I am new on that, some problems have occured:
The purpose is to bind an item on the context menu when user clicking on an image(). I want to manipulate this image a lot(some work like visual encryption) using a canvas, and display the result in a new panel or dialog.
To bind the new item onto the menu, following code is used.
cm.Item({
label: _("menu-label-encrypt"),
context: cm.SelectorContext("img"),
contentScriptFile: [
data.url('jquery.js'),
data.url('encrypt.menu.js'),
],
onMessage: function(cmd){
var cryptWorker = imgcrypt();
cryptWorker.key(cmd.password);
var ret = cryptWorker.encrypt( cmd.width, cmd.height, cmd.data);
console.log(ret.length);
},
});
My idea is to use the contextScriptFile encrypt.menu.js to inject a code into the page, and fetch the canvas data as an array, which will then be posted using self.postMessage to the addon and get processed:
self.on('click', function(node){
var canvasID = 'cache';
var img = $(node)[0];
$('<canvas>', {id: canvasID}).appendTo('body').hide();
var canvas = $('#' + canvasID)[0];
var ctx = canvas.getContext('2d');
ctx.canvas.width = img.width;
ctx.canvas.height = img.height;
ctx.drawImage(img, 0, 0, img.width, img.height);
var data = ctx.getImageData(0, 0, img.width, img.height).data,
dataAry = new Array(data.length);
for(var i=0; i<data.length; i++)
dataAry[i] = data[i];
var command = {
'password': 'test',
'width': img.width,
'height': img.height,
'data': dataAry,
};
self.postMessage(command);
});
and now the problem came to my surprise: when I tried this addon on some page hosted at localhost:4000, it works. On some real web page, it shows:
menu.js:14 - SecurityError: The operation is insecure.
I know that this may be caused by a violation of some same-origin policy, but this is a content-script injected by an addon. Is it also impossible to read the image data without some help of a external server, or am I doing something totally wrong?
Thank you.
I've not found any solution to this approach(that being said, to inject a content to get the data from an remotely retrived image). But the net/xhr in mozilla's sdk works at the side of addon well. Alternatively I have wrote a XHR working in main.js and used it as a proxy.
At the side of addon, this XMLHttpRequest have no same-domain restrictions and can be used to retrive the binary data of the image given by a URL. The retrived data can be returned in a form of ArrayBuffer by setting xhr.responseType='arraybuffer';
The URL given to the addon's XHR is simply the src of <IMG>, which is being clicked. The browser seems to be caching the image and using XHR to retrive this image is very fast and doesn't need another request to the server.
Not sure what would be causing this, but when I upload some images to my remote server via FileTransfer(), the images sometimes show up either sideways or upside down. However, when I view the images locally on the iPhone, they are positioned in the correct way.
For example, when I select an image like this to upload: http://sharefa.st/view/WBe2QNSK8r8z
It will turn out like this: http://sharefa.st/view/EWdW1Z4G8r8z
I am using the local path to transfer the file, so I don't understand why the image would rotate "randomly".
Here is my upload function:
function uploadPhoto() {
var options = new FileUploadOptions();
options.fileKey = 'file';
options.fileName = imgURI.substr(imgURI.lastIndexOf('/')+1);
options.mimeType = 'image/jpeg';
var params = new Object();
if(logged_in == true) {
params.unique_id = app_unique_id;
params.secret_key = user_secret_key;
}
options.params = params;
loadingStart();
var ft = new FileTransfer();
ft.upload(imgURI, 'http://' + remote_server + '/API/upload', uploadDetails, fail, options);
}
imgURI value looks like this:
file://localhost/var/mobile/Applications/<snip>/tmp/photo_015.jpg
Any insight is appreciated.
Thanks to Humanoidism pointing out that the issue was in fact with the iPhone, and the way it stored images, I was able to figure out a solutuion.
To upload photos in the correct orientation you must add the correctOrientation setting to the options array in getPicture(), and set it to true.
Here are two examples:
function capturePhoto() {
navigator.camera.getPicture(onPhotoDataSuccess, onFail, { quality: 30, correctOrientation: true });
}
function getPhoto(source) {
navigator.camera.getPicture(onPhotoURISuccess, onFail, { quality: 30,
destinationType: destinationType.FILE_URI,
sourceType: source,
correctOrientation: true });
}
The problem is not PhoneGap but iPhone. The iPhone was designed to be used as a wide lens camera. Turn the phone sideways when taking pictures or capturing video if you intend to view them on desktop. Your phone will display them correctly, because it "knows" how you took them, but the computer that you're viewing it on dosen't.
What you could do to prevent this is to rotate the picture before the upload. This is not a recommended fix but at least people on desktop computers will be able to see it. Though when viewing them on iPhone they'll be rotated - maybe a check for mobile devices wether or not to rotate the image could come in handy - but still again, not recommended.
It seems like I have not clearly communicated my problem. I need to send a file (using AJAX) and I need to get the upload progress of the file using the Nginx HttpUploadProgressModule. I need a good solution to this problem. I have tried with the jquery.uploadprogress plugin, but I am finding myself having to rewrite much of it to get it to work in all browsers and to send the file using AJAX.
All I need is the code to do this and it needs to work in all major browsers (Chrome, Safari, FireFox, and IE). It would be even better If I could get a solution that will handle multiple file uploads.
I am using the jquery.uploadprogress plugin to get the upload progress of a file from the NginxHttpUploadProgressModule. This is inside an iframe for a facebook application. It works in firefox, but it fails in chrome/safari.
When I open the console I get this.
Uncaught ReferenceError: progressFrame is not defined
jquery.uploadprogress.js:80
Any idea how I would fix that?
I would like to also send the file using AJAX when it is completed. How would I implement that?
EDIT:
I need this soon and it is important so I am going to put a 100 point bounty on this question. The first person to answer it will receive the 100 points.
EDIT 2:
Jake33 helped me solve the first problem. First person to leave a response with how to send the file with ajax too will receive the 100 points.
Uploading files is actually possible with AJAX these days. Yes, AJAX, not some crappy AJAX wannabes like swf or java.
This example might help you out: https://webblocks.nl/tests/ajax/file-drag-drop.html
(It also includes the drag/drop interface but that's easily ignored.)
Basically what it comes down to is this:
<input id="files" type="file" />
<script>
document.getElementById('files').addEventListener('change', function(e) {
var file = this.files[0];
var xhr = new XMLHttpRequest();
(xhr.upload || xhr).addEventListener('progress', function(e) {
var done = e.position || e.loaded
var total = e.totalSize || e.total;
console.log('xhr progress: ' + Math.round(done/total*100) + '%');
});
xhr.addEventListener('load', function(e) {
console.log('xhr upload complete', e, this.responseText);
});
xhr.open('post', '/URL-HERE', true);
xhr.send(file);
});
</script>
(demo: http://jsfiddle.net/rudiedirkx/jzxmro8r/)
So basically what it comes down to is this =)
xhr.send(file);
Where file is typeof Blob: http://www.w3.org/TR/FileAPI/
Another (better IMO) way is to use FormData. This allows you to 1) name a file, like in a form and 2) send other stuff (files too), like in a form.
var fd = new FormData;
fd.append('photo1', file);
fd.append('photo2', file2);
fd.append('other_data', 'foo bar');
xhr.send(fd);
FormData makes the server code cleaner and more backward compatible (since the request now has the exact same format as normal forms).
All of it is not experimental, but very modern. Chrome 8+ and Firefox 4+ know what to do, but I don't know about any others.
This is how I handled the request (1 image per request) in PHP:
if ( isset($_FILES['file']) ) {
$filename = basename($_FILES['file']['name']);
$error = true;
// Only upload if on my home win dev machine
if ( isset($_SERVER['WINDIR']) ) {
$path = 'uploads/'.$filename;
$error = !move_uploaded_file($_FILES['file']['tmp_name'], $path);
}
$rsp = array(
'error' => $error, // Used in JS
'filename' => $filename,
'filepath' => '/tests/uploads/' . $filename, // Web accessible
);
echo json_encode($rsp);
exit;
}
Here are some options for using AJAX to upload files:
AjaxFileUpload - Requires a form element on the page, but uploads the file without reloading the page. See the Demo.
List of JQuery Plug-ins Tagged With "File"
Uploadify - A Flash-based method of uploading files.
How can I upload files asynchronously?
Ten Examples of AJAX File Upload - This was posted this year.
UPDATE: Here is a JQuery plug-in for Multiple File Uploading.
I have protocol (like http) with scheme managed with 3rd party App registered in Mac OS X.
I.e, x-someapp://someaction or something like that.
How can I open this URL with Google Chrome?
By default, Chrome starts searching in Google engine instead launching App and passing URL handling to it...
Safari launches some registered App. And it is right thing.
Firefox and Opera asks what to do... and I can launch App also.
But Chrome... Doesn't ask.
I even tried to write some HTML page with JavaScript inside to send XHttpRequest:
function _httpExecuteCallback()
{
if (httpRequestCallbackFunction != null) {
if (httpRequest.readyState == 4) {
if (httpRequest.status == 200) {
httpRequestCallbackFunction();
httpRequestCallbackFunction = null;
}
}
}
}
function _httpGet(url, callbackFunction)
{
httpRequest = false;
httpRequestCallbackFunction = callbackFunction;
httpRequest = new XMLHttpRequest();
httpRequest.onreadystatechange = _httpExecuteCallback;
httpRequest.open('GET', url, true);
httpRequest.send(null);
}
_httpGet('x-someapp://test',function(){})
No results also...
The current accepted solution has a problem with Chrome for SSL https. Watching the console log, Chrome blocks the request because it thinks the custom url protocol is not secure:
[blocked] The page at reports blah blah ran insecure content from customproto//blah blah
Here is a solution (this took me a few days to research):
<input type='button' value='Test Custom Url' onclick='exec()'>
<script>
function submitRequest(buttonId) {
var d = (window.parent)?window.parent.document:window.document
if (d.getElementById(buttonId) == null || d.getElementById(buttonId) == undefined) return;
if (d.getElementById(buttonId).dispatchEvent) {
var e = d.createEvent("MouseEvents");
e.initEvent("click", true, true);
d.getElementById(buttonId).dispatchEvent(e);
}
else {
d.getElementById(buttonId).click();
}
}
function exec(){
var d = (window.parent)?window.parent.document:window.document
var f = d.getElementById('customUrlLink')
if (f ) {f.parentNode.removeChild(f);}
var a = d.createElement('a');
a.href = 'mycustomproto://arg1';
a.innerHTML = "Link"
a.setAttribute('id', 'customUrlLink');
a.setAttribute("style", "display:none; ");
d.body.appendChild(a);
submitRequest("customUrlLink");
}
</script>
This code will not work for IE. I've found using this technique IE limits the argument of the custom protocol to less than 1000 where as using the iFrame technique IE will allow 2083 chars.
The only way to overcome the url limit in javascript is chuck the data and call multiple times. If anyone wants to take a stab at that, please let me know how it goes. I would like to use it.
To handle long urls in the executing app, pass a token into the app and have it go get the data from a url GET.
So for right now I am using one function for Chrome/FF and another function for IE.
These links helped me develop this solution:
https://superuser.com/questions/655405/custom-protocol-handler-not-working-in-chrome-on-ssl-page
Simulating a click in jQuery/JavaScript on a link
(wish I had known this a few days ago....hope this helps someone)
==================================================
Update: (8hr later)
==================================================
Jake posted a great solution for chrome: https://superuser.com/questions/655405/custom-protocol-handler-not-working-in-chrome-on-ssl-page
This works in chrome only:
window.location.assign("customprotocol://");
It will fail in an iframe so this is working:
var w = (window.parent)?window.parent:window
w.location.assign(service + '://' + data)
==================================================
Update: (weeks later)
==================================================
All of the examples of opening the custom protocol, including my own, have a "://" in the url. And this is what is causing the SSL warnings.
Turns out the solution is to change "://" to ":"
so do this:
src="x-myproto:query" .....
and the SSL warnings will go away.
==================================================
Follow: (after months of production use)
==================================================
This has been working well for chorme. Detect the browser and if chrome do this:
var w = (window.parent)?window.parent:window
w.location.assign('myproto://xyzabcdefetc')
For IE and other browsers I do something slightly different.
Note that browsers do impose a limit on how much data you can put in custom url protocol. As long as your string is under 800 chars this seems to be the magic number for which works in all browsers.
It looks like it's Google's locationbar parsing which is getting in the way.
The browser, however, does seem to handle custom URL schemes properly. Try this in your locationbar:
javascript:document.location = 'myscheme://whatever'
Any link on your page that uses the custom scheme should also do the right thing.
I found the solution that works with Chrome.
I use the IFRAME-way.
Example (with JQuery):
$("body").append('<span id="__protoProxy"></span>');
function queryWord(aWord)
{
var protoProxy = document.getElementById('__protoProxy');
if (protoProxy)
{
var word = aWord.replace('"','\"');
protoProxy.innerHTML = '<div style="display:none;"><iframe src="x-myproto://query?' + word + '"></iframe></div>';
}
}
queryWord('hello');
Here's a solution that also includes a redirect to the App Store / Play Store if the user doesn't have the app. It uses a setTimeout for this. It also makes use of an iframe to support more browsers. So this works on Chrome, and any other mobile browser. We use this as my company, Branch. Just modify the two links below to correspond to your URI and App Store link.
<!DOCTYPE html>
<html>
<body>
<script type="text/javascript">
window.onload = function() {
// Deep link to your app goes here
document.getElementById("l").src = "my_app://somepath";
setTimeout(function() {
// Link to the App Store should go here -- only fires if deep link fails
window.location = "https://itunes.apple.com/us/app/myapp/id123456789?ls=1&mt=8";
}, 500);
};
</script>
<iframe id="l" width="1" height="1" style="visibility:hidden"></iframe>
</body>
</html>
Again, this should work on any browser, thanks to the iframe.
If Chrome does not recognize the URL scheme, it defaults to a search.
This is what I see in Safari:
alt text http://img62.imageshack.us/img62/6792/clipboard02oh.jpg
and in Firefox:
alt text http://img138.imageshack.us/img138/9986/clipboard04xk.jpg
I believe the reason why Chrome defaults to search is that there are special google searches that use the colon.
E.g:
define: dictionary
filetype:pdf google chromium
This is one of the annoyances I have with Firefox, I have to jump to the "search box" rather than the address bar to execute these types of searches. Since Chrome does not have a separate search box like Firefox, IE and Safari have, this functionality is required.
Ajax requests won't get you around this.
Some weeks later ....
Looks like window.location.replace('myscheme://whatever') has full cross-browser support , works with chrome,firefox,safari,edge,opera see https://developer.mozilla.org/en-US/docs/Web/API/Location/replace