Cant Record Stream Using MediaRecorder In Server? - ffmpeg

First I am trying to make a Webrtc peer connection from the browser to the server using SFU model.
Here is the post request which makes the webrtc peer connection from the browser to the server(SFU)
app.post("/broadcast", async ({ body }, res) => {
const peer = new webrtc.RTCPeerConnection({
iceServers: [
{
urls: "stun:stun.stunprotocol.org",
},
],
});
peer.ontrack = (e) => handleTrackEvent(e, peer); <-- Important
const desc = new webrtc.RTCSessionDescription(body.sdp);
await peer.setRemoteDescription(desc);
const answer = await peer.createAnswer();
await peer.setLocalDescription(answer);
const payload = {
sdp: peer.localDescription,
};
res.json(payload);
});
In the handleTrackEvent function, I am getting the stream which I want to start record and save in the server's local storage.
function handleTrackEvent(e, peer) {
console.log(e.streams);
senderStream = e.streams[0];
var recorder = new MediaStreamRecorder(e.streams);
recorder.recorderType = MediaRecorderWrapper;
recorder.mimeType = "video/webm";
recorder.ondataavailable = (blob) => {
console.log(blob);
};
recorder.start(5 * 1000); <-- Error generator
}
But when try to start the recording and get the blob in 5 sec intervals, it gives me "MediaRecorder Not Found" ...
Passing following params over MediaRecorder API. { mimeType: 'video/webm' }
/Users/tecbackup/webrtc-peer/node_modules/msr/MediaStreamRecorder.js:672
mediaRecorder = new MediaRecorder(mediaStream);
^
ReferenceError: MediaRecorder is not defined
I am very new to webrtc, I need some suggestion to save the live stream from the browser to the server....In future, if find the blob, then I will save the blobs sequentially in a mp4 file in the server. Then, on runtime i start pressing ffmpeg in that mp4 file to get 240p, 360p, 720p ts files for hls streaming

Related

Puppeteer - how to iterate through queryObjects to collect the url of a WebSocket object?

I am using Puppeteer in a Node.js module. I retrieve a WebSocket object Prototype with queryObjects and I need to extract the url property.
// Get a handle to the websocket object prototype
const prototypeHandle = await page.evaluateHandle(() => WebSocket.prototype);
// Query all websocket instances into a jsHandle object
const jsHandle = await page.queryObjects(prototypeHandle);
// Count amount of map objects in heap
// const count = await page.evaluate(maps => maps.length, jsHandle); // returns the expected amount (x2)
// How to iterate through jsHandle to collect the url of each websockets
await jsHandle.dispose();
await prototypeHandle.dispose();
You do not get any response because WebSocket is not a simple JSON object which can be stringified and given back to you when you evaluate using page.evaluate.
To get the URL of the connected websocket in the page, you can map through the collected WebSocket instances/objects and extract the url out of them.
const browser = await puppeteer.launch();
const page = (await browser.pages())[0];
// create a dummy websocket connection for testing purpose
await page.evaluate(() => new WebSocket('wss://echo.websocket.org/'));
const wsPrototypeHandle = await page.evaluateHandle(
() => WebSocket.prototype
);
const wsInstances = await page.queryObjects(wsPrototypeHandle);
const wsUrls = await page.evaluate(
(e) => e.map((e) => e['url']), // <-- simply access the object here
wsInstances
);
console.log(wsUrls);
Which will result in following,
[ 'wss://echo.websocket.org/' ]

How to DRM Content in chromecast

we are trying to play drm MPD content from Chrome to Chromecast
Our receiver app code is as follow:
const context = cast.framework.CastReceiverContext.getInstance();
const playbackConfig = new cast.framework.PlaybackConfig();
playbackConfig.licenseUrl = 'http://widevine/yourLicenseServer';
playbackConfig.protectionSystem = cast.framework.ContentProtection.WIDEVINE;
playbackConfig.licenseRequestHandler = requestInfo => {
requestInfo.withCredentials = true;
};
context.start({playbackConfig: playbackConfig});
// Update playback config licenseUrl according to provided value in load request.
context.getPlayerManager().setMediaPlaybackInfoHandler((loadRequest, playbackConfig) => {
if (loadRequest.media.customData && loadRequest.media.customData.licenseUrl) {
playbackConfig.licenseUrl = loadRequest.media.customData.licenseUrl;
}
return playbackConfig;
});
I don't get a correct way to pass custom data for drm in the client application.
please help.
I think you are asking how you send the license URL from the sender client (the device 'casing') to the receiver (the device which will receive the request to cast and which actually get and plays the stream) in the custom data.
The custom data is a JSON object and you just need to put the license url into it.
There are two common ways of passing this custom data:
include it in the MediaInfo object using the MediaInfo.Builder.setCustomData method
include it the MediaLoadOptions data
As an example, looking at a MediaInfo example form the Google documents and adding in custom data:
List tracks = new ArrayList();
tracks.add(englishSubtitle);
tracks.add(frenchSubtitle);
tracks.add(frenchAudio);
MediaInfo mediaInfo = MediaInfo.Builder(url)
.setStreamType(MediaInfo.STREAM_TYPE_BUFFERED)
.setContentType(getContentType())
.setMetadata(getMetadata())
.setMediaTracks(tracks)
.setCustomData(yourCustomData) <--- This is the custom data
.build();
'yourCustomData' above is a JSON object which you create and add your data to, in your case your license server URL:
JSONObject yourCustomData = new JSONObject();
try {
yourCustomeData.put("licenseURL", "HTTPS://yourlicenseServerUrl.com");
} catch (JSONException e) {
// Add any error code you want here
e.printStackTrace();
}
please follow the code for js
new ChromecastJS().cast({
content: "xxxxxx",
contentType: "application/dash+xml",
poster: "xxxxxxx",
title: "xxxxx",
description: "xxxxx",
duration: "xxxx",
volume: 0.5,
muted: false,
paused: false,
time: "Xxxxx",
customData: {
drmHeaders: {
customdata: "xxxxxx",
},
drmLaUrl: "xxxx",
drmKsName: "com.widevine.alpha",
},
});

Azure blob storage is giving "request was aborted" for almost all my PUT requests

I'm trying to use Azure blob storage to store blobs, but the majority of the time I get "request was aborted" responses back from the blob store. I'm using the Node.js v10 SDK in an Express API, and I get the feeling that it has to do with how I'm configuring the service.
I'm using Azurite to mimic the Azure storage service.
In my app.js I run my function configureBlobStore() on startup:
export const configureBlobStore = async containerName => {
const sharedKeyCredential = new SharedKeyCredential(
process.env.AZURE_STORAGE_ACCOUNT_NAME,
process.env.AZURE_STORAGE_ACCOUNT_ACCESS_KEY,
);
const pipeline = StorageURL.newPipeline(sharedKeyCredential);
serviceUrl = new ServiceURL('http://blob:10000/devstoreaccount1', pipeline);
if (!containerUrl) {
containerUrl = ContainerURL.fromServiceURL(serviceUrl, containerName);
try {
await containerUrl.create(aborter);
} catch (err) {
console.log(err);
console.log('Container already existed, skipping creation');
}
}
return containerUrl;
};
and then to save blobs I run my function saveToBlobStore()
let { originalname: blobName } = file;
const blockBlobUrl = BlockBlobURL.fromContainerURL(containerUrl, blobName);
const uploadOptions = { bufferSize: 4 * 1024 * 1024, maxBuffers: 20 };
const stream = intoStream(file.buffer);
try {
await uploadStreamToBlockBlob(
aborter,
stream,
blockBlobUrl,
uploadOptions.bufferSize,
uploadOptions.maxBuffers,
);
return blockBlobUrl.url;
} catch (err) {
console.error('Error saving blob', err);
return err;
}
};
Sometimes it works, often when I take my containers down and do a volume prune, but usually it only works for the first file uploaded and not any others. Does anyone know why this might be happening?
Found the problem, I created an Aborter once, cached it and was using it for all of my calls, but you need to create a new one before each call to the blob store.

Parse Server - How to delete image file from the server using cloud code

How can I delete an image's file from the server using Parse Cloud Code. I am using back4app.com
After Deleting Image Row
I am getting the images urls, then calling a function to delete the image using its url
Parse.Cloud.afterDelete("Image", function(request) {
// get urls
var imageUrl = request.object.get("image").url();
var thumbUrl = request.object.get("thumb").url();
if(imageUrl!=null){
//delete
deleteFile(imageUrl);
}
if(thumbUrl!=null){
//delete
deleteFile(thumbUrl);
}
});
Delete the image file from the server
function deleteFile(url){
Parse.Cloud.httpRequest({
url: url.substring(url.lastIndexOf("/")+1),
method: 'DELETE',
headers: {
'X-Parse-Application-Id': 'xxx',
'X-Parse-Master-Key': 'xxx'
}
}).then(function(httpResponse) {
console.log(httpResponse.text);
}, function(httpResponse) {
console.error('Request failed with response code ' + httpResponse.status);
});
}
for security reasons, not is posible to delete directly the image from Back4App, using DELETE from SDK or REST API. I believe that you can follow the guide below:
https://help.back4app.com/hc/en-us/articles/360002327652-How-to-delete-files-completely-
After struggling with this for a while it seems to be possible through cloud function as mentioned here. One need to use MasterKey in the cloud code:
Parse.Cloud.define('deleteGalleryPicture', async (request) => {
const {image_id} = request.params;
const Gallery = Parse.Object.extend('Gallery');
const query = new Parse.Query(Gallery);
try {
const Image = await query.get(image_id);
const picture = Image.get('picture');
await picture.destroy({useMasterKey: true});
await Image.destroy();
return 'Image removed.';
} catch (error) {
console.log(error);
throw new Error('Error deleting image');
}
});
For me it was first confusing since I could open the link to that file even after I deleted the reference object in the dashboard, but then I found out that the dashboard is not calling Parse.Cloud.beforeDelete() trigger for some reason.
Trying to download the data from the url after deleting the file through the cloud code function returns 0kB data and therefore confirms that they were deleted.

Node Express sending image files as API response

I Googled this but couldn't find an answer but it must be a common problem. This is the same question as Node request (read image stream - pipe back to response), which is unanswered.
How do I send an image file as an Express .send() response? I need to map RESTful urls to images - but how do I send the binary file with the right headers? E.g.,
<img src='/report/378334e22/e33423222' />
Calls...
app.get('/report/:chart_id/:user_id', function (req, res) {
//authenticate user_id, get chart_id obfuscated url
//send image binary with correct headers
});
There is an api in Express.
res.sendFile
app.get('/report/:chart_id/:user_id', function (req, res) {
// res.sendFile(filepath);
});
http://expressjs.com/en/api.html#res.sendFile
a proper solution with streams and error handling is below:
const fs = require('fs')
const stream = require('stream')
app.get('/report/:chart_id/:user_id',(req, res) => {
const r = fs.createReadStream('path to file') // or any other way to get a readable stream
const ps = new stream.PassThrough() // <---- this makes a trick with stream error handling
stream.pipeline(
r,
ps, // <---- this makes a trick with stream error handling
(err) => {
if (err) {
console.log(err) // No such file or any other kind of error
return res.sendStatus(400);
}
})
ps.pipe(res) // <---- this makes a trick with stream error handling
})
with Node older then 10 you will need to use pump instead of pipeline.

Resources