Having spend the last couple of days reading the manuals, watching videos and reading through the forum, I can honestly I have no idea what I'm doing.
My basic requirement is to display a video on a web page so that it works in all major browsers.
Easy in Chrome, not so easy in anything else it seems.
For example Firefox will say 'Error loading player, No playable sources available', or if I add more sources it simple prompts me to select a video player on my local machine and proceeds to download the file.
Here is where I'm at right now, of-course it does not work.
jwplayer("playerOMLNJWgiRjbu").setup({
"sources": [
{ "file": "rtmp://video.newnrg.com:1935/vod/mp4:sample.mp4"},
{ "file": "http://video.newnrg.com:1935/vod/mp4:sample.mp4/manifest.mpd" },
{ "file": "http://video.newnrg.com:1935/vod/mp4:sample.mp4/playlist.m3u8"},
{ "file": "http://video.newnrg.com:1935/vod/mp4:sample.mp4/manifest.f4m"},
{ "file": "rtsp://video.newnrg.com:1935/vod/sample.mp4" }
],
"rtmp": {
"bufferlength": 5
},
"primary": "flash",
"modes": [
{ "type": "flash", "src": "/app/src/jwplayer/jwplayer.flash.swf" }
]
});
HTTP tests:
rtmp://video.newnrg.com:1935/vod/mp4:sample.mp4 (200)
http://video.newnrg.com:1935/vod/mp4:sample.mp4/manifest.mpd (200)
http://video.newnrg.com:1935/vod/mp4.../playlist.m3u8 (200)
http://video.newnrg.com:1935/vod/mp4...4/manifest.f4m (200)
rtsp://video.newnrg.com:1935/vod/sample.mp4 (200)
In firefox when rtsp://video.newnrg.com:1935/vod/sample.mp4 is used it asks for a locally installed player to play the video file, yet pasting this URL into the browser address bar works.
I've looked at the wowza homepage to attempt to dissect how this library is being used.
Wowza config:
(on) MPEG-DASH
(on) Apple HLS
(on) Adobe RTMP
(on) Adobe HDS
(on) Microsoft Smooth Streaming
(on) RTSP/RTP
From my perspective I've certainly struggled to get to this point, and have the following questions.
Which is the preferred client side player, with a decent JavaScript API,
I like the JWPlayer api, so can it be used? and what is best practice in easy to follow steps 1, 2, 3.
Related
Hey I wonder if someone can point me in the right direction.
I am building a Mac Audio App that I need to perform the following actions
Select audio input device
Show a live audio graph of device input (think GarageBand/ Logic)
capture the audio for replay
Output the sound or make itself selectable as an input device for another app - Similar to how plugins work for GarageBand Logic etc
I have looked into using the following so far:
AudioKit
AVFoundation / AVCaptureDevice
AudioKit
This framework looks amazing and has sample apps for most of the things I want to do, however it seems that it will only accept the audio input that the Mac has selected in settings. This is a non starter for me as I want the user to be able to chose in App (like GarageBand does or Neural Dsp plugins)
Using AudioEngine I can get the available input devices, but everything I have found points to them not being changeable in App - here's code to display them
struct InputDevicePicker: View {
#State var device: Device
var engine: AudioEngine
var body: some View {
Picker("Input: \(device.deviceID)", selection: $device) {
ForEach(getDevices(), id: \.self) {
Text($0.name)
}
}
.pickerStyle(MenuPickerStyle())
.onChange(of: device, perform: setInputDevice)
}
func getDevices() -> [Device] {
AudioEngine.inputDevices.compactMap { $0 }
}
func setInputDevice(to device: Device) {
// set the input device on the AudioEngine
}
}
}
Alternatively
AVFoundation
This has a nice API for listing devices and setting the input, but when it comes to working with the data the delegate provides, I don't have the first clue how I would handle this in terms creating an audio graph and saving the data for replay. Here's the delegate method for reference
extension Recorder: AVCaptureAudioDataOutputSampleBufferDelegate {
func captureOutput(_ output: AVCaptureOutput,
didOutput sampleBuffer: CMSampleBuffer,
from connection: AVCaptureConnection) {
print("Audio data recieved")
// Needs to save audio here
// Needs to play through speakers or other audio source
// Need to show audio graph
}
}
It would be great if someone with experience using these could advise if this is possible on either and where I can look for examples / guidance
Any help pointers, life savers would be appreciated
Thanks if you got this far !
In a video.js player, I want to display information on the currently played video as an videojs-overlay whenever the user is active (moving the mouse over the video) and hide the information when the user is inactive (not moving the mouse over the video).
I set videojs-overlay to listen to the useractive and userinactive events like this:
player.overlay({
content: 'Default overlay content',
debug: true,
overlays: [{
content: 'The user is active!',
start: 'useractive',
end: 'userinactive'
}]
});
Unfortunately, the overlay is not triggered at first, but then, it starts working after the video is playing for ca. 1 minute.
Is there a problem with my setup, or might this be a bug in videojs or videojs-overlay? What can I do to debug this?
Video.JS already keeps track of the user active state using CSS classes. An example of this can be found in the videojs-dock plugin. It uses the vjs-user-inactive and vjs-user-active CSS classes to control showing or hiding a dock or tray over the video that can be used to display information such as a title or description for the video. You may be able to use this as inspiration for your overlay.
Please let me know if you have any additional questions.
Disclaimer: I am employed by Brightcove.
I am trying to create a simple app in pebble.js using the cloud IDE
I have my resources loaded (47 of them) with a colour and a black and white image, with all the correct naming conventions. Here is a snippet from the appinfo.json
{
"file": "images/34_Unlockables.png",
"name": "IMAGES_34_UNLOCKABLES_PNG",
"type": "png"
},
{
"file": "images/45_Leaderboards.png",
"name": "IMAGES_45_LEADERBOARDS_PNG",
"type": "png"
}
When I reference the image by it's name, the app pulls in the wrong image totally. This is the same if I try the url as well
card.banner('images/45_Leaderboards.png');
card.banner('IMAGES_45_LEADERBOARDS_PNG');
both result in the wrong image (and it is always black and white).
Has anyone else hit a similar issue?
I retrieve photos from home with this graph API request https://graph.facebook.com/me?fields=home.filter(photos).
Then via the object ID I retrieve the images array that gives me different sizes like this :
"images": [
{
"height": 780,
"width": 1240,
"source": "https://fbcdn-sphotos-g-a.akamaihd.net/hphotos-ak-prn2/281028_10151592859979158_562775226_o.jpg"
},
{
"height": 81,
"width": 130,
"source": "https://fbcdn-photos-g-a.akamaihd.net/hphotos-ak-prn2/s75x225/969715_10151592859979158_562775226_s.jpg"
}
Is it possible to retrieve the originally posted image ?
The image retrieved has to have the same checksum then the image posted
That's no 100% guarantee checksum is same, don't do that. Why? Because if you upload a .gif or .png image, Facebook would do conversion to become a jpeg image instead. So even though you upload a 1920*1280 .png image, what you can get is the modified version 1920*1280 .jpeg image. I have no idea facebook database would keep original image or not, but it's not what you can do using Facebook API.
I don't think you can get the exact, same image by the checksum that you uploaded to their servers. I would think that Facebook modifies them into certain formats so that the photo experience is consistent across the whole site.
That said, they could be keeping a copy of the original photo that was uploaded. But as far as getting access to it via the API or any other way, my best guess is looking at the 'source' according to the Facebook documentation on the Photo object:
The source image of the photo - currently this can have a maximum width or height of 720px, increasing to 960px on 1st March 2012
You can fetch it via the API by:
/<photo id>/?fields=source
Sorry it's not an exact answer, but I hope it helps.
I use SWFObject 2.2 to play sounds for an AJAX based game I have made. I used to use SWFObject 1, and everything worked fine, but when I updated my game, I updated to 2.2. Now, when users try to listen to music on Youtube or Pandora in another tab on Firefox while playing the game, they can't unless they have that tab selected.
What is interesting is that the video doesn't stop playing, just the sound stops working. I run the following javascript in order to stop the sound effect in my flash file, and it seems to stop the sound at the exact same time on Youtube or Pandora:
$('myflashid').doStop();
The following is the actionscript used for my flash file:
import flash.external.ExternalInterface;
snd=new Sound();
snd.attachSound("MySound1");
ExternalInterface.addCallback( "doPlay", this, doPlay );
ExternalInterface.addCallback( "doStop", this, doStop );
function doPlay() {
snd.start();
}
function doStop() {
snd.stop();
}
I am not sure why this fixes it, but if I set the volume to 0 instead of doing snd.stop(); and then set the volume back to 100 when I start it again, it seems to work fine.