I am currently using the vimeo-threejs-player library to show a Vimeo video in my WebGL app.
The code is basically:
vimeoPlayer = new Vimeo.Player(vimeo_video_id, { autoplay: false, loop: false });
vimeoPlayer.load();
vimeoPlayer.on('videoLoad', function(videoTexture) {
//code to set the 3d object texture from the video
}
This is working fine with current video files uploaded, but now I am trying to visualize a Vimeo live stream using the same method, but the library is returning following error when the vimeoPlayer.load() is called:
vimeo-threejs-player.min.js?v=15:10 Uncaught (in promise) Error: TypeError: Cannot read property 'link' of undefined
at vimeo-threejs-player.min.js?v=15:10
And it never gets to the videoLoad callback function.
It seems that the /vimeo/api service is returning an unexpected result for this library.
Once the live event is done, and it is stored as a regular video, it works fine, as with any other regular video file uploaded.
In the other hand, it is stated in vimeo-depth-player that the vimeo-threejs-player is used to play a live stream video (demo is not currently working), and code in this example is mainly the same that I am using.
Do you have any idea of why it can be failing?
Related
I'm developing a streaming radio in 3D using Three.JS, I'm sending music as a PeerConnection to my clients attaching a THREE.AudioAnalyser() to display 3D bars that move according to frequencies.
Sound is working great in all platforms, but THREE.AudioAnalyser() with an input source of stream type only works on Chrome, Safari is not working at all :frowning:
var listener = new THREE.AudioListener();
var audio = new THREE.Audio( listener );
audio.setMediaStreamSource( stream );
audioAnalyser = new THREE.AudioAnalyser( audio, 128 );
function loop(){
console.log(audioAnalyser.getFrequencyData());
}
The console.log() of the loop() function should contain an Array of Integers, on Chrome is all good, Safari logs [0,0,0,0,0,0,0,0]
What could be causing this issue? It seems to work everywhere but not on Safari, and also it only seems to fail when the source is a stream.
Not 100% sure, but you might want to connect the output of the AnalyserNode to the destination node. You may want to stick a GainNode with a gain of 0 in between, just in case you don't really want the audio from the AnalyserNode to be played out.
In a video.js player, I want to display information on the currently played video as an videojs-overlay whenever the user is active (moving the mouse over the video) and hide the information when the user is inactive (not moving the mouse over the video).
I set videojs-overlay to listen to the useractive and userinactive events like this:
player.overlay({
content: 'Default overlay content',
debug: true,
overlays: [{
content: 'The user is active!',
start: 'useractive',
end: 'userinactive'
}]
});
Unfortunately, the overlay is not triggered at first, but then, it starts working after the video is playing for ca. 1 minute.
Is there a problem with my setup, or might this be a bug in videojs or videojs-overlay? What can I do to debug this?
Video.JS already keeps track of the user active state using CSS classes. An example of this can be found in the videojs-dock plugin. It uses the vjs-user-inactive and vjs-user-active CSS classes to control showing or hiding a dock or tray over the video that can be used to display information such as a title or description for the video. You may be able to use this as inspiration for your overlay.
Please let me know if you have any additional questions.
Disclaimer: I am employed by Brightcove.
The following is the way I load a video (in the actual code, the variables are member variables of the player class). I do not want the video to be played right away which is the reason I use prepareMedia(). When the application is ready to play the video, I call player.play().
However, my player view (I add EmbeddedMediaPlayerComponent to JPanel which is set as ContentPane of a JFrame) still shows the old video after running the following code with a new "videoPath" value. The player view shows the new video only after I call player.play().
EmbeddedMediaPlayerComponent mediaPlayerComponent = new EmbeddedMediaPlayerComponent();
MediaPlayer player = mediaPlayerComponent.getMediaPlayer();
player.prepareMedia(videoPath);
Is there any way I can get the player to show the new video image (or at least removing the old video image) without starting to play it? I tried calling methods such as repaint() from mediaPlayerComponent, stop() from player, in the overrided MediaPlayerEventAdpater methods such as mediaFreed(), but nothing I tried so far work.
It is a feature of VLC/LibVLC that the final frame of the video is displayed when the video ends, so you have to find a workaround.
A good solution is to use a CardLayout with two views, one for the media player component (or the Canvas used for the video surface) and another view simply with a blank (black) JPanel.
The idea then is to listen for the video starting/stopping/finishing and show the appropriate view in your card layout.
If you add a MediaPlayerEventListener and implement the playing, stopped, finished and error events you should cover all cases.
For example: in the "playing" event you switch your card layout to show the video view, in the "stopped", "finished" and "error" events you switch your card layout to show the blank view.
The view doesn't have to be black of course, you can do whatever you want like show an image.
Also note that the media player events will NOT be delivered on the Swing Event Dispatch Thread, so you will need to use SwingUtilities#invokeLater to properly switch your view.
I'm having trouble converting my app from WatchOS 1 to WatchOS2. I'm programmatically creating a sequence of images on the iPhone for the Watch to play.
I'm putting them in a zip file (using SSZipArchive) in iOS and using transferFile() to send it over to the watch where I unzip it in the Watch Extension delegate to the shared container between the Watch Extension and Watch App, that the Watch App can play the sequence later:
func session(session: WCSession, didReceiveFile file: WCSessionFile)
{
imagesURL = NSFileManager.defaultManager().containerURLForSecurityApplicationGroupIdentifier("group.com.xxxx.images")
SSZipArchive.unzipFileAtPath(file.fileURL.path, toDestination:imagesURL!.path)
}
I've checked that the shared group is set up correctly, and I can see the image files in the shared directory (imagesURL!.path).
But when I get ready to play the sequence with:
image.setImageNamed("myImages") // myImages0.png, myImages1.png, myImages2.png, etc.
I get the error: Unable to find image named "myImages" on watch
Am I putting the images in the right place?
Am I referring to them correctly in setImageNamed?
Am I missing something else?
The correct answer is to use the animatedImageNamed:duration: method of UIImage if you have a series of image files already created, then set that animation using the setImage: method.
So the correction to my original code is a simple one-line change:
image.setImage( UIImage.animatedImageNamed("myImages", 3) ) // myImages0.png, myImages1.png, myImages2.png, etc.
(assuming the duration is 3 seconds)
Why setImageNamed: fails
According to the WKInterfaceImage documentation, setImageNamed loads the image from the watch app bundle.
Whenever possible, place image resources in an asset catalog in your Watch app bundle (not in your WatchKit extension’s bundle). Placing them in the Watch app bundle lets you use the setImageNamed: method to load the animated image at runtime, which simplifies the loading process.
To load an animated image sequence from images in your Watch app bundle, you must name your image resources appropriately and use the setImageNamed: method of this class.
This is why setImageNamed: can't find the images, as they are not a static resource bundled with the watch app.
How to load a dynamic animated image
Since your animation images are dynamic, this is handled by the watch app extension.
For animations you generate dynamically, use the animatedImageWithImages:duration: method of UIImage to assemble your animation in your WatchKit extension, and then set that animation using the setImage: method.
You need to first use animatedImageWithImages:duration: to assemble the dynamic animated image you transferred, then set that animation using setImage:
let animatedImages = UIImage.animatedImageWithImages(images, duration: 3)
image.setImage(animatedImages)
Update:
As Scotty pointed out, the better solution is to use animatedImageNamed:duration: to avoid needing to create the array of images!
I use SWFObject 2.2 to play sounds for an AJAX based game I have made. I used to use SWFObject 1, and everything worked fine, but when I updated my game, I updated to 2.2. Now, when users try to listen to music on Youtube or Pandora in another tab on Firefox while playing the game, they can't unless they have that tab selected.
What is interesting is that the video doesn't stop playing, just the sound stops working. I run the following javascript in order to stop the sound effect in my flash file, and it seems to stop the sound at the exact same time on Youtube or Pandora:
$('myflashid').doStop();
The following is the actionscript used for my flash file:
import flash.external.ExternalInterface;
snd=new Sound();
snd.attachSound("MySound1");
ExternalInterface.addCallback( "doPlay", this, doPlay );
ExternalInterface.addCallback( "doStop", this, doStop );
function doPlay() {
snd.start();
}
function doStop() {
snd.stop();
}
I am not sure why this fixes it, but if I set the volume to 0 instead of doing snd.stop(); and then set the volume back to 100 when I start it again, it seems to work fine.