I am tracking events in Analytics, but I have a question with audio tag.
I can track when the audio tag is played, but I don't know how to track how long the audio track has been played.
Did the user listen it all ?
Or how long did he listened it ?
Curently I used onclick=" ga('send', 'event', 'Audio', 'play', 'Lecture');"
But what will it be to know how long it has been listened to?
thanks for your help !
Related
In a video.js player, I want to display information on the currently played video as an videojs-overlay whenever the user is active (moving the mouse over the video) and hide the information when the user is inactive (not moving the mouse over the video).
I set videojs-overlay to listen to the useractive and userinactive events like this:
player.overlay({
content: 'Default overlay content',
debug: true,
overlays: [{
content: 'The user is active!',
start: 'useractive',
end: 'userinactive'
}]
});
Unfortunately, the overlay is not triggered at first, but then, it starts working after the video is playing for ca. 1 minute.
Is there a problem with my setup, or might this be a bug in videojs or videojs-overlay? What can I do to debug this?
Video.JS already keeps track of the user active state using CSS classes. An example of this can be found in the videojs-dock plugin. It uses the vjs-user-inactive and vjs-user-active CSS classes to control showing or hiding a dock or tray over the video that can be used to display information such as a title or description for the video. You may be able to use this as inspiration for your overlay.
Please let me know if you have any additional questions.
Disclaimer: I am employed by Brightcove.
If you look in soundcloud you'll notice that when you play a song it plays in the main content of the page as well as a 'footer' player. I'm trying to achieve something similar, with jplayer or soundmanager. My main content of my pages are ajaxed while the footer stays consistent to support the continious player of the website.
But my question is, how do you play music from the ajaxed footer player while animating the main content player and having seek functions on both?
With jPlayer I assume you would need to listen for $.jPlayer.event.play, $.jPlayer.event.pause, $.jPlayer.event.ended, $.jPlayer.event.seeked in your main player and update secondary interface.
You may need to create secondary interface yourself (play/pause buttons, seek bar, etc) and add event handlers for them passing corresponding parameters to the main player.
We are developing a music player app for Lion OSX(10.7), which applies different audio effects to selected music file.
We have used Audio unit and AUGraph APi's to achieve this.
However after connecting all the audio unit node , when we call AUGraphStart(mGraph) graph takes around 1 sec to invoke first I/o callback.
Because of this there is slight delay in the beginning of the playback.
How can we avoid this delay?Could any one provide any imputs to help us solve this issue?
One solution is to start the audio graph running before displaying any UI that the user could use to start playback. Since the audio units will then be running, you could fill any audio output buffers with silence before the appropriate UI event. If the buffers are small/short, the latency from any UI event till an output buffer is filled may be small enough to be below normal human perception.
I am using MediaElement for video playback in my app. I have added controls for play, pause, rewind and forward. In the forward button's event handler, I am trying to forward the video clip for 5 seconds. the code I have used to do that is given below.
if(myMediaElement.CanSeek)
{
myMediaElement.Position = TimeSpan.FromSeconds(2);
myMediaElement.Play();
}
But the video clip does not forward, instead it stops the video playback. Can anyone please tell me what is going wrong.
You need to start playing the stream before you can set the position.
Move the call to CanSeek and the setting of the position until after the MediaOpened event has been raised.
See the remarks in MSDN http://msdn.microsoft.com/en-us/library/system.windows.controls.mediaelement.position(v=VS.95).aspx for confirmation.
Using valueconverter sample here, with Slider adjustments, to get the positions
http://diggthedrazen.com/2011/07/08/using-an-ivalueconverter-to-create-a-player-with-a-seek-bar-on-windows-phone/
I'm muting volume in QTkit like this:
if
([muteButton state] == NSOnState){
[mMovieVolumeSlider setFloatValue:0.1];
[testMovie setVolume:0.1];
The problem is the volume attenuation is sudden and abrupt. How can I implement a fade effect to the volume attenuation?
Also - my app runs .pls audio stream files. I have the .pls files embedded in the bundle. When selecting a stream within the app, a short delay is common before the stream begins to play. I want to display some sort of status message ("Buffering" or Connecting") during this short delay prior to connecting. When the stream begins the status message would end. Any idea's on how to approach this?
thanks for the help.
-paul.
I'll just outline my answers to your two questions as suggestions:
What you want to accomplish sounds much like a good fit for NSAnimation (either through subclassing or delegation — animation:valueForProgress: will probably be your friend here).
Open the QTMovie asynchronously and listen for the QTMovieLoadStateDidChangeNotification.
HTH
Daniel