Soundmanager2 Loads but not playing - playback

THE SETUP
Installed Soundmanager2 on my website. Links to MP3s are automatically generated.
THE PROBLEM
Using the 360-ui, I see the sound loading because it plays the animation like in the top left of this demo (http://www.schillmania.com/projects/soundmanager2/demo/360-player/), but it doesn't play and the seconds counter doesnt move.
HOW I LOAD THE PLAYER
<link rel="stylesheet" type="text/css" href="/css/360player.css" />
<link rel="stylesheet" type="text/css" href="/css/flashblock.css" />
<!-- special IE-only canvas fix -->
<!--[if IE]><script type="text/javascript" src="/js/excanvas.compiled.js"></script><![endif]-->
<!-- Apache-licensed animation library -->
<script type="text/javascript" src="/js/berniecode-animator.js"></script>
<script type="text/javascript" src="/js/soundmanager2.js"></script>
<script type="text/javascript" src="/js/360player.js"></script>
<script type="text/javascript">
soundManager.setup({
// path to directory containing SM2 SWF
url: '/swf/', debugMode: true
});
</script>
THE DEBUG CODE
-- SoundManager 2: HTML5 support tests (/^(probably|maybe)$/i): mp3: true (preferring flash), mp4: true (preferring flash), ogg: true, wav: true -- soundmanager2.js:1155
-- SoundManager 2 V2.97a.20121104 (AS3/Flash 9) + HTML5 audio, high performance mode, normal polling, wmode: transparent, flashBlock mode -- soundmanager2.js:1155
soundManager::createMovie(): Trying to load /swf/soundmanager2_flash9_debug.swf soundmanager2.js:1155
soundManager::initMovie(): Waiting for ExternalInterface call from Flash... soundmanager2.js:1157
soundManager::externalInterfaceOK() (~2 ms) soundmanager2.js:1157
soundManager::init() soundmanager2.js:1157
soundManager: Attempting JS to Flash call... soundmanager2.js:1157
(Flash): SM2 SWF V2.97a.20121104 (AS3/Flash 9) soundmanager2.js:1157
Flash security sandbox type: remote soundmanager2.js:1157
(Flash): JS to/from Flash OK soundmanager2.js:1157
(Flash): Enabling polling, 10 ms interval soundmanager2.js:1157
-- SoundManager 2 loaded (OK) -- soundmanager2.js:1155
soundManager: Firing 1 onready() item soundmanager2.js:1157
threeSixtyPlayer.init() soundmanager2.js:1157
threeSixtyPlayer.init(): Found 5 relevant items. soundmanager2.js:1157
handleClick() soundmanager2.js:1157
soundManager.createSound(): ui360Sound0 (http://clients.activemd.net/external/x.mp3) soundmanager2.js:1155
SMSound() merged options: {
id: ui360Sound0,
url: http://clients.activemd.net/external/x.mp3,
onplay: { pl.removeClass(this._360data.oUIBox,this._360data.className); t... },
onstop: { pl.removeClass(this._360data.oUIBox,this._360data.className); t... },
onpause: { pl.removeClass(this._360data.oUIBox,this._360data.className); t... },
onresume: { pl.removeClass(this._360data.oUIBox,this._360data.className); t... },
onfinish: { var nextLink; pl.removeClass(this._360data.oUIBox,this._360data... },
onbufferchange: { if (this.isBuffering) { pl.addClass(this._360data.oUIBox,pl.css... },
whileloading: { if (this.paused) { self.updatePlaying.apply(this); } }... },
whileplaying: { self.updatePlaying.apply(this); this._360data.fps++; }... },
useWaveformData: 0,
useEQData: 0,
usePeakData: 0,
autoLoad: false,
autoPlay: false,
loops: 1,
multiShot: true,
multiShotEvents: false,
pan: 0,
stream: true,
usePolicyFile: false,
volume: 100,
isMovieStar: false,
bufferTime: 3
} soundmanager2.js:1157
(Flash): SoundManager2_SMSound_AS3: Got duration: 0, autoPlay: false soundmanager2.js:1157
SMSound.play(): Attempting to load "ui360Sound0" soundmanager2.js:1155
SMSound.load(): http://clients.activemd.net/external/x.mp3 soundmanager2.js:1155
SMSound.play(): "ui360Sound0" is starting to play soundmanager2.js:1157
fanOut: ui360Sound0: http://clients.activemd.net/external/x.mp3 soundmanager2.js:1157
SMSound._onbufferchange(): 1 soundmanager2.js:1157
(Flash): start (ui360Sound0): 0 soundmanager2.js:1157
SMSound._onbufferchange(): 0 soundmanager2.js:1157
SMSound._onload(): "ui360Sound0" loaded.
THE WEIRDNESS
The links are clickable in my Chrome Developer view and the MP3 opens in a separate browser window and plays normally (default when you click on an MP3 link in Chrome)
ONE POSSIBLE THEORY
The sounds are being loaded, but not be told "ok, go ahead and play yourself".
ANOTHER POSSIBLE THEORY
jQuery is somehow screwing with it.
Please help me determine why my sounds are loading but not playing.
Thanks.
Rick

I just had a similar issue with autoplay. It worked fine in Safari, but Chrome and FireFox, IE just traced they were playing and nothing was happening. I ultimately put in a setTimeout to call soundManager.play(id) but I'm not enjoying I had to do that.
I notice other developers use click() events to trigger the song. I don't know if that minor lag routing it thru a click events is what it needs to make it play or not, but I tried setting the timeout to 200 and 500+ milliseconds seemed to be the ticket. I also figured maybe the buffering event or loading event might be able to trigger a play when the autoplay fails. Also calling soundManager.play(id) right after soundManager.createSound(e) does absolutely nothing on my end.
I added a onbufferchange event and checked the playState. If it wasn't playing I told it to play. That got rid of the need for a setTimeout.

Related

How to wait for an element to be visible?

Is it possible to wait until an element is visible?
cy.get('[data-test=submitIsVisible]').should('be.visible');
This should error if the submit button is not visible. I want to wait until the submit button is visible.
The primary use case is visual testing, i.e. taking a screenshot of the page.
You can wait for the element to be visible like so:
// Give this element 10 seconds to appear
cy.get('[data-test=submitIsVisible]', { timeout: 10000 }).should('be.visible');
According to Cypress's Documentation:
DOM based commands will automatically retry and wait for their corresponding elements to exist before failing.
Cypress offers you many robust ways to query the DOM, all wrapped with retry-and-timeout logic.
Another way to wait for an element’s presence in the DOM is through timeouts. Cypress commands have a default timeout of 4 seconds, however, most Cypress commands have customizable timeout options. Timeouts can be configured globally or on a per-command basis. Check the customizable timeout options list here.
In some cases, your DOM element will not be actionable. Cypress gives you a powerful {force:true} option you can pass to most action commands.
Caveat:
As Anthony Cregan pointed out, the .should('be.visible') assertion checks whether an element is visible on the page, not necessarily in the viewport. This means that this assertion will return true even if the element is not within the visible area of the screen when the test is run.
Further recommended readings:
Retry-ability.
Interacting with elements.
Updated for Cypress v12
If you want to see exactly how Cypress waits for something to become visible, follow this example.
Using this code, you can check out how the delay and the timeout can affect the passing or failing of the .should('be.visible') assertion.
Steps
Add a simple page to a VSCode project containing Cypress v12.1.0
Call it index.html
<html>
<body>
<h2>API fetched data</h2>
<span>will become visible here</span>
</body>
<script>
fetch('https://jsonplaceholder.typicode.com/posts/1')
.then(response => response.json())
.then(data => document.querySelector('span').innerText = data.title )
</script>
</html>
Right-click index.html and choose "Open with Live Server" to activate the page.
Add this test to see how Cypress waits for the API data
describe('test the waiting of API data', () => {
const timings = [
{ delay: 0, timeout: 4000 }, // default, passes
{ delay: 2000, timeout: 4000 }, // passes
{ delay: 4000, timeout: 4000 }, // flaky
{ delay: 5000, timeout: 4000 }, // fails
{ delay: 5000, timeout: 10000 }, // passes
]
timings.forEach(timing => {
const {delay, timeout} = timing;
it(`delayed API by ${delay} ms, command timout is ${timeout} ms`, () => {
cy.intercept('https://jsonplaceholder.typicode.com/posts/1', (req) => {
req.continue((res) => res.setDelay(delay))
})
cy.visit('http://127.0.0.1:5500/index.html')
cy.contains('sunt aut facere', {timeout})
.should('be.visible')
})
})
})
Result
This shows that the longer the delay in receiving the data, the bigger the timeout needed on the visibility assertion.
you can also do it by passing below script into your cypress.config.js files
e2e: {
defaultCommandTimeout: 25000,
}
pass defaultCommandTimeout as per your requirement.
Try element.should('have.length.greaterThan', 0).and('be.visible')

openseadragon background request's accept header

in a Linked Data context, I am pointing my openseadragon client to a resource that will respond with a 303 redirection. If OSD's request has a header "Accept: image/*" (with virtually anything for '*'), the redirection will point to a jpeg and everything is fine. This is the way it works in google chrome.
However, in Firefox the request seems somehow to have a header "Accept: */*". At least that's how I am interpreting the output of firefox's developer's tools' network panel. This leads the resource at the end to redirect to a html page which, of course, OSD cannot render as an image.
How could this happen?
The way I embed OSD in my webpage is like this:
var Imageviewer = OpenSeadragon({
debugMode: false,
id: "contentDiv",
prefixUrl: "resources/img/SeaDragonImages/",
showNavigator: true,
autoHideControls: false,
autoResize: true,
springStiffness: 10,
preserveViewport: true,
// ajaxHeaders: {'Accept: image/jpeg'},
tileSources: null // sources are (re-)loaded when pageNumbers are clicked
});
$(function(){
$(".pageNo").click(function(event){
event.preventDefault();
Imageviewer.open({
type: 'legacy-image-pyramid',
levels: [{ url: $(this).attr('href'), <span data-template="app:scaleImg"/> }]
});
});
});
Note that if I enable the ajaxHeaders: {'Accept: image/jpeg'}, that is presently commented out, I do get an Accept header but it contains all sorts of text/hmtl; application/xhtml and the like.

flowplayer wowza video not playing

I am very new to flowplayer using wowza in order to have secure streaming. Below is the code I am using, but video not playing at all. I am pretty sure all files are loading properly without any 404 or 403 errors.
Here is the code:
<html>
<head>
<title>Wow! This is video</title>
<script src="js/flowplayer-3.1.4.min.js"></script>
</head>
<body>
<a href="videos/MyVideo.mp4"
style="display:block;width:425px;height:300px;"
id="wowza" class="player">
<!-- splash image inside the container -->
<img src="./flow_eye.jpg"
alt="Search engine friendly content" /></a>
<script language="JavaScript">
flowplayer("wowza", "swf/flowplayer-3.1.5.swf", {
log: { level: 'debug', filter: 'org.flowplayer.rtmp.,org.flowplayer.securestreaming.' },
clip: {
url: 'mp4:videos/MyVideo.mp4',
// use RTMP streaming
provider: 'rtmp',
// with a secured connection
connectionProvider: 'secure'
},
plugins: {
// set up the RTMP streaming plugin
rtmp: {
url: "swf/flowplayer.rtmp-3.2.13.swf",
// The net connection URL with HDDN looks like this
netConnectionUrl: 'rtmpte://d.securevod.flowplayervod.netdna-cdn.com:1935/securevod.flowplayervod'
},
// set up the secure streaming plugin
secure: {
url: "swf/flowplayer.securestreaming-3.2.9.swf",
// the token value (shared secret).
token: 'bky9p52t'
}
}
});
</script>
</body>
</html>
Please test it from your end and tell me what still needs to be added in above code.
Your are using wrong script.
what I see, that you are trying to make a live stream using RTMP(Real Time Messaging Protocol). and you want to play just video.
well incase of flow-player you don't need to re-invent the wheel.
just use any sample script.
Embed Videos in your Web Pages with Flowplayer

Fineuploader taking too long to upload files in IE9

Hi I am using fineuploader 3.3.0 version.
I am facing problem with fineuploader in IE9. as fine uploader does not support sizeLimit in ie9.
I am checking the file size at server side with simple contentlength check if (this.Request.Files[0].ContentLength > 5242880).
but it took 1-2 mins to get this response. Also the 1.4 MB file is taking too long to upload.
Can some one please let me know what is causing it, following is the fineuploader code I am using:-
$('#restricted-fine-uploader').fineUploader({
request: {
endpoint: '/apm/api/job/UploadDocument/?category=' + JobDocuments.category + '&mode=' + JobDocuments.forceupload + '&jobid=' + job_manager_details.jobId
},
autoUpload: true,
text: {
uploadButton: 'Upload File'
},
multiple: false,
validation: {
allowedExtensions: ['doc', 'docx', 'xls', 'xlsx', 'pdf'],
sizeLimit: 5242880,
itemLimit: 1
},
showMessage: function (message) {
// Using Twitter Bootstrap's classes and jQuery selector and method
$('#restricted-fine-uploader').append('<div class="alert alert-error">' + message + '</div>');
}
}).bind('submit', function (event, id, fileName) {
$('#displaymessage').hide();
$('li. qq-upload-fail').hide();
job_manager_details.isuploading = 1;
// fileCount++;
}).bind('complete', function (event, id, fileName, responseJSON) {
$('li. qq-upload-fail').hide();
$('#displaymessage').hide();
job_manager_details.isuploading = 0;
if (responseJSON.success) {
// fileCount--;
ShowJobDocuments();
// if (fileCount == 0 && !$('div.alert-error').html()) {
$('#jobDocumentDialog').dialog("close");
// }
}
})
I just had the same issue and found one more clue.
The VM is incredibly slow (WinXP/IE8) while the network was a NAT'd but it became very fast as soon as it was switched to being bridged.
The speed of the upload should not be influenced by Fine Uploader in any noticeable way. All Fine Uploader does for non File API browsers, such as IE9 and older, is submit a <form> containing the file and related parameters. If you are noticing slow upload times, most likely something in your environment is the cause of the issue. You haven't provided any additional information about your environment, so I can't offer any advice on that front.
As you may already know, file size checking is not possible client-side in IE9 and earlier due to lack of File API support.

Loading local content through XHR in a Chrome packaged app

I'm trying to load in a web app that I've built using Backbone and it pulls in JSON and HTML template files that are stored locally. I was wondering with Chrome packaged apps whether it's possible to load these files by using some sort of 'get'/ajax request?
Currently I'm getting this...
OPTIONS chrome-extension://fibpcbellfjkmapljkjdlpgencmekhco/templates/templates.html Cannot make any requests from null. jquery.min.js:2
XMLHttpRequest cannot load chrome-extension://fibpcbellfjkmapljkjdlpgencmekhco/templates/templates.html. Cannot make any requests from null.
I can't find any real information on how to do this so any help would be great thanks!
Yes, it's totally possible, and it's easy. Here's a working sample. Try starting with this, confirm that it works, and then add back in your own code. If you hit a roadblock and come up with a more specific question than whether XHRs work in packaged apps, you might want to ask a new question.
manifest.json:
{
"name": "SO 15977151 for EggCup",
"description": "Demonstrates local XHR",
"manifest_version" : 2,
"version" : "0.1",
"app" : {
"background" : {
"scripts" : ["background.js"]
}
},
"permissions" : []
}
background.js:
chrome.app.runtime.onLaunched.addListener(function() {
chrome.app.window.create("window.html",
{ bounds: { width: 600, height: 400 }});
});
window.html:
<html>
<body>
<div>The content is "<span id="content"/>"</div>
<script src="main.js"></script>
</body>
</html>
main.js:
function requestListener() {
document.querySelector("#content").innerHTML = this.responseText;
};
onload = function() {
var request = new XMLHttpRequest();
request.onload = requestListener;
request.open("GET", "content.txt", true);
request.send();
};
content.txt:
Hello, world!
You are making a request from a sandboxed page, and sandboxed pages have a null origin.
I have posted this issue question on the Google Group.
Unless Chrome decides to changed the sandbox policy, it appears the only workaround is to make XHR requests from a non-sandboxed page and use Chrome's message passing API to give it to your sandboxed page.
I don't know why it has to be so difficult.
EDIT:
The answer from the Chrome Team was to change the CORS header to *.
I believe your problem is on the server side, rather than the client side. The server needs to send the following header for jQuery to deal with the response:
Access-Control-Allow-Origin: *
The problem, with this, however, is that any page can load that content now. Once you know the ID of your extension, you can change that header to something like:
Access-Control-Allow-Origin: chrome-extension://gmelhokjkebpmoejhcelmnopijabmobf/
A short test of something like the following showed these to work:
<h1>Content Below</h1>
<div id="loadme"></div>
<script src="jquery-1.9.1.min.js"></script>
<script src="app.js"></script>
// app.js
$(document).ready(function() {
$.get('http://localhost:8080/content.php', function(data) {
$('#loadme').html(data);
});
});
This would fail with the following message if I didn't add the Access-Control-Allow-Origin header:
XMLHttpRequest cannot load http://localhost:8080/newhope/deleteme.php.
Origin chrome-extension://gmelhokjkebpmoejhcelgkfeijabmobf is not allowed by
Access-Control-Allow-Origin.
Once I added the Access-Control-Allow-Origin header on the php response, it worked fine.
Again, setting this to * may be a security risk as any browser page anywhere is allowed to load it inline.

Resources