Windows Phone leave background music playing - windows-phone-7

My Windows Phone app was rejected because when I play a sound in the browser in my app it stops existing background music from playing (rejection issue 6.5.1).
How do I update my application to not stop background music?
My JavaScript is something like this:
var mySound = new Audio(soundpath);
Sound.play(mySound);
I could not find anything in Microsoft.Xna.Framework.Media that handles how the app handles browser sound, but maybe I missed something?

What you are looking for is from the xna soundeffect
Here is what I use and it works for me great for all my sound effects.
(You cannot use music with a 'soundeffect' object, as it will not pass the windows store qualifications. You have to use the MediaElement for music)
I can’t remember which you need but I have all these includes
using Microsoft.Xna.Framework;
using Microsoft.Xna.Framework.Audio;
using Microsoft.Xna.Framework.Media;
and this function
private static void LoadSoundEffect(string fileName, SoundEffectInstance mySound)
{
var stream = TitleContainer.OpenStream("Sounds/" + fileName);
var effect = SoundEffect.FromStream(stream);
FrameworkDispatcher.Update();
mySound = effect.CreateInstance();
}
If you use a SoundEffectInstance, you can start,pause,stop the soundeffect etc. without all the problems.
SoundEffect works as well, but I prefer the SoundEffectInstance.
Edit: If you use soundeffect, it will not effect the music being played.

I'm assuming that you are building a native Windows Phone app (.XAP) and then using a browser control to display your app content. If that is the case, you have checks that you will need to perform when the app is first opened. Below is a description of how to resolve that issue.
You will need to add this piece of code to your app (I suggest you check this every time your app loads/opens). Add a reference with the using statement as well for the Media.
if (Microsoft.Xna.Framework.Media.MediaPlayer.State == MediaState.Playing)
{
MessageBoxResult Choice;
Choice = MessageBox.Show("Media is currently playing, do you want to stop it?", "Stop Player", MessageBoxButton.OKCancel);
if (Choice == MessageBoxResult.OK)
mediaElement1.Play();
}
If the user allows or disallows the media to be played you must put checks in your JavaScript for this. You will fail certification again if the user says "Cancel" and you play music anyway.
There is more information on the topic here.

Related

How do I detect when embedded Youtube video is playing?

UPDATE (July 2020): I was able to get a mostly working version of this. See my answer to the related question here.
In my angular Nativescript iOS app, I have a webview set up to play youtube videos. Following a method similar to this question here, I can get the youtube video to load and play automatically, using the Youtube Iframe API. But how do I detect when the video actually starts playing (after it is loaded)?
The Iframe API has events for this purpose, like "onStateChange". But, because my youtube code is "stuck" inside the webview, I am currently not able to read when events are fired from that webview.
In Nativescript, there is a nativescript-webview-interface plugin for this purpose, but I can't get it to work. I have put my code below. If that is the way to go, what is the correct code to get it working?
(I don't want to use the nativescript-youtube plugin because that brings in youtube's quotas, which are regularly shrinking. )
Code I Have Tried:
To get the youtube player to activate, I have put all of the relevant youtube code inside the webview. That works to play the video, but not yet to get the event of when the player starts playing. To do what I want to do, I need to have some way of inserting that code into my app WITHOUT trapping it in the webview. Or, have some way of communicating inside the webview.
To try to communicate with the webview, I have tried the nativescript-webview-plugin:
$ tns plugin add nativescript-webview-interface
html:
<web-view src="{{youtubeCode}}" #webView ></web-view>
ts:
import {WebView, LoadEventData} from "tns-core-modules/ui/web-view";
let webViewInterfaceModule = require('nativescript-webview-interface');
...
export class ...{
#ViewChild('webView') webView: ElementRef;
public youtubeCode = [code that youtube provides in its IFrame API]
ngOnInit(): void {
this.setupWebViewInterface();
}
setupWebViewInterface() {
let webView: WebView = this.webView.nativeElement;
this.oWebViewInterface = new webViewInterfaceModule.WebViewInterface(webView, '~/www/index.html');
this.youtubeListen()
}..
youtubeListen(){
this.oWebViewInterface.on('onStateChange', (eventData)=>{ //'onStateChange' is the event provided in the Youtube Iframe API
console.log('event data = ' + eventData) //***this is the key part I want to work.
});
}
RESULT: ERROR TypeError: undefined is not an object (evaluating 'this.webView.ios.constructor')
This is a problem in the plugin's index.ios.js file. If I comment out the offending line, then the errors go away, but then nothing happens
..

How can I test the background scan and launch the application in background with iBeacon-Android?

I am using the pro library.
But I just found doc for free library
I cannot find any doc for pro version.
Also, I don't know how to implement the background mode even using the pro sample.
Here are the steps:
Build the pro sample project
start the iBeacon source(using iPad) and it can be detected
start the application and then press home button the make it in
background
Turn off the iBeacon source
Turn on the iBeacon source
However, more than 5 minutes, the application does not launch
So, can anyone verify the step I did?
How can I test the background mode more easily?
Also, for the BootstrapNotifier, is it just work only first time when the device reboot?
After that, even I put application in background, the application will not launch when it detect iBeacon?
Your testing method sounds fine. I think the issue is that the reference app for the pro library only auto launches the app on the first detection after boot. After that, it sends a notification instead, and tapping on that notification launches the app.
This is purely for demonstration purposes. You can change it to auto launch on every detection if you wish. Simply alter the haveDetectedIBeaconsSinceBoot logic in this code:
#Override
public void didEnterRegion(Region arg0) {
// In this example, this class sends a notification to the user whenever an iBeacon
// matching a Region (defined above) are first seen.
Log.d(TAG, "did enter region.");
if (!haveDetectedIBeaconsSinceBoot) {
Log.d(TAG, "auto launching MainActivity");
// The very first time since boot that we detect an iBeacon, we launch the
// MainActivity
Intent intent = new Intent(this, MainActivity.class);
intent.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
// Important: make sure to add android:launchMode="singleInstance" in the manifest
// to keep multiple copies of this activity from getting created if the user has
// already manually launched the app.
this.startActivity(intent);
haveDetectedIBeaconsSinceBoot = true;
} else {
// If we have already seen iBeacons and launched the MainActivity before, we simply
// send a notification to the user on subsequent detections.
Log.d(TAG, "Sending notification.");
sendNotification();
}
}
The javadoc link was missing from the main documentation page when you posted this question. That is fixed now.

How to reuse BackgroundAudioPlayer after Close method called

I’m using MediaElement for playing videos and BackgroundAudioPlayer for playing audio.
Here is a case.
I’m playing remote audio via BackgroundAudioPlayer.
Then I want to play video and before MediaElement begins playing video I’m calling BackgroundAudioPlayer.Close as suggested in BackgroundAudioPlayer best practices.
MediaElement and the BackgroundAudioPlayer
Care must be taken when mixing BackgroundAudioPlayer and MediaElement for audio playback.
1. Close() must be called before switching to MediaElement playback.
2. There is only one media queue. Your application cannot pause background audio, play something with MediaElement then resume the background audio stream.
But after video is playing I want to play audio again.
// Play audio result
BackgroundAudioPlayer.Instance.Track = new AudioTrack(new Uri(audioSearchResult.Url, UriKind.Absolute), audioSearchResult.Title, null, null, null,
AudioPlayer.TrackStateBuffering, EnabledPlayerControls.All);
BackgroundAudioPlayer.Instance.Play();
I’m getting InvalidOperationException on first line of code saying “The background audio resources are no longer available”.
So how can I reuse BackgroundAudioPlayer in my app after using MediaElement?
EDIT:
If use MediaPlayerLauncher instead of MediaElement it works second time audio being played cause app is being tombstoned when MediaPlayerLauncher launches. But is it possible to mix MediaElement and BackgroundAudioPlayer in one app!?!?!
Seems to be another nightmare from MS:(
It looks like for you to be able to continue using the background audio player after you used the MediaElement to play video you need to call BackgroundAudioPlayer.Instance.Close() again after the video end and before using any other BackgroundAudioPlayer methods.
Your example should look like that:
// Play audio result
BackgroundAudioPlayer.Instance.Close();
BackgroundAudioPlayer.Instance.Track = new AudioTrack(new Uri(audioSearchResult.Url, UriKind.Absolute), audioSearchResult.Title, null, null, null, AudioPlayer.TrackStateBuffering, EnabledPlayerControls.All);
BackgroundAudioPlayer.Instance.Play();
You must call BackgroundAudioPlayer.Instance.Close() BEFORE you start playing the media element. I've tried this in both WP7.1 and WP8 emulators with a simple Background Audio agent (not streaming). Without this call I consistently see InvalidOperationExceptions. With it things behave much better.
For instance:
private void ButtonPlayMediaElement(object sender, RoutedEventArgs e)
{
BackgroundAudioPlayer.Instance.Close();
mediaElement.Source = new Uri("http://wpdevpodcast.episodes.s3.amazonaws.com/Episode_093_Were_All_Stickmen.mp3", UriKind.Absolute);
mediaElement.Play();
}
Also:
You are adding a track from your UI, you should really do this in your GetNextTrack in the background audio agent.
If you want to use both audio and video media content in your application do not mix MediaElement with BackgroundAudioPlayer!
Use MediaLauncher with BackgroundAudioPlayer and of course do not forget to call BackgroundAudioPlayer.Instance.Close() before MediaLauncher.Show()

Extrading Metadata of an audio streaming file (MediaElement)

I just started to develop for the Windows Phone (7.1/8) platform and am still not really familiar with it.
My plan is an internet radio app which streams the audio file from a server. I used MediaElement to set the source property to the streaming URL.
It works and the app starts playing the music but I can't read any metadata about the song which is coming from the server such as artist name/title or any string which I can use to know about the song itself.
I've been searching around and tried the MediaReached event as well, but it never gets fired as well?
So any idea what should I do?
and My Code Behind Sample:
protected override void OnNavigatedTo(NavigationEventArgs e)
{
MyMedia.Source = new Uri("MyURL");
}
I hate to tell you this but it is because the MediaElement.Attributes are not supported for Windows Phone 7 - all the limitations can be found here: http://msdn.microsoft.com/en-us/library/ff426928(VS.95).aspx

How can I start QuickTime and have it start playing a url?

I'm using MonoMac, but I understand cocoa and objc well enough that if you can answer me in those languages, please do.
I have a url from my web server which returns an mp4. I'd like my MonoMac application to launch QuickTime and start playing that url.
I tried these methods:
Process.Start("/Applications/QuickTime Player.app/Contents/MacOS/QuickTime Player", url);
but when the url is something like http://webhost/1/blah.mp4, quicktime says "The document blah.mp4 could not be opened. The file doesn't exist. I know the file exists and everything is correct. If I use this method:
var cfurl = MonoMac.CoreFoundataion.CFUrl.FromUrlString(url, null);
LSOpenCFURLRef(cfurl.Handle, (IntPtr)null);
The stream is opened in Safari and the QuickTime plugin starts playing it.
I've also tried NSWorkspace OpenUrls and OpenFile
NSWorkspace.SharedWorkspace.OpenUrls(new[]{NSUrl.FromString(url)},
#"com.apple.quicktimeplayer",
NSWorkspaceLaunchOptions.Async,
new NSAppleEventDescriptor(),
new[]{""});
but this launches in safari
NSWorkspace.SharedWorkspace.OpenFile(url, "QuickTimePlayer");
but this does nothing.
So I try NSTask
MonoMac.Foundation.NSTask.LaunchFromPath("/Applications/QuickTime Player.app/Contents/MacOS/QuickTime Player",
new string[] {url});
But this gives the same "... could not be found..." as my very first attempt above.
Finally, if I start QuickTime Player and use open URL and paste the url into the textbox and click Open, the stream plays without error.
How can my cocoa app send a URL to QuickTime Player?
Considering the URL is a remote URL, you can use Scripting Bridge in Cocoa applications to ask QuickTime Player to open a remote URL:
id qtApp = [SBApplication applicationWithBundleIdentifier:#"com.apple.QuickTimePlayerX"];
[qtApp activate];
if ([qtApp isRunning]) {
// note that the parameter to openURL: must be the string representation of a URL
[qtApp openURL:#"http://movies.apple.com/media/us/ipad/2011/tours/apple-ipad2-feature-us-20110302_r848-9cie.mov?width=848&height=480"];
}
You’ll need to link the Scripting Bridge framework to your application.

Resources