Chromecast receiver using sendLoadComplete() doesn't update status - chromecast

I'm writing a Chromecast receiver to play different kind of contents (including embedded flash videos). I'd like to use my own JS library to create the player canvas, not to rely on the html video element.
I'm currently blocked because I can't get a media to be loaded using a custom behaviour :
Receiver :
Nothing fancy in the HTML, I just load my library in the #mediaWrapper div. Then I create a MediaManager from it.
var node = $( "#mediaWrapper" )[0];
var phiEngine = new phi.media.Player( node );
window.mediaManager = new cast.receiver.MediaManager( phiEngine );
window.castReceiverManager = cast.receiver.CastReceiverManager.getInstance();
/* Override Load method */
window.mediaManager['origOnLoad'] = window.mediaManager.onLoad;
window.mediaManager.onLoad = function (event) {
console.log('### Application Load ', event);
/* Custom code (load lib, set metadata, create canvas ...) */
window.mediaManager.sendLoadComplete(); // Doesn't seem to do anything
// window.mediaManager['origOnLoad'](event);
// -> Fails 'Load metadata error' since url is not a video stream
// -> ex: youtube url
}
/* Will never be called */
window.mediaManager['origOnMetadataLoaded'] = window.mediaManager.onMetadataLoaded;
window.mediaManager.onMetadataLoaded = function (event) {
...
}
Sender :
I use my own android application to cast to the device. I can't use the Companion library because it will be a Titanium module.
private void createMediaPlayer() {
// Create a Remote Media Player
mRemoteMediaPlayer = new RemoteMediaPlayer();
mRemoteMediaPlayer.setOnStatusUpdatedListener(
new RemoteMediaPlayer.OnStatusUpdatedListener() {
#Override
public void onStatusUpdated() {
Log.e(TAG, "onStatusUpdated");
}
}
});
mRemoteMediaPlayer.setOnMetadataUpdatedListener(
new RemoteMediaPlayer.OnMetadataUpdatedListener() {
#Override
public void onMetadataUpdated() {
Log.e(TAG, "onMetadataUpdated");
}
});
try {
Cast.CastApi.setMessageReceivedCallbacks(mApiClient,
mRemoteMediaPlayer.getNamespace(), mRemoteMediaPlayer);
} catch (IOException e) {
Log.e(TAG, "Exception while creating media channel", e);
}
mRemoteMediaPlayer
.requestStatus(mApiClient)
.setResultCallback(
new ResultCallback<RemoteMediaPlayer.MediaChannelResult>() {
#Override
public void onResult(MediaChannelResult result) {
Log.e(TAG, "Request status : ", result.toString());
if (!result.getStatus().isSuccess()) {
Log.e(TAG, "Failed to request status.");
}
}
});
}
private void loadMedia( MediaInfo mediaInfo, Boolean autoplay ) {
try {
mRemoteMediaPlayer.load(mApiClient, mediaInfo, autoplay)
.setResultCallback(new ResultCallback<RemoteMediaPlayer.MediaChannelResult>() {
#Override
public void onResult(MediaChannelResult result) {
Log.e(TAG, "loadMedia ResultCallback reached");
if (result.getStatus().isSuccess()) {
Log.e(TAG, "Media loaded successfully");
} else {
Log.e(TAG, "Error loading Media : " + result.getStatus().getStatusCode() );
}
}
});
} catch (Exception e) {
Log.e(TAG, "Problem opening media during loading", e);
}
}
Expected behaviour :
I basically first call createMediaPlayer() once, then call loadMedia(...). The first call to loadMedia will show nothing in the log : nor success or fail. Next calls issue with errorCode 4.
I get the load event on the receiver side. But, back to the sender side, I can't manage to end the load phase and get a media session to be created.
I was expecting sendLoadComplete() to do so but I might be wrong. How can I have the media status to update and the loadMedia ResultCallback to be reached ?
My goal is to use RemoteMediaPlayer.play(), pause(), ... but for now I get stuck with 'No current media session' because the media isn't loaded yet.
Also, I'd really enjoy to be able to log any message the Sender receives, before being processed. Is it possible ?
Hope I did not forget any information,
Thanks for your help !
edit: I solved this by using a custom message channel since it seems that I can't use RemoteMediaPlayer the way I want to.

I believe the error code 4 you are receiving is bogus; see https://plus.google.com/u/0/+JimRenkel2014/posts/aY5RP7X3QhA . As noted there, I created a Chromecast issue for this (https://code.google.com/p/google-cast-sdk/issues/detail?id=305&thanks=305&ts=1403833532). Additional support for this issue will help it get fixed faster! :-)

.sendLoadComplete(true)
The boolean value helped me to receive the loaded event on sender. Might help you as well.

Related

AEM 6.3 Set up PageEvent Handler/Listener

Im currently working on setting up a event handler for page creations and deletions on aem to then call one of our vendors API.
Ive been basing my work on a module we already have that listens to replication events.
So far i was able to reproduce that behavior on my module and trigger the code upon replications. However, i only need the calls to the API on Page Publications and deletions.
Ive been trying to find how to diferentiate between replications and page deletions and activations.
So far, it seems that AEM handles crx replications and page publications as the same type of event "type= ACTIVATION".
If i delete a page, it does set the type as "DELETE" so i can work with that to call the API but for the page publications im lost since as i mentioned, AEM looks like it handles CRX replications and pages publications as the same type.
After some research, i found the PageEvent API and I tried to set up a Page Event Listener but it is not getting triggered upon publications or deletions of pages, so im not sure that if what im trying to do its possible or maybe my componet is located on a wrong part of the project to listen for Page Events.
Thanks beforehand
This below code works fine to detect page deletion event :
#Component(
service = {
EventHandler.class,
JobConsumer.class
},
immediate = true,
configurationPolicy = ConfigurationPolicy.OPTIONAL,
property = {
"event.topics=" + PageEvent.EVENT_TOPIC,
JobConsumer.PROPERTY_TOPICS + "=" + "aem/custom/event"
}
)
public class CustomEventHandler implements EventHandler, JobConsumer {
#Override
public void handleEvent(Event event) {
PageEvent pageEvent = PageEvent.fromEvent(event);
Map<String, Object> properties = new HashMap<>();
properties.put("pageEvent", pageEvent);
jobManager.addJob("aem/custom/event", properties);
}
#Override
public JobResult process(Job job) {
PageEvent pageEvent = (PageEvent) job.getProperty("pageEvent");
try {
if (pageEvent != null && pageEvent.isLocal()) {
Iterator<PageModification> modificationsIterator = pageEvent.getModifications();
while (modificationsIterator.hasNext()) {
PageModification modification = modificationsIterator.next();
if (PageModification.ModificationType.DELETED.equals(modification.getType())) {
// Your logic
}
}
}
} catch (Exception e) {
logger.error("Error : ", e);
}
return JobResult.OK;
}
}

Rendering the Google Recaptcha in Android Studio 3

I am using Android Studio 3
I am following this article to learn how to use Google Recaptcha in Android Studio.
Installed the package using this: implementation 'com.google.android.gms:play-services-safetynet:12.0.1'
API keys are also registered.
I saw there is onClick event handler but where is it mentioned about rendering the recaptcha?
Update 1
When I wrote the button click code as mentioned in the link...I got a complication error: inconvertible types cannot cast anonymous android.view.view.onclicklistener to java.util.concurrent.executor
Code as asked in comment
btn_Login.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(final View view) {
SafetyNet.getClient(this).verifyWithRecaptcha("")
.addOnSuccessListener((Executor) this,
new OnSuccessListener<SafetyNetApi.RecaptchaTokenResponse>() {
#Override
public void onSuccess(SafetyNetApi.RecaptchaTokenResponse response) {
// Indicates communication with reCAPTCHA service was
// successful.
String userResponseToken = response.getTokenResult();
if (!userResponseToken.isEmpty()) {
// Validate the user response token using the
// reCAPTCHA siteverify API.
}
}
})
.addOnFailureListener((Executor) this, new OnFailureListener() {
#Override
public void onFailure(#NonNull Exception e) {
if (e instanceof ApiException) {
// An error occurred when communicating with the
// reCAPTCHA service. Refer to the status code to
// handle the error appropriately.
ApiException apiException = (ApiException) e;
int statusCode = apiException.getStatusCode();
} else {
}
}
});
}
});
I used below code and everything is work fine now.
Make sure to implement Executor in the activity
btn_Login.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(final View view) {
SafetyNet.getClient(Activity.this).verifyWithRecaptcha("")
.addOnSuccessListener((Activity) MyActivity.this,
new OnSuccessListener<SafetyNetApi.RecaptchaTokenResponse>() {
#Override
public void onSuccess(SafetyNetApi.RecaptchaTokenResponse response) {
// Indicates communication with reCAPTCHA service was
// successful.
String userResponseToken = response.getTokenResult();
if (!userResponseToken.isEmpty()) {
// Validate the user response token using the
// reCAPTCHA siteverify API.
}
}
})
.addOnFailureListener((Activity) MyActivity.this, new OnFailureListener() {
#Override
public void onFailure(#NonNull Exception e) {
if (e instanceof ApiException) {
// An error occurred when communicating with the
// reCAPTCHA service. Refer to the status code to
// handle the error appropriately.
ApiException apiException = (ApiException) e;
int statusCode = apiException.getStatusCode();
} else {
}
}
});
}
});
According to the article, in your button click handler you must call the method SafetyNet.getClient(this).verifyWithRecaptcha(...) to show reCAPTCHA and handle success or error. Passing this, you give the SDK handle to your current view which should be shown after solving reCAPTCHA. Most probably the rendering will be done by the SDK itself given that it’s a part of the OS. And most probably it will be full-screen in a separate top-level view blocking access to your app before solving the riddle.
You should try to implement it in your app as described in the article and see how it goes. Then you can ask a more specific question.
EDIT: You combined 2 techniques in your code: copy-pasting the code from Google and implementing anonymous class from it. So the problem you asked in the comment is that using (Executor) this in line 5 refers now not to your View (as it was there in the original tutorial) but to the instance of the anonymous interface implementation new View.OnClickListener() that you created. Ypu can refer to this answer to see how it can be implemented not interfering with already complex reCAPTCHA code.

How to start video during audio call in sinch

Using Sinch SDK
1) i have made a video call. i have a button in my GUI. i want to turn off video by clicking the button to make the call like a audio call.
i am starting video call as
Call call = getSinchServiceInterface().callUserVideo("user1");
String callId = call.getCallId();
Intent callScreen = new Intent(this, RunningVideoCallActivity.class);
callScreen.putExtra(SinchService.CALL_ID, callId);
startActivity(callScreen);
2) i have made an audio call. i have a button in my GUI. i want to start video by clicking the button to make the call like a video call.
i am starting audio call as
Call call = getSinchServiceInterface().callUser("user1");
String callId = call.getCallId();
Intent callScreen = new Intent(this, RunningAudioCallActivity.class);
callScreen.putExtra(SinchService.CALL_ID, callId);
startActivity(callScreen);
3) How to mute a call in Sinch.
4) how to hold a call in Sinch. Please help.
You cant start video in the audio call, what you could do is to always start an audio call and pause the video in the beginning to make it look like an audio call. WE dont have hold functionality.
To mute use mute and unmute on the audio controller
Hope this comes useful for Future readers.
You really can't switch between callUserVideo and callUser while on an ongoing call.
But there is an alternate way to achieve the functionality. This is what Sinch Support team says,
you can pause video and do voice only, so all calls are video and you can pause / resume the video track
So what this means is, you have to start the call always with callUserVideo, in case you want to toggle between audio and video.
So for toggling between Audio and Video, you need to do some thing like this. In the page where you are handling the incoming call client.
// function to be called when you want to toggle to video call
private void resumeVideo() {
if (mVideoCall != null) {
mVideoCall.resumeVideo();
}
}
// enable speaker
// add remote and local video views
private void resumeVideoCallback() {
mAudioCallToggle.setText("Switch to AudioCall");
if (getSinchServiceInterface() != null && getSinchServiceInterface().getAudioController() != null) {
getSinchServiceInterface().getAudioController().enableSpeaker();
}
addLocalView();
addRemoteView();
}
// function to be called when you want to toggle to audio call
private void pauseVideo() {
if (mVideoCall != null) {
mVideoCall.pauseVideo();
}
}
// disable speaker
// remove remote and local video views
private void pauseVideoCallback() {
mAudioCallToggle.setText("Switch to VideoCall");
if (getSinchServiceInterface() != null && getSinchServiceInterface().getAudioController() != null) {
getSinchServiceInterface().getAudioController().disableSpeaker();
}
removeVideoViews();
}
And on your video call listener, implement like this
.............
.............
other implementations
.............
.............
#Override
public void onVideoTrackAdded(Call call) {
Log.d(TAG, "Video track added");
addRemoteView();
}
#Override
public void onVideoTrackPaused(Call call) {
pauseVideoCallback();
}
#Override
public void onVideoTrackResumed(Call call) {
resumeVideoCallback();
}
And finally to toggle between Audio/Video, do something like this
new OnClickListener() {
#Override
public void onClick(View v) {
if (mAudioCallToggle.getTag().equals("Audio")) {
mAudioCallToggle.setTag("Video");
pauseVideo();
} else {
mAudioCallToggle.setTag("Audio");
resumeVideo();
}
}
}

Not able to receive notification on android device using Parse.com push notification service

I am not able to receive notification on android device. I am seeing following error message on app startup.
Could not find method android.database.Cursor.getNotificationUri, referenced from method com.parse.ParseSQLiteCursor.getNotificationUri
Any pointers?
If your code was similar to old code in parse sample code like above.
// Specify an Activity to handle all pushes by default.
PushService.setDefaultPushCallback(this, SavedVideoListActivity.class);
ParseInstallation.getCurrentInstallation().saveInBackground();
Try to choose this code in new sample code. (https://parse.com/tutorials/android-push-notifications)
ParsePush.subscribeInBackground("", new SaveCallback() {
#Override
public void done(ParseException e) {
if (e == null) {
Log.d("com.parse.push", "successfully subscribed to the broadcast channel.");
} else {
Log.e("com.parse.push", "failed to subscribe for push", e);
}
}
});

Xamarin "Sorry, this video cannot be played"

How can i handle this error?
please help me out of this situation.
private void previewVideo(){
try{
var path = Android.Net.Uri.Parse(App._file.AbsolutePath);
preview.SetVideoURI (path);
preview.Start ();
}
catch(Exception e){
e.GetBaseException ();
}
}
Your'e lucky that I was following your previous question. Please try to have your questions as detailed as possible so it's easier for us to analyze and possible replicate the error.
To be able to set an error listener on the VideoView, the VideoView needs an object that implements the Android.Media.MediaPlayer.IOnErrorListener interface.
You can accomblish that by letting your Activity implement the previous mentioned interface, and setting the Activity as the ErrorListener for the VideoView
public class MainActivity : Activity, Android.Media.MediaPlayer.IOnErrorListener
{
...
protected override void OnCreate(Bundle bundle)
{
...
preview = FindViewById<VideoView> (Resource.Id.SampleVideoView);
preview.SetOnErrorListener(this); // <- Set the error listener
...
}
...
//The implementation of MediaPlayer.IOnErrorListener
public bool OnError(MediaPlayer player, MediaError error, int extra)
{
// Do Something here because error happened
}
...
}
By doing this, when error occurs in the VideoView the VideoView will call the public OnError method.
From the Android Docs of OnErrorListener you can see what the OnError method should return.
Returns:
True if the method handled the error, false if it didn't. Returning false, or not having an OnErrorListener at all, will cause the OnCompletionListener to be called.

Resources