I'm pretty new to Appcelerator and was wondering what the listeners for the view life cycle are?
For example, if I wanted to detect the iOS viewWillAppear and viewDidDisappear methods, or Androids OnResume, OnPause methods, then how would I do this the "Appcelerator" way?
I've searched around on the web, but only able to find in the Titanium documentation info about the application state such as Active, Suspended, ect. I need a controller, or window, specific listener to react to.
Thanks!
Titanium abstracts those events for you - so you don't have to worry about writing them for iOS/Android each.
Check out the Titanium.App documentation (http://docs.appcelerator.com/platform/latest/#!/api/Titanium.App). You can see what events are available at the app level (of course, each Titanium components has it's own events - but those are at app level).
If I understand your question, the relevant events for you are paused and resumed - when the app goes in the background and then back to foreground.
Related
I've found that whilst there are a lot of tutorials on Xamarin Android, there does not seem to be a great deal on how to dispose of resources. More particularly, when they are disposed of.
For example, in the OnCreate handler of an activity, I am making several Rx subscriptions, each of which returns an IDisposable. I have tried to dispose of those in various other handlers (e.g. OnDestroy), but those handlers never get invoked. But the subscriptions seem to pile up because OnCreate runs every time the activity is navigated to.
In addition to those subscriptions, there's all the UI controls (TextViews, Buttons etc.) which I am assigning to class-level variables (fields). And those also implement IDisposable.
For all I know, I've got memory leaks all over the place.
Is there a guidance on this anywhere?
#SushiHangover is correct (thanks Sushi). OnPause and OnResume were the events I was after. I also had a bit of a challenge in that when I clicked my custom "Back to Start" button, I needed to go right back to the start screen (skipping the intermediate screen along the way).
The way to do that is use the ClearTop ActivityFlag (Android.Content.ActivityFlags.ClearTop) when starting the Home screen activity. Raw Android code version of this can be seen here https://stackoverflow.com/a/5794572/540156
When you do that, you can clean things up on the activities which get popped off the back-stack as they get popped (in the OnDestroy handler, from recollection).
I'm working on inactivity detection.
I have successfully done so in iOS by subclassing UIApplication and overriding SendEvent as outlined here.
I know I could implement this separately for both iOS and Android, but I'd rather have a cross-platform Forms approach by intercepting all touch events and resetting my timers. I'd rather not have to add a touch event handler to all my pages either.
I was unable to find a cross platform approach. I was able to accomplish this by leaving the timer related information in core Forms project and implement the touch event handlers separately for iOS and Android. I handled the iOS touch events as outlined in the link in the OP, but for Android I took the approach of subclassing Activity due to the presence of the OnUserInteraction() method.
Initially I thought I would have to force Xamarin to use my subclassed Activity for all pages that I use in Xamarin, but I was mistaken. AdamMeaney over on the Xamarin forums was able to help with providing a solution for the Android side of things with regards to subclassing an Android activity. As it turns out, Xamarin only uses one Activity which inherits from Xamarin.Forms.Platform.Android.FormsAppCompatActivity. I used the MainAcitivty provided by Xamarin in the Droid project. From there, overriding the OnUserInteraction() proved to be quite simple:
public override void OnUserInteraction()
{
base.OnUserInteraction();
//Do other stuff
}
It would seem to me that all you really have to do on the platform side is get notified whenever a new touch event occurs. Unless I am missing something it seems you can do all of the timer stuff in the PCL core Forms project and call that code from the platform specific code that runs when a touch is detected.
So if on Android ( I did not verify, but I would assume so) there is a similar way to handle any touch, device wide, then it would seem that all you have to do is implement that event handler, as you did for iOS, and call into your Forms core code to handle the timer(s).
To clarify: On the platform side, just handle the touch events globally and then call into code in the Forms core thus only having to implement the timer functionality once. Or so it would seem unless I am missing something.
If you want to make a feature request for Xamarin Form, please do so at Xamarin's user voice page: xamarin.uservoice.com
I suspect Forms would just have to do as outlined above... handle device wide touches on each platform and then have a virtual method in Forms core code that is called whenever a touch occurs.
I'm looking for a way to embed another application into my own view.
The business reason is that the company has many small Electron apps (basically a small portable web program with a self-contained browser) that the company wants to embed inside an OS X program. These Electron apps would ideally integrate and display inside a subview seamlessly, so they look like little web frames inside our larger program.
I think programatically it would be easiest to open another program as a subview, but I'll take whatever I can get. Maybe even capturing it's NSWindow somehow. (Electron source is available so it is easily discoverable.) Maybe a way to dock the other program inside mine, or (getting more desperate) finding its view and sending commands to constrain it's size and location on top of mine.
So far all I've found says it is not really possible. I've found I can take the more desperate course. I can launch a process, find its view, and position it inside a spot on my display; when the window is moved or the content is scrolled send messages to move the other window. But that isn't really integrated, the menu stays separate, etc., but I cannot incorporate it.
Any ideas or helpful implementation details?
EDIT 1: Thanks for those responses. How about if we could have the electron apps expose their NSWindow somehow? Could that be leveraged? I'm thinking the application could send messages and (somehow, not sure exactly) to set the parent window inside this one. In Windows API it is much easier since you can call SetParent on anything, even items inside different processes. But Cocoa seems more difficult.
This isn't really a thing you can do in Mac OS X. Applications are not "composable" in the way you're hoping for - while it is possible to share a view with a subprocess under certain very specific circumstances (e.g, Safari or Chrome tab renderers), this requires the subapplication to be written in a very specific way to permit that. It's not something that would be feasible in the situation you're describing.
If you have access to the source of these Electron apps, consider combining them into a single overarching Electron application. Alternatively, if it's not possible for these applications to coexist within a single Electron app, you may want to consider using something like Chromium Embedded Framework to build your wrapper application; note, however, that this may require you to implement parts of the Electron framework yourself.
You cannot do that. Cocoa requires you to have only one NSApplication instance per UI app. So you will to fork/exec out new process and launch your applications.
If you can recompile the source code then you can create custom subclass of NSApplication and use that custom class in all the applications or you can create NSthread of other applications without NSApplication instance and go from there.
Apologies for an 'open' question, but can anyone provider pointers on how to 'dock' my app to the Android Wear watch face?
Essentially, I want users of the application to be able to swipe left to right (or vice-versa) from the edge of the screen to open the application, compared to having to scroll the list of applications after tapping the watch face.
I've seen this implemented in another wear app, but don't know the right terminology to produce meaningful results in Google. Is it a wallpaper service, specific view type, touch listerner service etc?
Many thanks.
You can't receive touch events inside the WatchFaceService, touch delivery is disabled.
I can't say for sure how the app you saw implemented the desired behavior, but it probably did by inserting views directly into the WindowManager from a Service.
Checkout this youtube video: https://www.youtube.com/watch?v=S3vHjxonOeg
I don't know how well the Standout library does its job, but it should give you enough examples to figure out yourself, how to add views to the WindowManager.
I'm new to cocoa app dev, and I'm searching a solution to create a windows like the tweetie main windows with a left tool bar and a panel that point to the selected icon.
like this screenshot : http://i.stack.imgur.com/qvxWu.jpg
could anyone help me?
It's likely that a lot of the Tweetie UI is implemented using custom controls. You'll want to look into subclassing NSView and how to handle drawing and mouse events. There's nothing built into the Cocoa framework for this.
The NSView documentation has info on view programming, drawing, and event handling. If you're new to Cocoa, you may want to start off with something built in, though, as this will be a lot of work (and requires a pretty good understanding of how the framework works).