I would like to write some tests for the GUI of my Cocoa program.
Is there any good GUI testing framework for Cocoa apps? The only thing I found is Squish, which, at 2.400€, is well beyond my budget…
Any ideas? How do you test your Cocoa GUIs?
It depends on what you mean by "testing Cocoa GUIs."
If you want tools like the old Virtual User tool included with MPW, then those are few & far between; you'll be looking at tools like Squish and Eggplant.
If you want to write unit tests for your application's human interface, I suggest you follow a "trust, but verify" approach where you trust that as long as you're making the right connections (according to your framework) that your user can interact properly with your framework. That means you can do the majority of your testing by verifying your model and controller code are hooked up to your views correctly.
On my weblog, I've written a couple of examples of how to do this specifically with Cocoa, one for testing user interfaces built with target-action, and one for testing user interfaces built with Cocoa bindings. (Remember, of course, that the two technologies aren't exclusive: If you want to do drag & drop in a table view managed via Cocoa bindings, you'd also have a data source and probably a delegate hooked up via target-action.)
The thing I don't write unit tests for — generally — is the positioning or type of controls in their superview. Sometimes that is important to get and keep correct, however; in that case, I can just query the appropriate properties of the controls and verify them using the standard assertions.
What I virtually never do is write code to "simulate events." The closest I've ever come to that is constructing a fake drag info object and passing that to an outline view data source to ensure it will deal with drags correctly.
I would suggest you take a look at Google's Toolbox for Macintosh. It has, among some other nice goodies, a very nice set of state and rendering test additions for NSView and CALayers. In your unit tests you assert that the view/layer state or rendered image matches a saved (by name) template. If the template doesn't exist in the test bundle or doesn't match the saved version, a new encoded state or rendered TIFF is produced for review. GTM provides categories for NSView and CALayer to do state encoding and rendering. Obviously you can override these categories on your own NSView or CALayer subclasses to encode relevant state (using the NSCoder protocol) or rendering.
It also allows you to (easily) programatically send key events and run the run loop from with unit tests and it supports unit testing on both OS X and iPhone.
I created an open source Python package that uses the Apple Accessibility API among others to create a classic GUI automation library, giving you visibility into and interaction with Cocoa GUIs. PyATOM home page
You might check out and consider Eggplant by TestPlant (formally Redstone Software) at http://www.testplant.com/.
Here is an article that Apple featured on them last year.
The latest CocoaCast podcast has an interview with Ian Dees the author of "Scripted GUI Testing with Ruby". You can find out more at CocoaCast
Related
I'm looking for a way to embed another application into my own view.
The business reason is that the company has many small Electron apps (basically a small portable web program with a self-contained browser) that the company wants to embed inside an OS X program. These Electron apps would ideally integrate and display inside a subview seamlessly, so they look like little web frames inside our larger program.
I think programatically it would be easiest to open another program as a subview, but I'll take whatever I can get. Maybe even capturing it's NSWindow somehow. (Electron source is available so it is easily discoverable.) Maybe a way to dock the other program inside mine, or (getting more desperate) finding its view and sending commands to constrain it's size and location on top of mine.
So far all I've found says it is not really possible. I've found I can take the more desperate course. I can launch a process, find its view, and position it inside a spot on my display; when the window is moved or the content is scrolled send messages to move the other window. But that isn't really integrated, the menu stays separate, etc., but I cannot incorporate it.
Any ideas or helpful implementation details?
EDIT 1: Thanks for those responses. How about if we could have the electron apps expose their NSWindow somehow? Could that be leveraged? I'm thinking the application could send messages and (somehow, not sure exactly) to set the parent window inside this one. In Windows API it is much easier since you can call SetParent on anything, even items inside different processes. But Cocoa seems more difficult.
This isn't really a thing you can do in Mac OS X. Applications are not "composable" in the way you're hoping for - while it is possible to share a view with a subprocess under certain very specific circumstances (e.g, Safari or Chrome tab renderers), this requires the subapplication to be written in a very specific way to permit that. It's not something that would be feasible in the situation you're describing.
If you have access to the source of these Electron apps, consider combining them into a single overarching Electron application. Alternatively, if it's not possible for these applications to coexist within a single Electron app, you may want to consider using something like Chromium Embedded Framework to build your wrapper application; note, however, that this may require you to implement parts of the Electron framework yourself.
You cannot do that. Cocoa requires you to have only one NSApplication instance per UI app. So you will to fork/exec out new process and launch your applications.
If you can recompile the source code then you can create custom subclass of NSApplication and use that custom class in all the applications or you can create NSthread of other applications without NSApplication instance and go from there.
I was playing with Xamarin Mobile api MediaPicker which uses MediaRecorder with monodroid to make a plugin to record a video.
Android must preview the video inside a VideoView. This restriction applies to wp7 and ios too for privacy.
So, I need to get the VideoView (or Rectangle in wp7) from my custom view and setPreviewDisplay to this VideoView in my plugin (or init MediaPicker with this VideoView).
What is the best way to implement my portable plugin which requires UI element ?
Thanks in advance for your help.
What is the best way to implement my portable plugin which requires UI element ?
I guess my first question is "do you need a portable plugin?"
What is the interface that you actually need at the ViewModel layer or lower?
My guess is that the cross-platform interface that the ViewModel will see might contain just:
some control commands (things like start/stop)
some summary information - e.g. video length
a file access layer - this may be as little as a file path?
If that's the case, then I'd probably implement most of the logic within Controls/Views/UIViews in the UIs, and would then bind the relevant commands and values to those ViewModel properties.
So I wouldn't personally implement this as a plugin at all!
I've previously done a couple of apps which use video views - one for video capture (Android only), one for bar code scanning.
I found that the basic available samples worked quite well. However, once I started trying to extend them, then they became quickly fragile, they were hard to get working and they were quite frustrating to develop!
I would genuinely recommend starting your current develop as UI View code. After you've got it working, then you might find a nice way to split up the control and interface into a plugin - but I suspect that this won't be where most of your time is spent.
e.g. for my next QR code app, I plan to use the separate UI controls in https://github.com/Redth/ZxingSharp.Mobile - at the ViewModel level, I can hopefully just expose some sort of Command which acts on the decoded QR strings.
Can someone please guide me regarding which touch framework (javascript) I should use to make a tablet app? I am new to this area and I am looking for something which allows me to play with my own UI design comfortably.
I went through sencha as I heard its apt for a tablet app environment but I am (sorry, it might sound odd) not able to make out whether I can use my own UI design to make app in sencha. Or any other framework (stable) allows to use custom UI design?
There aren't any major differences between handsets and tablets, except for the screen size. For example, what you would show in a handset in one long scrolling screen, would be shown in a split-screen on a tablet (I am concentrating on the user-experience here).
Split-screen support in still rare in the jscript frameworks, since webkit browsers didn't fully support scrolling only parts of a page (i.e. an iframe or overflow:scroll divs), this support is only now starting to get materialized with iOS5 (Android already had this since 2.2, but it never worked right).
There have been other jscript solutions (like iScroll), but being client code they are not always bringing the full "experience" to the client.
The JQuery-Mobile docs have a version under testing, you can try that in a tablet/handset to see the differences.
Regarding your "own UI design", if you mean colors/icons/buttons that's possible on any framework. Where the problems start is when you want to create custom layouts, and each framework provides partial support depending on what exactly you want to achieve.
In general, I'd say Sencha totally separates you from HTML design - you build everything using JSON controls and it has an extensive events/rendering code (of course you can write your own controls), whereas frameworks like JQuery mobile work directly on the HTML (you specify data-* attributes for the details) and renders it almost the same (ok, it does adds wrapping layers, but in general it's still pure HTML).
As always, "it depends" on what you want to achieve and what you are ready to give up... ;-)
Sencha Touch (our framework) is particularly well suited to tablet apps because it has an implementation of multiple scrollable areas that works on older iOS and RIM devices, not just iOS5. But, the intention with Sencha Touch is that you create your app using the built-in UI components (carousels, momentum lists, tabs, etc.) or, if you have unique UI elements, then you will need to extend an existing component or build a custom component. If you're expecting to be able to slap some of your own HTML into innerHTML or even a Touch xTemplate, then you will be setting yourself up for failure. But the good news is that there are tutorials on doing your own components, and there are plenty of apps that you can look at the source of, in order to guide your development. Lots of people have built apps with custom UI's
You need some level of JavaScript experience to use Sencha Touch, so if you're coming from a non-JavaScript web design background, you'll have to get down the JavaScript learning curve first.
As the title suggests...
Is it possible to add custom Data Detectors to Cocoa apps?
If so, a gentle nudge in the right direction would be great.
Note: To be clear. I want to add new detectors to currents apps. I am not writing a new app.
Thankyou
W
It's not even possible to build a custom data detector on anything but iOS 4. NSDataDetector is only available on iOS 4 and above.
If they existed on OS X and were a plug-in class like Spotlight importers, that'd be a nice feature. Perhaps filing a request at bugreport.apple.com would help it along?
Later update
I think the reason this hasn't been opened up with an API is because they're only meant to find common data (contact info, dates, URLs) for which there is only one (or just a few) uses. That is, contact info can be stored or used in "the" system-designated app. URLs can be auto-highlighted so they're linkable (clicks invoke the system-designated handler - Safari, an app registered to a protocol, etc.). But there's only one direction to funnel those actions and the endpoint is always a major "convenience app" meant to manage this common information (contacts, calendar, browser, email app, phone app...)
On the other hand, consider app-specific information. Data formatted a certain way for use with one app or platform might mean something else entirely to another application. In fact, this is rather common. So what happens when a string like %%SOMESTRING%% is detected? To one app, it might be a placeholder token. To another, it might be a user name. To another still, it might be interpreted as %%USERNAME followed by %%. Suddenly the simple system-wide UI for handling basic data types has to account for multiple actions and/or multiple "data detector plugins" claiming all or part of a format.
I'm not sure we'll ever see custom data detector APIs on iOS or Mac for this reason alone.
While custom data detectors aren't available at the OS level, there is a mechanism that will get you almost there. One possibility is to create a Workflow in Automator and save it in the Services menu.
It can be configured to be active when text is highlighted. You'd either go to the current app's main menu and select the Workflow under "Services", or else right click on the text and go to the "Services" menu from there. Not as easy as clicking on the text as you would a URL, but pretty close.
Create a workflow in Automator on Mac
We need to automate GUI testing of an application developed in Win32 API. Developer's have created this application by custom painted controls. They have controls which look like Grid, Buttons etc., but they are not basic Windows controls.
What is custom painted controls? and how we can test these controls?
Test it just like any other UI: Not at all. Move all code out of the callbacks into the application layer where your unit tests can execute them just like any other method.
Rationale: There is little point in testing whether "button.activate()" works; you want to know whether the your code behind the button callback works.
If you need to know whether the correct dialogs, etc., are opened, see my blog: Testing the Impossible: User Dialogs
Have the developers added support for accessibility using IAccessible? If they have, you can use active accessibility to navigate through the controls and test them that way.
If they haven't, open a bug that says that their controls can't be use by disabled people (who need a screen reader or other accessibility aid).
Once they fix that bug, you can use whatever mechanism they added to their controls to allow them to be used by screen readers and other accessibility aids to test their UI.