I am developing a testing algorithm for our iOS apps using Appium. To fully implement this algorithm i need to identify wither i have moved onto different screen or am still on the same screen after performing some action. I need to know, what makes every screen unique/different from other in terms of Appium?
Going through the pageSource of every screen, i found that most screens have xpath attribute in window element. Can i use value of xpath of window element to mark the screen as unique from others, or do i need to do a trivial string comparison between screen's pageSources to mark them different? Or is there any other better solution?
Not sure if xpath would be the best solution for this. Normally the UIAWindow would remain the same, and developers might use different containers within this UIAWindow to render different screens.
So to verify different screens, you might need to figure out what this container is and see if the container's properties change when you move to a new screen (ie a new container)
If you app user a different header for every new screen, then you can use this header to see if the screen is changed. Example: in WhatsApp, you would see a different persons at the top. So in this case, the person's name can be assumed as the header.
If this doesn't work then you can verify some of the other controls, or say list of all the UIAStaticText on the screen. During screen change the entire list of UIAStaticText might change. So this can indicate a screen change.
For our automation suite at work I've implemented a series of screen check steps. Every time we switch screens I do a find_element command for an element on that screen that is unique to that screen. That way if a button or option takes me to a new screen that is incorrect my test will fail as expected. If it does find the element we're expecting it adds minimal time to the test suite.
Anish Pillai made a good suggestion of using the header text if there is any. Otherwise a particular tab, menu text, resource_id, or whatever is unique about the page would suffice. All you would need to do is a find_element call and a failure message if it fails.
Related
I'm currently prototyping a potential app where the user would be managing a primary list. What I would like is for the ability of an alternate list to slide out so both lists would appear side by side. This would allow the user to transfer items between the two via drag and drop. Since the main workflow would be on the primary list most of the time, it seems like it would be cleaner if most of the time, the alternate list is hidden.
To me, it seems like using NSDrawer would be a good fit. However it's deprecated. And I'm assuming it's impossible to build an app which has an 'L' shape or inverted 'T' shape.
So I was thinking of potential alternatives which I could try out.
Some ones I though of were
Expanding the window width to accommodate a split view. And shrink when done
Create a new NSWindow but find a way to always pin it to the side. So if either window were moved, the other would move with it.
Just use another NSWindow and don't care about the pinning. In theory I could always open it next to the other window and let the user worry about managing any movement.
Has anyone had any success in replicating an NSDrawer type of alternative?
I am using xcode 8.3.3, swift, and XCTest. I am wondering what the best approach to handle scrolling is when you plan to run your tests against multiple simulators and have a list displayed. Since the screen size may change based on the simulator being used, the element you want to select to scroll up on may or may not be displayed.
If I have a list with x # of elements, how do I best approach which element to use to scroll the list up to have the next set of elements displayed on the screen so that the tests will run on multiple simulators of different screen sizes?
When we do "po XCUIApplication()" we see all the elements in the list, so in order to know which one is the last one displayed on the screen, we would have to look through each element and do a checked like isDisplayed or something to find the last element currently displayed ... but I was hoping there is a better approach?
If the element you want to use is displayed when you do po XCUIApplication() then you should just be able to tap() it without having to scroll it into view. The framework will handle the scrolling automatically if it can find the element.
Here is an article I wrote discussing how to use SBTUITestTunnel for scrolling:
https://rlukasik.medium.com/using-sbtuitesttunnel-for-scrolling-in-xcuitests-2e166440ca73
For mouse I'm using:
ourEvent = CGEventCreate(None);
currentpos = CGEventGetLocation(ourEvent);
What can I use for the caret?
First the bad news.
Not every app is Cocoa-based, and those that are neither Cocoa nor Carbon nor a straight mix of the two—i.e., those based on wxWidgets, Qt, or some other cross-platform framework—typically reimplement the entire GUI stack on top of raw event and drawing primitives.
That means that there is typically no way to get this information from those applications (unless they're scriptable and expose it that way).
The good news is, Cocoa apps and some Carbon apps may expose this via Accessibility.
The user will need to have assistive devices turned on in System Preferences. Once that condition is met, you can use the Accessibility framework to get the frontmost application, get its focused window, get its focused view, and get its selection ranges.
A text view with an insertion point has exactly one selection range, and that range is empty (length=0). The location is where the insertion point is.
Of course, those are character indexes, not on-screen bounds.
That's where parameterized attributes come in. There's one for converting ranges to bounds. That's the one you want.
Theoretically (I haven't tried this), you should be able to convert the empty range of the insertion point to an empty or nearly-empty rectangle whose location is somewhere within the vertical line of the insertion point.
Make sure you test this with text views that are in scroll views, particularly when the insertion point is scrolled partially or completely out of view.
You'll want to use the Accessibility Inspector to see for yourself where your application will need to look, and to test individual applications and investigate reported failures.
You can get it from the Developer Downloads page, in the “Accessibility Tools” disk image.
If you want to focus a window, forging a mouse event to click on it is a bad idea—anything can happen if you click on the wrong thing. Send the window an kAXRaiseAction action instead.
If you want to set a text view's insertion point (and are looking to find where you need to forge a mouse event to click to set it in the desired position), again, that's a bad way to do it. Set the view's kAXSelectedTextRangesAttribute attribute instead. Again, an insertion point is a single empty range.
Did you try like this below?
NSPoint p=[[NSApp currentEvent]locationInWindow];
CGFloat X=p.x;
CGFloat Y=p.y;
NSLog(#"%f %f",X,Y);
I need to implement a threaded view of sorts in an old VB6 app. It should look similar to this:
So, it's like a TreeView of sorts but there are buttons on the right (for each row) that could be pressed. The view does not need to collapse - it always stays in the expanded mode. The users should be able to respond to each node (via the comment button on the far right). And, of course, users should be able to scroll through the entries.
What are some of the ways I could implement this? I am open to 3rd party controls, paid or not.
VSFlexGrid has an outline mode. You can set the indent per row via the RowOutlineLevel property. It supports word wrap, images, etc within its cells/columns so you should be able to get pretty close to what you want. It also supports owner-drawn which lets you fully customize the cell painting (for example, to get those rounded corners).
I'm sure there are other controls out there as well...
I have just spet the last few hours trying to find the flag to use in Terminal to launch an app with the colored outlines around the various view elements to show how they are nested. I know that Matt Gemmell covered it during the Cocoa Face Off session of NSConference 2009 (at about the 13minute mark in the video). Unfortunately I can't actually read what he types and he doesn't speak the exact command. I know it has to be in the Apple docs somewhere but the search system is currently not being of any use. It looks like her just adds -showAllViews YES to the end of the command to open TextEdit but that command has no effect in 10.6.6. I have also tried every other capitalization I can think of as well as using view instead of views. Every command opens TextEdit just fine but doesn't show the colored outlines.
Use -NSShowAllDrawing and -NSShowAllDrawingColor:
/Applications/TextEdit.app/Contents/MacOS/TextEdit -NSShowAllDrawing 200 -NSShowAllDrawingColor cycle
-NSShowAllDrawing sets the delay between drawing commands (allowing you enough time to see the drawing update)
-NSShowAllDrawingColor sets the fill colour for the regions with pending drawing operations (see class methods on for NSColor for valid values, or pass it "cycle" to loop through all available colours).