track the movement of eyeballs of the user in iphone - xcode

I am developing an application where I need to track the movement of eyeballs of the user i.e. whether the user is looking at the top or bottom of an iPhone. I have used FaceDetection in my earlier projects but in that only the eyes were detected not the movement of eyes.
Is there any API or any Framework which could detect the motion of eyeballs??
Any help would be great.

Related

Which screen size should I start making a wireframe from?

Usually, browsing on Behance, Dribbble or other online stuff like those, we see wireframes only for desktop or only for mobile, so I'm still in doubt about something: when we need to make a wireframe (and considering the "mobile-first"), where should I start from? Desktop, tablet and mobile or the inverse?
"Mobile first" is a design philosophy of designing for a small screen and touch interactions first and then adding layout changes that need to happen as the display gets larger...
Start with the size/device your audience is most likely to use for interaction. Not all interactions are best on a small screen...
So as a designer who knows most users will be on a phone might go: "mobile first". In that case, you should start with mobile wireframes and then show the developers how it should change on a bigger screen by making additional wireframes (if that's your team's process).
Behance and Dribbble are bad examples of process since they're usually only showing you the end result, not the work that goes into getting there.

Object floating and moving with user movement

i'm trying to create a simple Tango application to visualize my 3D models. The problem is that, when I hold my Tango device and move around, I can see my model is also moving with me.
Looks like my physical movement is not reflected in my Tango app in correct scale - like when I stepped for 3 feets, I only moved 2.5 feets or so in the app. However I found other Tango applications on the same device works perfectly - their 3D objects are stable and stays on the same spot without moving.
Please advice. Thank you.

Docking wear app to watch face

Apologies for an 'open' question, but can anyone provider pointers on how to 'dock' my app to the Android Wear watch face?
Essentially, I want users of the application to be able to swipe left to right (or vice-versa) from the edge of the screen to open the application, compared to having to scroll the list of applications after tapping the watch face.
I've seen this implemented in another wear app, but don't know the right terminology to produce meaningful results in Google. Is it a wallpaper service, specific view type, touch listerner service etc?
Many thanks.
You can't receive touch events inside the WatchFaceService, touch delivery is disabled.
I can't say for sure how the app you saw implemented the desired behavior, but it probably did by inserting views directly into the WindowManager from a Service.
Checkout this youtube video: https://www.youtube.com/watch?v=S3vHjxonOeg
I don't know how well the Standout library does its job, but it should give you enough examples to figure out yourself, how to add views to the WindowManager.

Generate and post Multitouch-Events in OS X to control the mac using an external camera

I am currently working on a research project for my university. The goal is to control a Mac using the Microsoft Kinect camera. Another student is writing the Kinect driver (which will be mounted somewhere on the ceiling or the wall behind the Mac and which outputs the position of all fingers on the Macs screen).
It is my responsibility to use that finger-positions and react on them. The goal is to use one single finger to control the mouse and react on multiple fingers the very same way, like they are on the trackpad.
I thought that this is going to be easy and straight forward, but its not. It is actually very easy to control the mouse cursor using one finger (using CGEvent), but unfortunately there is no public API for creating and posting Multitouch-Gestures to the system.
I've done a lot of research, including catching all CGEvents using an event tap at the lowest possible position and trying to disassemble them, but no real progress so far.
Than I stumbled over this and realized, that even the lowest position for an event tap is not deep enough:
Extending Functionality of Magic Mouse: Do I Need a kext?
When I got it right, the built-in Trackpad (and the MagicMouse and the MagicTrackpad) communicates over a KEXT-Kernel-Extension with the private MultitouchSupport-framework, which is generating and posting the incoming data in some way to the OS.
So I would need to use private APIs from the MultitouchSupport.framework to do the very same thing like the Trackpad does, right?
Or would I need to write a KEXT-Extension?
And if I need to use the MultitouchSupport-framework:
How can I disassemble it to get the private APIs? (I know class-dump, but that only works on Objective-C-frameworks, which this framework is not)
Many thanks for any response!
NexD.
"The goal is to use one single finger to control the mouse and react on multiple fingers the very same way" here if I understand what you are trying to do is you try to track fingers from Kinect. But the thing is Kinect captures only major body joints. But you can do this with other third party libraries I guess. Here is a sample project I saw. But its for windows. Just try to get the big picture there http://channel9.msdn.com/coding4fun/kinect/Finger-Tracking-with-Kinect-SDK-and-the-Kinect-for-XBox-360-Device

Any possibility to get a notification if another application receives a scroll event?

I'm developing an application in Cocoa which allows users to draw on any given window in OS X. The drawings move along with the corresponding window when dragged on screen. To complete this tie between drawings and the windows (and their contents) beneath, I'd like to catch scrolling events from the window in order to react on the positioning/visibility of the drawings.
An example:
The user opens Safari and browses the web. On a specific website s/he draws a circle around a link and takes hand written notes (this is all considered a drawing, input by a pen tablet). Afterwards s/he moves the window, the drawings are also being moved so that they remain on top of the link on the website. Then s/he begins to scroll the website and the location of the link changes (moves up until it's outside of the viewport).
Now I'd like to catch that event and also move the layer with the drawings to keep them on top of the link. When the link is no longer visible, I'd turn off the visibility of the drawing and turn it back on when scrolling brings the link back on to the viewport.
I know this is a quite tricky assignment and being able to intercept such events from another application might as well be considered an OS security flaw but maybe someone out there is good enough a coder to give me a hint... :)
The Cocoa Accessibility classes may be helpful but until now I haven't found the solution.
Thanks for your help.
Oh, and if that's not tricky to you, maybe you can tell me how to get notified when Safari switches Tabs ;)
kkthxbai
I'm not sure if you can monitor scroll events. However, it's a lot easier if you just monitor the position of the link with the Accessibility API.
Just hold a reference to that link and constantly poll it for its position, if the position changes, you know what to do.
You could also try using AXObserverAddNotification, but as far as I am aware, there is no notification you can monitor for position changes.
If you haven't discovered it already, the Accessibility Inspector can help you a lot with identifying things that you can get using the Accessibility API and pfiddlesoft's UI Browser lets you register for notifications.

Resources