How to animate basiceditfield in blackberry? - animation

In my j2me blackberry application, I want to open editfield to enter text when user clicks on search icon. I have search icon added at the top right corner, when user clicks on it, I want to open basiceditfield with animating from right to left. I want Animation should be like default search on main screen in blackberry. How to do this? Is it possible?

You can do almost anything with the Blackberry Ui, if you understand it and are prepared to put the time in. But what you see when you press the magnifying glass is something that someone has spent a lot of time doing. There is API for doing anything like that (at least not one I have found). So if you are not experienced doing BlackBerry Ui, as I suspect is the case here, then I would suggest that replicating what the BB engineers have done with the animation from the search icon on the Home screen to the search screen, is too difficult to justify.
A lot of these sorts of things are, in my opinion, just gloss. They do not make the application any easier to use, just make it look flash. Personally, I would spend your time on making sure your application works well, rather than making it look flashy.
I do recommend using the screen transition animations as a way of moving from one screen to another. These are fairly easy to use and when used correctly, provide a good visual clue to your user regarding the flow through your application. I also suggest you spend some time making sure your assets (icons etc.) look good, on all the various sizes of BB that you are developing for.

Related

Which screen size should I start making a wireframe from?

Usually, browsing on Behance, Dribbble or other online stuff like those, we see wireframes only for desktop or only for mobile, so I'm still in doubt about something: when we need to make a wireframe (and considering the "mobile-first"), where should I start from? Desktop, tablet and mobile or the inverse?
"Mobile first" is a design philosophy of designing for a small screen and touch interactions first and then adding layout changes that need to happen as the display gets larger...
Start with the size/device your audience is most likely to use for interaction. Not all interactions are best on a small screen...
So as a designer who knows most users will be on a phone might go: "mobile first". In that case, you should start with mobile wireframes and then show the developers how it should change on a bigger screen by making additional wireframes (if that's your team's process).
Behance and Dribbble are bad examples of process since they're usually only showing you the end result, not the work that goes into getting there.

Hide or not to hide the system tray in a Windows Phone 7 application?

I am debating whether to hide or not to hide the system tray in a Windows Phone 7 application. I've not found any general suggestions on this issue -the official Windows Phone design guidelines don't address this issue at all - except for Jeff Wilcox's blog post who suggested that he personally likes to see the system tray in applications. I'd like some general advice on this issue from other Windows Phone developers.
Some reasons for showing the system tray are
Doesn't take up that much space
Users may want to see it at times
Reasons for hiding it are
You can't control its background: unless you're using PhoneBackgroundBrush as the background the top row will stand out
Lots of widely used / official apps already hide it: all games as well as the official Facebook and Twitter app.
I'd appreciate all advice on this.
Transparency and colors are now possible with Mango by setting its BackgroundColor and Opacity :
shell:SystemTray.IsVisible="True"
shell:SystemTray.BackgroundColor="Transparent"
shell:SystemTray.Opacity="0"
I would say by default show it unless it really gets in the way in a way you can't workaround, especially if its an app and not a game.
I would say it depends on the application/game you are writing
If the app needs a network connection or if you will be in the application for a long period of time show it so the user can keep an eye on network and battery.
If you need the extra screen space (for a game?), and you rarely need network hide it.
Or... I guess you leave it up-to the user and give them a setting to programatically hide/show it.
I think its best to use the tray in applications that are tools or utilities. Typically these users would prefer more info than less when they're using applications on the phone (battery, network, time).
Also adding the tray in there tends to make the application look more native on the phone (according to me and others I've asked), which is a big plus because the user associates your app as if it was built with the phone OS.
But if the application is a game, media, etc. type of application I recommend you take it off, especially on panorama because it takes away from the intended design. Also these types of applications focus on the content and seeing multiple little icons at the top can take away from the experience.
Really to me the space it occupies isn't really THAT much, so that shouldn't be too much of an issue. But rather the purpose of the app as stated above.
I do like the suggestions that have been given as far as giving the user the choice. Check out this code:
bool ShowTray;
//ASK USER WHAT THEY WANT
//ShowTray = true or false;
SystemTray.IsVisible = ShowTray;
I've been reluctantly hiding, at least on any view where I have a background image; otherwise it looks too strange to me. I've been considering a setting in my app that would allow the user to choose, and persisting that to isolated storage.
Also considering maybe having the top of the screen in phone background brush color and have it fade / blend into another color or background image. Not sure how well that would turn out but as long as it is not a cheesy looking gradient effect, perhaps.
I'm hoping eventually MS will soon add support for transparency in the system tray or otherwise help address this issue. As a user I wish that I could force the system tray to always be visible across all apps, but as a developer I realize that the visual effect often doesn't look good.
Perhaps if the community came up with a new UX metaphor where maybe double-tapping in the system tray area would toggle whether it is visible. A single tap could perhaps start to animate / pop / hint at the system tray...

Any possibility to get a notification if another application receives a scroll event?

I'm developing an application in Cocoa which allows users to draw on any given window in OS X. The drawings move along with the corresponding window when dragged on screen. To complete this tie between drawings and the windows (and their contents) beneath, I'd like to catch scrolling events from the window in order to react on the positioning/visibility of the drawings.
An example:
The user opens Safari and browses the web. On a specific website s/he draws a circle around a link and takes hand written notes (this is all considered a drawing, input by a pen tablet). Afterwards s/he moves the window, the drawings are also being moved so that they remain on top of the link on the website. Then s/he begins to scroll the website and the location of the link changes (moves up until it's outside of the viewport).
Now I'd like to catch that event and also move the layer with the drawings to keep them on top of the link. When the link is no longer visible, I'd turn off the visibility of the drawing and turn it back on when scrolling brings the link back on to the viewport.
I know this is a quite tricky assignment and being able to intercept such events from another application might as well be considered an OS security flaw but maybe someone out there is good enough a coder to give me a hint... :)
The Cocoa Accessibility classes may be helpful but until now I haven't found the solution.
Thanks for your help.
Oh, and if that's not tricky to you, maybe you can tell me how to get notified when Safari switches Tabs ;)
kkthxbai
I'm not sure if you can monitor scroll events. However, it's a lot easier if you just monitor the position of the link with the Accessibility API.
Just hold a reference to that link and constantly poll it for its position, if the position changes, you know what to do.
You could also try using AXObserverAddNotification, but as far as I am aware, there is no notification you can monitor for position changes.
If you haven't discovered it already, the Accessibility Inspector can help you a lot with identifying things that you can get using the Accessibility API and pfiddlesoft's UI Browser lets you register for notifications.

Window docking advice for Mac

I'm from a Windows programming background when writing tools, but have been programming using Carbon and Cocoa for the past year. I have introduced myself to Mac by, I admit it, hiding from UI programming. I've been basically wapping my OpenGL code in a view, then staying in my comfort zone using my platform agnostic OpenGL C++ code as usual.
However, now I want to start porting one of my more sophisticated applications to Mac OS.
Typically I use the standard Visual Studio dockable MDI approach, which is excellent, but very Windows-like. From using a Mac primarily now for a while, I don't tend to see this sort of method used for Mac UIs. Even Xcode doesn't support the idea of drag and drop/dockable views, unfortunately. I see docked views with splitter panels, but that's about it.
The closest thing I've seen to the Visual Studio approach is Photoshop CS4, which is pretty nice.
So what is the general consensus on this? Is there are more Mac-like way of achieving the same thing that I haven't seen? If not, I'm happy to write a window manager in Cocoa myself, so that I can finally delve in an learn what looks like an excellent API.
Note, I don't want to use QT or any other cross-platform libraries. The whole point is that I want to make a Mac app look like a Mac app, leave the Windows app looking like a Windows app. I always find the cross-platform libraries tend to lose this effect, and when I see a native Mac UI, with fancy Cocoa transitions and animations, I always smile. It's also a good excuse for me to learn Cocoa.
That being said, if there is an Open Source Cocoa library to do this, I'd love to know about it! I'd love to see how someone else achieves this, and would help smooth the Cocoa learning curve.
Cheers,
Shane
UPDATE: I forgot to mention a critical point. I support plugins, which can have their own UI to display various plugin specific information. I don't know which plugins will be loaded and I don't know where their UI will live, if I don't support docking. I'd love to hear people's thoughts on this, specifically: How do I support a plugin view architecture, if the UI can't change? Where do I put the plugin views?
Coming from a Windows background, you feel the need to have docking windows, but is it really essential to the app? Apple's philosophy (in my opinion) is that the designer knows better than the user how things should look and work. For example, iTunes is a pretty sophisticated app, but it doesn't let you change the UI around, change the skin, etc., because Apple wants to keep it consistent. They offer the full view, the mini player, and a handful of different viewing options, but they don't let you pull the source list off into a separate window, or dock it in other positions. They think it should be on the left, so there it stays...
You said you "want to make a Mac app look like a Mac app", and as you pointed out, Mac apps don't tend to have docking windows. Therefore, implementing your own docking windows is probably a step in the wrong direction ;)
+1 to Ken's answer.
From a user perspective unless its integral to the app like it is in Adobe CS or Eclipse i want everything as concise as possible and all the different options and displays out of my way so i can focus on the document.
I think you will find with mac users that those who have the "user skill" to make use of rearranging panels will in most cases opt for hot key bindings instead, and those who dont have that level of "skill" youre just going to confuse.
I would recommend keeping it as simple as possible.
One thing that's common among many Mac apps is the ability to hide all the chrome and focus on your content. That's the point behind the "tic tac" toolbar control in the top right corner of many windows. A serious weakness of many docking UIs is that they expect you to have the window take up most of the screen, because the docked panels can obscure content. Even if docked panels are collapsable, the space left by them is often just wasted and filled with white space. So, if you build a docking panel into your interface, you should expect it to be visible most of the time. For example, iTunes' source list is clearly designed to be visible all the time, but you can double-click a playlist to open it in a new window.
To get used to the range of Mac controls, I'd suggest you try doing some serious work with some apps that don't have a cross-platform UI; for example, the iWork apps, Interface Builder or Preview. Take note of where controls appear and why—in toolbars, in bottom bars, in inspectors, in source lists/sidebars, in panels such as IB's Library or the Font and Color panels, in contextual HUDs. Don't forget the menu bar either. Get an idea of the feel of controls—their responsiveness, modality, sizing, grouping and consistency. Try to develop some taste—not everything is perfect; just try iCal if you want to have something to make fun of.
Note that there's no "one size fits all" for controls, which can be an issue with docking UIs. It's important to think about workflow: how commonly used the control would be, whether you can replace it with direct manipulation, whether a visible indication of its state is necessary, whether it's operable from the keyboard and mouse where appropriate, and so forth. Figure out how the control's placement and behavior lets the user work more efficiently.
As a simple example of example of a good versus bad control placement and behavior in otherwise-decent applications, compare image masking in OmniGraffle and Keynote. In OmniGraffle, this uses the Image inspector where you have to first click on an unlabeled button ("Natural size") in order to enable the appropriate controls, then adjust size and position away in a low-fidelity fashion with an image thumbnail or by typing percentages into fields. Trying to resize the frame directly behaves in a bizarre and counterintuitive fashion.
In Keynote, masking starts with a sensibly named menu item or toolbar item, uses a HUD which pops up the instant you click on a masked image and allows for direct manipulation including a sensible display of the extent of the image you're masking. While you're dragging a masked image around, it even follows the guides. Advanced users can ignore the HUD entirely, just double-clicking the image to toggle mask editing and using the handles for sizing. It should be easy to see, with a few caveats (e.g. the state of "Edit Mask" mode should be visible in the HUD rather than just from the image; the outer border of the image you're masking should be more effectively used) Keynote is substantially better at this, in part because it doesn't use an inspector.
That said, if you do have a huge number of options and the standard tabbed inspector layout doesn't work for you, check out the Omni Group's OmniInspector framework. Try to use it for good, and hopefully you'll figure out how to obsess over UI as much as you do over graphics now :-)
(running in slow motion, reaching out in panic) Nnnnnoooooooo!!!!!
:-) Seriously, as I mentioned in reply to Ken's excellent answer, trying to force a "Windowsism" on an OS X UI is definitely a bad idea. In my opinion, the biggest problem with Windows UI is third-party developers inventing new and inconsistent ways of presenting UI, rather than being consistent and following established conventions. To a Mac user, that's the sign of a terrible application. It's that way for a reason.
I encourage you to rethink your UI app's implementation from the ground up with the Mac OS in mind. If you've done your job well, the architecture and model (sans platform-specific implementation) should clearly translate to any platform.
In terms of UI, you've been using a Mac for a year, so you should have a pretty good idea of "the norm". If you have doubts, it's best to post a question specifically detailing what you need to present and your thoughts on how you might do it (or asking how if you have no idea).
Just don't whack your app with the ugly stick by forcing it to behave as if it were running in Windows when it's clearly not. That's the kiss of death for an app to Mac users.

Windows mobile controlling scroll bar with finger

I have a question about the windows mobile development.
I created a mobile form on the windows mobile 6.0 test project. But that example form slightly larger than the vertically normal pocketpc forms. I now everybody said you can press the scrollbar for accessing bottom or any location of the form.
But i need to use the finger for easy navigating form areas. This kind of iphone :)
Is it possible ? how can i make this ?
Windows Mobile 6.5 adds gesture support, that is supposed to allow such functionality for finger control. Of course, your code has to take advantage of it.
You can also write your own, which isn't difficult, but still cumbersome.
My answer could be classified as subjective. I try to now show the scroll bar when possible for just that reason. On most devices that have touch screen, you can scroll using you finger (and I'm a somewhat large guy -- 6'3" with farmer kid hands).
But if you are displaying a grid, that isn't always possible. The results can go off the screen very easily. Oh well, grab a pen and hit the scrollbar.
Other screen elements that can help: tab control. separate your controls into groups and put each group on a separate tab. I also do a lot of wizards with LARGE next and previous buttons.
But in all of this, if it is designed to be stylus free, just pray the user doesn't have to type anything using the screen soft keyboard. That just doesn't work with a finger.

Resources