cocoa - App architecture - xcode

I've a app that lives of making HTTP GET calls, and parsing the JSON responses. I've about 5/6 different views, where that are waiting for that responses to present the information. The contents are much like text and images.
Problems:
- If I make the requests only when the user enters on the view, he have to wait to much, and that's not a good user experience.
- If I make all the requests on the first viewcontroller, the app becomes more slow, and sometimes it takes to long to respond to the user taps.
Questions:
- What is the best way to implement the app, so I've a good user experience and a good performance?

I think it depends on the amount of data you ae downloading. If a view needs a lot of data it might be best to download it only when this view is about to be shown, so your app doesn't unnecessarily use bandwidth.
If you do want to load the data up front it might be best to do so in a separate thread so the UI thread isn't blocked.
You could also cache the data and use this cache while the newest data is downloading in the background. This creates a seamless user experience.

Related

UI/UX Design in Real Time Applications

Lately I've been developing applications with Meteorjs and I notice that most developers/designers don't pass the idea to the user that you don't need to refresh the browser.
In Meteor, it's completely useless to refresh the browser, the data changes are directly pushed to client.
What UI/UX Design practice match this kind of application behaviour?
I thought about adding a refresh button in the UI, in the hope that the user doesn't refresh the browser, but this is bad design, it's not an honest design. With this approach I'm lying to the user.
I've been dealing with this lately and there are a few different ways I've handled it, depending on the scenario.
Sometimes it doesn't really matter that much for UX that the user refreshes. Generally, for data that changes often, you can use methods like those described in this link about designing for realtime (provided by larsbuur's answer) to make your user understand that they don't need to refresh. The general idea is to use unobtrusive but visible and clear cues that the data has changed.
If there is not much UI state that can't be recovered from the route, refreshing the browser should work pretty much how the user expects. Generally, you don't want to try to make browser features work in unexpected ways. If you want to persist UI state across refreshes, you can try something like the Meteor package u2622:persistent-session.
You could also remind the user that they don't need to refresh after they do. For example, using the web storage API:
Meteor.startup(function() {
var reload = !!sessionStorage.getItem('reload');
var manualReload = reload && !Session.get('hotReload');
if (manualReload) {
// Tell the user they don't need to manually reload the browser
}
sessionStorage.setItem('reload', true);
Session.set('hotReload', true);
});
In certain cases, it is more important that the user understands they shouldn't refresh. For example, in online games you often have a pre-match lobby system where game settings can be adjusted, and the lobby leader is the player who has been in the lobby the longest. If the lobby leader refreshes, they will lose their position.
It is possible to make this work by having a reconnect time window where the user can reclaim their position, or the user will lose their spot in the lobby after the timeout. The mizzao:user-status is helpful with this. This is, in my opinion, the most friendly UX option, and you can even display a spinner icon next to the username to indicate to other users that the user may be reconnecting. However, my attempts at implementing this have proven to be a bit buggy and unpredictable when it comes to data integrity/server reliability.
For now, I've chosen to simply use the beforeunload event like StackOverflow does when you are typing. It's not very customizable and not as friendly as transparent reload/reconnect, but sometimes it's the best compromise when you simply want to warn the user why they might not want to refresh right now.

Implement real-time updating notification feature

I'd like to implement some visual indicator of various sections for the items whose status is pending in my app similar to facebook's / google plus unread notification indicator...I have written an API to fetch the count to be displayed, but I am stuck at updating it everytime an item gets added or deleted, I could think of two approaches which I am not satisfied with, first one being making an API call related to the count whenever a POST or DELETE operation is performedSecond one being refreshing the page after some time span...
I think there should be much better way of doing this from server side, any suggestion or any gem to do so?
Even in gmail it is refreshed on client request. The server calculates the amount of new items, and the client initiates a request (probably with AJAX). This requires an almost negligible amount of data and process time, so probably you can get away with it. Various cache gems even can store the part of the page refreshed if no data changed since last request, which even solves the problem of calculating only when something changed.
UPDATE:
You can solve the problem basically two ways: server side push, and a client side query. The push is problematic, for various reasons, rarely used in web environment, at least as far as I know. Most of the pages (if not all) uses timed query to refresh such information. You can check it with the right tool, like firebug for firefox. You can see as individual requests initiated towards the server.
When you fire a request trough AJAX, the server replies you. Normally it generates a page fragment to replace the old content with the new, but some cache mechanism can intervene, and if nothing changed, you may get the previously stored cache fragment. See some tutorial here, for various gems, one of them may fit your needs.
If you would prefer a complete solution, check Faye (tutorial here). I haven't used it, but may worth a try, seems simple enough.

Designing an application around HMVC and AJAX [Kohana 3.2]

I am currently designing an application that will have a few different pages, and each page will have components that update through AJAX. The layout is similar to the new Twitter design where 'Home', 'Discover', and 'Connect' are separate pages, but interacting within the page (such as clicking 'Followers' or 'Following') uses AJAX.
Since the design requires an initial page load with several components (in the context of Twitter: tweets, followers, following), each of which can be updated individually through AJAX, I thought it'd be best to have a default controller for serving pages, and other controllers with actions that, rather than serving full pages, strictly handle querying the database and returning JSON objects. This way, on initial page load several HMVC requests can be made to gather the data for each component, and AJAX calls can also be made to update each component individually.
My idea is to have a Controller_Default that handles serving pages. In the context of Twitter, Controller_Default would contain:
action_home()
action_connect()
action_discover()
I would then have other Controllers that don't deal with serving full pages, but rather components of pages. For instance, in the context of Twitter Controller_Tweet may have:
action_get()
which returns a JSON object containing tweets for a specific user. Action_home() could then make several HMVC requests to get the data for the several different components of the page (i.e. make requests to 'tweet/get', 'followers/get', 'following/get'). While on the page, however, AJAX calls could be made to the function specific controllers (i.e. 'tweet/get') to update the content.
My question: is this a good design? Does it make sense to have the pages served through a default controller, with page components served (in JSON format) through other function specific controllers?
If there is any confusion regarding the question please feel free to ask for clarification!
One of the strengths of the HMVC pattern is that employing this type of layered application doesn't lock you into a workflow that might be difficult to change later on.
From what you've indicated above, this would be perfectly acceptable as a way of serving content to a client; the default controller wraps sub-requests, which avoids multiple AJAX calls from the client to achieve the same goal.
Two suggestions I might make:
Ensure that your Twitter back-end requests are abstracted out and managed in a library to make the application DRY'er and easier to maintain.
Consider whether the default controller is making only the absolutely necessary calls on each request. Employ caching to avoid pulling infrequently changed data on every request (e.g., followers might only be updated every 30 seconds). This of course depends entirely on your application requirements, but if you get heavily loaded you could quickly find your Twitter API request limit being reached.
One final observation: if you do find the server is experiencing high load and Twitter API requests are taking a long time to return, consider provisioning another server and installing a copy of your application. You can then "point" sub-requests from the default gateway application to your second application server, which should help improve response times if the two servers are connected by a high-speed link.

Why do update streams require the user to manually 'load more content'?

Looking at a lot of web applications (websites/services/whatever) that have a 'streaming' component (typically this is a 'Social' app): Think: Facebook's 'Wall', Twitter 'Feed', LinkedIn's 'News Feed'.
They have a pretty similar characteristic: 'A notice of new items is added to the page (automatically assuming via a background Ajax call', but the new HTML representing the newest feed items isn't loaded to the page until the users click this update link.'
I guess I'm curious if this design decision is for any of the following reasons and if so: could anyone whom has worked on one of these types of apps explain the reasoning they found for doing it this way:
User experience (updates for a large number of 'Facebook Friends' or
'Pages' or 'Tweets' would move too quickly for one to absorb and
read with any real intent, so the page isn't refresh automatically.
Client-side performance: fetching a simple 'count' of updates
requires less bandwidth (less loadtime), less JS running to update
the page for anyone whom has the site open, and thus a lighter
weight feel on the client-side.
Server-side performance: Fewer requests coming into the server to
gather more information about recent updates (less outgoing
bandwidth, more free cycles to be grabbing information for those
whom do request it (by clicking the link). While I'm sure the owners
of these websites aren't 'short on resources', if everyone whom had
Twitter or Facebook open in the browser got a full-update fetched
from the server every-time one was created I'm sure it would be a
much more sig. drag on resources.
They are actually trying to save resources (it takes a cup of coffee
to perform a Google search (haha)) and sending a few bytes of data
to the page representing the count of new updates is a lot lighter
of a load on applications that are being used simultaneously on
hundreds and thousands of browser windows (not to mention API
requests).
I have a few more questions depending on the answer to this first question as well...so I'll probably add those here or ask another question!!
Thanks
P.S. This question got trolled off of the 'Web Applications' site -- so I brought my questions here where they're not to 'broad' or 'off-topic' (-8
Until the recent UI changes to Facebook, they did auto-load new content. It was extremely frustrating from a user perspective, as you'd be reading through the list of your friend's posts and all of a sudden everything would shift and you'd have no idea where the post you were just reading went.
I'd imagine this is the main reason.

specific limitations of AJAX?

I'm still pretty new to AJAX and javascript, but I'm getting there slowly.
I have a web-based application that relies heavily on mySQL and there are individual user accounts that are accessed and the UI is populated with user specific data.
I'm working on getting rid of a tabbed navigation bar that currently loads new pages because all that changes from page to page is information within one box.
The thing is that box needs to reload info from the database, etc.
I have had great help from users here showing that I need to call the database within the php page that ajax is calling.
OK-so pardon the lengthy intro-what I'm wondering is are there any specific limitations to what ajax can call that I need to know about? IE: someone mentioned that it's best not to call script files and that I should remove scripts from the php page that is being called and keep those in the 'parent' page. Any other things like this I need to keep in mind?
To clarify: I'm not looking to discuss the merits/drawbacks of the technology. I'm wondering about specific coding implementation that I need to be aware of (for example-I didn't until yesterday realize that if even if I had established a mySQL connection on the page, that I would need to re establish that connection in my called page as well...makes perfect sense now).
XMLHttpRequest which powers ajax has a number of limitations. I recommend brushing up on the same origin policy. This is a pivotal rule because it limits where AJAX calls can be made.
First, you can't have Javascript embedded in the HTTP response to an AJAX call. That's a security issue.
No mention of the dynamics of the database, but if the data to be displayed in tabs doesn't have to be real-time, why not cache it server-side?
I find that like any other protocol, Ajax works best in tightly controlled conditions. It wouldn't make much sense for updating nearly the whole page, unless you find that the user experience is improved with an on-page 'loader'. Without going into workarounds, disadvantages will include losing the browser back button / history, issues such as the one your friend mentioned, and also embedded resources and other rich content can suffer as well, and just having an extra layer of complexity to deal with in your app. Don't treat it as magic sauce for your app - make sure every use delivers specific results that benefit your client / audience.
IMHO, it's best to put your client side javascript in a separate page and then import it - neater container. one thing I've faced before is how to call xml back which contains code to run such as more javascript - it's worth checking if this is likely earlier on and avoiding, than having to look at evals.
Mildly interesting.

Resources