Designers to Developers-What should I know? - client

This is a broad question, so let me narrow it a bit. I am a graphic designer entering the world of web design. I'm not totally green in this field, but I know enough to know that I have a lot to learn. From friends and from posts on this site I realize there is often a harmful disconnect between design and development.
I'm getting ready to place a client log-in/password "portal" on my website. Nothing fancy, just enough to provide some sound mind for my clients and a space for secure download of imagery. I am only handling the look and feel of this one, nothing more. What potential pitfalls should I know about, on my end, to avoid making the development end hairy?
And of course any other nuggets of wisdom are appreciated too. Thanks!

Perhaps the worst mistake that many designers make when working with developers is to assume that developers aren't creative, and that we couldn't possibly have any good ideas or inputs into the design. The fallacy of this is obvious because what we do all day, every day, is create things. It's taken for granted that designers can raise bugs against developers when our code doesn't represent the design exactly, yet many designers get very touchy when we raise suggestions about how their design could be improved even in minor ways. Sometimes the suggestions may not be suitable, but occasionally you might be able to improve your design.
In addition, I have frequently found that designers under-estimate the capabilities of developers to achieve what they want, so will sometimes suggest a simpler alternative. By opening up the dialog and giving a couple of options like a minimum one and an ideal one, you might be surprised that you can have elements of the ideal one, or all of it, or even something better as you discuss what actually can be achieved (sometimes what seems hard to achieve to a designer seems easy to the dev, and vice versa, because they are different disciplines). Of course the converse is true and you might be aiming too high, so you need to find that out as well.
In summary - you're absolutely right that any disconnect between design and dev is detrimental both to morale and the final product. So talk to the devs as soon as you have initial designs, and keep a good two-way dialog open.

I am a web developer, so I'm answering this from my viewpoint. There is really no serious pitfall as long as developer and designer understand each other. One tries to make websites look as attractive as possible, while the other tries to make the look of the website as close to the design as possible.
So when I'm asked to do the impossible, (like replacing the browser's default scrollbar with an animated image of a cat) I'll just tell the designer that it can't be done, the reasons for that, and suggest possible alternatives (Flash ?). After that being said, I expect the designer to understand and cooperate with me to choose the best alternative, not suddenly turning into grumpy mode or something.
A little basic knowledge of the developer's work would help, too.

Some ideas that may smooth the process :
Talk to the developers directly and ask if they have any specific requirements. Different platforms have different needs and requirements. Communication is important.
Get the basics of good HTML and CSS down. There are many references but you can try A List Apart as a starting point.

Related

The future of Web Design & Front-end Development?

I am a little bit confused about the future of my job (front end web developer & designer).
I always learn the latest technologies to do my work. But i am anxious about losing my job in the future, Cuz everyday a new "Website Builder & Create your own website "website or application came up.
Despite of thinking part is getting harder (User Experience), Design part is being simplier (flat design, material design) day by day.
Do you think that "Web Design & Front-end Development" will transform to another job field?
or what should we learn to be able to do our job in the future?
You're seeing something that is common across all of software development: whatever you learn today will not be the hottest thing in the future. You've also identified an important point here: the thinking part is getting harder.
The thinking part is what sets apart a good developer from a great developer. A good developer can learn the current hotness and deploy it. A great developer can take a difficult problem and come up with solutions that meet the need, are scalable, and are well-written and well-documented so that someone else can come along in the future and update them to solve future needs.
Being able to show how you handled a difficult problem, the trade-offs that you made, what you delivered, and how you would make improvements on it, and what you might do differently if you know now what you didn't know then, are how you set yourself apart from your fellow web developers. Another way to set yourself apart is to know the differences between the various technologies that you could use, and be able to articulate why you might choose one over the other. It's very rare that a new technology perfectly replaces an existing technology. That shows a knowledge of technologies, the flexibility to learn new technologies, and the ability to consider the strengths and weaknesses of each one and determine how that impacts what you need to develop.
It's quite possible that web development will become something else, or that you will decide to do something else. Development has changed significantly, and it seems likely that it will continue to do so. Likewise, as you continue in your career, you might find that you want to pursue something else (perhaps a different type of development, perhaps something technical but not development, perhaps something else entirely). The transferrable skills that you can learn as a web developer are problem-solving, flexibility and adaptability, working in teams, and articulating how you have approached a problem and why you chose a particular method of handling the problem.

Should a developer be a designer?

I have been developing websites for quite some time and I am not so good in designing websites? My Boss is refering me to take some lessons on it.
But I really want to stick to development rather than designing?
You don't need to be a designer. But I would highly recommend you understand the process and some of the techniques used. Having that knowledge will assist in both working with designers and providing better back ends.
I'd do the course, but make it clear to my boss that it's not what I want to do as a main job.
Answer yourself these questions:
What is your objective, the dream? developer or designer?
What are you best with?
Will I be able to justify with my design requirements?
It this common that a developer should be a designer too?
Will you be able to to concentrate on both, the ever changing trends and techs.
Having said that, I have seen such people having both skills but still they don't weigh equal in both parts.
Developer as well as designer:
Chris Coyier of css-tricks.com
Pekka
It depends on what you want to be in the future. Actually, designing and programming are two different skills. Obviously, for websites two things are both required. As a developer, if you have some basic knowledge about design, it would help you and also the designer to make the website much easier to maintain. But personally, I do not thinking you have to dive into design.
a good developer knows a lot about design, but dont have to be good at designing something.
i've seen to many developers building up a given design and making so much mistakes, because they don't see the little intricacies that are enormously important for a well designed website.
One particular design aspect I find many developers (good ones) are not necessarily extremely strong at is understanding of colors harmony. Even though it seems like easy thing to do, find the right combination of colors on a page, it is not always that easy. That course may be helpful in that regard.
I started of as a developer and then progressed into being a Developer/Designer.
You start to understand design aspects, UX aspects and the likes.
So i believe a good developer should also have a good understanding of design aspects as well
The bottom line is your boss thinks you'd benefit from a bit of immersion in design, and you probably will.
It doesn't sound like he wants you to become a designer, just get a feel for it. He's not asking for a career change.
There's always benefits in learning something new. And if your boss is backing you taking some time to do it got for it.
As a developer you should know something about usability and software ergonomics. You should know the basic structure of a website. And you should be able to implement a given design.
I think it is not the job of a developer to create a design.
Try to answer: "Why does your boss want you to improve skills in design? "
Your team is too expensive and boss is going to fire designer. He is wondering is it possible.
Your designer complains to boss that developers constantly ask him to refactor insignificant details interrupting from common tasks. So your boss wants to delegate small design decision to developers.
If it's so, I think nothing is a bad to improve design skills if your boss doesn't want you to convert to designer.
I also agree with all those people, who state: Developer and designer are two different roles.
Well, if developing is the field you are comfortable with, stick with it.
But learning is never bad. Try to gain knowledge first, after taking the classes, you can answer this question yourself
Wow, I'm actually in the exact opposite of your situation. I'm a designer just crossing the line of web development. But in my case, it was my own decision and it wasn't imposed by anyone.
It's always a plus if you have web development skills on top of design skills. I guess it holds true if you're a web developer and have design skills as well.
It never hurts to learn the basic, like others have mentioned, but keep in mind to stick on what you're good at and master it. Its better to be a master of something rather than being a jack of all trades. With so much competition out there, you really have to excel at your craft.
Learn both, but master one, I'd say. I personally see myself as a developer foremost, but I do know a thing or two about design - and, more specifically, implementing it (think CSS and the like).
However, I gratuitously admit that I am not good at making a design that looks good. A functional one, maybe, but not good. You could say something like that to your boss - that yes, you are capable of learning to design, however that you will never be as good as a real designer. Likewise, a designer learning to program will never be as good as a dedicated developer.

How to make and apply standards for UI development?

I work in a small and young team of developers and we have problems that we are not sure how to solve.
On previous projects every developer have been working on tasks that were based on use cases. So, upon setting the system architecture, each team member worked on user interface and business logic of tasks assigned to him.
This kind of organization gave us the problems with UI. Each developer had his own logic about how UI should look like, where buttons should be, etc etc... and even if we've had one css designer a lot of refactoring had to be done in order to make web site to look compactly.
How do you deal with this issue?
Do you split tasks based on layer, not on whole use case?
Do you use some technical solution to achieve this or is it just written standard that every developer need to follow?
Thanks
Everyone has their own style and it would be difficult and a waste of time to define a standard that would get everyone to draw the UI in a consistent manner. Instead, elect your best UI designer to do what he does best and design the UI for the whole system. Funneling all UI changes through the designer would be difficult so just let your developers "mess it up" as they implement new use cases and just have your designer clean it up before the release. It shouldn't be hard for him/her to rearrange the existing forms and bring some consistency back to the UI.
I've found this 12 Standard Screen Patterns article very useful.
A solution might be to create sketches of all screens of your application, have them reviewed by an ergonomy-expert to correct the biggest mistakes, and, only then, give them to your developpers.
This way, they would know how the screens they are developping should look like -- there will still be a couple of differences in the end, but those should not be "big differences", and should be eaiser to fix.
And this would mean not each developper has to imagine what the perfect screen would look like : each one of those would be coherent with the others.
Adopt the tried and tested MVC system, let the view be decoupled from the business logic. Then ask a UI designer to produce sketches and work to that. UI's are something best done top-down from my experience. The user gets an overall view before being presented with all the details, defining and capturing this hierarchy makes good UI's. Coding of business logic is done as you mentioned on a use-case basis, mostly bottom-up and this is where the code falls out of sync with the UI.
Designate one person (preferably someone with graphic design experience, even if they're not really a programmer) and give them the authority to make cosmetic changes to all forms, pages and controls at any time, and have them be responsible for the overall look and feel of the application.
As far as metrics go, keep track of how much time this one person has to spend "fixing" each programmer's work, and make sure the programmers are aware of these numbers. The idea is to encourage them to make their stuff look like it should from the beginning, but also not to do weird things based on what they think stuff should look like. I've had to spend more time undoing my coworkers' bizarre design choices than anything else.
Don't be afraid to have outside sources review the design work of each programmer. It's very common for programmers to 1) produce horrible-looking UIs, and 2) believe the UIs look fantastic. You should do what the Army does with boot camp: break them down completely right from the start, so that you can build them back up again the right way.
Part of the problem with creating your own written standard is that while well meaning, there could be mistakes or better ways to do things than what's been standardized. For example, where I work, the standardized cancel button does nothing when you click on it (it's been wired to Reset).
Instead, I recommend choosing existing standards, such as The Macintosh Human Interface Guidelines or Windows User Experience Interaction Guidelines. Even if the standard is wrong, it's rarely profitable to deviate from widely established conventions.
Then pick up some good books for the developers, such as "Designing Interfaces: Patterns for Effective Interaction Design". Good user interface design is partially a matter of good taste, and while not every developer will be interested in the subject, it's in your best interest to help them improve.
Next, empower your QA team to file bugs when the interface for one product is inconsistent with another. The developer can then either standardize or justify the deviation if he has a reason. We do this; it works pretty well.
Lastly, go over your existing products and get a consensus on how their interfaces should be unified. Bring in (and keep) a usability expert if you can. I've seen good ones do amazing work.
There really is no clear solution for how to deal with UI problems. There are however several approaches one can take to combat the problem of having things become too complicated:
Use cases are usually cross disciplinary in nature, thus the responsibility to get a use case done should be split between the people who can implement it properly. Programmer and designer type of people need to cooperate.
Everyone in the team needs to keep in mind seperation of concerns, i.e. things that can be seperated must be kept that way preferably as early as possible. There are so many ways to do this: e.g. apply MVC pattern in your project (which is a very wide way to put it). Presentation and logic should be seperate so that changes in one layer should not affect the other.
Someone needs to be responsible for the overall UI design so it is consistent throughout the application. Preferably someone who is both a graphic designer and has some insight in usability. UI design is something that needs to be planned along with the use cases and revised constantly as development goes on. Consistent UI is very important and developers need to be on board on it.

How to write "good" user interface texts?

Many applications are let down by the quality of the 'writing' in their user interfaces: typically, poor spelling, grammar, inconsistent tone, and worse yet, "humour" are the usual offenders.
Are there good resources that can help developers to write UI messages that give a professional and positive impression to your customers, even when your code's going to hell in a handcart?
Thanks, all — Some great resources here, so I will CW this question. I'm accepting Adam Sill's answer because it's the one that (as a developer of desktop apps) I found most pertinent.
Since XP, I've been a fan of the Windows UX Guidelines sections that cover how to properly structure text (how to ask questions, how to make assertions in dialogs, etc).
http://msdn.microsoft.com/en-us/library/aa974176.aspx
http://msdn.microsoft.com/en-us/library/aa974175.aspx
Read The Elements of Style. Then re-read it.
Also, anytime you are working with a program or website make a conscious effort to notice how they choose to do their writing. Imitate those you like.
The resources found at Writing for the Web might be useful to you.
The best tool for this is called "primary education". Many developers seem to have missed this, and I don't know how to fix that problem.
Also, this may be a British thing, but I think you mean "humor" and "going to Hell in a handbasket". :)
This book has a lot of good advice:
GUI Bloopers 2.0
Short version:
Be consistent throughout your application or app suite. Don't call the same feature two different names, even if they're in different dialogs, etc. Develop a product lexicon that everyone references.
Use the same terms that people who use your software use (i.e. users don't refer to themselves as users).
Don't call two different things by the same name.
Put all of the messages displayed to the user in a central place (i.e. a resources file of some kind). This makes it easy to review all of the messages for spelling, tone, consistency, and whatever else you want to check.
Usability test your software to see if the messages make sense and people can use your software easily. If they can't, change the resources file and test it again.
I would suggest showing your UI to as many people as you can--preferably people who read a lot (Just because reading does wonders for your grammar and vocabulary).
Getting something out that people can examine, however, is awesome--even if it's just a demo of the GUI.
If you work at a company, get to know your QA and Tech Support people. They are usually really wonderful once they understand what you are trying to do--they will review your UI, give you input on text and usability as well as possibly new requirements nobody in engineering would come up with.
If you work on your own, try to find a potential customer or two to review your UI. Ask them to pay attention to the text...
The more eyes, the better. You might even ask your parents, wife or other family. What can it hurt?
Get your application's texts proofread by someone who does just that for a living. Then the UI walked through by someone who does usability for a living. Neither of these two people should have been involved in the development.
It's the only way to make sure.

Microsoft User Interfaces, are they user friendly still?

I find that most of microsoft's new programs are very hard to use.
Microsoft Office 2007 (word especially) I find to be hard to use.
Microsoft IIS 7.0 is a PAIN, I never remember which icon to click on, things are just to cluttered and hard to find.
As a programmer, we have to design according to what people are used too, what exactly is MS telling us to do?
we have to design according to what people are used too
Well that's a slight misconception. You're not wrong that people familiar with something will appreciate the interface remaining familiar, but not all change is bad. You have to weigh the power of the change up against the harm it does to veteran users.
Lets take Office 2007 as an example.
The ribbon interface is a huge departure from the interface Office has used for as long as I can remember but there is sound logic behind it.
User functions are grouped by activity and it's very easy to change which set of functions you're looking at.
They're also contextual so some thing only show up when you're on a table or an image (etc).
These both help keep the clutter down - something really quite useful as these apps grow in feature-sets. Rather than spending hours choosing and customising a set of toolbars, you have access to everything through the tabs.
And Microsoft did this all the right way. They tested the interface on lots and lots of real people. They listened to see what worked and what they should fix or drop. They also kept some legacy keyboard shortcuts for seasoned pros.
The redesign effort was targeted at making life easier on beginner and intermediate -level users. Mission accomplished. The problem you're having is overcoming your familiarity but I can't be more helpful than say: It'll happen in time, but you'll manage it in the end.
Look, I'm just a simple caveman, scared by your post-modern architectures and vroom vroom machines go honk. I'm used to the simple life of the paleolithic era; charcoal cave paintings and bone-based technology. I can't make heads or tails of your fancy ribbon UIs and pointy-clicky icons. That's why I'm never upgrading from DOS. The old ways were always the best, and learning new ones bad like fire.
Well, Microsoft has to balance this. On one side, users scream for new features and change-for-change's sake in a lot of MS software. On the other, lack of backwards compatibility (including subjective UI compatibility) is a deal breaker. Really no way to win there.
That said, I don't think we need to design according to what people are used to; neither does Microsoft. Change will never happen if we just do what has always been done before. IIS is not developed for programmers; it's developed for IT people. And the new interface serves them well. Likewise, Office is designed for office drones, not programmers, and the new Office is very discoverable for that particular group.
I think they take a while to get used to, but I do like them. (Althought I will fully admit I am a mac person and I like the mac UI a lot better).
The biggest thing I've seen about the UI that is difficult is the fact that it is so much different from previous versions (I'm talking about the current version of Office). That seems to be where most of the rub is.
The rule I was taught about UI design is that things need to be familiar to the user (that's really is what makes it "intuitive"). MS broke that rule ......but from a business perspective they are allowed a little leeway when doing this simply because they control so much of the market share. Ultimately, they know that a radical change won't cause a loss of much market share because for most people and businesses there isn't a real viable alternative. (I know there is open office, but migrating a mid to large office to it will cost as much money or more as it will to just continue using the same product).
Do we have to design according to what people are used to, yes we kinda do. Does this mean we have to make it look like what MS is doing now, not necessarily. What we have to do is create a design the users can relate to. They have to be able to make a jump of logic from what they know already to using the products we create. If not, they most likely won't use the application unless they are absolutely forced to.
User interface and user experience are totally separate concepts. (Simon Guest; User Interface Blog.)
Microsoft did quite a bit of research in the raw usability of Office 2007, and found that while there is a learning curve for people like yourself, or me, who are experts in the tool, newer users and non-experts experienced much greater discoverability of more advanced features, and wound up using more of the application's features and power. Yes, there is a learning curve if you knew Office 2003 inside-out (which, frankly, few of us really did).
Now I'm not making apologies -- Microsoft's UIs haven't always been easy to use, and sometimes they fail miserably. (Personally I think not standardizing all of their office products on the Ribbon is a classic example -- there's a large context switch in my brain when I open Project or Visio, compared to when I open Word.)
As for what developers are "supposed" to do: Bear in mind that the ribbon isn't ideal for every scenario. If you're using it as a glorified, prettified toolbar, it's being used incorrectly. It's designed to help you organize literally hundreds (if not thousands) of commands in a way that makes them discoverable to your end user. It's supposed to reinforce the traditional experience of discovering the abilities of your application in a safe way (see any edition of About Face), when the depth of your application is too great to function within menus.
Aside from that, bear in mind that we should generally be making the most appropriate UI for our own audience, as Microsoft is attempting to do for its own audience. Again, we may find these things more difficult to use, as we are used to doing things a set way -- but it's the right thing (typically) for Microsoft to do. Remember that we programmers are not the target users of most UI. (How many of us turn off visual themes, for example? Now how many normal end users? BTW, I don't fall in that camp; I'm one of the few who actually finds Vista moderately attractive.)
Again, at the end of the day, what Microsoft does matters only to the extent that it becomes what your users expect, and then only if you can't educate them that "your way" is better. In any event, if usability is truly critical for you and your users, it's time to invest in usability testing and ensure that your application really is as usable as you think it is. And start reading usability sites. (You don't have to agree with them all, but understand them.) Here are some samples:
AskTog (Bruce Tognazzini, inactive but the archives are a treasure trove)
UseIt (Jakob Nielsen)
jnd.org (Don Norman)
Office User Interface Blog (Jensen Harris)
Microsoft Windows User Experience Interaction Guidelines (The holy word on Windows)
It's interesting because there was a lot of talk about the usability testing that went into the design of the Ribbon controls, but along with almost everyone else I know I find them very difficult to use. I keep losing controls that I need and not being able to get them back until I've cycled through another three or four document views looking for them. I instinctively move my mouse to menus that no longer exist.
I wonder if they would be easier to someone not accustomed to the earlier office products- maybe this is who they did their usability testing with. I don't think the design of the new interfaces is bad as such, but it is different enough that for those of us who don't spend our whole time staring at Office but have been using the product for a long time it makes life difficult. I guess most real power-users would be doing most tasks from keystrokes anyway which presumably haven't changed too much.
The business problem is really that they need an incentive to upgrade and so they keep adding new features ( who do you know that uses all the features of Word ) and then they need to find ways to present those without making the application impossibly cluttered, which was certainly happening in the previous version of Office.
I'm not sure what we take from this as developers- maybe it's that we should design for usability from the start or find ways to make the transition between old and new functionality as easy as possible for our existing users.
Microsoft IIS 7.0 is a PAIN
I'm relieved to hear that others have found the new IIS UI a challenge. I stumbled into it without being forewarned, and was completely discombobulated. There is so much clicking around. You have to memorize where the feature is, or click and click. I don't know of a way to see all of the IIS settings at once (not that you could before, either, but at least you could stay in the single tabbed dialog).
I think it is really hard to adapt to an entirely new UI when you are so familiar with the old one. I am similarly disoriented by the ribbon menus. More clicking around to find the features. And not everything is in the ribbon. Some is in menus accessible from other entry points, such as file properties.
For new users who never saw the old UIs, it probably isn't so much of a problem.
I guess what I really dislike is having to spend the time learning the new UI, at the least convenient time. There is an immediate loss of productivity when you have to learn the new UI. You can't just drop into IIS, configure the website, and be on your way. The first few times, it's going to take a lot longer. Maybe with growing familiarity, we will come to like the new UIs better.
I wish they had given the option to show the menus for us old fuddy duddies.
I had a meeting with one of the Microsoft Office guys last year when I brought up the same points. His point was that the number of features had grown so much that a new method of displaying them was required. I was not entirely convinced and found it amusing that Microsoft are so touchy about the problem that he had a very nice, well-prepared PowerPoint presentation to give to try and explain it.
MS is trying to give users more power by being able to click this to do this or that and try to make what others may see as very advanced functions simpler to use and more powerful than the previous ones. I remember going from IIS 3.0 to 4.0 where suddenly, there are all these new buttons to click and things are different but it is kind of better. I also remember going from Windows 3.11 to 95 having its own shock of updating things.
Did you ever try watching a movie on VHS and on DVD or go from cassette to CD? Remember how the DVD suddenly had all these new features like chapters, no need to rewind, bonus features that you could just go to and not have to fast forward to find? Similarly how a CD organized things so much better than a cassette? Another point would be to look at TVs where it used to be very few options on a TV: There were 2 dials, the power and volume where combined into one place, and a few other knobs were all we had but now you have TVs where you can store favorites, closed captioning options, sound setting, and color style that could scare some people that remember the old days where you had to physically pull a knob to turn on the machine.
I find that most of microsoft's new
programs are very hard to use.
If you feel so, do yourself a favor and change to Mac. I did it and wont go back to windows. So much time wasted to achieve little things with Windows.
And Apple has Style Guides for GUIs. You dont have to stick to them, but as far as I can tell most developers do.
To prevent a Mac-Windows-Flamewar I would like to point out that this is totally my opinion. Please dear Windows user, do not feel attacked by my opinion.

Resources