Levels of continuous integration - continuous-integration

In a recent interview, I was asked about the level of Continuous Integration practiced in our company. When I started describing what we did, the interviewer interjected and asked me the level number - sounded something like CMM level to me. When I told him that I was not aware of any levels followed in my current company, he seemed displeased at my answer.
I am not able to find any such information online.
Can someone please throw some light?

I am afraid the interviewer was not competent at what he was asking. Such interviewers would ask you
Why are manhole covers round?
And they expect you to say something along the lines of safety. But they would completely disregard other valid answers, such as the ones described here
There are no strictly defined levels as such. Merely some bloggers shared their personal view on the topic. Does it really matter if you are on Level A, Stage 2, or Phase III based on their private classification? No it does not. Shall you be aware of such classification to do CI properly? No, you should not.
The actual number is irrelevant, what is relevant is the description of CI process.

I agree that it was a bad interview question but some people have been talking about a maturity model for Continuous Delivery (with 5 levels). For example, take a look at
http://www.infoq.com/articles/Continuous-Delivery-Maturity-Model

Presently in the industry maturity levels for Continuous Integration are sort of customized. Some companies have maximum levels of 6 while some companies like mine follow 5 levels.
Continuous Integration concept was envisaged at ThoughtWorks and if you go through this document on their website you'll see they too follow a model of 5 levels.
P.S - It indeed was a vague interview question.

It sounds like your interviewer was asking about CMMI (Capability Maturity Model Integration)]1.
The levels here have very specific meanings, and are used to indicate the level of process maturity within an organization or team. I've never actually worked anywhere that has cared about CMMI, but if you are applying to a company where this is used, or where their clients use this, it may be important to them.
It sounds to me like the interviewer wanted to know if you knew about this at all, and could explain about the different levels. However, as mentioned previously, knowing about Continuous Integration and why it is good for your project (and when it can be bad) is the important piece to pull out. Not whether you've memorized the Wikipedia page on CMMI.

Related

How to prove to colleagues that use-cases are important? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
... and how to prove to management that use-cases can be informal and still useful?
Hi folks,
I came in the middle of a project and found out that there are no use-cases, user-stories, requirements, neither anything similar to a specification. Since the deadlines are short, the current dev team don't want to spend time on such things. I wanted to join that project, but by digging more I found out that the current development adds features just by considering their "wow-effect" and chooses what to add just by using the easiness that the underlying technology provides. I was surprised how they have managed to go so far (more than 4 months) without requirements, but this is what we have now. I believe that the way they have chosen is the most sure one to kill the product which has a good marketing value.
Am I right, and what would you do in a similar circumstances to prove the dev team/management to make use-cases/requirements before moving forward? Thanks in advance, kh.
P.S. Two copies of Cockburn's book are on the bookshelf...
You should give your colleagues the use-case spiel :D Tell them that use-cases are useful as they're:
A way of capturing business processes in a manner which is reasonably comprehensible by all stakeholders. This helps to bridge the gap between programmers, clients and users.
Traceable units of functionality. Use-cases are formed (ideally) in the analysis phase, referenced in the design phase, and can be used as sources for test cases later on.
Quick and easy to write up and useful, even if informal.
If you need more ammunition, you might want to read Use cases - Yesterday, today and tomorrow by none other than Ivar Jacobson.
If your colleagues still can't see the potential usefulness of use cases as a business analysis tool, then they're probably beyond help :P You should remind them that they're developing software to meet other people's needs and solve their problems in the long term, not to ostentatiously impress them in the short term with petty gimmicks. And so a little bit of direction and specification helps. Even if the use-cases themselves don't prove to be that useful, the simple act of coming up with them will force your colleagues to consider the actual underlying purpose of the software.
Ask questions, of both sides. Of development, ask them if they are certain that all of the ways in which they have considered using the application are all of the ways in which the end-users will want to use it; if they say they have, ask for proof. Of management, ask if they've ever used software that does everything they want, but still ends up being hard to use (they will have). These questions will seed the concept that what will be delivered might not be what is desired, on both sides; use that seed of an idea, then, to open up discussions (not documents, not at the start) on how the software will be used, and in what way any differences can be resolved. They'll get around to use-case documents eventually.
I am a product manager by profession, and my first reaction to your post is that ideas can come from anywhere, and if the dev team has decent ideas they should be incorporated into the product.
Having said that, a product can not develop a soul (a simple message) through a string of disconnected ideas that do not serve the ultimate purpose: solving the needs of a target user. And, ultimately it boils down to making the case that time is better spent on requirements/use cases that make sense for the product, while the opportunity cost of not having a clear strategy/end goal will lead to too many chefs and a jaded product message.
The ultimate way to make this message hit home is to involve other stake holders and have development demonstrate their work. Eventually, there will be disagreement and a more formalized (less cowboy) approach will lead to a more refined and simple product.
One of the problems you mention is tight schedule and scope creep induced by the devs themselves. Explain them, that by using use cases you can earn time by dropping features, which will potentially end up on the "never used" pile. With use cases you can find out what are the features customers need and will pay for and by removing unimportant features out of the scope you would have time to implement. Use cases apart from defining the scope also help to identify all the stakeholders, which might help you to focus even better while defining the scope and prevent forgetting about trivial things, which are not so apparent, but are a must if the product should be usable. The third most important thing about use cases is that they allow you to start thinking about corner cases which might be important for the customer before development and therefore you can find out with the customer what would be the ideal solution instead of letting the coder decide on his/her own under pressure of deadline.
Just show them.
Example is not the best way of educating people, it is the only one.
Lead by example focusing on extensions and exceptions. In other words emphasize the failure scenarios because everyone knows how the system should work. The real value of written Use Cases is identifying what should happen when something goes wrong.
That noted, consider you may have to live without written use cases. And, for the environment you describe, a major win is any sort of requirements documentation. Screen comps and/or prototyping are often easier to introduce.

Too hard a project? What do you do? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
What do you do when you get assigned a project that is just way too hard to do:
Say it's a mammoth project and your boss thinks you alone can handle it
You have knowledge to do somethings, but some other things are a little beyond your expertise at this point in time
Your boss probably thinks it's something that can be done by one person in probably one month
SO users, I would love realistic answers here. This is a real world situation and I am trying to figure out my response to my boss tomorrow on how to approach him delicately.
I just wanted to add an update to my note here. The app in question that my boss is targeting is a "NING like" web app. My hesitation is mostly being the only person being assigned to it for such a complicated app in such a short time period.
This is a situation that everyone has to deal with on a regular basis simply due to the nature of work. (Typically, if you know everything you need to know to complete a job, you've already completed the job and don't need to do it again. :) )
Be honest with your boss about your anxiety. Your manager needs to understand your assessment of the project's risk profile. Odds are good that you'll be doing it anyway. That's OK! This is your chance to shine! :)
Break the problem down into tasks you understand and tasks you don't understand, then start tackling issues one at a time. I, personally, like to alternate between easy tasks and hard tasks. Completing easy tasks helps me feel like I'm making real progress on a gut level, which is important for my personal motivation. Completing hard tasks addresses potential problem areas earlier in the schedule. This mitigates the tail-end risk of the project by evaluating unknowns earlier, rather than letting them fester and explode when you've got 2 days left and no more planning/wiggle room. It also helps your stress level because you know you've gotten the ball rolling on the project's scary bits. Remember-- your unknown areas are where you don't understand the problem domain, so that's where the real risk of schedule/budget slips lie. You need to mitigate those risks early and often. Get the ball rolling with colleagues that you can consult to learn how to do these things.
The one month goal is probably a target. I don't believe it's reasonable to expect person A to realistically estimate person B's scheduled completion of a task in the general case. To track your progress against the target, set up milestones, none longer than 16 hours/2 days, and track your completion rate against them. This goes hand-in-hand with your list of easy/hard tasks.
The simple fact is that, sometimes, you'll just get dumped in over your head. In that case, you may have to make the best of an overwhelming situation. My very first task at my first job out of college was to design a reliable, transaction-oriented, peer-to-peer n-way server synchronization system for high-volume, high-rate data. I told my boss up front that I did not have the expertise for this, and at the time I didn't have enough experience to understand that I needed to push back on the requirements. (In retrospect, given the political environment, I don't know if pushing back on the requirements would really have helped anyway). That was simply a case of a poorly managed project that took about 18 months to ultimately collapse under its own weight. I still leveraged the opportunity to learn a lot and take some knowledge about the way my particular organization worked, though, and that can be very valuable no matter what. :)
Good luck! :)
Edit after question update
Ok, if I understand your update correctly, we're definitely in #4 territory here. There's nothing realistic about creating a competitor for Ning in one man-month. I assumed in my prior answer that you were dealing with someone who had a base understanding of software development. Based on that:
Ask your boss to clarify the requirements more. Perhaps (cross your fingers!) you simply misunderstood what you were being asked to do, or the scope of the project. Always assume competence until absolutely proven otherwise for social reasons. Maybe you were only being asked to come up with an overall design and some very simple proof of concept?
If your boss is truly this out of touch with reality, put together a sensible, 15-minute back-of-the-envelope estimate with him/her on a whiteboard or a shared piece of paper. It shouldn't be hard at all to blow all kinds of holes in this one month to completion. Perhaps your boss thinks you'll be able to reuse some internal code that you're not aware of? This will bring any faulty assumptions your manager is making re: project scope to light.
If your boss is absolutely unreasonable (this doesn't happen often, but it occasionally does-- perhaps the company needs a killer app by the end of the month to sell to avoid going under), prep your resume for an intra- or extra-organizational move (depending on how big the place you work is). Unrealistic expectations on that order can be a sign of organizational desperation or malfunction, and your position may simply not exist 3 months from now.
Don't panic. You may have misinterpreted the goal your boss has. It sounds like he was not very clear if he said only "Ning-like."
Research Ning. What are all the things Ning can do? On Ning's Resources link, they list at least 21 major social network features.
Write up a high-level statement of the goal for this project. Include all the features Ning lists. Also include an objective for how many users this app should serve. Don't try to think about how to solve these goals objectives, or how many programmers it'll take or how long it'll take. Just list them. Keep this write-up to one or two pages.
Present the list to your boss. Ask him, "does this sound like what you had in mind?" Ask a few direct questions to ensure he has looked at your write-up:
"Who are the target users for this application?"
"How many new users per month do you expect to sign up?"
"What level of uptime do we need to support?"
"What's our budget for hosting this service?"
"Do you need this application to support international users?"
"What is the end-user license agreement (EULA) for this application?"
It may become clear at this point that your boss has more modest goals than you assumed. Perhaps he does not intend to duplicate all the capabilities and scale of Ning. So then it becomes a task of getting your boss to articulate more clearly what subset of Ning features or capacity he needs.
Install Drupal, Joomla, or Wordpress, download some plugins, and design a custom site for your boss. That'll probably give him 99% of what he wants, and it's the only way you'll be able to do it in one month.
Don't start by saying "No" or "It can't be done" or "Its too hard" or any of the other things you said in your post. Most managers in a company do not even begin to understand the effort level involved in a programming project and need a little education with their software planning estimates.
I would suggest a conversation which includes the following steps.
Estimates: review the effort level you believe is required for this project to be a success. Make sure that you have thought out tasks in enough detail so that you can answer questions.
Education: if your boss doesn't understand why something will take a certain amount of time, explain as clearly as you can (good analogies tend to help, bad ones can be devastating).
Alternatives: if you believe there is some middle ground or some set of sub features which will fulfill the project needs discuss these alternatives. Managers hate when an employee says something is hard or difficult, they want workable options.
Alignment: are you sure that you and your boss are on the same page about this project? Perhaps you see it as a piece of mission critical software and your boss sees it as a minor enhancement to your existing tools. Be sure that you both have the same expectations; otherwise, you may be planning more complex software than what is being requested.
The most important thing I ever learned in software was how to "push back."
It doesn't always mean saying no. What it means is providing your best estimate of what the impact of new work is. Whether you're saying "yes" or "no", you say, "we can do that, but it will require (x, y and z resource). I think it will take (n days for me, n*a for person b) to understand problem b), but I know how to handle (c, d and e). I've never had to solve problem b before, so I don't know if my estimate for that is realistic."
The difference between "yes" and "no" is whether the cost equation is acceptable.
Any good manager will respect your analysis, question some of your assumptions, expect a round of rethinking, and then, either accept the risks, find additional resources, or abandon the project.
If they say "I see what you're saying, but you're going to have to accomplish the impossible anyway," start looking for another job.
Would your boss not understand the truth? Just talk to him about the requirements of the project, and mention what can and can't be done.
Here's how I would plan it out:
Don't panic and to react - tell your boss that you would like to review the request and will get back to him shortly with questions and concerns
Go through the spec (or if there is no spec, the email or write down the request somewhere) and create a work breakdown structure for each delivery. This should be done to a level where each item is understandable (User Login, Message Entry, etc.)
For each item, est. the amount of work and +/- % amt. based on your knowledge, questions, risks, etc.
Create a list as you're going through the spec of any major/important questions (how many people is this targeted for? does this include the ability for users to IM, etc.)
You now have a rough timeline, risk assessment and list of questions to review with your boss. He'll see that you put some effort into it, may open his eyes to the complexity and provide him with confidence that you are not knee-jerking reacting. He might demand you do it in the timeframe he provided anyway....look for another job, you have at least a month.
What you say is that your perception of the task scope and complexity differs greatly from the perception your boss has. Great.
Most likely you're both wrong: you have misunderstood the requirements and the boss underestimated the task or fell into the trap of wishful thinking.
It's best to go through the requirements with your boss once again, work out together that deliverables are required, try to guestimate amount of time and resource needed to deliver these. If there are blind spots in implementation that you feel you lack skills or expirience for, make this clear and work on the assumption that you'll have spend cash to source these externally (that will at least give you an idea of a market price).
I am sure, the longer you and your boss spend discussing and researching the project, the more detailed the dissussions will become and the better idea of what is feasible will emerge.
The worst thing you can do is to keep quiet. Any good boss relies on developers to give some assessment of a project: either affirmative or hear more questions.
You don't have to say "no", that's not your job to decide whether to go ahead, but you have to be asking good questions.
It really depends on the relationship you have with your boss. If you can, I would just be open and honest with them. Tell them a few things are beyond your level of expertise and you would have to do some research, lengthening the project time. And stress the point that you don't believe you can get it done in a month and you're requesting a team to help.
It's possible your boss doesn't really understand the full scope of the project. If you can break it down into a list of tasks or sections to show how much work really has to go into it, they might see where you're coming from.
In the end, if your boss still wants you to go through with it, just keep stressing that you will do your best but you cannot make any promises about the deadline.
You need to be realistic with your boss. You will come out much better having over-delivered on the project rather than under-delivering on an aggressive timeline.
You have to be honest and tell the boss that there's a problem. However you need to show how much exactly of a problem it is so that you don't sound like an incompetent person waiting for a pink slip.
You need to carefully analyze what is to be done and break it into small parts and see which of them you can do and which you can't. It's normal to have parts in the project which look possible but hard to do - every normal boss usderstands that.
This way you show that the problem is not imaginary and it's not your desire to get nice salary for trivial job.
The truth of course is always the right answer, which your boss will find out eventually, better to fail early.
But with that said, is it something you just don't want to be involved in. Make sure you explain to your boss that you don't want to commit to something that you're sure to fail at but let him know that it could be a learning experience and at least be involved at some level, even if it's to look at the solution after it's done.
Create a realistic schedule and present it to your boss. Ask your boss for his input regarding the schedule. Maintain a positive attitude and let him know that you are both working toward the same end goal. Tell him that this is your best professional estimation of the amount of effort needed to meet all the requirements. Point out where the complexities are if challenged. Be firm and clear and above all else give him an opportunity to speak his concerns. Demonstrate good listening skills and address each of the issues he presents in a language that he feels comfortable with. I wish you all possible success in your project.
If estimations are not yet there then your first task is to do a realistic project estimation. Second task would be to check what technologies are required for the project and check if the knowledge is already available. If not then estimate for the training and gaining the knowledge. I understand boss is boss but you do your part and rest is up-to him. If the boss appreciate others opinion then he will understand but if he is like "I am always right" then you do whatever you can(work as best you can and also look for a new job).
Its a bit on the side of all the good advice I've seen here, but I'll say it anyway: most managers are actually quite smart. Senior managers I've met have all been very smart. The problem is that, as Eric Raymond puts it, they are "differently optimised". So they may need some education. If you assume that they will be reasonable once they know all the facts then you will almost always be correct.
Of course you do occasionally get people who behave unreasonably, or think that saying "make it so" like Captain Picard is Leadership. But they are rare, and do not last long.
Go fight club on him? Get free monies and airfare!

Information/knowledge flow within the team [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
I want to avoid the situations when my developers do not share the common knowledge (solutions for the problems they encountered, cool tips, common mistakes, shortcuts for achieving particular goal, configuration issues, partial requirements, etc.) with each others. I'm taking about the situation when such lack of communication is accidental (a result of the misunderstanding or improper management) - I'm not thinking about the situations when developers deliberately keep the knowledge for themselves.
I believe that the following techniques are extremely useful to improve the information flow within the developers team:
XP pair programming - due to the knowledge exchange within the pair (and due to the regular pair mixing).
stand-up meetings - due to the occasion to tell the others on what you're working on and what problems you encountered.
trainings/presentations/coaching prepared by the lead-developers to the rest of the team/department.
"web 2.0 tools" - techie blogs for the company/department, dedicated twitter account of team leader, wiki's and stuff like that.
Any further ideas? What techniques do you use (or did you) in your company? How would you encourage developers to share the knowledge between themselves?
Trust.
You are allowed to 'seem stupid', but please ask if you don't know, or don't fully understand what I'm saying. And please tell me if I'm wrong (I didn't realize it because I'm equally stupid.)
I worked at one company where every Friday we had lunch meetings for developers. Management would provide food while developers had to share their knowledge; present some tool or technique one learned recently, or give a demo of a project you are working on, etc.
It wasn't restricted to the technologies that were being used by the team at that time, developers were encourage to learn new technologies and give a demo to the team.
And at my current job we have monthly IT group meeting, where sometimes developers from different teams demo off the projects they've been working on.
An internal twitter-esque utility. Maybe a wiki if you can get it to work, I personally find it a little too much. But twitter is different. "just added an extention method to escape a like clause in a rowfilter" and stuff like that.
Some people may find it a little overbearing, but a common location for utilities so you know where to look and string.CountOccurrences isn't scattered throughout the codebase.
I'll add a few more
Hire the right people - This is essential if you want to create a great dynamic (asocial people require a lot more effort)
Pre-mortem and post-mortem. We use the wiki for this, create a page for each of your projects, split it into section of recurring things (both goods and bad). At the end of each milestone, have the team meet to do a post-mortem. At the end of the project (or after a fix lenght of time), have project coordinator compile this into something easy to read for posterity (and put it on your wiki)
Daily stand-up are a must! You already said it, but I find it so helpful!
If you have multiple teams in the company, organize conference about one of their greatest achievements. If possible on a regular basis, even accros department, you would be surprised how artists can be interested into programmers work.
Lunch is a good time to share, in our company we have the president breakfast, project leads lunch, end of projects supper. I love them all, mix and match for greater results.
Offsite meeting with the whole company is great, we do it at least once a year (morning we present what's coming up in the futur, afternoon, is activities to learn about the projects)
Wikis are great, but beware of informations that can become false over time (this is a reccuring problem with any written informations)
A few more things to my mind:
Patterns & Practices meetings - These don't have to be every week but there should be some time devoted to where the team can discuss various outstanding questions and have concensus for things that may save a lot of people headaches.
Culture factor - Does the work place provide enough socializing to help the team gel or could some team-building exercises, e.g. an obstacle course or cooking together, be useful in getting some dynamics established. Is there a humility among developers so that there aren't big egos that can be a problem. Another factor here is to think about how you'd answer this: Would you go to a local pub and have a drink with your fellow teammates? If yes, then you have some good points here while if not, then there may be some investigation to do here.
Retrospective follow-up - How are ideas presented during retrospectives considered and implemented? How are meetings handled in general?
Demos within the team - If some story got finished and involved some big code points, then perhaps there should be a little demonstration of this for the team to see what was done and allow others to see what was done so that the knowledge does get spread around. This can dovetail with my first point in terms of being something that helps to further communication.
I'm a big proponent of working in pairs. It is a good way to transfer knowledge and keep communication lines open. Try mixing up the pairs for each project as well.
I've tried many approaches, and am a big fan of working in pairs on projects, as well as doing regular discussions or meetings with the team.
However, I've also found that the single best thing I can do is foster a culture of constant communication between the developers. I try to have all of my developers communicate with each other as they work - not even necessarily waiting until a weekly or monthly meeting.
For me, this is a little trickier as most of my developers are not in the same location, so we have a single XMPP chat room setup, and all of us are always logged in when we're working on the project. Some of the developers (including myself) will login during our off hours, as well.
I do the same with the people in my office - we tend to be a fairly quiet bunch, but I'm very open to having people interrupt each other with questions, or grab a chair and sit down to brainstorm at any time.
Part of why this works, though, is I try not to restrict the communication to the work at hand, or any specific project. My feeling is that people are going to talk about other, non-work related things, whether or not I foster that. I'd rather have the "water cooler" talk in an official channel, though, than outside.
This makes everybody feel more at ease to ask the questions that "seem obvious". Also, people ask questions continually, since they're right there, and used to talking to everybody. It's easy to ignore if needed, but also much easier to just throw out a general question and see if anybody has ideas without feeling like a pain, etc.
My experience is that the time lost due to interruption is much smaller than the time saved due to having a group that is always eager to help solve a problem at hand.
If you have a small enough team, using adequately SVN commit comments, and exploit them a tool that generates an RSS feed (like Trac for instance) can be an easy and efficient way to promote communication.
There are several requirements for this to work, which are quite easy to attain:
- commit frequently (that is good in itself, as it allows everybody to benefit from each programmer's local changes, and to identify problems early);
- use verbose comments (which is good to, as it allows to trace more easily what was changed, in case anything breaks down);
- ensure everybody actually reads (better even, keeps posted to, through an RSS reader) the feeds.
Of course, there is no way to "reply" to such comments, but if someone really needs to reply, it's probably between that person and the committer, so mail is usually enough.
An other useful tool is to ask each developer to, let's say, once a week, write a 10 or so bullet point list of recommendations for fellow coders, on a topic he/she is really familiar with.
Time.
Official
Getting out of your dusty office to clear your mind, really taking the time to go to a lecture or training, it all helps to spread knowledge.
It's also easy to budget: N developers go to meeting for T hours.
Unofficial
"On the job" training... The things you need for your specific job can only be taught by someone who knows the job.
In the current climate, under the current pressure (must ship now), no-one takes time to fully explain something. Only when people are relaxed, they are readyfor information sharing. People are relaxed when they have enough time.
Apart from that, you need to bump into some specific linker error before you really start thinking about it. Without the time to think, ask, read, you won't be able to get the knowledge. You can't postpone it to an official linker-training.
Way harder to budget: developer Mary asked developer Sophie about dynamic linkage for an hour and a half. The day after, she went back with some questions. Experienced developers will spend more time distributing, while younger will need more time learning.
no walls - Have all of your developers in one large, non-walled room - where everyone can see and talk with each other.
common goals - ensuring your team has a good understanding of goals INCLUDING the goal of self-improvement
rewarding - rewarding - even if nothing more then communication - reinforces what you are looking to accomplish
Socialization and common goal always encourages exchanege of information.

How to react when the client's response is negative on delivery? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I am a junior programmer. Since my supervisor told me to sit in with the client, I joined. I saw the unsatisfied face of the client despite the successful (from my programmer's perspective) delivery of the project!
Client: You could have included this!
Us: Was not in the specification!
Client: Common Sense!
As a programmer, how do you respond in this situation?
What you should do to avoid this situation:
Explicitly spec out what will be included and what will not be included.
The problem probably comes down to the unspecified parts of the spec:
The client thinks that unspecified stuff should be in, i.e. it was implied.
The developer thinks that unspecified stuff should not be in.
For future specs that you have, you should have a catch all statement, that explicitly states that if something is not specified in this document, it can be done after the original specification is done at an additional cost.
What you should do in the current situation:
Other than learning from your experiences, you should come to some compromise with the client.
Example: I will do this feature that you feel is common sense, but for all future additions/changes it will have to be spec'ed out explicitly.
I.e. you will have to do a little more work, but it is worth it in return for the catch all explicitly spec'ed agreement your client will enter into.
Bad spec?
Was it necessarily a bad spec? No.
It is impossible to mention everything your clients may expect, so it is critical to have this catch all statement mentioned above stated clearly and explicitly in your spec/contract.
Other ways to reduce the problem:
Involve the client early, show them early prototypes. Even if they don't demand it.
Try not to sell the client an end product, but more of a service for working on his product.
Consider an agile development model or something similar so that tasks are well defined, small, paid for, and indisputable.
This would be one of many reasons why I switched to an Agile development philosophy. The only way, in my opinion, to successfully avoid this scenario is to either be omniscient or involve the customer heavily and release early/release often to get feedback as soon as possible. That way you can develop the software the customer really wants, not the software the customer tells you they want.
Client: You could have included this!
Us: Was not in the specification!
Client: Common Sense!!
Us: We do not attempt to go beyond what the client has specified - we follow the specification. It's as important to NOT implement features not specified as it is to implement features specified. We will never second guess our customers, who value the fact that they can completely depend on us to correctly and completely implement the specification on time and under budget.
As others very rightly point out, the situation is almost always more complex than the simple exchange I've described above.
However, the above is valid if the implementer has a specification with the customer's signature on it which essentially implements an agreement that says "once the software provably implements all the features in the spec then it is considered complete", and anything additional is outside the specification and therefore outside the contract.
The contract itself may have some input here as well - if you don't have a signed contract than it doesn't matter what's in the spec - everything so far has been done on a handshake, and the entire deal (including payment) can go down the toilet based on any dissatisfaction on either side.
But if you have a contract and a specification, and the customer has seen and signed both, then they have no wriggle room to ask you to go further.
Now, as to the question of whether you should implement it:
AWESOME! You delivered a product and they only had one complaint. Implement the feature, call it a 'freebie (make sure they understand you're working outside the spec and contract and explicitly send them a bill for the work with the discount shown in dollars) and have them sign off on the project as a whole.
It will explicitly demonstrate that the project is ended, that you went above and beyond the call of duty, and that any further 'surprises' are outside the contract/spec, which gives you a nice layer of protection beyond what you already (ostensibly) have.
If it's a UI issue, then you're in murkier water.
Does the spec adequately describe the UI? Does it have mockups? I wouldn't fault a customer for this complaint about the UI if the spec did not very closely describe the layout, usage, and include mockups.
Either way, I think you can understand the customer's position - if they haven't played with UI mockups, then they're going to be disappointed with the result regardless - there's no way, psychologically speaking, that you and your customer could have possibly had the same idea in mind (nevermind the fact that common sense isn't!).
Quite frankly if this is the first time the customer has thought about checking out the UI before the work is finished, then it's at least partially your fault for not explaining good UI design processes to them. This is a key feature for their app, and it's very tightly coupled to what they've imagined - no one can be satisfied in such a situation unless they've 'grown' their internal representation over time to match what the reality is.
This disconnect is solved only through frequent user and customer testing, which is obviously missing. This is a problem regarding client education and communication, not whether the specification was met or not.
-Adam
Expect last minute changes of scope - they always happen, so be ready.
Review progress frequently with client - to minimize surprises.
Contract: Functional Spec, plus Time & Materials with initial cap (so client feels control).
Then when changes come along, re-negotiate the cap if necessary.
Never say they can't have what they want. They can get that answer for free!
Always give them a little more than they asked for, so they know you've got a positive attitude.
Relate to the client as being on the same team with them. Don't accept being legalistically painted as an adversary.
They may think of contractors as not loyal, compared to employees. Show them you're as dedicated to their success as their employees are, and you'll go the extra mile.
Classic case...
There's not definite answer to this one, but it all turns around communication. There should have been preventive measures put in place (like weekly reviews or something like that).
For sure, you can't redo the whole thing for free.
Two ways: Or to tell them to ** off or you deal with it.
If you choose to deal:
First, empathize, respect the client.
Have a look at what can easily be changed.
Have a look at the contracts.
Maybe create a new agreement.
Don't do too much.
Make them see the progress and the work it takes.
Find workarounds for the missing features (maybe using other great features, or available tools.)
Use your common sense, it is so common, its not even funny.
This is one of the many drawbacks of a fixed bid arrangement. Any time business needs or priorities change, or there is even a simple misunderstanding, it results in anything from an awkward situation like this to calling lawyers in. If you have an arrangement where you get paid for development time, you can always react to any change and get paid for whatever time it takes to make that change. Also, having a by-the-hour arrangement does not preclude having a plan or making an estimate.
Once you are in a fixed bid pickle, though, your options are:
1) Do it at an additional cost.
2) Do it free.
3) Don't do it.
Option 3 is the worst, and Option 1 is the best. If you have a good trusting relationship and decent communication with the client, it's usually easy to arrive at Option 1. If the relationship is bad, then you've got bigger problems. At that point, just try to avoid laywers.
A final point - any project that has something known as "The Delivery Date" inevitably runs into the problem described. Projects with said date usually involve retreating to a cave for several months to develop in hiding followed by an unleashing of the product all at once in front of the stakeholders. This is abrupt and leaves plenty of time for client expectations and the actual product to drift apart. If, instead, you show intermediate versions of the product and gather feedback every few weeks, two things happen. First, you get better feedback, minimize misunderstandings, and make a better product. Second, there is no single point in time on which a massive amount of expectation is laid. The potential difference between what the client is imagining and what actually exists is much smaller. No surprises.
Good luck.
"how do you react?"
Question 1 - do you want to continue this relationship with this customer? Seriously. If they are going to claim that unspecified features are "common sense," this may not be a good relationship to maintain or enhance.
If you want to disengage, then that's easy. Ask for them to highlight each part of the specification that you failed to comply with and play that game. Get specific test criteria for each missing feature. Pull Teeth. Be confrontational in determining what's missing. Don't ask why. Just ask for all the details up front. It's slow and unpleasant. But you don't want them anyway.
If you want to engage, well, you're going to have to change the relationship. Currently, you have a Passive Aggressive Customer. They won't say what they want, but they will say what they don't want.
This may be a habit with them; this may be how they win concessions. Or this just may be sloppy specification on their part.
If you want the relationship, your reaction has two parts.
Short-term. Get something they're happy with. They have to identify specific changes. You have to score each change with a "cost to do" and "fit with specification".
Some things are cheap and a good fit. Do those.
Some things are cheap to do, but a bad fit with the specification. Think twice about enabling a bad specification to lead to rework. In a sense, you purchased the specification from them; you may need to raise your standards, also.
The expensive things which (sadly) fit the specification are a problem. You're in trouble with these, and pretty much have to do them.
The expensive things which don't fit well with the specification are lessons learned for everyone. Detail a plan for these, including specification rewrites and approvals.
Long-term. Make sure they you're not PA'd again. Review early and often, use Agile techniques. Communicate more, prototype more, release more.
Well, it was not successfully delivered. Somewhere along the line there was miscommunication. Without knowing the specifics I would suggest this is not a developer injected problem and this is probably not to be blamed on the customer - the requirements gathering task was insufficient. This is a classic example of what happens when the software side does not have domain experts or the requirements discovery process doesn't do all that it could...
If it was me I would correct the problem and figure out how to avoid similar issue in the future.
How you handle this can very well determine the future of this contract/business with the client. Taking responsibility and correcting the issue is a huge opportunity for your company.
EDIT:
This is a good time to evaluate how this happened to help correct it. Some companies choose to totally revamp everything they do which is a mistake I think. So is ignoring it. Blaming people for the problem is also a mistake.
It is a good time to walk through how this happened, what the process is, and maybe how it could have been caught. I would not make huge rule changes or process changes - but coming up with guidelines for future work is a great thing. Your company had a clear lesson about a shortcoming. Losing the opportunity to correct this problem and to correct your process would be a waste of a good chance.
ZiG, I've had to deal with this problem on several occasions at my current place of Employment. My group (3 developers) tries to approach things in an Agile manner. We're used to getting mid-stream and even last-second requests (which we then treat on a case-by-case basis).
However, we make it clear that resources (particularly time) are limited and if it's not in the spec we can't make promises. If it's judged important and it can't fit into the current release, we generally plan a followup release. If it isn't important, it goes on a list.
One thing I've found is that you can get users to agree to Spec S at Time T. However at Time T + N, getting them to remember they agreed to Spec S, or getting them to acknowledge that they did so (with the documentation you've been keeping, I hope!) can be trickier than it should be.
Speaking to the OP's subject and question:
If you are an employed programmer, then I would hope that other resources are in the meeting with you. Possibly "higher ups" in the organization.
If this is the case, then your job is to answer DIRECT questions, and to keep your emotions in check. Yes, you may feel injured because they don't love your code, but showing any emotion with bosses present is not a good thing. Rather, try and look neutral and let the others handle the session.
Now, if they "hang you out to dry", then I would recommend the following questions:
a) "OK. I see. Why exactly to you feel this is common sense to include this feature? I'd like to discover why we didn't include it." (force them to explain their thought process. Common sense to one person is rarely common sense to anyone else.)
b) "Well, I'm sure we could include that in the next release. I'll leave it up to XXX (the bosses) to come to a mutually agreeable approach" (i.e. don't talk cost or freebies with bosses present. EVER.)
Again, this assumes you are a programmer WORKING for a company that delivered the product. Now, if are more than that - i.e. you ARE one of the higher ups, then many of the suggestions here are excellent.
However, if you are the higher-up or are a consultant programmer, then first and formost
a) Apologize for the process that did not catch this requirement. Promise to work with the client to prevent this from recurring.
Then on to the other strategies. It really doesn't matter if you charge for the fix or not - the apology is the most important action to the client. Again, it bears repeating - you are not apologizing for the missed feature. You are apologizing for the faulty design process that let it slip. Clients are usually pretty accommodating when you start this way and then seek a solution.
Cheers,
-Richard
Use SCRUM like approaches to avoid this deathtrap: involve the client in the dev process early, frequently and in informal, restricted commitees -> risk reduction and improved agility.
In terms of your literal question, how to react, the best way is to ignore your ego ("what?! After I worked so hard on this and met the spec?!") and instead focus on some active listening and working to consensus.
Client: You could have included this!
Us: Was not in the specification!
Client: Common Sense!!
Us: I understand that you're not happy that we didn't go beyond the bounds of the specification. Seeing how you feel about this, how can we make you happy? Let's see if there's a process we can create together that will help everyone.
Essentially, you don't want to turn this into a "you said/I said" death match. The only way to resolve those involves lawyers and then nobody wins. If you can agree that the spec or the process was to fault, work together to fix those.
This approach actually just worked for me: wait for the guy who doesn't like your software to leave and be replaced by the guy who does like it.
Obviously you can't really rely on this, but if you're sure that you did a good job and that your software really will satisfy the business needs of the people who hired you, it does pay to wait it out. Sometimes the client's initial reaction will not be their final one, especially if you can quickly incorporate their concerns.
Don't try make the client feel like it is their fault. It might be their fault, but making them feel that way will not produce constructive results, and could just annoy them.
Instead, you should realize that clients only complain about software they use, in most cases because they like it. Nobody complains about software nobody uses. It is inevitable that a client will complain about the software you deliver, even if you deliver exactly what they ask for. So don't sweat it. Software is never done.
Total failure on the part of the person in charge of requirements collection, no doubt about it. Additional failure of the project management to not iterate the deliverable and have check-in meetings with the client.
However, you have a signed-off spec, and what you've delivered matches the spec. So, your company has two choices: write off the cost in the name of business development and make the change for free, or charge them for the change request.
If it ain't in the spec, it ain't in the spec. As a developer with no specific domain knowledge, 'common sense' is an irrelevant concept. Different industries work in different ways and one approach might be quite appropriate for a particular domain but completely unacceptable in the other.
Writing good specs is an art-form. IMO, you can either take an agile 'analyst/programmer' approach where you make small iterations or write and maintain a detailed, unambiguous specification. Both are highly skilled tasks, and are still iterative. You still have to evolve the specification.
Either way is not as easy as it sounds and both require the ability to establish a good working relationship with the client.
You cannot know what your customer think in his head. This situation occur often with client that haven't got any experiences with programming project. What I suggest to you is to simply show him that "common sense" isn't very accurate as answer in engineering (or programming if you prefer).
Show him other example in life that will show him that you cannot build something that aren't written. Example: building a new house, the guy who build the house need a plan with all detail... he won't put optional electric plug because in the living room it's more "common sense" to have some extra...
I had this once. And luckily it wasn't me that created the design because that proved to be the problem.
It is of vital importance that the communcation between your company and the client is as perfect as possible. Be sure you understand each other. Ask questions and let them ask questions. Do not let anything open in the design. This will be the problem point at delivery. And have regular meetings during the project (preferably with a prerelease).
Unfortunately a lot of developers are bad at communciation, and a lot of clients are not aware of their own needs. But if you can minimize the gap, you have found yourself a happy (and returning) customer.
This is why I/the teams I worked with always used a prototype-style approach, that means:
after collecting the requirements, you show the client an early and basic release of the software
the client says "you could have included this"/"it's common sense"
you change your design to reflect the client's desiderata
iterate from point 1 till the official release
You have to start it early on; tell the customer, early and often, that the spec/use-cases/user-stories are a contract which define what will be delivered. in an agile environment there are plenty of chances for the customer to observe some "common sense" feature they want and ask for it, which is one of the advantages of an agile approach, but if you start accepting "common sense" additions at the end, you are preparing yourself for infinite extensions, probably at your expense.
Some customers expect this; the more and better you tell them they can't, the easier the eventual arguments will be.
As a junior guy, I realize you can't do this -- yet -- but one of the hard-but-necessary lessons is that sometimes you have to fire a customer.
You learn - everything is learning and nothing is personal.
We are experts in our area we know better than customer what he need. And next time for next customer we will suggest all useful features in advance and make him happy and will make him pay more money because we are the experts and we know better.

How do you test the usability of your user interfaces [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
How do you test the usability of the user interfaces of your applications - be they web or desktop? Do you just throw it all together and then tweak it based on user experience once the application is live? Or do you pass it to a specific usability team for testing prior to release?
We are a small software house, but I am interested in the best practices of how to measure usability.
Any help appreciated.
I like Paul Buchheit's answer on this from startup school. The short version of what he said listen to your users. Listen does not mean obey your users. Take in the data filter out all the bad advice and iteratively clean up the site. Lather, rinse, repeat.
If you are a small shop you probably don't have a team of QA or Usability people or whatever to go through the site. Your users are going to be the ones that actually use the site though. Their feedback can be invaluable.
If something is too hard for one of your users to use or too complex to understand why they should use it, then it might be the same way for 1000 other users. Find a simpler way of accomplishing the same thing.
Once you have gathered all of this feedback and have a list of things to do, do the simplest ones first. That way you have forward moving usability progress.
What I like to do is give someone an install package, ask them to perform a number of tasks related to how the application works, and watch.
Hardest part is to keep your mouth shut.
Some of the best advice on usability testing is available on Jakob Nielsen's Website http://www.useit.com. He advocates what Will mentioned - ask users to perform various tasks on your website or web application and then sit back to see what they do.
Do not interrupt the users by asking questions or guiding them. Just observe them and document their flow. You can also get hardware and software to do eye-tracking and understand what captures the attention of the users.
However, usability should not start from the testing phase. You must have some general idea of what users generally like and do not like when you do development. There are many websites and books outlining generally accepted usability standards and principles.
Normally, we test the usability of new interfaces by asking a small selection of users to try out a beta version.
We give a small amount of instruction as to what the new features/screens are supposed to do and let them dive straight into it. It's very interesting to see where they are looking and clicking. We never demo the new features - we only talk about what it does.
If the UI changes are minimal then they go live and we gather feedback from real users. It's only when we are making big changes that we go through usability tests on beta.
When developing new screens it usually helps a hell of a lot to get a colleague sat in front of the UI and ask them what it does. Which areas do they click on? Where are they looking first? What sections are drawing their attention? etc.
I agree with Adam; using a very computer illiterate person is very helpful. However, what I've run into before with that is the program I want them to try out just isn't "up their alley" as far as something they would ever want to do.
A good way to start is with a paper prototype. Have specific tasks that you want your "user" to perform and have them do it. For more on paper prototyping, start here.
I frequently take any new interface I'm working on to one of our technical support people. They've heard every complaint about interfaces that you could ever imagine, so if anyone is going to think up potential problems, they will.
Also, and I'm not kidding about this, I often take the least computer literate person I know (you're mother is often a good choice...but they have to have used a computer before, otherwise it's going to by pointless) and let them loose on the interface with no instruction. If they can't figure out where things are intuitively, then your GUI likely needs work. Remember, Don't make them think! (yes, I know this is for web design, but it applies)
There are many ways to test the usability of a system. Please check any available literature you can find. I just want to insist that usability test is not so hard as you or anyone might think. In a famous paper called "A mathematical model of the finding of usability problems" in INTERACT'93 and CHI'93, J. Nielsen and T. K. Landauer showed that only five users are enough to find most problems in a small system.
If you have no way to read this paper, try this article in the author's website:
http://www.useit.com/alertbox/20000319.html
Z'been a while since this question was last active but here goes anyways.
From experience :
Always use Objectively measurable to decide if usability is better or not (time to accomplish carefully selected task, inactive time, KLM type metrics) here a key-mouse logger can be a precious ally
Never go too far ahead before consulting and measuring again with your client (do not encage yourself with the paper prototype and emerge with the finish product... that just never works)
read, read, read, try, evolve
Keep things simple and always remember the task at had (why the user needs the interface)
test, test and test again...
Always go to the bottom of the user requests. Although the check box the user request at this particular place may be the best thing to do, it almost always hides a more fundamental flaw
the system user (the one using it... as opposed to the one paying for it) is your best ally, keep him/her on your side
Never be afraid of refactoring your design and evolve your system. Also evolve your metrics and measurements also, however be careful in doing so not to break measurements continuity as it is the best token of objective progress in a VERY subjective world.
recommended reading (other than previously proposed):
Handbook of usability testing Jeff Rubin. A bit extreme but we toyed around an agile version of his approach and found that if we spent 30 minutes a week with users we would get a LOT of useful feedback while not getting swamped with too much info.
keep close watch to the Sneiderman and Nielsen of this world and other that may arrise
As usability inspection goes, there are several viable methods. They require a different amount of resources in regards to persons, analysis and equiptment.
The most common, and easiest to perform is called
Heuristic Evaluation
You basically walk through each screen to check if it conforms to the heuristics set by you, or your customer.
Check this article by Nielsen
Cognitive walkthrough
This method requires you to ask the user to complete steps in the application. You prepare steps for the user to complete. Issues that arrise during this walkthrough is taken into consideration when finishing the application.
Check this paper for details.
Think Aloud Analysis
I have used this method mostly in the early stages of prototyping. I let the user talk freely about the system while it is beeing used. Ask questions about use, design etc. You can get a really nice veiw of the general feeligns of the system, and what features are lacking.
Check this paper for details.
Interaction analysis
This is a more tricky one. I have only used the datagathering teqchniques proposed by this one. This technique takes into account context, activites, body language etc. Interaction analysis is commonly focused on research, not so much in commercial evaluations.
This link takes you to the article.
Keep in mind that these methods take practice to perfect. I would start with HE, continue to CW and THA. And only use Interaction Analysis if you have lots of resources and time.
There are a number of methods to test or evaluate usability of an application. Broken down into qualitative and quantitative methods and based on when you are planning to test.
Further it is categorized based on whether users are involved or experts do the testing.
To name a few methods,
Expert Reviews - user interface or usability experts rate the usability of an interface based on decided heuristics and principles
Formative usability testing - task flows are taken and users are provided with tasks to be completed. Qualitative feedback is collected based on what the users feel the pain points are during the testing. This form of testing is done during the design to provided feedback into the design of the application.
Summative Usability testing - task flows are taken and users are provided with tasks to be completed. The applications performance on efficiency, effectiveness and satisfaction are measured based on users completion of tasks.
The importance difference is whether you engage the user or a expert to tell you the difference in usability. Further on when you do the evaluation - at the end of the project or during the design phases.
I'm a strong believer in what I call 3-martini usability testing. When designing a system, imagine that the person who will be using it has just had 3 martinis.
Before handing over the system to colleagues (other programmers, quality assurance, tech support) or usability testers, an informal test with a couple of friends and a bottle of vodka (outside of work, of course) can often prove instructive.

Resources