What strategies have you employed to improve web application performance? [closed] - performance

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
Any personal experience in overcoming web application performance hurdles?
Any recommended strategies for improving the performance of a data-driven web application?
My development team works on a web application (JSP reports, HTML, JavaScript) that uses an Oracle database (PL/SQL). The key functionality the application delivers is in reporting, where a user can get PDFs of reports at a high level and drill down to lower levels of supporting details.
As the number of supporting detail records has grown into the millions, the performance of the system has significantly degraded. Based on our current analysis of the metrics, the bottleneck seems to be in the logic hitting the DB and the DB performance. Changing the DB model and re-doing some of the server side logic is currently being explored.
Partioning, indexing, explain plans, and running statistics are things that have been done on the DB side to try to help improve performance. While they've helped, they haven't solved the issue satisfactorily. The toughest part in analyzing performance data is that the database and web servers are remotely administered by a different part of the IT organization, so the developers don't have regular, full access to see what's going on (especially in the production environment, which is not mirrored exactly in any other development/testing environment).

While my answer may not contain any concrete steps to help this is always where I start.
First thing I would do is try to throw away all of your assumptions about what the trouble is and take steps to install metrics everywhere you can. Let the metrics guide you rather than your intuition. I've chased many, many, many white rabbits going on a hunch...the let me down more times than they've been right.

Have you checked this out?
Best practices for making web pages fast from Yahoo!'s Exceptional Performance team
If you really are having trouble at the backend, this won't help. But we used their advice to great effect to make our site faster, and there is still more to do.
Also use the YSlow add-on for Firebug. You may be surprised when you see where the actual time is being taken up.

Have you considered building your data ahead of time? In other words are there groups of data that are requested again and again? If so have them ready before the user asks. I'm not exactly talking about caching, but I think that is part of the equation.
It might be worth it to take a step back from the code and examine the usage patterns of the system. For example, if you are showing people monthly inventory or sales information do they look at it at only at the end of the month? If so just build the data on the last day and store it. If they look at it daily, maybe try building each previous days results and storing the results and avoid the calculation. I guess ultimately I am pushing you in to a Dynamic Programming solution; if you know an answer don't solve it again.

As Webjedi says, metrics are your friend.
Also look at your stack and see where there are opportunities for caching - then employ mercilessly wherever possible!

As I said in another question:
Use a profiler. Yes they cost money, and using them can occasionally be a bit awkward, but they do provide you with a great deal more real evidence rather than guesswork.
Human beings are universally bad at guessing where performance bottlenecks are. It just seems to be something our brains aren't build to do very well. It may seem obvious, you may have great ideas about what the problem is, but the real world often turns out to be doing something different. And optimising the wrong part of code means, at best, lots of work for minimal benefit. More often it makes things slower, and sometimes it breaks things entirely. So before you make any changes for the sake of optimisation, you should always have real evidence from a profiler or other accurate tool.

Not all profilers cost (extra) money. For .Net, I'm successfully using an old build of NProf (currently abandoned but it still works for me) for profiling my ASP.Net applications. For SQL Server, the query profiler is part of the package. There's also the CLF Profiler from MS but I've never been able to get it to work successfully.
That being said, profilers are definitely the way to go. That way you can see where your program is spending most of its time, and not focus on things that you think are slow. Plus it means you don't have to write anything in your code to actually record the metrics.
As I hinted to at the beginning, there are different types of profilers. The three I find most useful are application profilers, which let you see which functions you actually spend most of your time in. The second is SQL profilers that let you see how long your queries take to run. The third is memory profilers, which help to show you what type of objects your memory is being used up by. All three of these are really useful, and although you won't use them every day, the times you do use them will save you a lot of headache.

Related

When is it too late to optimize for performance?

I know that you shouldnt optimize too early, and you should instead aim for maintainability. My question is, at what point is it too late?
I'm working on a website, similar to yahoo answers, and my database structure is exactly what I feel it should be. Table for users, questions, answers, question_comments, answer_comments, etc.
My question is, IF the site were to grow, how would this architecture scale? I'm thinking of putting both questions and answers in a single table (posts), separating them by type, and then putting both question_comments and answer_comments in the same table (comments). I believe this is similar to stackoverflow's DB scheme.
I know what you guys are gonna say, "Dont worry about it until it becomes an actual problem". But wouldn't it be a little too late to worry about it then?
Thanks
The reason why it's a bad practice to optimize early is you don't know where your bottlenecks will be until your website sees a significant amount of traffic. How your users access and interact with your site is an unknown at this point.
It's almost always best to start with a 'good' architecture (normalized database, MVC architecture, DRY, well-written frontend code, etc) and go from there. It will be much easier to scale a clean, organized architecture than one that was prematurely optimized.
At best right now you can do some load testing via ab or another load testing tool to see where your current bottlenecks are. It certainly won't find all of them, but it will find some.
If you're really worried about this (and you shouldn't be yet), install Nagios or Munin on your server to monitor performance. Use a third party tool to measure page load time daily. Once you start seeing issues then you can profile and tune.
You absolutely should optimize if a fast service is a fundamental requirement of the application.
If sub-second responses are not a requirement, than you can write clean code and optimize later.
A good example of this was JavaScript before the latest version of browsers, people who wrote nice, clean, extensible JS for their pages had terrible performance and had to start from scratch.
One huge table is generally harder to maintain. People usually cut their tables into partitions and even their databases into shards.
I don't see how putting all comments into the same table would save you a join. Really, putting questions and answers into the same table won't save you a join either, you'll just be joining by the same table.
If you want to save on joins, I'd expect you use a document-oriented NoSQL database, such as MongoDB. That's where you can store a question with all related answers and comments in a single 'record', fetchable with one operation.
Databases need to be designed with performance in mind not wait until you havea problem later. Premature optimization doesn't mean don't do it in design, it means don't get ridiculously excessive about it. However, there are known performance killers for every database backend and it is foolish to design to use one of those when a differnt technique will be faster and take the same amount of time to write code for if you are familar with it. So before designing any database, read up on performance tuning and you will never write database code the same way again.

How can avoid people using my code for evil? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I'm not sure if this is quite the right place, but it seems like a decent place to ask.
My current job involves manual analysis of large data sets (at several levels, each more refined and done by increasingly experienced analysts). About a year ago, I started developing some utilities to track analyst performance by comparing results at earlier levels to final levels. At first, this worked quite well - we used it in-shop as a simple indicator to help focus training efforts and do a better job overall.
Recently though, the results have been taken out of context and used in a way I never intended. It seems management (one person in particular) has started using the results of these tools to directly affect EPR's (enlisted performance reports - \ it's an air force thing, but I assume something similar exists in other areas) and similar paperwork. The problem isn't who is using these results, but how. I've made it clear to everyone that the results are, quite simply, error-prone.
There are numerous unavoidable obstacles to generating this data, which I have worked to minimize with some nifty heuristics and such. Taken in the proper context, they're a useful tool. Out of context however, as they are now being used, they do more harm than good.
The manager(s) in question are taking the results as literal indicators of whether an analyst is performing well or poorly. The results are being averaged and individual scores are being ranked as above (good) or below (bad) average. This is being done with no regard for inherent margins of error and sample bias, with no regard for any sort of proper interpretation. I know of at least one person whose performance rating was marked down for an 'accuracy percentage' less than one percentage point below average (when the typical margin of error from the calculation method alone is around two to three percent).
I'm in the process of writing a formal report on the errors present in the system ("Beginner's Guide to Meaningful Statistical Analysis" included), but all signs point to this having no effect.
Short of deliberately breaking the tools (a route I'd prefer avoiding but am strongly considering under the circumstances), I'm wondering if anyone here has effectively dealt with similar situations before? Any insight into how to approach this would be greatly appreciated.
Update:
Thanks for the responses - plenty of good ideas all around.
If anyone is curious, I'm moving in the direction of 'refine, educate, and take control of interpretation'. I've started rebuilding my tools to try and negate or track error better and automatically generate any numbers and graphs they could want, with included documentation throughout (while hiding away as obscure references the raw data they currently seem so eager to import to the 'magical' excel sheets).
In particular, I'm hopeful that visual representations of error and properly created ranking systems (taking into account error, standard deviations, etc.) will help the situation.
Either modify the output to include error information (so if the error is +/- 5 %, don't output 22%, output 17% - 27%), or educate those whom this is being used against to the error so that they can defend themselves when it is used against them.
Well, you seem to have run afoul of the Law of Unintended Consequences in the context of human behavior.
Unfortunately, once the cat is out of the bag, it's pretty hard to put back in. You have a few options (which are not mutually exclusive, by the way) to consider, including:
Alter the reports so that their data can no longer be abused in the way you describe.
Work with management to help them understand why their use of your data is improper or misleading.
Work with those whose performance is being measured to pressure management to rethink their policy on the matter.
Work with management/analysts to come up with a viable means to measure performance in a way that is fair to everyone.
Break the report in a manner that makes them unusable for any purposes.
Clearly there is a desire on the part of management to get analytics on performance of analysts. Likely there is a real need for this ... and your reports happened to fill a void in the available information. The best option for everyone would be to find a way to effectively and fairly fill this need. There are many possible ways to achieve this - from dropping dense rankings in favor of performance tiers to using time-over-time variance to refine performance measurements.
Now, it's entirely possible that the existing reports you've provided simply cannot be applied in a fair and accurate manner to address this problem. In which case, you should work with your management team to make sure they understand why this is the case - and either redefine the way performance is measured or take the time to develop an appropriate and fair methodology.
One of the strongest means to convince management that their (ab)use of the data in your report is unwise is to remind them of the concept of perverse incentives. It's entirely possible that over time, analysts will modify their behavior in a way that results in higher rankings in performance reports at the cost of real performance or quality of results that are not otherwise captured or expressed. You seem to have a good understanding of your domain - so I would hope that you could provide specific and dramatic examples of such consequences to help make your case.
All you can do is to try and educate the managers as to why what they're doing is incorrect.
Beyond that, you can't stop idiots from being idiotic, and you'll just go mad trying.
I definitely wouldn't "break" code that people are relying on, even if it's not a specific deliverable. That will only cause them to complain about you, a move which may affect your own EPR :-)
I really think the key here is good communication with your managers.
Besides, I like PatrickV's idea. You could also try some other ways to engineer your tool around the problem so that it'll seem silly/be hard to use it as performance measurement - change the name of the statistics to mean something other than "how good programmer X is", make it hard to get data per-person, show error statistics.
You can also try to display the data in another way (this may actually make your managers think you are trying to help them). Show a graph - a several pixels difference in position may be harder to identify than a numeric results (my guess - your managers are using excel and coloring red everything below average). Draw the error margin so it doesn't make sense to obsess over fractions of percentages.
Give the result as a scale - low and high margin that take into account your error information, it is harder to compare.
Edit: Oh yeah, and read about "social interfaces". You can start with's Spolsky's Not Just Usability and Building Communities with Software.
I would echo #paxdiablo's advice, as a first step:
Work on the report on the inherent errors. In fact, make it the introduction to every copy generated.
When you refer to the measurement errors, indicate they are the lower limit of the errors (unless there actually aren't any).
Try to educate the manager(s) in the error of his/her ways.
If possible, discuss the issue with your manager. And perhaps with the offending managers' management, depending on how familiar you are with them you probably limit it to just "express some concerns" and giving a heads-up.
Consult your HR department, or whomever is in charge of fairness in the performance reviews.
Good luck.
The problem is that the code is not yours, it belongs to your company. They really can do whatever they want with it.
I hate to say this, but if you have an issue with the ethics of your company you will have to leave that company.
One thing you could do is implement the comparison yourself. If he really wants to check if somebody is performing significantly less than the rest, it should be tested formally as well.
Now to choose the right test is a bit tricky without knowing the data and the structure, so I can't really advise you on that one. Just take into account that if you do pairwise comparisons, or compare multiple scores against an average, that you run into the multitesting problem. A classic way of correcting is using Bonferroni. If you implement that one, you can be sure that at a certain point, noone will jump out any more. The Bonferroni correction is very conservative. Another option is using Dunn-Sidak, which is supposed to be less conservative.
The correct implementation would be an ANOVA -if the assumptions are met and the data suitable off course- with a post-hoc comparison like a Tukey Honest Significant Difference test. That way at least the uncertainty on the results is taken into account.
If you don't have a clue on which test to use, describe your data in detail on stats.stackexchange.com and ask for help on which test to use.
Cheers
I just wanted to elaborate on the Perverse Incentives answer of LBushkin. I can easily see your problem extending to where analysts will avoid difficult topics for fear of reducing their score. Or maybe they will provide the same answer as earlier stages to avoid hurting a friends score, even if that is not correct. An interesting question is what happens if the later answer is incorrect - you have no truth, just successive analytic opinions - in this case I assume the first answer is marked as "incorrect", right?
Maybe presenting some of these extensions to the manager will help.

New project panic [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I have a question which is not strictly-speaking programming related, but nevertheless caused by being an analyst and a programmer at the same time.
It's about starting new projects which seem to be unfeasible, because they have an unknown domain, lack specifications, and/or require technology which I am not familiar with. I get some sort of panic when I approach such a project, and then relax as I proceed along with domain and technology understanding.
Is this something you experience? How do you cope with it?
The best way that I know of to try to contain and control the human factors in a project is to have a clear idea of your own processes.
Start with some Domain Driven Design, work with the users and help them to understand their domain and the business processes that surround the domain. Often developers are far better at abstraction than the managers/business people so we can often help them to understand their own domain.
Build up a set of acceptance criteria, these form your tests which actually form your spec.
Once you have an idea of the above you know much more about feasibility and how long it will take (and even if the technology that has been specified is the right one)
As for approaching new technologies, start small, build a proof of concept and make your mistakes there rather than on production code. There is a huge amount of best practice on the web and places like StackOverflow are good places to start.
I would suggest working in an agile fashion, get the project owners to prioritise the work that needs to be done, work out what is needed for the next two week sprint and deliver it (which may mean stubbing out a lot of functionality). They'll tell you when it is wrong and it may influence their own decision making.
Don't view the entire project as a nasty whole, break it down into deliverable sections and one step at a time.
Calm down.
If the project is initially infeasible (even if only in your own mind) then start with a feasibility study. This is a sub-project with which you will define the project (or at least the next sub-project).
You've already defined several major tasks within the feasibility study: learn about the domain, write some specifications, learn enough about the new technologies.
As for me, no I never panic about this sort of situation, I love starting with a blank sheet of paper, and experience has taught me how to start filling it in real quick.
So, take a few deep calming breaths and jump in.
Yep, I get this felling all the time. But I always think that technologies are like tools. Once you got how to handle then, the rest will be easy.
Whenever I don't feel like that is when disaster lurks! It's like eating an elephant, just do it one bite at a time. Do some part you do understand, and that gives a handle to the next bit.
unfeasible,
unknown domain,
lack specifications,
require technology which I am not familiar with
I think that's how we start our life too. As long as you are confident that you can pull it off, just stick to it and you will see that things are working in your favor provided:
You understand the importance of being Self Starter
You take responsibility for who you are
You ask the right questions at right time
All the best!!!
Often the trouble with these infeasible projects is that the client is on a limited budget and will go bust before you complete your feasibility study. In this case it might be worth taking a step back from technology and looking at economics. May be sub-contracting to someone with the required knowledge will ease the pain.

To be a lazy developer or not to be a lazy developer? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Am I a lazy developer?
Is it being lazy to use automated tools, such as code generators and such?
Now, I could, if I had to, create all the data layers and entities I needed, but I choose to use CodeSmith to generate my datalayers and entities.
I also use Resharper and I would say it fights with MSDeploy as to which gets installed first after Visual Studio.
Again if I had to, I could code without it, but prefer not to.
Both these tools from my point of view are no brainers as they improve output massively.
But is this lazy?
I'm sure there are purists out there that would say everything should be wirtten by you so you know what everything is doing, but if you can read through the code and see what is happening is that ok?
So am I being lazy or am I just using all the cards in my hand?
In programmers, laziness is a virtue, so don't worry.
It's only lazy if you use a tool to produce code and use it as-is without verifying that the code meets your needs and abides by your standards.
You don'nt need to reinvent the wheel n times, this is done often enough. Briefly I'd state that using tools like the ones you mentioned (within reason) is absolutely no problem...
For you? No, you're not being lazy.
For the guy that doesn't understand what code generators are doing and how they do it? Yes, it's being lazy.
That's the important distinction: You have to know what you gain and know what you're missing by using a code generator. If you don't, it's only a matter of time before you come across a case where you have to be able to produce those classes and not know how.
Both these tools from my point of view are no brainers as they improve output massively.
This means you're not being lazy, you are using the appropriate tools to enable you to concentrate on the important aspects of the job.
It's not being lazy - it's being smart. There's nothing wrong with using every tool at your disposal...as long as it makes you more productive. Using tools for the sake of using tools is a bad idea.
However, if you don't know what your tool is doing under the hood, you should learn about it so if you don't have the tool available for some reason, you can get the job done.
I think that's the wrong question. Laziness is a virtue. I've seen too many programmers who do things the hard way rather than sitting back and thinking for a few minutes to come up with an easier way. I've had so many times that I've said to a junior programmer something to the effect of, "Yes, I respect your diligence in working through lunch and staying late to write the code to do X, but if you'd taken a few minutes to check the documentation you might have seen that there is already a function in the library that does that". Or similar stories.
I'm not familiar with the specific tools you describe, but to me, the question always is, Does this tool actually save me any work? I've tried plenty of "code generators" that basically just create code stubs. So gee, thanks, you wrote the "function x(int, float)", now all I have to to is fill in the actual parameter names and write the code. What did that save me? I've also seen plenty of code generators that write really awful code. So now I have to try to add the "custom" code to this jumbled mess. Wouldn't it have been easier to just write the whole thing cleanly the first time? I've seen plenty of productivity tools where I found it takes me more time to set up the parameters to run the tool than I actually saved by using it. (Like the old joke that it's been proven that jogging regularly really does make you live longer: for every 60 minutes you spend jogging, it adds 30 minutes to your life.) Some tools may produce code or data structures or whatever that is difficult to maintain, so you save an hour today but it costs you ten hours in maintenance over the life of the project. Etc.
My conclusion isn't that you shouldn't use productivity tools, but rather that you should make sure they really are increasing your productivity, and not just giving an illusion of doing so. If in your case you find these tools really do help you, then using them is not "cheating", it's simply smart.
As everyone else already pointed out there's nothing wrong in your use of code generators.
Still I can see downsides and reasons to avoid it in certain particular sitations.
choice of language. Sometimes the very same fact you need a code generator to get your coding started could imply you're using the wrong language for the task. Most times language cannot really be chosen, so code generators remain the best way to go.
code redundancy. Depending on the actual generators used, generated code could be redundant, if this happens and generation happens once, isn't automated, and generated code goes into the main repository maintenance problems could arise in the long run. Not really a problem with code generation itself, but with the way it should and should not be used.
adding development platforms requirements. We have to concede many programmmers out there work on bread-toasters doubling as PCs. It's really a bad, (and sad) reality of cheap business practices meeting sharp minds. (sharp minds go to waste in the process) It could become a concern if our project (which could have a port in store for the future, and in an external facility either) needs an hefty, ram hogging, not enough cross platform, IDE handy to compile every little modification.
So, no definitive answer on code generating lazyness and programmming: it depends. Then again, using the wrong tools for the job is bad for your health, (and business) so... don't.
You're using all the cards in your hand. Why reinvent the wheel when there are tools available to make your job easier. Bear in mind these tools DON'T do your job, they only assist.
What you create is down to you, so using the tools is not lazy... it's just intelligent.
I'd say you're more efficient rather than lazy.
Programming is primarily a thinking exercise not a typing one. So long as you understand what the tools are doing you're shifting the balance away from typing to thinking. Doing more of what your job is about? Doesn't sound like lazy to me!
I'm sure there are purists out there that would say everything should be wirtten by you so you know what everything is doing
This might have been a viable point of view during the early days of programming. But nowadays, this is simply not feasible (or even preferable). After all, you've already obscured a certain level of understanding just by using a high-level language.
That said, I've found it to be a great learning exercise to write some of these things by hand occasionally. Not only do you get to learn more, but they teach you how helpful these tools really are (or aren't). Note that I'd only do this on a personal project though. I wouldn't do this for any project someone was paying me for (unless I were working for a masochist or something).
Ask yourself why there are so many ORM and other code-generation tools around. I'd say go for it with the proviso that you leave it maintainable for the next guy/gal.
Programming is about being lazy, about automating repetitive tasks. If you can't do that inside your language, using code generators and similar things is a useful workaround.
It depends on what you're writing, of course. I am suprised nobody's brought this up. If you're writing device drivers, operating systems, protocols, or server software (web servers, tcp driven servers, etc), you should probably do it by hand.
But with what I do and probably what a lot of us do is implement business processes in code for web pages or web services. And in those areas, if you can improve on your code with code generators, go for it.
Yes you are being a lazy developer, be honest to yourself, if you take the time to do it the hard way you can call yourself less lazy than you are now.
The point is, being lazy isn't inefficient at all.
Lazy people take time to look at the problems from different direction before acting upon it, this avoids unnecessary errors which saves you valuable time.
So you're being lazy, but that's ok. People don't hire hyperactive coders that make 10 applications each day but leave a trail of bugs on their path. bug-fixing costs time, time is money.
conclusion:
Laziness = profit
Go for it.
I think the best developers are also the laziest. Basically, all you're doing should be focused on getting the end result with the least amount of work. This often provides the best result and also avoids developers from being distracted by fun things to include in a project. A lazy developer would e.g. never add an easter egg to his code, simply because this would be more code, which could introduce more bugs that need to be fixed later on. Adding code is bad, since you'd also add more bugs that you need to resolve later. Still, you will need to add code, else you won't get paid. So, as a lazy developer you would of course choose for the most optimized code, the best-tested code which would almost never fail and you'd work in a way that the chance of errors is reduced to a minimum.
Do keep in mind that lazy developers should focus on avoiding work in the future, not on avoiding work right now! So stop reading here and get back to work! ;-)
Laziness is a trait that most good programmers have. Unless they work for Adobe, in which case they are often lazy in a bad way.

Is the UI a valid indicator of internal quality? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
When looking at the myriad types of software written at our company, I instantly jump to conclusions of the quality of the entire product based on the UI. If I find misspellings, weird tab orders, fields not lined up, odd colors, I assume that the entire application is of poor quality.
I'm assuming that if the programmer doesn't care enough to make the outside look good, that they don't care enough at all. I am NOT assuming if the UI looks good that the application does what it should, although I am not immediately down on it -- it gets more leeway when it's being evaluated.
Is this a valid decision to make? For commercial software as well?
It may or may not be. But that's not really relevant. To your end user, crappy UI = bad code.
I think it's a good indicator of the care that a developer has for their work - basically a sense of professional pride.
It's a given that most devs don't make fantastic UI designers, but there are a basic set of rules that should be followed when developing professional software and these apply as much to the UI as they do to the internals.
So, basically I agree with you.
IF the application was written by one developer its not an unfair assumption that a slovenly UI is indicative of the underlying code quality.
However if it was written by a team of 5 or 7 or 13 there will likely be a wide range of quality under the hood (it just might be the newbee was given the UI).
Also if the app is 5+ years into its lifecycle with maintenance being performed by FBN contractors or interns or whoever is handy you may find a lot of good code under the hood thats slowly rotting because of indifferent management and undisciplined developers who just throw a "patch" at it, compile it, check it back in and throw it over the wall to production.
A crappy UI can be indicative of a lot of things, none of them good, some worse than others.
In my opinion it is a valid decision. And you are right when you say that good looking software is not necessarily good software internally.
But definitely, if the programmers don't care about the usability of the program, most likely they won't care about it's functionality.
If the UI is riddled with typos and inconsistencies, it is probably fair to say that the QA process and project management were a bit lacking. Doesn't really infer that the codebase is riddled with bugs.
In a commercial product, it most likely means that less people will buy it, so whilst sales are not really a quality metric, they're pretty important in the overall scheme of things.
People are more likely to buy things that look good, behave as they expect them to, and "Don't make them think".
Many programmers suck at UI design, and that's not their fault, it does not mean they suck at coding. They're just generally more interested in the internal beauty of what they make, otherwise, they'd be liberal arts majors instead.
It really depends. I know of software developers who are excellent at just about all aspects of design and implementation but have lousy UI skills. Many times the UI is an afterthought as a nod to the user. In the cases of scientific software or other software where the processing is central or key, it might not be a good idea to judge the quality of the rest of the code by the UI. However, overall - it might be a good indicator that the software company has not done its job well.
It all really depends on each case, but if the UI is not usable or a pain in the neck, then the underlying code is harder to use and not worth the time I suppose.
The opposite is also not true - flashy, beautiful UIs do not mean that the underlying code is good at all. Anyone can wrap a piece of junk with a nice UI.
I'd agree with the masses here. Poor UI mean that the product development team dropped the ball.. That said. I consider myself a good coder. Great at math, but dyslexic and attention deficit disorder.. Give me a set of earphones and some code and I'm on my way. Don't however expect me to mock up a great GUI. Line things up.. That I do.
Now ADD to that the fact that as the "programmer" even when I see things in the GUI that bug the crap out of me (as a person who uses it), I don't get to fix them.. Hell when I do fix them I get QA asking me for the design document and the approval from on high. After a while I stoped caring about the GUI..
I write solid code, that works. It's fast, clean and small.. it's where I get to have an impact. The GUI is beyond my pay grade. :(
In my experience it's usually the other way around. You get good quality UI's by having people who spend "huge" amounts of time focusing on widget behavior and look&feel instead of domain model or automated tests.
Some of the best quality systems I've worked with had auto-generated UIs, that were rather unpleasant to use.
As much as I really want to say, "Yes, absolutely," it's not always a valid conclusion. The programmer or QA team may have an excellent understanding of the application but a terrible grasp of the language or presentation.
Some people simply focus on what they consider to be important—and get it fairly close to perfect—and all but ignore what they consider "fluff" or "window dressing."
But I do have very a strong tendency to pre-judge the overall quality of the software based on first impressions.
No, UI is not indicative of internal code. Many a time's we come across things that are shiny and look cool but serve no purpose. Think of it as a seeing a Ferrari parked at the store. It looks awesome and you wonder what it would be like to get behind the wheel -- only to find out it's a body kit slapped on a 1980 late model Acura that has 500k miles on it.
A personal example, at my current employer, we have stellar code in our software (and I say this subjectively since I was not there for 99% of it's creation). But when you look at our UI, it can seem a bit old and rusty. That and many a times the UI developers don't even touch much of the internal code.
"I'm assuming that if the programmer doesn't care enough to make the outside look good, that they don't care enough at all" - I don't believe this to be true. I think most programmers look for functionality as opposed to shininess as they tend to be creatures of logic, not artists.
Take linux as an example -- stellar internal code, but UI was lacking for a long time, thus why no one in the mainstream has used it extensively as opposed to Windows or Mac.
Short version: !UI.Equals(InternalQuality)

Resources