As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I have a question which is not strictly-speaking programming related, but nevertheless caused by being an analyst and a programmer at the same time.
It's about starting new projects which seem to be unfeasible, because they have an unknown domain, lack specifications, and/or require technology which I am not familiar with. I get some sort of panic when I approach such a project, and then relax as I proceed along with domain and technology understanding.
Is this something you experience? How do you cope with it?
The best way that I know of to try to contain and control the human factors in a project is to have a clear idea of your own processes.
Start with some Domain Driven Design, work with the users and help them to understand their domain and the business processes that surround the domain. Often developers are far better at abstraction than the managers/business people so we can often help them to understand their own domain.
Build up a set of acceptance criteria, these form your tests which actually form your spec.
Once you have an idea of the above you know much more about feasibility and how long it will take (and even if the technology that has been specified is the right one)
As for approaching new technologies, start small, build a proof of concept and make your mistakes there rather than on production code. There is a huge amount of best practice on the web and places like StackOverflow are good places to start.
I would suggest working in an agile fashion, get the project owners to prioritise the work that needs to be done, work out what is needed for the next two week sprint and deliver it (which may mean stubbing out a lot of functionality). They'll tell you when it is wrong and it may influence their own decision making.
Don't view the entire project as a nasty whole, break it down into deliverable sections and one step at a time.
Calm down.
If the project is initially infeasible (even if only in your own mind) then start with a feasibility study. This is a sub-project with which you will define the project (or at least the next sub-project).
You've already defined several major tasks within the feasibility study: learn about the domain, write some specifications, learn enough about the new technologies.
As for me, no I never panic about this sort of situation, I love starting with a blank sheet of paper, and experience has taught me how to start filling it in real quick.
So, take a few deep calming breaths and jump in.
Yep, I get this felling all the time. But I always think that technologies are like tools. Once you got how to handle then, the rest will be easy.
Whenever I don't feel like that is when disaster lurks! It's like eating an elephant, just do it one bite at a time. Do some part you do understand, and that gives a handle to the next bit.
unfeasible,
unknown domain,
lack specifications,
require technology which I am not familiar with
I think that's how we start our life too. As long as you are confident that you can pull it off, just stick to it and you will see that things are working in your favor provided:
You understand the importance of being Self Starter
You take responsibility for who you are
You ask the right questions at right time
All the best!!!
Often the trouble with these infeasible projects is that the client is on a limited budget and will go bust before you complete your feasibility study. In this case it might be worth taking a step back from technology and looking at economics. May be sub-contracting to someone with the required knowledge will ease the pain.
Related
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
So there are a lot of posts around here about what are the best ways to teach kids to program. I'm interested in the next step, teaching kids how to debug code that doesn't do what they want, or doesn't always work 100% of the time (I believe these are separate problems, but that could be subjective).
I ask from the point of view of a game developer who already has a working game (ROBLOX) where kids can code up a ton of crazy stuff in our embedded scripting language, which happens to be Lua.
What we are seeing is that as these scripts become more complicated they are suffering from edge cases that the kids didn't consider - ultimately limiting the scope of what they can do. Part of the solution is education and part of the solution is better debugging tools. Thus I ask a two part question:
What high quality, freely available sources of information exist on the internet that we can send aspiring script developers to with any expectation that they would get something valuable out of it? Maybe there aren't any and we need to write some?
What debugging tools do you think would be most useful to kids? I want to hit the payoff vs. complexity sweet spot.
Our target demographic here is motivated kids, mostly 12-15 years old.
IMHO: Never mind tools. Talk them through it. Teach problem-solving skills. And just as importantly, teach testing.
Well for the debugging part, my guess would be three things:
Avoid bugs in the first place by teaching them good programming practice
Test each part with eg. unit-testing (Lunit)
use print() enough for seeing what happens
you might be interested in debugger.lua or Remdebug
Use a decent editor with syntax highlighting, bracket matching, ...
For the general information:
Learning Lua on the Lua-users wiki
The Lua reference manual
Programming in Lua
That's the way I learned using Lua anyway :).
Of course, early start always helps. In the early years, brains aren’t wired to one particular language like in adulthood. http://blog.quib.ly/2012/10/30/can-kids-beat-adults-at-coding/
I don't know about the "sources of information" part. It looks a bit too generic to me. I learned about edge cases with painful experience, and don't know any other means. I'm not sure it is a kind of knowledge that can be taught formally. It's more like an intuitive thing to me. Kind of like swimming: in order to learn, you have to get wet.
But regarding payoff-vs-complexity part, I'd say that nothing beats the good old console + print duet. It might not be as fancy as other debugging means, but its complexity asymptotically approaches 0. And it's something they will be able to use in nearly any environment and any language they encounter in the future (unless something really big happens).
If you have iPad, now there's a nice app that lets you write programs/games/simulations and run it directly from your iPad. The language is Lua.
http://twolivesleft.com/Codea/
I would use Netbeans after stripping it down a bit. It has some very nice code hinting and comprehensible error checking and hinting.
Kids can have restricted access to tools like debuggers as an individual may not be registered as a programmer or (game) software developer in the state or at the national level. Lua can be run in debug or trace mode and there is something to be gained by reading through the program script or code and using a pen and paper with test input values to note the variables and their contents with logic jumps separately noted with any return expectation and assess the output data values created at relevant points. This is sometimes called dryrunning and is used normally prior to first full test in the development process. This can help in coping with sometimes complex logic progress and with stack element contents written from bottom to top or from left to right on the paper.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm a member of a software development team, working on a small project.
We think that we can release a beta quality product after 2 or 3 month of continuos work.
Since this is our first teamwork, I decided to ask, which software development methodology would you suggest for a small project with small number of developers (less than 10)?
There are two approaches to software development:
Write down what you are going to do, do it, then agree that you have done it.
Start developing stuff, agree that what you have done is good, repeat until finished.
Both have their adherents and both pop up repeatedly under a variety of names. Each new generation of software developers (ie about every 2 years, this is a fast changing industry and software developers have the lifespan of a mayfly) rejects the previous generation's approach, re-discovers the approach used by the generation before last, renames it something funky and declares it to be the ONE TRUE WAY.
The choice between the approaches ought to depend on the culture of (a) the customer organisation and (b) to a lesser extent, the culture of the supplier organisation (ie your software developer team).
So, if you work for a buttoned-down conservative enterprise approach 1 is indicated. If you look down and see that you are wearing surf shorts and came to work this morning on your skateboard, go with approach 2.
And, in case you have read this far, the most serious bit is the paragraph before the one before this final one, ie the one starting 'The choice ...' This is a cultural / organisational issue rather than a technical one. Both approaches have been used on many many successful projects, neither has a monopoly on unsuccesful projects.
This really does depend on what you are intending to build. If the project is going to be something you want to build upon and have regular intervales something like Agile / Scrum would be very suited.
But it really depends on what the project is to determine release iterations and the like etc.
I think that you need to start from Joel Test and try to implement most of this list:
http://en.wikipedia.org/wiki/The_Joel_Test
And as product development use KISS = Keep It Simple & Stupid, for first release
Also really good start is Getting Real book, available free from 37 signals:
http://gettingreal.37signals.com/toc.php
This really does depend on your customer.
If the customer can accept fixed
time, fixed resources, fixed quality
(100% working code), and slightly
variable scope, I recommend choosing
an agile methodology.
If the customer cannot accept the
above, i.e. the pre-condition for
using an agile methodology is not
present, I recommend choosing any
methodology you like.
The important thing is that you do have a methodology, learn what is working as you go, and use the knowledge to adapt the methodology.
Don't do waterfall, this never worked and will never work. Thinking waterfall is a working methodology is like thinking banging your head against the wall is good, because even the sturdiest wall MUST crumble at some point.
I'd go with a reasonable agile methodology, like Scrum (XP is a bit harsh). Also, introduce things like TDD, DDD, DBC and you should be fine.
I wont suggest this as THE best answer, without having a better idea of the context and circumstances, but I am personally becoming a fan of the Lean / Kanban approach. In general I find a lot of the agile / scrum methods can be fairly developer focused, and almost anti-manager sometimes, which is sometimes appropriate but not always. The lean approaches tend to address the entire value stream rather than just the development itself.
You can read more about it at :http://www.limitedwipsociety.org/
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
This may be a hopelessly vague question. But I am interested to hear whatever logical thought processes people go through when learning a new concept or trying to get their brain around code they might not have ever seen before.
Basically, what general steps does one take to to break down problems and what does it take to "get it"? If you were to diagram a flowchart of how your mental process works when you look at code or try to solve a problem what might it look like?
What common references, tips, and mental assumptions do you find useful in problem solving?
How is this different between different domains? For example in what ways is a web programmer's thought process similar or different from a traditional desktop app developer's process?
I'm a big believer that no matter what type of application you're looking at for the first time, may it be a web app, a desktop app, a device driver, or whatever else, there are three steps one developer usually follows in order to understand how it works:
Get the big picture :
What kind of app is this (web, desktop, ...)?
How is it layered (standalone, client-server, n-tier, ...)?
What is the app's purpose? What is it supposed to do?
Who is the app made for?
See how it works :
What language(s) is (are) used?
How is the code structured?
How is the data structured?
Understand (or at least try to) the way the app has been thought through:
Has it been thought through at all?
Is the app clearly optimized? (For performances? For readability?)
Is the app finished? Or is there room for evolutions?
Are there signs of multiple releases?
etc...
The 1st and 2nd steps are purely technical, while the 3rd MUST be as untechnical as possible... it's more about psychology and understanding how the app has been built. It obviously requires experience, but as long as you think hard enough and don't waste your brain's time with technical details, you'll eventually get it.
This whole process shouldn't require the use of a keyboard. You're only supposed to read, think, and take notes on a paper (I'm not kidding: pen and paper!).
Ho ho, good luck with this one. It's a great question and I'm sure you'll get a ton of answers. Although I have to say I cannot give a satisfactory answer to this - the last thing I would describe my thought processes as is a flow chart - I don't think there is any golden formula for this.
The only tip in problem solving I can recommend is discussing it with somebody else. In those times when you hit a brick wall, going through it with a colleague is invaluable. Quite often, as well, they will actually not even add much to the discussion - in the process of getting all your thoughts out in the open, the solution can become clear.
People are notoriously bad at examining their own thought processes, but I'll give it a whirl. I test very high for visuo-spacial ability in IQ tests, medium-to-high for verbal skills, and moderate for mathematical skills (explains my A-level Maths grade, I suppose). amd when I start to design software, I think in terms of shapes and the connections between them. When it comes to describing these thoughts to others (or clarifying them for myself), I use simple block diagrams or the object diagrams taken from Jacobson's Objectory method - NOT the over complex stuff that UML suggests. I sometimes write textual descriptions of complex things, mostly as reminders to myself, but never use numbers or maths.
Of course this is just me - I've worked with maths whizzes who were just as good or even better programmers than myself.
I don't think... I process.
This is actually less flip than it sounds. I always break down tasks into their components and then break these down further, and that doesn't just go for writing software! Much like #Mark Pim U go through things sequentially.
My wife gets really annoyed when I make dinner because I take so long to get started.
Divide & Conquer
I start by trying grasp the entire problem as it is, and then start to find patterns I can recognize, and do the same for them in a kind of recursive process, until I have a broken down solution I can implement and follow more easily.
This is one of the rare times I would answer with "it just works." I learn things by steamrolling through them. I don't have gimmicks, or devices to help me. Took me some time to learn PHP, but after that Javascript was much easier. Once you tackle one thing, the next items become cumulatively-easier.
Personally, I conduct an internal dialogue with myself 'OK so we need to loop over this list of integers.' 'But we can break when we find the value we want.' 'OK, will the list definitely be initialised when we start?'
I'd be interested to see if any psychological research had been done on problem solving techniques.
Similar to Jonathan Sampson - it kind of just works.
When I'm attacking a real problem, I try and think of the most logical way of getting through it is.
Then, when everything goes wrong (as it usually does), I have to make hundreds of sidesteps to get things done. Just keep focusing on that end goal, that logical way and you'll get there.
Eventually though, it decides to work for me and I end up with a finished product that is usually nothing like I planned it out to be. As long as the customers are happy, I am!
Personally, I see code in my head pictorally rather than textually (like Neil Butterworth) - it's a bit hard to describe since (quoting STIV) "there's no common frame of reference."
My main skill is identifying similarities between models or systems I already know about and the task at hand. The connections between some of these can seem quite abstract; the key is to spot the connections. This leads to the abstraction of common patterns and approaches which are widely applicable. Related to this, the most important thing I learnt about algorithms was that the problem is never 'come up with a smart algorithm to solve X'. It's 'model problem X such that it can be solved by existing smart algorithm Y'.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
When looking at the myriad types of software written at our company, I instantly jump to conclusions of the quality of the entire product based on the UI. If I find misspellings, weird tab orders, fields not lined up, odd colors, I assume that the entire application is of poor quality.
I'm assuming that if the programmer doesn't care enough to make the outside look good, that they don't care enough at all. I am NOT assuming if the UI looks good that the application does what it should, although I am not immediately down on it -- it gets more leeway when it's being evaluated.
Is this a valid decision to make? For commercial software as well?
It may or may not be. But that's not really relevant. To your end user, crappy UI = bad code.
I think it's a good indicator of the care that a developer has for their work - basically a sense of professional pride.
It's a given that most devs don't make fantastic UI designers, but there are a basic set of rules that should be followed when developing professional software and these apply as much to the UI as they do to the internals.
So, basically I agree with you.
IF the application was written by one developer its not an unfair assumption that a slovenly UI is indicative of the underlying code quality.
However if it was written by a team of 5 or 7 or 13 there will likely be a wide range of quality under the hood (it just might be the newbee was given the UI).
Also if the app is 5+ years into its lifecycle with maintenance being performed by FBN contractors or interns or whoever is handy you may find a lot of good code under the hood thats slowly rotting because of indifferent management and undisciplined developers who just throw a "patch" at it, compile it, check it back in and throw it over the wall to production.
A crappy UI can be indicative of a lot of things, none of them good, some worse than others.
In my opinion it is a valid decision. And you are right when you say that good looking software is not necessarily good software internally.
But definitely, if the programmers don't care about the usability of the program, most likely they won't care about it's functionality.
If the UI is riddled with typos and inconsistencies, it is probably fair to say that the QA process and project management were a bit lacking. Doesn't really infer that the codebase is riddled with bugs.
In a commercial product, it most likely means that less people will buy it, so whilst sales are not really a quality metric, they're pretty important in the overall scheme of things.
People are more likely to buy things that look good, behave as they expect them to, and "Don't make them think".
Many programmers suck at UI design, and that's not their fault, it does not mean they suck at coding. They're just generally more interested in the internal beauty of what they make, otherwise, they'd be liberal arts majors instead.
It really depends. I know of software developers who are excellent at just about all aspects of design and implementation but have lousy UI skills. Many times the UI is an afterthought as a nod to the user. In the cases of scientific software or other software where the processing is central or key, it might not be a good idea to judge the quality of the rest of the code by the UI. However, overall - it might be a good indicator that the software company has not done its job well.
It all really depends on each case, but if the UI is not usable or a pain in the neck, then the underlying code is harder to use and not worth the time I suppose.
The opposite is also not true - flashy, beautiful UIs do not mean that the underlying code is good at all. Anyone can wrap a piece of junk with a nice UI.
I'd agree with the masses here. Poor UI mean that the product development team dropped the ball.. That said. I consider myself a good coder. Great at math, but dyslexic and attention deficit disorder.. Give me a set of earphones and some code and I'm on my way. Don't however expect me to mock up a great GUI. Line things up.. That I do.
Now ADD to that the fact that as the "programmer" even when I see things in the GUI that bug the crap out of me (as a person who uses it), I don't get to fix them.. Hell when I do fix them I get QA asking me for the design document and the approval from on high. After a while I stoped caring about the GUI..
I write solid code, that works. It's fast, clean and small.. it's where I get to have an impact. The GUI is beyond my pay grade. :(
In my experience it's usually the other way around. You get good quality UI's by having people who spend "huge" amounts of time focusing on widget behavior and look&feel instead of domain model or automated tests.
Some of the best quality systems I've worked with had auto-generated UIs, that were rather unpleasant to use.
As much as I really want to say, "Yes, absolutely," it's not always a valid conclusion. The programmer or QA team may have an excellent understanding of the application but a terrible grasp of the language or presentation.
Some people simply focus on what they consider to be important—and get it fairly close to perfect—and all but ignore what they consider "fluff" or "window dressing."
But I do have very a strong tendency to pre-judge the overall quality of the software based on first impressions.
No, UI is not indicative of internal code. Many a time's we come across things that are shiny and look cool but serve no purpose. Think of it as a seeing a Ferrari parked at the store. It looks awesome and you wonder what it would be like to get behind the wheel -- only to find out it's a body kit slapped on a 1980 late model Acura that has 500k miles on it.
A personal example, at my current employer, we have stellar code in our software (and I say this subjectively since I was not there for 99% of it's creation). But when you look at our UI, it can seem a bit old and rusty. That and many a times the UI developers don't even touch much of the internal code.
"I'm assuming that if the programmer doesn't care enough to make the outside look good, that they don't care enough at all" - I don't believe this to be true. I think most programmers look for functionality as opposed to shininess as they tend to be creatures of logic, not artists.
Take linux as an example -- stellar internal code, but UI was lacking for a long time, thus why no one in the mainstream has used it extensively as opposed to Windows or Mac.
Short version: !UI.Equals(InternalQuality)
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
You've just written a pile of code to deliver some important feature under pressure. You've cut a few corners, you've mashed some code into some over-bloated classes with names like SerialIndirectionShutoffManager..
You tell your boss you're going to need a week to clean this stuff up.
"Clean what up?"
"My code - its a pigsty!"
"You mean there's some more bug fixing?"
"Not really, its more like.."
"You're gonna make it run faster?"
"Perhaps, buts thats not.."
"Then you should have written it properly when you had the chance. Now I'm glad you're here, yeah, I'm gonna have to go ahead and ask you to come in this weekend.. "
I've read Matin Fowler's book, but I'm not sure I agree with his advice on this matter:
Encourage regular code reviews, so refactoring work is encouraged as a natural part of the development process.
Just don't tell, you're the developer and its part of your duty.
Both these methods squirm out of the need to communicate with your manager.
What do you tell your boss?
It's important to include refactoring time in your original estimates. Going to your boss after you've delivered the product and then telling him that you're not actually done is lying about being done. You didn't actually make the deliverable deadline. It's like a surgeon doing surgery and then not making sure he put everything back the way it was supposed to be.
It is important to include all the parts of development (e.g. refactoring, usability research, testing, QA, revisions) in your original schedules. Ultimately this isn't so much a management problem as a programmer problem.
If, however, you've inherited a mess then you will have to explain to the boss that the last set of programmers in a rush to get the project out the door cut corners and that it's been limping along. You can band-aid the problem for awhile (as they likely did), but each band-aid just delays the problem and ultimately makes the problem that much more expensive to fix.
Be honest with your boss and understand that a project isn't done until it's done.
Speak in a language he can understand.
Refactoring is paying design debt.
Ask your boss why he pays the company credit card bill every month vs not paying it until there is a collections notice. Tell him refactoring is like making your monthly payment.
Just do it and schedule it into your normal process. Estimate refactoring time into starting a new change or into finishing a change (ideal).
I always refactor while I'm initially exploring new code (extracting methods, etc).
Lie. Tell him it's research into a new technology. Then tell him you decided the cost didn't justify the benefits. He'll think you did a great job.
lol # people down modding / marking offensive.
Really, if it's a penny pinching boss, who doesn't understand good software from cheap software, what he doesn't know will ultimately make him happier. if it was me, i would leave the company and go someplace where they respect their developers ability to write good code. But then again, this is why I'm in a senior position.
Tell him 80% of the costs associated with a software project comes in the maintenance phase of the lifecycle. Any refactoring done now to alleviate future problems, and have some examples, will net substantial cost benefits later on when the need arises to maintaining that code.
This is assuming you are refactoring for a reason and not for programmer vanity.
Refactoring you should do all the time.... so you shouldn't have to justify it.
Cleaning up big messes / Redesign may include refactoring in order to get it under control, however its not "Refactoring"
Refactoring should be a matter of moments...or if you have no tool support, minutes.
In one of Robert Glass's recent books (I'll have to look up the reference) he mentioned a study on the cost of well maintained code. What they found is that well maintained code was edited more often than poorly maintained code. That sounds counter intuitive but when they dug deeper the discovered the reason:
Well maintained code has more features added to it in the same time frame than poorly maintained code.
Does your Boss like features? Sure, they all do. If more you improve the maintainability of the code, the more features you will be able to deliver with that limited budget.
I like the answer given in "Refactoring" by Martin Fowler. Tell your boss that you are going to develop software the fastest way that you know how. It happens that in most cases the fastest way to develop software is to refactor as you go.
The other thing to tell your boss is you are reducing the cost to make future improvements.
Less money now for me to refactor...
or more money later to fix whatever goes wrong and for me to refactor.
Sometimes, it's just time to get a new job. There are certian poeple who just want you to "get it done". If you are ever in one of those situations, and I've been there, then just leave.
But yeah, all that other stuff about future costs and such is good idea. I just think that most bosses lie to themselves because they want what they want when they want it, and they are just not able to see what's going to happen in the future.
So, good luck with your boss. Hpefully he or she is reasonable.
Dont.... just go get a new job in a place thats more in synch with you.
I think you should just start working on it without telling your boss. This is truly how I've done my best work. I just don't tell my boss what I'm doing and slowly replace bad/legacy code when I have time.
It has acutally saved my ass on more than one occasion.
If your boss doesn't understand the need to refactor or clean up code, then you have to wonder if he has enough engineering knowledge to be an engineering manager.
It's rare to find a boss who will give you time to refactor...just do it as you go along.
In my opinion, the simplest case to make for refactoring is fixing overly complex code. Measure the McCabe cyclomatic complexity of the source code in question (Source Monitor is an excellent tool for such a problem). Source code with high cyclomatic complexity has a strong correlation defects and bad fixes. What this means in simple terms is that complex code is harder to fix and more likely to have bad fixes. What this means to a manager is that the quality of the product will likely be worse, and the bugs harder to fix, and the schedule for the project ultimately worse. However, in refactoring out the complexity, you are improving the transparency of the code, reducing the likelihood of obscure / difficult bugs, and making it easier to maintain (e.g. a maintenance programmer can have a larger maintenance scope because of this).
Additionally, you can make the case (if it isn't a dead product in maintenance cycle) that decreasing complexity makes the application easier to extend when new requirements are added to the project.
The boss has to trust the dev to make correct technical decisions (including when to refactor).
Establish that trust or replace the boss or replace the dev.
Another good analogy is the maintenance of a tidy building site. The only catch here is that a programmer does not represent a construction worker, and a manager does not represent a foreman. If that were the case, his counter of "do it right first time" would still apply, since a competent and conscientious construction worker is responsible for maintaining good order on their workspace as they go.
Really the code itself represents the labourers, and the development process is the foreman. The mess is generated by various trades going about their business around one another (i.e. by different code features interacting, where each feature does its job well, but the seams between them are disorganised) and is cleaned up by the foreman taking a firm hand and keeping an eye on where disorder is setting in, and acting to get it cleaned up (i.e. the software process demanding refactoring).
What I just did recently is to explain to my business counterpart that the re-factory process helps to develop new features faster and decrease the probability of new bugs because the code has a better order and structure, and is even posible to make some speeds improvements because you can inspect the code easier than before.
When the business guys get that, if they are smart, they will encourage you to do a constant re-factory process.
You can explain that with a building metaphor. If you don't do refactory you will end with a crappy building with a bad core so you will have problems with the pipes, windows, doors.