Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I am currently trying to make the transition from a technical PM to a Developer.
Obviously this depends very much on current level of knowledge / experience, but are there some key things that a PM (who also codes regularly) might have missed from not strictly working as a Developer.
Also would a course like this help in the right direction?
http://www3.imperial.ac.uk/computing/teaching/postgraduate/msc-computing-science/description
Considering I want to work on Audio/Video/3D ideally, I feel this course could be a good leg up?
As a technical PM you have the advantage of knowing the terminology etc so that is at least a heads start. As to making the switch check out information on areas such as
computing fundamentals - low level concepts on computer hardware, network and protocols.
algorithms - for an understanding of sorting, graphs, networks, trees, etc.
architecture and design - web application architecture, messaging architecture, UML, use cases, documentation.
programming languages - OO, scripting and AI (at least to get a feel for the types and applications)
business end of programming - software estimation
This is a broad spectrum of areas that you would need to have at least some exposure to for the transition. In fact it might even be useful if your current employer allowed you to work as the developer on a small part of a project. You'd certainly gain respect from the developers on a project coming from the technical PM role and could even enlighten the developers.
If you have a passion for working in an area, seriously consider the amount of creative freedom, in your experience, developers have as compared to PMs. Make sure that's acceptable to you.
Nothing is worse than having passion in an area, but little or no influence.
As far as technical abilities go, the only thing to do is to code. Any classes primarily will act as ways to ensure that you do so, and do so in ways that will teach you. But at the end of the day, it's going to boil down to time spent writing software.
If you really want to become a great developer, learn at least one language radically different from the languages you know. If you're a Java/C++/C# kind of guy, learn something that will really torque your brain like Haskell, Erlang, or Scheme. To just learn really good OO techniques, learn, read, and write some Smalltalk.
The best thing to do is to spend ten years or so programming during every waking moment. That's what worked for me!
First of all get start practicing to type all day ! Then get ready to work on minute details which a developer works on everyday like... code shortcuts, coding styles, commenting etc.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
Have any commercial video games ever used Prolog? With is rules-based logic based model it seems like it have some place in the industry.
PS: as odd as this question is it still meets all the criteria for a question on SO.
Not a commercial game, but I was in a game jam just this last weekend, and we wrote the entire game (a small MMO) in Prolog. It's probably only a fantasy, but we discussed expanding the game into a game engine. That game engine would be rule based.
I guess I should add that I've worked on prolog systems that were near real time.
I work in the game industry and I doubt it very much. I have seen only one guy use prolog and it was for a build bot rule to automerge git branches into subproducts and overversions, and not in a game company.
That said, it could make sense for some fuzzy AI, but everything related to AI in the business is far from the research papers in practice. Real game developpers and producers hate unpredictability, basically for business reasons, today games are merely interactive movies.
Everything is on rail, scripted and controlled. Artists are very uncomnfortable with algorithmical rules, and game designers are artists. In my programmer's opinion, games with sophisticated AI must have beneficiated from a high ranking programmer in the company to push for it.
Or the game really required it, for example hitman. However if you see some of their talks (they have presentations at GDC, Cedec...) they say most of their work is empirical, and I tend to think by that, made in typical imperative programming.
Thirdly, you also get the problem of maintenance, and people knowing the language, which is.. few. Most of computer science graduates will have heard of it, followed some tutorial at the school/university but quickly forgotten about it anyway later. And you see, in game companies, a good percentage of programmers are self made, and even drop outs ! This leaves little room for prolog I can tell you that.
Lastly, you need to think about a technical point : performance. prolog underlying execution machine is somekind of a danger to real time. Because it has this simplex solver based on tree branches elimination heuristics which can run for god knows how long. Most games make scarse use of multi threading because of platforms limitation, or because of synchronization problems with the game data which has to be in synch on a by-frame basis for lots of things.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I'm in my first development job out of college and have been handed a (solo) project that is completely outside the range of my skills/experience both in terms of the technologies being used and the sheer scope of the thing.
I've spent the last 6 months or so basically completely retraining myself and then starting to do the thing, and although I did very well at college and I think that I'm on track for delivery, I've had zero feedback on what I've been doing and I'm suddenly starting to feel very much out of my depth.
My direct supervisor, while a nice guy and I think a competent coder, doesn't have the best communication skills and basically told me to "read a book" when I asked him for a bit of guidance, which is not really what I was hoping for!
Am I just being unrealistic about the amount of support I can expect as a junior developer? It seems to me that ignoring the issue and ploughing ahead runs the risk of a failed project which is to no-ones benefit. I could take my request for guidance a step higher to the head of development, but I don't want to sound like I'm saying I can't do the job nor do I want to make my supervisor look bad.
Can anyone suggest a good approach for saying "help!" without making myself or my supervisor look bad?
This is a great question, and I think a fairly common situation. Basically, I think what you're asking for is guidance on how to communicate with your boss, and the other people in your organization.
This might be a good time to look into the scrum framework, and take from it what seems applicable to your environment.
In particular, you mention that you might be in over your head. Or, there is an (implicit) expectation that you'll need to finish this project "tomorrow," when you really don't know how long it will take.
I suggest starting with a list. Write down everything you need to do. Include non-coding activities, like "research technology X for doing Y," and give each task a basic time estimate like "1" for short, "2" for medium, "3" for long. Then put the things in an order that you think makes sense.
Then meet with your boss, once a week, for like 20 minutes, to discuss what you did, and what you're going to do next week. Out of this discussion, you'll both see what's going on, and adjust expectations (and the list) accordingly. When conflicts of expectation come up, talk it out.
Regarding the amount of support to expect as a junior developer, this really depends on your organization, and your supervisor's opinion. As software engineering is still a relatively young profession, there isn't much in the way of industry-standard mentoring programs.
I suggest trying the list + meeting thing for a couple months, and observe how your opinion of the support situation changes. Then, go to a large conference as soon as possible; spend the money if you need to. You'll see who is struggling with similar situations, and also who is not, and you'll create your own, more-informed model of "how the industry is supposed to work."
Regarding a good approach to communicating, I (seriously) suggest The Seven Principles for Making Marriage Work by John Gottman, which has a lot of examples of what works and doesn't work when communicating with people.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
In a big corp, they often ask developers to fill in a matrix of what skills they have at what level. It's generally seen as a bit of a pain but is it actually useful, or another way for bureaucrats to try and reduce developers to a bunch of numbers on a spreadsheet?
Skills matrix are only partially helpful, they are good at giving you a general picture of your current "experience".
However these skills matrix does not include the most important aspect, the ability to learn.
This is the most important skill in IT in my view. And everyone learns at different speeds.
Eg. Throwing guy A into a new technology stack, and how long before he/she is productive?
Since IT/software development is a very wide field I regard skill sheets as quite useful. I used to be a Linux expert and my skill sheet reflected that. Then I shifted into iOS/Mac development and my now-employer asked me to fill out a skill sheet tuned to Mac... and I immediately noticed that I was novice in this field back then ;-) Vice versa, they were able to see whether I can fit into the company and where (in which team).
So of course they can be harmful if you lack the skills, but I think they make choices for employers easier (and I regard a big skill sheet in my CV as the most important part of the CV, even more so than the list of projects done).
The usefulness is totally dependent on what is being assessed. I work in an insurance company and this was done for all staff here. There was no category that I fit into and all the criteria were irrelevant.
I can see the benefit of assessing relevant criteria, it can identify weaknesses and target training, but those criteria need to be defined by someone who knows what you might not know.
Most of all, don't berate the bureaucrat for simplifying a complex object into a manageable set of information. As a programmer that's what you should be doing every day.
I think it is appropiate on big corps, but for small and specialized consultancies I would make a personal interview.
In big corporations if you dont fit in one place you may fit in other... in small teams I rather do personal assessment .
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
Is there a tool or a formula for calculating man-hours required for a certain project? Either by specifying the details, either, even better, input the sources and have it calculate a measure of how many man-hours were put into the project.
Edit:
I often hear about big projects, with components built in parallel by numerous groups, that they took a couple of thousand man-hours to complete, but they were finished in just x days... probably an argument supporting the teams' efficiency... so I think it might be possible to at least estimate these measures. I am convinced that efforts were put into making these estimations automatic, and even though they might not reflect the actual time invested in the project, at least I'd like to know that the "state of the art" is in this kind of endeavour.
There is a whole science to this called Function Point Analysis.
Read through this introductory article.
Or try the Wikipedia article for more references and external links to follow up.
This technique is based on looking at the functions which are to be implemented in the software, and assigning a point count to them. Then you plan on how many points can be achieved per day to figure out a schedule.
There are also techniques that lean more on psychology which involve asking people to estimate the time of of individual tasks in a project with best, worst, most likely, cutting their estimates in half and padding out the end of the project with an unspecified buffer time which can be used for late running tasks, only if needed. This works by giving the developers a short timetable for results, but promising management/customers a longer timetable. It's called Critical Chain Project Management and has been used with success in defense projects.
Introduction to Critical Chain
Wikpedia article
Estimating the man-hours for a new project is more about experience than formulae. When I started programming there was this notional "10 lines per hour of debugged code" that was the yard stick but that figure varies hugely based on the type of code, the language and the experience/skill of the software engineer.
I suggest that you search the internet for articles on estimating software development projects, one example of such an article would be this How to estimate a software project in man-hours?. But it's by no means definitive but does demonstrate the complexities involved.
As for looking at some code and estimating, you might as well stick a wet finger in the air and guess. Only the programmers involved would know and even then I suspect the answer wouldn't be accurate. At the end of the day it's an estimate, not a quote or a fact, and as such often open to wild variations.
Unfortunately general the answer is no -- there is no, ready to use, formula to calculate man-hours for software project.
However, software project estimation, is a huge problem and there are a lot of ways to deal with it.
Many solutions are described in Steve McConnell's book Software Estimation: Demystifying the Black Art.
Steve's company offers also some resources and tools (some of them are free) which help to estimate software project.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
we have a number of functional deliverable planned for 2010 but we also have a technology agenda (architectural refactorings, consolidation, upgrade a platform). any suggestions on the best way to include these in a roadmap to help the business understand why they are important.
one option is just saying trust us as this is the right thing to do to keep everything healthy but i would like some better visualization if possible
Being a bit cynical about it, I would say phrase every thing in terms of money. If you can't re-write your technical agenda in terms of money made or money saved, then why are you doing it at all?
Also, there is an article on "technical debt in financial terms" that I found very useful at:
http://forums.construx.com/blogs/stevemcc/archive/2007/11/01/technical-debt-2.aspx
One of the more interesting points, to me, is "One of the important implications of technical debt is that it must be serviced, i.e., once you incur a debt there will be interest charges."
There is a brief follow up at
http://forums.construx.com/blogs/stevemcc/archive/2007/12/12/technical-debt-decision-making.aspx
Show how support time, time between failures, number of problems should go up if you won't do the change.
Each technology has it's time limits, end of support from the manufacture, and regular life cycle.
Example - if you use MFC - you can show that programming a simple task in MFC is 3 times slower than in winforms. so after x months, the benefit from not upgrading will be lost.
with equipment it is even easier, as the older the equipment gets, the more mal-functions there are and it is easy to show (usually after the 3 years covarage everything starts to break, and I think it's planned like this. it didn't use to be but these days it does).
with infrastructure - again - if you have oracle 7.6 - show how much more time (money) you spend on administration and how less will be spent in 11g.
ect. ect. ect.
eventually manager wants to see ROI... TCO... BLA BLA BLA so you need to give them that.