When Memory profiling is required? - performance

Whether its required to do memory/performance profiling on all applications? If not when it should be done?

Profile an application when performance does not meet requirements, or when there's a good business reason to focus on optimizing performance (in our case we run dozens of web servers, so every so often we dedicate part of a sprint to performance tuning because there's a real impact to our operations budget by doing so).
It can also be worth profiling applications now and then even if these conditions are not met to help you understand what matters, and what does not, in performance. Usually though there is a large backlog of software to write, so I would not spend too much time profiling without a specific business reason. It's easy to spend a LOT of time optimizing things because it's an interesting Engineering challenge rather than because it helps the software really be better to its users.

Related

How to use miniprofiler to help us crush loading speeds?

As a developer and constant user of minipfoler, I use stakoverflow as the benchmark for my .NET sites. That is because the entire stack network is just a blazingly fast.
I know miniprofiler is used on stackexchange. There is a whole developers thing that can be used on stack but can we enable the stats to see how fast it really is?
I might be bit over obsessive here - but I am looking to improve permanences in milliseconds and the only viable benchmark is a large and complex site like stack exchange.
I know it might be a security issue to see live data but I just really want a benchmark (screenshot / guidelines) to see how far I can optimize my .NET MVC web application.
My actual IIS and MVC performance is fantastic and I think I am more concerned about server replies and client side stuff. So can I (and should I) put more effort into smashing down this response time?
This site is hosted in Azure Cloupapp and using Azure DB - I know about 60~180ms is used on connection times that are out of my control.
How can I improve times between Paint, Load and Complete?
I find that I answer my own question on StackExchange more often now a days. Not sure what that means. But this in interesting what I found while dealing with other Q&A's (And it answered this question)
Yes, you should avoid the obvious beginner mistakes of string
concatenation, the stuff every programmer learns their first year on
the job. But after that, you should be more worried about the
maintainability and readability of your code than its performance. And
that is perhaps the most tragic thing about letting yourself get
sucked into micro-optimization theater -- it distracts you from your
real goal: writing better code.
Posted by Jeff Atwood
There is no real problem in performance or serious delays. Its just an obsession that wont lead to much satisfaction.
The 'dudes' got a point. As long as my code is readable and it runs fast - what the heck more do I want?
PERFECTION! - Waste of time, lol#me!

Tackling Scalability

I have recently made the shift to a framework called the Yii Framework. Really is good to be working on a framework, solves lot of mundane work. I believe the framework will provide me the platform to build a great site but a senior developer keeps asking me these questions.
How many concurrent users can the site handle?
Is the site scalable?
What is the performance level of the website?
If you were to answer those questions about your project how would you do it?
and also how would you tackle the scalability issues once and for all.
P.S. Any references i could read upon will be greatly welcome.
Scalability is not a hole that one can plug. It is a very broad and generic topic in itself. One of the best approaches I've seen is that of youtube
while (true)
identify_and_fix_bottlenecks();
drink();
sleep();
notice_new_bottleneck();
Having said that, database is usually the bottleneck in most of the web applications and the choice of web framework doesn't matter much. Things like number of concurrent users and performance levels will be sufficiently large for most of the frameworks.
While this answer is quite late, I hope it helps you in your future projects.
You should not look at scalability as a bandaid or a one time fix.
As the usage of your application changes, your scalability requirements will change and evolve. Also, there is no silver bullet for addressing scalability. It is a mix of various approaches like caching, replication, distribution, performance tuning, hardware upgrades etc. You should choose from those based on the context of "what you want to scale" and where will you get maximum bang for the buck!
check out this link
http://sevenoct.wordpress.com/2013/07/04/scaling-applications/
which has some good information about scalability and how not to fall into traps of "sought after" scalability mantras

Redgate Visual Studio addin

I realise that this may be subjective (and would appreciate not being voted down on this one XD), but I would like some advice from other developers out there who have used RedGate's .Net productivity addins - ANTS Performance Profiler Pro, ANTS Memory Profiler, and Excpetion Hunter. Its quite pricey, and basically, does anyone recommend it? And do the ANTS products do what they say they can (respectively)?:
Identify bottlenecks and ensure code is performing optimally
Zero in fast on common causes of memory leaks
Anticipating your input on this. much thanks!
I have evaluated the ANS Performance Profiler, and it's a great tool in my opinion, well worth the price. If you ever discover (and solve) a single annoying performance blocker with its help, it's more than worth its price - at least for professional devs (rather pricey for single home / hobby devs, I agree).
I have both the RedGate performance and memory profilers, and both are good. I used the trial of Exception Hunter when it first came out, but didn't see a need for it so I don't have a licence for that.
ANTS Performance Profiler - this is very good and I have used it many times to identify bottlenecks in code. The user interface is intuitive and easily shows slow/inefficient areas to focus on.
ANTS Memory Profile - I've had less success with this as I find it harder to use. I also have a licence for the SciTech Memory Profiler which I find a better tool for memory profiling, allowing you to see more detailed information and drill down into it easier.
My biggest niggle with the RedGate tools (and this applies to all of their tools) is that they do not work through authenticating proxies and there is no way to configure them to (this doesn't stop them from running though).
If cost is an issue, Eqatec make a free performance profiler. I've never used it though, so cannot comment on how good it is.
If you are looking to solve a specific memory/performance issue, the cost of these tools will pay for itself in saved time. If you are just curious about your application then it would be a harder cost to justify.
Good tools cost more money that lousy ones. From everything I've heard, seen and personally observed, RedGate produces good tools. Using lousy tools takes more of your time. How much that time is worth to you or your employer is something we cannot judge from the information you provided. In the Western world, a good tool pays itself back in only a few hours. That's an ROI that's hard to beat.
Do make sure you adjust that ROI by the amount of time you'll need to learn how to use the tool. You'll get a quick insight in that from spending an hour on the trial version.

A upgradable approach to design a web application system

Many poeple have online startups in their head that may potentially attracts millions, but most of the time you will only have minimal budget (time and resource) to start with so you want to have it delivered within a year's time. Short after launch, you are bound to perform one or a series of upgrades that may include: code refactor to newer foundation, adding hierarchy(ies) in software architecture or restructure database(s). This cycle of upgrade/refactor continues as:
New features avaiable in latest version of the language(s)/framework(s) you use.
Availability of new components/frameworks/plugins that may potentially improve the product.
Requirement has changes it's direction, existing product wasn't designed to cope with new needs.
With above as prerequisite, I want to take this discussion serious and identify the essence of an upgradable solution for a web application. In the discussion you may talk about any stages of development (initial, early upgrade, incremental upgardes) and cover one of more of the following:
Choice of language(s) for a web application.
Decision for using a framework or not? (Consider the overhead)
Choice of DBMS and its design
Choice of hardware(s) and setups?
Strategy to constant changes in requirements (, which can be a natural of web application)
Strategy/decision toward total redesign
Our company's web solution is on its fourth major generation, having evolved considerably over the past 8 years. The most recent generation introduced a broad variety of constructs to help with exactly this task as it was becoming unwieldy to update the previous generation based on new customer demands. Thus, I spent quite a bit of time in 2009 thinking about exactly this problem.
The single most valuable thing you can do is to employ an Agile approach to building software. In particular, you should maintain an environment in which a new build can be (and is) created daily. While daily builds are only one aspect of Agile, this is the practice that is most important in addressing your question. While this isn't the same thing as upgradeability, per se, it nonetheless introduces a discipline into the process that helps reduce the chance that your code base will become unwieldy (or that you'll become an Architect Astronaut).
As far as frameworks and languages go, there are two primary requirements: that the framework be long-lived and stable and that the environment support a Separation of Concerns. ASP.NET has worked well for me in this regard: it has evolved in a rational manner and without discontinuities that invalidate older code. I use a separate Business Logic Layer to manage SoC but ASP.NET does now support MVC development as well. In contrast, I came to dislike PHP after a few months working with it because it just seemed to encourage messy practices that would endanger future upgrades.
With respect to DBMS selection, any modern RDMS (SQL Server, MySQL, Oracle) would serve you well. Here is the key though: you will need to maintain DDL scripts for managing upgrades. It is just a fact of life. So, how do you make this a tractable process? The single most valuable tool from any third-party developer is my copy of SQL Compare from Red Gate. This process used to be a complete nightmare and a significant drag on my ability to evolve my code until I found this tool. So, the generic recommendation is to use a database for which a tool exists to compare database structures. SQL Server is just very fortunate in this regard.
Hardware is almost a don't care. You can always move to new hardware as long as your development process includes with a reasonable release build process.
Strategy for constant changes in requirements. Again, see Agile. I'd encourage you not to even think of them as "requirements" any more - in the traditional sense of a large document filled with specifications. Agile changes that in important ways. I don't keep a requirements document either except when working on contract for an external, paying customer so that I can be assured of appropriate billing and prevent feature creep. At this point, our internal process is so rapid and fluid that the reports from our feature request/bug management software (FogBugz if you want to know) serves as our documentation when documenting a new release for marketing.
The strategy/decision for total redesign is: don't. If you put a reasonable degree of thought into the process you'll be using, choose mainstream tools, and enforce a Separation of Concerns then nothing short of a complete abandonment of HTTP and RDBMSs should cause a total redesign.
If you are Agile enough that anything can change, you are unlikely to ever be in a position where everything must change.
To get the ball rolling, I'd have thought a language/framework that supports the concept of dependency injection (or Inversion of Control as is seems to be called these days) would be high on the list.
You will find out that RDBMS technology is not easily scalable. All vendors will tell you otherwise yet when you try multiple servers and load-balancing the inherent limitations will show up. Everything else can be beefed up with "bigger iron" and may be more efficient code but Databases cannot be split and distributed easily.
Web applications will hopefully drive the innovation in database technologies and help us break out of the archaic Relational Model mind-set. It is long overdue.
I recommend paying a lot of attention to this weak link right from the start.

The pros and cons of "Shadow IT" in software development [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 months ago.
Improve this question
Recently we’ve seen the emergence of so-called “Shadow IT” within many organisations. If you’re not already familiar with the term, it refers to those who manage to dodge the usual IT governance by means such as using thumb drives to share files or “unapproved” software products to achieve business tasks. Shadow IT can emerge from within technology groups but in many cases is sourced from non-tech areas such as the marketing or sales department.
What I’m really interested in is examples you have of Shadow IT within software development. Products like Excel and Access are often the culprits as their commonality means they’re easily accessible to the broader organisation. In many cases this is driven by someone who has just enough knowledge to make the software perform a business function but not quite enough to be aware of all the usual considerations required when building software for an enterprise.
What sort of cases of Shadow IT have you witnessed in the software development space? What processes have you seen unofficially addressed by this practice and just how important have these tools become? An example would be the use of a single Access database on a folder share becoming common practice for tracking promotions across the marketing department. Remember this cuts both ways; it can be extremely risky (lack of security, disaster recovery, etc) but it can result in innovation from a totally unexpected source.
Why does IT assume they should own and control all technology in the business?
The very fact that we have a name for technology that IT does not control (Shadow IT) suggests that we'd like IT to have control over all technology in an organization.
The only real reason I can think of for IT to have control is security (even then, I'd be very weary of trusting the most sensitive data to IT). Most other reasons given against business user-developed solutions are completely false. Take the reasons above: "software produced may not be well designed...", "the software may not be well supported...". Who are we kidding here? IT's track record on these fronts is simply not good enough to claim the high ground here.
Savvy business users solve their own information problems - they have been doing so long before IT existed. Anyone remember triplicate forms? Fax machines? Photocopiers? These things didn't need IT departments to govern them and they worked very well. If IT cannot solve the problem, or IT's track record has been sufficiently poor that business users have lost faith in IT, then business users will solve their own problems, using whatever means are available to them. Access, Excel, and shared drives are frequently used very successfully by business users. If IT is to stay relevant to an organization, it needs to support it's business users needs and deliver technology that people actually want to use, not just technology people use because they have to.
I have seen an organization where a multimillion dollar portal implementation promised to solve many business technology and information sharing problems. Years later, still not in production, business users gave up, and in despair developed their own solutions by outsourcing the development of a data centric web application. Guess what? It worked brilliantly and other departments are now bypassing IT and doing the same, on their own departmental budgets.
IT is a support organization for business users. This may offend some who believe IT's place to be somewhere alongside executive management in terms of its importance to the business, but IT has to deliver what the business needs, otherwise its just justifying its own existence.
The advantage is that users get exactly what they want and need, when they want and need it. Getting a request through a largish IT shop is a trying experience for a user. IT rarely has the business knowledge to let them give the business owners exactly what they are asking for, and when requests are denied or requirements amended, an explanation in plain English (or whatever language) is rarely forthcoming.
The disadvantages outweigh the benefits. Societe Generale lost billions due in part to "Shadow IT". It can cause support nightmares when an Access application, for example, becomes essential and outgrows the capabilities of the person who created it, or that person leaves. Even a poorly written Crystal Report can become so popular and widely used that it starts to drag down the database it is accessing when reporting times comes around. And if the person who wrote that report did not fully understand relational databases, it could produce bad data in some situations; data that causes bad business decisions to be made. Using a commercial (outsourced) application guarantees that the users will not get exactly what they want; there will always be compromises, and no explanation of why they were made.
The previous poster was right. Shadow IT exists because IT does not do its job well enough. There is not enough business knowledge, not enough responsiveness, and especially not enough communication. These things are why "Shadow IT" exists. The business owners paid for the machines, the admins, the dbas, and the programmers. It frustrates them when IT loses sight of that.
At the end of the day, the primary driver for most businesses is results i.e. making money. If the business sees that it can achieve the desired outputs necessary for the operation without spending thousands on software but through "shadow IT", then I can only see it being encouraged. I feel that that it is part of our job as developers to point out the pitfalls in operating in this fashion.
The pros of "shadow IT" could be
cost - less expensive
whilst the people writing the software may not be software experts, they are likely to be domain experts and have an intrinsic knowledge of how a piece of software should function.
depending on how the IT is organized, "shadow IT" may be able to respond faster to changes and business needs than the core IT can.
And the cons
software produced may not be well designed to be extensible, handle errors correctly a d all other aspects that come from experience in software development.
the software may not be well supported or, due to the way in which it has been produced, there may be no support at all.
Over time, the average person is becoming more IT savvy. Younger marketeers and finance people know that Excel and Access make them vastly more efficient. Working without them would make them feel handicapped.
I expect this trend to continue, and Corporate IT becoming more of an enabling organization. Where you make available data, help users troubleshoot their workflow, and limit them to a specific compartment for security.
What was called software development 10 years ago, will be everyman's tool 10 years from now!
There is no such thing. There are dinosaurs, and there are people who need to get work done.
If something like 'Shadow IT' happens, it is because 'Official IT' is not doing its job.
Software developers have hundreds of little and not so little applications they need to get their work done. The IT governance organisation should learn how to handle tens of updates a day, and switch to releasing daily (and patching a few times a day). Development has learned how to do that, they are next.
Sometimes I use Amazon EC2 and/or RDS when my company's resources are not enough or would take too long to provision. I pay for this out of my own pocket but do get to achieve my goals faster. All this without having to spend painful hours in meetings, trying to convince superiors or the SA-s that I really do need to do some thing or other.
In my mind, EC2 is the ultimate shadow IT. It's super easy to get going and provides me with the ultimate control.
Well, I suppose these things are everywhere. Not a big deal if it not threatens the company operation in any way.
Ya it's a big problem where I work. Architects and DBA's try to make a centralized system but these little "Shadow IT" departments make these small apps that have their own security or duplicated data... Personally, if I was the head of IT I would fire anyone who started such a project without IT support. Kinda harsh but it's important to keep the system healthy.
Most software developers have "unapproved" software on their computers. Just expect it. I'm not sure how much I have, but I'm sure I have dozens, if not hundreds of utilities that corp. IT has never even heard of on my work laptop.

Resources