Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
the point that every request need some processing. I want to know if my resources enough or I have to upgrade it, or may be I have to test my code and optimize it.
My resources :
4 CPU and 8G ram.
Any outlines, test tools will be appreciated.
If you are serving a static small page - you will be able to serve hundreds of thousands requests per second on that hardware. If you have very "heavy" page which requires lots of server-side processing - it can be just few concurrent users.
I would recommend the following:
Get the load testing tool, for example see Open Source Load Testing Tools: Which One Should You Use? article for comparison.
Set it up to replicate real users as close as possible (headers, cookies, cache, AJAX requests, etc.), virtual users should follow main application use cases
Start with 1-2 virtual users and double check everything works as expected
Gradually increase the load to anticipated number of users and observe system behaviour and main metrics like response time, number of transactions per second, etc. If you are happy with the results - that's it. If not - identify the bottleneck (which is not necessarily your code) and fix it. Repeat. Use profiling tools to inspect your code and identify the most resources consuming parts. Refactor. Repeat.
You can also consider increasing load until your application starts breaking to see what fails first, will the system recover when the load gets back to normal, etc.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
Let's say I have published this project containing a function:
int sum(int a, int b) {
return a + b;
}
I have written the tests:
assert(1==1);
assert(2==2);
All the tests pass and so I claim that my project is 100% good and tests have 100% coverage.
My client is smart and knows about fault injection. So he injects a fault in my project and makes it:
int sum(int a, int b) {
return -1;
}
My tests still pass and so he says my tests are useless. Great.
But is there any other use of software fault injection?
I read that it is used to assert robustness of a software? How? Can you use this example and show that is possible?
Also, I was reading some papers that showed how faults can be injected in SOAP messages passed around in the web (*basically meaning that messages that were being sent from one machine to another in the web were injected with faults and the results were observed)*. How is this going to be useful? Obviously this will go wrong and what can you conclude from it?
Please quote with simple examples.
With regard to injecting faults into SOAP messages (or other data sent over the wire): that could be viewed as a form of "stress testing", to determine how robust your application is in the face of network problems, data corruption, malicious clients, and so on.
As you said, "obviously this will go wrong". But the purpose of stress testing isn't to see if your application works as intended (under normal conditions). When you feed the application garbage data/massive volumes of data, deliberately corrupt config files, or suddenly disconnect hardware which is being used, etc., it's expected that your program will not "work". But you want to make sure it will not do something bad like crashing, destroying valuable data, revealing confidential data to unauthorized persons, and so on. If it is an application which services multiple users at the same time, you also want to make sure that illegal input from one user cannot result in a loss of service to other users.
If you have never stress tested your application, the results will probably surprise you!
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
The controller is responsible for responding to user actions. In the case of a web application, a user action is (generally) a page request. The controller will determine what request is being made by the user and respond appropriately by triggering the model to manipulate the data appropriately and passing the model into the view.
Source: Controller - J1.5:Developing a MVC Component/Introduction
So I was wondering, how many http requests simultaneously (json calls, xmls calls, http calls) can a single controller handle before it starts screwing the application? I could use multiple controllers, but honestly, how many requests can a single controller in joomla handle? or in other words Will the joomla performance be affected if there is one controller handling all the requests in contrast of breaking the logic into multiple controllers?
By thinking in terms of Joomla, you are just going to confuse the answer a lot, because you introduce a lot of extra factors. You could ask the same question about any PHP file, like so:
Simpler question:
I have a file called script.php, how many HTTP requests can call this file at the same time.
The answer: How ever many your server can support. Making two files (script1.php and script2.php) won't necessarily improve performance at all. It likely will have some improvement though, because ever php script that is called is loaded into memory and your server only has so much memory.
The second variable would likely be processing power. So the less that the controller has to process, the less load each call would place on the server. (So for example, if you were performing a calculation on a set of data but needed to display it in three different places on the page, only calculate it once and then save it in a variable that can be used for each display.)
In all of this, though, there is no magic number for the number of requests you could handle. Even if you ran tests and told us your controller could handle 72 simultaneous connections, that is a useless number.
What you actually want to know:
So, the test you actually should run on your server is the difference between one controller and multiple controllers. This comparison takes in to account your current hardware that you run the test on and helps you optimize the code.
And honestly on that note, I'm not sure that there will be enough of a difference to matter having worked with Joomla a lot. There are probably far worse bottlenecks in your code, and would do best to focus on standard optimization practices: PHP code optimization
As one final note, I do think it is valuable to have multiple controllers, but this is more so I can remember where the different functions are and what they do over an inherent speed issue.
Generally, the controller is instantiated once for each single request, so each controller instance handles exactly one request. How many requests can be served simultaneously, depends on the resources available (and of course consumed per thread) and will vary from environment to environment.
it depends on the server you have. Read the article below
HTTP Server Performance Tuning
http://www.w3.org/Protocols/rfc2616/rfc2616-sec8.html
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I have an audio site where user can upload their music files , but problem is i can't go for expensive hosting , since i am not monetizing this service.I am searching for some shortcuts to store the mp3 files to cut some hosting cost.
What will be best idea to do technically or any (hosting)suggestion will be help full.
I need to save server space as much as possible.
In most cases, the size of your database will also count against your overall hosting space as well. Furthermore, inserting huge BLOBs into your database isn't going to help performance with it.
The typical pattern to follow when doing something like this is to save the MP3 (or any binary file) on the server in a particular directory, and save the path to the file in the database.
The least expensive way, outside of using the original hosting environment, would probably be to utilize Amazon AWS S3 reduced redundancy storage, which starts at $0.093 per GB/per month. Pretty darn cheap.
But in answer to your original question, inserting stuff in the database probably won't save server space, and if your host is worth its salt, they will pick up on a huge database that keeps growing and growing, even if they claim "unlimited databases" or similar.
Just consider that storing in a database (BLOB) is usually a bad practice because it slows the queries, makes the database big and fall off the database performance. A database is used to store "searchable" information, not as a data store. Although a database can do it, it's not designed for that.
Take a look to some cloud storage service/provider instead, as the ADrive services like their personal plan ( http://www.adrive.com/personal_basic ) that let you store 50G for free (Im not sure if it's a trial), and also has Remote File Transfer functionality that allows you to transfer files from external websites.
I never tried this service, but give it a try, it's free and maybe solves your problem
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
How do you manage a massive (60+ pages) design (HTML/CSS) Project? Like what is your workflow? How do you set milestones?
Step 1. Simplify. Find a way to simplify what they're asking for. Often, this won't be apparent until you decompose and prioritize.
Step 2. Decompose. Inside every large project is a series of smaller projects waiting to get out. Break the big job into "sprints" that will build something you can release in a reasonable amount of time. 2-3 weeks per sprint (or less) is a good target.
Step 3. Prioritize. They want something first. Find out what that thing is and build that.
Step 4. Review and see if you can simplify further. Once you've decomposed and prioritized, you may see further opportunities to remove duplication, useless non-features, junk, fluff, bad ideas, and the like.
I recommend creating a work breakdown structure (WBS) to make sure you capture all of the tasks/deliverables required for your project..here's some basic tasks:
- develop a site map
- develop wireframes and mockups - and get client approval
- develop the main page and unique sub pages (assuming most sub pages are similiar in design and functionality, but different in content)
- inventory content needs
- build out primary page and 2-3 sub pages for final review/approval
- complete implementation of site but add content to the sub-pages
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
What concerns, processes, and questions do you take into account when deciding when and how to cache. Is it always a no win situation?
This presupposes you are stuck with a code base that has been optimized.
I have been working with DotNetNuke most recently for web applications and there are a number of things that I consider each time I implement caching solutions.
Do all users need to see cached content?
How often does each bit of content change?
Can I cache the entire page?
Do I need a manual way to purge the cache?
Can I use a single cache mechanism for the entire site, or do I need multiple solutions?
What impacts occur if informaiton is somehow out of date?
I would look at each feature of your website/application a decided for each feature:
Should it be cached?
How long should it be cached for?
When should the cache be expunged?
I would personally go against caching whole pages in favour of caching sections of the website/application.
First off, if your code is optimized as you said, you will only see noticable performance benefits when the site is being hammered with a lot of requests.
However, It is faster to pull resources from RAM than from the disk, so your web server will be able to handle more requests if you have a caching strategy in place.
As for knowing when you're going to need caching, consider that even low end modern web servers can handle hundreds of requests per second, so unless you expect a decent amount of traffic, caching is probably something you can just skip.
Also, if you are pulling content from your database (for example, StackOverflow probably does this) caching can be very helpful because database operations are relatively expensive and can be a huge bottleneck in high-volume situations.
As for a scenario when it's not appropriate to cache or when caching becomes difficult... If you try to cache a dynamic page that, say, displays the current date and time, you will constantly see an old date/time unless you get a little more involved with your caching strategy. So that's something to think about.
What language are you using? With ASP you have some very easy caching with only adding some property tag over the method and the value is cached depending of the time.
If you want more control over the cache, you can use some popular system like MemCached and have a control with time or by event.
Yahoo for example "versions" their JavaScript, so your browser downloads code-1.2.3.js and when a new version appears they reference that version. By doing this they can make their Javascript code cacheable for a very-very long time.
As for the general answer I think it depends on your data, on how often does it change. For example, images don't change very often, but html pages do. The "About us" page doesn't change too often, but the news section does.
You can cache by time. This is useful for data that change fast. You can set time for 30 sec or 1 min. Of course, this require some traffic. More traffic you have, more you can play with the time because if you have 1 visit every hour, this visit will be populate the cache and not using it...
You can cache by event... if your data change, you update the cache... this is one very useful if the data need to be accurate for the user very fast.
You can cache static content that you know that won't change ofen. If you have a top 10 of the day that refresh every day, than you can stock all in the cache and update every day.
Where available, look out for whole object memory caching. In ASPNET, this is a built-in feature where you can just plant your business logic objects in the IIS Application and access them from there.
This means you can store everything you need to generate a page in memory (persisting writes to database) and generate a page without ANY database IO.
You still need to use the page-building logic to generate the page, but you save a lot of time in getting the data.
Other techniques involve localised output caching, where you capture the output before sending and save it to file. This is great for static sections (like navigation on certain pages, or text bodies) and include them out when they're requested. Most implementations purge cached objects like this when a write happens or after a certain period of time.
Then there's the least "accurate": whole page caching. It's the highest performer but it's pretty useless unless you have very simple pages.
What kind of caching? Server side caching? Client side caching?
Client side caching is a no-brainer with certain things, like Static HTML, SWFs and images. Figure out how often the assets are likely to change, and set up "Expires" headers as appropriate. (2 days? 2 weeks? 2 months?)
Dynamic pages, by definition, are a little harder to cache. There have been some explorations in caching of certain chunks using Javascript (and degrading to IFrames if JS is not available.) This however, might be a little more difficult to retrofit into an existing site.
DB and application level caching may, or may not work, depending on your situation. That really depends on where your bottlenecks are. Figuring out where your application spends the most time on page-rendering is probably priority 1, then you can start looking at where and how to cache.