Simulate website loading time - loading

I want to try to see how much faster a website can load on a server, without having to copy the entire site over first. I have an idea of how I can do that, but I'm not sure if I'm overlooking something, and I'm not sure the best way to do what I am trying to do.
If I use a website like Pingdom, I can see the number of files that are loaded for a given URL, and the size of each of those files.
I want to create a page in PHP that can simulate the loading of these files. I assume I can do this by calling a file on my server that can generate files with a size I specify. So, for example, I could call this file more than once, with a file size appended in the query string, and the browser would download those files.
Maybe I can generate images on the fly that will be the file size I specify?
Would this idea work, or is there something I'm overlooking?
If this is possible, how would I generate these files?

I'm not sure if it's exactly what you are looking for, but in Google Chrome there is an option to simulate a connection speed. When you open the developer tools, you can go in the Network tab and to the right of "Disable cache" you will see "No throttling" which is a drop-down and you can select the internet speed you want to simulate.

Related

Website with a very slow load time on every page

http://www.puppykisses.org/
i made a WordPress page for a client, and for some reason it is taking over a minute to load the page. The only thing that I could think of being the problem is the amount of photos that he inserted into the slider up top on the home page. It looks like all those pictures need to load before anything else pops up. But then I click on Contact or any other page that has no real images to speak of, and the problem is still there. Just wondering if anyone can point me in the right direction to fix this. thanks!
Like #David said, its the initial request (the source for the page) that is giving you the issues. This means it is unlikely an issue with hosting, and most likely an issue with your code. I would go through any plugins you have installed and disable them one-by-one, and slowly start commenting out your own custom dynamic code bit by bit, till you see what is taking so incredibly long. Then rewrite/excise that code from the site.
Start With the Basics
Keep the number of WordPress plugins you use to a minimum
Get a Proper Hosting Provider
Remove Unnecessary Code From WordPress Header -> http://goo.gl/yfRcF
Use firebug and click Network tab to check loading speed for each files
Check Suggestion how to improve website speed -> http://goo.gl/FtiX3
Install WP Super Cache plugin -> http://wordpress.org/extend/plugins/wp-super-cache/
*If you use gallery try to use image thumbnail rather than load whole images size

Download dynamically created Flash

Is there a way to download dynamically created Flash (SWF) files? I'd like to fetch them and convert to PNG. (This can be done for example with swfrender.)
For example, when pointing my webbrowser to http://flashserver.company.com/statsgraph.jsp?data=2012-04 (URL completely made up), a JSP script on that server creates a statistics graph. But this is done on-the-fly, so there's no SWF file that I could download (for example with wget).
I could take a screen shot from what my browser displays, but I prefer to do this automatically with a server-side (shell or PHP) script, because it's about a few dozen graphs to be downloaded, once per month. BWT, what I intend to do is fully legal. :-)
It's unlikely and overly complicated to render SWFs server side. The SWF probably holds no data itself, anyway, it probably just displays it.
You need to use Firebug or something similar to dig up the SWFs data source and save that data set instead of the SWF. Then, you can create your own image graph from that data set, using jpGraph or something similar.

How to detect back button -vs- GoBack() in WP7 app

Maybe I'm over-thinking this, but here's what I'm trying to accomplish.
I have two MVVM projects (assemblies) in my WP7 app. One page in the main project will call another page in the second project. The second page will allow the user to browse through a list of files on the web and select one to be downloaded to Isolated Storage. The files are rather small.
For a little background: I want two assemblies because this file-selection feature is not used often in the app and I want the Main assembly to be as small as possible to decrease startup time. I also want to be able to re-use this file-selection/download component in other apps.
The simple thing I'm trying to figure out is that when the user selects the file and it is downloaded, I will execute a GoBack() to return to the calling page. On the calling page, I need to know if, in fact, the user downloaded a file or if they cancelled out of the operation by simply hitting the back button. I thought the obvious thing might be to just check for the existence of the file in Isolated storage, but that just feel like a bit of a kludge to me.
I also thought about the Messenger, but I'm not sure how that would work across two assemblies.
Any advice would be appreciated.
Thanks
It is tough to know without looking at the code. However, I would suggest that you could pass back a value to the page depending on whether you successfully downloaded your file. Navigate with the value as follows (pass true or false depending on download success):
NavigationService.Navigate(new Uri(("/Page.xaml?download=true", UriKind.Relative));
Then evaluate the page in the destination as follows:
string download = "";
if (NavigationContext.QueryString.TryGetValue("download", out imageurl))
{
}

File Downloading

We are working on our converter site. and we want users to download their converted file. But audio files and other files are just playing on the browser, How can we set all Video/Audio formats downloadable (not just because the browser supports the player for the formats). THANKS.
IMO you should check the MIME types. I think if you set it as binary, it will get downloaded, not played by the browser.
Another approach is to put the file(s) into an archive server side then attach the archive to the response. That way you avoid client playback and potentially decrease download time.
Another possibility is, as most sites do, just show the link and instruct the user to right click and select download. I'd go with other options (as proposed, like looking into the MIME types), but it is still another possibility.

A way to prevent 3rd-party elements to be loaded on Safari?

Basically, I'm looking for RequestPolicy for Safari. GlimmerBlocker, Privoxy and BFilter etc, those work well but none of them support "block 3rd party elements" feature.
I use GlimmerBlocker, and to imitate (barely) the function, I mainly put this code to filter script flooded website.
replace(/<(script|noscript|iframe)([\s\S]*?)<\/(script|noscript|iframe)>/img, "")
However I'm tired of repeating creating filters for each websites. Vice-versa, whitelisting will be the same.
If anybody had an idea to solve this, that would be so great. Thanks.
I made this proof-of-concept Safari extension to block external resources (images, objects, and scripts, but NOT link elements, such as stylesheet links) until allowed. It has a bare minimum number of features, but if you are interested, I might develop it further.
I say "external" and not "third-party" because I don't know to tell reliably if a resource is third-party or not. This extension just blocks all resources that come from a different host than the web page. As a result, it blocks too many resources by default.
You can right-click a blocked image and use a context menu command to whitelist the image host. If the blocked image didn't have a specified width and height, it will be invisible, so you won't be able to right-click it. (To remedy this, I will need to add code to make the empty image visible as a box.)
The whitelist command does not show up for blocked plugin objects (such as Flash objects) or scripts. I will have to add code to deal with that.
You can also whitelist the current site itself, meaning that all external resources will be allowed on that site. Again, this is done with a context menu command.
As yet, there is no way to remove items from either whitelist. This can be added.
Download the extension from here.
You can extract the source files from the extension package using this command:
xar -xf PartyPooper.safariextz
You are welcome to do whatever you like with the source.

Resources