How to parallelize downloads across hostnames on WordPress? - performance

I'm getting this message "Parallelize downloads across hostnames" when checking my WordPress site on GTmetrix > https://gtmetrix.com
Here are the details > https://gtmetrix.com/parallelize-downloads-across-hostnames.html
How do I fix that ?

Details
Web browsers put a limit on the number of concurrent connections they will make to a host. When there are many resources that need to be downloaded, a backlog of resources waiting to be downloaded will form. The browser will make up as many simultaneous connections to the server as the browser allows in order to download these resources, but then will queue the rest and wait for the requests to finish.
The time spent waiting for a connection to finish is referred to as blocking and reducing this blocking time can result in a faster loading page. The waterfall diagram below shows a page which loads 45 resources from the same host. Notice how long the resources are blocked (the brown segments), before they are downloaded (the purple segments) as they wait for a free connection.
So here is a hack to implement it on WordPress.
In order to work properly, all subdomains/hostnames MUST have the same structure/path. Ex:
example.com/wp-content/uploads/2015/11/myimage.jpg
media1.example.com/wp-content/uploads/2015/11/myimage.jpg
media2.example.com/wp-content/uploads/2015/11/myimage.jpg
Add to functions.php
function parallelize_hostnames($url, $id) {
$hostname = par_get_hostname($url);
$url = str_replace(parse_url(get_bloginfo('url'), PHP_URL_HOST), $hostname, $url);
return $url;
}
function par_get_hostname($name) {
//add your subdomains below, as many as you want.
$subdomains = array('media1.mydomain.com','media2.mydomain.com');
$host = abs(crc32(basename($name)) % count($subdomains));
$hostname = $subdomains[$host];
return $hostname;
}
add_filter('wp_get_attachment_url', 'parallelize_hostnames', 10, 2);

This is mainly due do HTTP/1.1 in which browsers open on average 6 connections per hostname.
If you are running over HTTPS with a provider that supports HTTP/2, this warning can usually be safely ignored now. With HTTP/2 multiple resources can now be loaded in parallel over a single connection.
--
However, if you need to fix it, you can follow the below steps:
Create additional subdomains such as:
domain.com static1.domain.com static2.domain.com
Simply add the following code to your WordPress theme’s functions.php file. And replace the $subdomains values with your subdomains.
All subdomains/hostnames MUST have the same structure/path.
function parallelize_hostnames($url, $id) {
$hostname = par_get_hostname($url); //call supplemental function
$url = str_replace(parse_url(get_bloginfo('url'), PHP_URL_HOST), $hostname, $url);
return $url;
}
function par_get_hostname($name) {
$subdomains = array('static1.domain.com','static2.domain.com');
$host = abs(crc32(basename($name)) % count($subdomains));
$hostname = $subdomains[$host];
return $hostname;
}
add_filter('wp_get_attachment_url', 'parallelize_hostnames', 10, 2);
Read more about the parallelize downloads across hostnames warning and why you probably don't need to worry about this anymore.

Related

How to ping a server using Ajax in Laravel every 5 minutes?

I have a HTML table full of server IP addresses, and I want to ping them them every 5 minutes to check if the server is alive (and eventually highlight table rows depending if the server is dead/alive).
Currently I'm using Ajax with a 5 minute interval which calls a method in my controller:
var checkSims = function() {
$.ajax({
type: "GET",
url: '/checkSimStatus',
success: function(msg) {
onlineSims = msg['online'];
offlineSims = msg['offline'];
console.log(onlineSims);
console.log(offlineSims);
},
error: function() {
console.log('false');
}
});
}
var interval = 1000 * 60 * 1; // where X is your every X minutes
setInterval(checkSims, interval);
However, this is not asynchronous and while this controller method is pinging the IPs the webserver cannot serve requests.
I've read about Laravel's queue system but I'm not sure this would suit me as I need one specific page to trigger the job, and would need to use JS to highlight table rows.
#f7n if you have done it with ajax, how will it work if that page where HTML table with IP address not open in a browser?
I think you must use cron job on a server. Also, if you use VPS (Linux) or something else you can write simple code with bash shell script and run it on the daemon. Also, you can create simple code like below, create php script where it will parse (grab) page with HTML table of IP addresses and ping server.
#!/bin/bash
echo "Press [CTRL+C] to stop.."
while true
do
php parse_and_ping.php
sleep 300
done
sleep 300 is mean, It will work every 5 minutes. Just save It on .sh file (run_shell.sh) and run It on a terminal or on the daemon of Linux server.

CPU is utilizing 100% resource and therefore Queue failed

My code is like below.
for($i = 0; $i <= 100; $i++) {
$objUser = [
"UserName" => $request["UserName"] . $i,
"EmailAddress" => $request["EmailAddress"] . $i,
"RoleID" => RoleEnum::ProjectManager,
"Password" => $request["Password"],
];
$RegisterResponse = $this->Register->Register($objUser);
$Data = $RegisterResponse["Data"];
$job = (new AccountActivationJob($Data));
dispatch($job);
}
Above code is creating 100 users and Each time a queue is being created to send email notification. I am using database default queue.
I have shared hosting account on GoDaddy. Due to some reasons the CPU usage reaches 100. Here is the screenshot.
Finally loop stops in between. Below is the screenshot after 5 mins.
Here, My problem is: It is not able to continue creating 100 users. I am doing this to test the sample queue implementation where multiple users send request for registration. Am I doing anything wrong?
As stated above, GoDaddy has a lot of resource limitations. You can only send 100 Emails an hour is what I have heard.
That also not at a single time. If it detects you are sending a lot of emails, your process is blocked.
Instead, you can queue up the messages to be sent 1 per 20 seconds or 30 seconds. It will help keep the resources in limits, and your emails are sent to the customers without any problem.
You can use the sleep function for this.
Godaddy does have a limit of resources you can use. If you go over it, it will kill the processes on ssh.
The limits are avaiable here
Try running the php process with a different nice parameter.
That's what I do when i need to use an artisan command that does use a lot of resources..
I did the findings and found that I should move to VPS instead of Shared hosting. here are the nice and cheap plans by GoDaddy. https://in.godaddy.com/hosting/vps-hosting

Bash script that checks website every 10 seconds

The following script checks a sites content to see if any change has been done to it, every 10 seconds. It's for a very time sensitive application. If something on the site has changed, I merely have seconds to do something else. It will then start a new download and compare cycle and wait for the next change and do cycle. The do something else, has yet to be scripted and not relevant to the question.
The question: Will it be a problem for a public website to have a script downloading a single page every 10-15 seconds. If so, is there any other way to monitor a site, unmanned?
#!/bin/bash
Domain="example.com"
Ocontent=$(curl -L "$Domain")
Ncontent="$Ocontent"
until [ "$Ocontent" != "$Ncontent" ]; do
Ocontent=$(curl -L "$Domain")
#CONTENT CHANGED TRUE
#if [ "$Ocontent" == "$Ncontent ]; then
# Ocontent=$(curl -L "$Domain")
#fi
echo "$Ocontent"
sleep 10
done
The problems you're going to run into:
If the site notices and has a problem with it, you may end up on a banned IP list. Using an IP pool or other distributed resource can mitigate this.
Pinging a website precisely every x number of seconds is unlikely. Network latency is likely to cause a great deal of variance in this.
If you get a network partition, your code should know how to cope. (What if your connection goes down? What should happen?)
Note that getting the immediate response is only part of downloading a webpage. There may be changes to referenced files, such as css, javascript or images that are not immediately apparent from just the original http response.

Classic ASP cache busting (& yet still satisfying PageSpeed score)

Scenario:
I am working with IIS and ASP, and we need to cache the site (to make Google Page Speed, and my boss, happy). We currently have IIS caching everything (asp/JS/CSS) for a period of 1 week.
Problem:
After updating the HTML content on the ASP pages, my boss sees the old version of the page until he does a (force) refresh.
Question:
How can I (force) update the server cache after I make a change to the ASP HTML content?
I would like my peers and managers to see the latest changes without making them do a forced browser refresh.
Are you configured to use the "If-Modified-Since" HTTP Header?
This explanation on Scott Hanselman's blog gives you and idea of what you are looking for - Forcing an update of a cached JavaScript file in IIS
This page also provides a useful primer for the "If-Modified-Since" HTTP Header
Let's see if we can make the boss happy. Like you, I have a few people that think F5 or Ctrl+F5 is annoying.
Quick Review, to be sure your Output Cache on your IIS server is updating on Change let's set it to "Cache until Change".
I read that you clear it every week but if things don't change... Why?
Let's set the client browser caching defaults.
And you have the following for all your page headers letting the page expire after 30 minutes using GMT time.
Master header:
Dim dtmExp
Response.Buffer = True
Response.CharSet = "UTF-8"
dtmExp = DateAdd("n", 30, Now())
Response.ExpiresAbsolute = dtmExp
Response.Expires = dtmExp
We have several options and methods to trigger our header change.
You can use Sessions, Cookies, DB updates etc. in this example I'm using Sessions feel free to change things around to fit your application better.
PageEdit.asp
Session("EditedPageFullURL") = "/yourpage.asp"
In a common functions page add the following.
Function EditorsReload(eChk,erURL)
If IsNumeric(eChk) Then
Session("Editing") = eChk
End If
If Len(erURL) = 0 Then
Exit Function
End If
If Session("Editing") <> "" Then
If Session("Editing") = 1 Then
If (LCase(erURL) = LCase(Request.ServerVariables("SCRIPT_NAME"))) Then
Session("Editing") = ""
Session("EditedPageFullURL") = ""
Response.Expires = -1
Response.ExpiresAbsolute = Now() -1
Response.AddHeader "pragma", "no-store"
Response.AddHeader "cache-control","no-store, no-cache, must-revalidate"
End If
End If
End If
End Function
Place the following in your page just below any headers you might have.
Call EditorsReload(1,Session("EditedPageFullURL"))
You can wrap it in a "Session("AUTH")" if your site has login and member sessions setup.
Other than that, this will fire only when Session("EditedPageFullRUL" has a length greater than 1.
This will update the bosses browser header forcing the browser to refresh the local cache.
It is a one time deal so any additional page refresh is using the standard headers.
There are many ways of doing this so be creative!

Extremely slow WordPress user import on XAMPP

I'm posting this question here because I'm not sure it's a WordPress issue.
I'm running XAMPP on my local system, with 512MB max headroom and a 2.5-hour php timeout. I'm importing about 11,000 records into the WordPress wp_user and wp_usermeta tables via a custom script. The only unknown quantity (performance-wise) on the WordPress end is the wp_insert_user and update_user_meta calls. Otherwise it's a straight CSV import.
The process to import 11,000 users and create 180,000 usermeta entries took over 2 hours to complete. It was importing about 120 records a minute. That seems awfully slow.
Are there known performance issues importing user data into WordPress? A quick Google search was unproductive (for me).
Are there settings I should be tweaking beyond the timeout in XAMPP? Is its mySQL implementation notoriously slow?
I've read something about virus software dramatically slowing down XAMPP. Is this a myth?
yes, there are few issues with local vs. hosted. One of the important things to remember is the max_execution time for php script. You may need to reset the timer once a while during the data upload.
I suppose you have some loop which takes the data row by row from CSV file for example and uses SQL query to insert it into WP database. I usually put this simple snippet into my loop so it will keep the PHP max_exec_time reset:
$counter = 1;
// some upload query
if (($handle = fopen("some-file.csv", "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
mysql_query..... blablabla....
// snippet
if($counter == '20') // this count 20 loops and resets the counter
{
set_time_limit(0);
$counter = 0;
}
$counter = $counter + 1;
} //end of the loop
.. also BTW 512MB room is not much if the database is big. Count how much resources is taking your OS and all running apps. I have ove 2Gb WO database and my MySql needs a lot of RAM to run fast. (depends on the query you are using as well)

Resources