I've got a website with products catalog using ajax pagination. I want to replace text on all product cards. First product cards (2 rows) are already rendered on page, but next product cards are loading via ajax while scrolling page down.
I can't change websites source code, so the question is it possible to make a replacement for example via Nginx (smth. like 'sub_filter')? Maybe other variants?
sub_filter_types *;
sub_filter_once off;
sub_filter "foo" "bar";
In case if you are using gzip on backend, you have to remove it or add
proxy_set_header Accept-Encoding "";
Related
Okay I do not know how to explain this to you, It may be just my internet, or maybe my site is slower, or they really have a technique for doing this.
If you visit Facebook, Reddit, Youtube, Twitter, and if you click on links or any actions on those websites, the url changes but the browser tab doesn't show any loading circle.
How do they do that?
I am pretty sure my website is fast enought and at times it loads even faster than the bigger sites, but mine shows the loading circle on the browser tab.
Okay so I found the answer. Here is the technique for changing the url without reloading the page.
Updating address bar with new URL without hash or reloading the page
How do I modify the URL without reloading the page?
I am still trying to figure out though how to redirect the actual page without reloading the entire page. I am guessing they are loading it via ajax or something similar upon url change. I'll update this once I figure it out.
Edit: I am currently working on this feature for my site. The technique is to use ajax to load the content based on the url. I'll update this thread more as I update my site with this feature.
Edit 2: Damn, you will probably face the same problem I had trying to detect the url change without using onhashchange. If so, here you go:
How to detect URL change in JavaScript
This literally took me 4 hours just to figure that one out.....
Edit 3: I have now integrated this feature on my site. You can check it at
Grandweb
It is quite simple, but lots of work in appending the content once retrieved via ajax. So here is the process:
I am using pushState(); to change the url without reloading the page.
var url = $(this).attr('href');
var split_url = url.split('/');
var new_url = url.replace('https://grandweb.net/','');
window.history.pushState("object or string", "Title", "/write");
Using 'mouseup' was a bad idea, I changed my mind.
I then have to trigger the first function using 'mouseup' to retreieve the content via ajax, and then listen to succeeding onpopstate() for the next ones, because some mouse actions such as Mouse 4 or Mouse 5 are bound to the browser's Back and Forward button, and does not trigger via 'mouseup'.
$(window).on('mouseup', function(evt) {
get_content();
}
window.onpopstate = function(event) {
get_content();
}
The first one is responsible for triggering the function on first try because onpopstate only listens only when the browser's history API is populated.
Using mouseup was a bad idea, basically, don't use it unless you really want to detect mouse action from anywhere on the document.
I instead use the anchor tags/links to trigger the first function for retrieveng content.
example:
<a class="dynamic_btn" href="website.com/post">Home</a>
then
$(document).on('click','.dynamic_btn',function(e){
e.preventDefault();
get_content();
});
Using onhashchange is possible IF you have hashes on your url. I do not use hashes on my url so basically onhashchange is useless in my use case, unless I do not know something.
After retrieving the contents, I append them via creating DOM elements to existing containers from the page.
This is much easier to do if you are planning to change few elements or containers in your pages. If you plan on doing this to change a full page layout, goodluck. It's doable, but it's a really pain in tha *ss.
Upon observing Facebook, I learned that they do not implement this technique in all of their links/features. It makes sense because this is harder to maintain most especially because most of the work here is being done client side. It is very nice though because the page doesn't load.
I have implemented it on a few 'essential' functions of my website such as the viewing of posts and returning to the homepage. I can implement it on the whole site, but I am still deciding on that. That is all, thank you very much for reading internet stranger.
I have a site, lets say has a paged lists of products.
The pagination is AJAX based, it degrades 100% without JavaScript.
With JavaScript Turned on or off this Url will show the same content. (page 4 of my list of products) the same applies for any filter and ordering filters.
~/products/list/4 { 4 = page number }
When googlebot lands on page 1 of my products it won't be able to page through the products because it is AJAX based pagination. So if I turn off the AJAX Pagination and fall back to "server side" pagination if the useragent == googlebot then it can index all my Urls which will have the same content as an AJAX enable page.
I have read about using #! but my site does has the same functioallity my urls are the same with or withour JS enabled.
Hope that makes sense.
Please, do not show a diferent content if is Googlebot, because it may be considered black hat SEO. The best way to do define witch version you prefer is to use canonical metatag.
If you do not know, please also read this: Making AJAX Applications Crawlable
I have searched the net for a solution but can't seem to get anywhere.
My page (php) is loading with one url (let's say www.mysite.com)
in the page several search options on music (albums) can be done and the tracks are shown. (without refreshing the page). the info comes from a database.
So the url stays the same.
In this search process the facebook meta tags (description, url, title) stay the same also because I never reload the page, I only load content into div's.
I would like to be able to 'like' the album, and backlink to it. So I have created the function to load the album by using the url: www.mysite.com?album=12345
I can show a popup with this url to share this.
So, if you go to this url, the content is automatically loaded based on the url parameter.
And on this spot (where you can see the url with the parameter ?album=12345) I would like to show the 'like' button as well. (I generated the url, so I use this in the code:)
echo '<div style="overflow:visable" class="fb-like" data-href="http://mysite.com/?album='.$albumid.'" data-send="false" data-width="300" data-show-faces="false">?</div>';
it works so far... (after I added the parse code to enable the button)
However the like button takes the default meta tags description and title etc.
Not particular on this album or artist - so it's not unique.
Note: if I remove the meta[property=og:url] from the header I can make the button backlink to the right url with the ?album parameter. Otherwise it would go back to the default root of the site mysite.com (this does make the lint tool give an error on the missing meta)
I have tried to add into this same function something like:
$("meta[property=og\\:url]").attr("content", "http://mysite.com/?album=<?php echo $albumid; ?>");
$("meta[property=og\\:title]").attr("content", "<?php echo $artistname; ?>");
$("meta[property=og\\:description]").attr("content", "<?php echo $albumname; ?>");
I did this so the meta tags will be changed, just to let the like button show the right description etc. However this doesn't work.
I understand that facebook scrapes the page (I used the lint tool etc.) but I will never executes javascript, so the meta tags wil stay as default (when first loading the page)
What can I do to make a unique like button, with it's own description (albumname etc) without making a html page for each one of them (millions of albums in the database...)
I hope it makes sense.
I can't seem to figure this one out, help please :-)
Based on the comments below I used the following solution:
you should create the right fb meta tags when the url (with the params ?alb_id=12345) is opened.
That's enough for the like button to do its job.
Your logic is fine, up to the point where you're setting the meta tags using jquery.
They should be set using PHP. As you can imagine the scraper won't execute the jquery, but if it's fed the already PHP-customized meta tags it will use them (as provided).
Just have the og:tags prepared server-side, depending on the albumId requested, and it should work. It might not work right away, I remember there used to be occasional caching issues with the scraper before.
In short, index.php?album=123 will send a different set of og:tags to the scraper than say index.php?album=321. Just set them up server-side.
<meta property="og:title" content="<?php echo $artistTitle; ?>"/>
What can I do to make a unique like button, with it's own description (albumname etc) without making a html page for each one of them (millions of albums in the database...)
You can’t, because Open Graph objects are URLs (resp. are represented/identified by their URL).
One URL == one Open Graph object.
But where’s the problem in having one URL for each album? Since it all works using parameters, it’s not like you have to create a page for each album URL manually …
I'm still not sure how it works(but it's not the point:D). As far as I noticed, whole content(almost:D) is in the iframe and chat window is outside iframe. Request are probably made via ajax, and urls are changing like this const_part_of_url#something - so the only url anchors(or whatever it's called) are changing.
2 things bothering me :
What about googlebot, is it able to index those pages correctly(not gmail, but say some web page with similar "technology" used), 1st beacuse of iframe, 2nd because of only anchor changes in urls?
Is it possible to make some part of url changing not only anchors?
The thing is I have an mp3 search engine where you can listen these mp3s too, and this kind of floating, "not-reloading" player with playlist would be kinda cool:D But I'm very concern about proper page indexing and other SEO blah blah... so I don't really now if it's worth trying:D
Cheers
you can detect robots and not feed them with user-eyes-only content ...
Edit : you can also load it on demand (javascript)... bots wont load it
I'll explain:
I have a picture gallery, the first page is display.php.
Users can flip through pictures using arrows, when you click an arrow it sends an Ajax request to retrieve the next picture from the db. Now I want the URL to change according to the picture displayed.
So if the first picture is:
www.mydomain.com/display.php?picture=Paris at night
I'll flip to the next one and the URL would be
www.mydomain.com/display.php?picture=The Big Ben
How do I do this?
The trick here are uri's with an anchor fragment.
The part before '#' points to a resource on the internet, and after normally designates to a anchor on the page.
The browser does not refresh if the resource is the same but moves to the anchors position when present.
This way you can keep the convenience of browser history from a usability point of view while replacing certain parts on the page with ajax for a fast and responsive user interface.
Using a plugin like jQuery history (as suggested by others) is really easy: you decorate certain elements with a rel attribute by which the plugin takes care of the rest.
Also kinda related to this topic is something called 'hijax', and it's something I really like.
This means generating html just like you would in the old days before ajax. Then you hijack certain behavior like links and request the content with ajax, only replacing the necessary parts. This in combination with the above technique allows really SEO friendly and accessible webpages.
You can use the jQuery history plugin for example.
changing the search of the url will load the changed url.
See also: stackoverflow, javascript changing the get parameter without redirecting
Do you really want to use AJAX here?
A traditional web request would work like this...
User navigates to display.php
User clicks "next" and location is updated to "display.php?picture=Big-Ben"
Big Ben is shown to user, along with a link to "display.php?picture=Parliment"
User clicks "next" and location is updated to "display.php?picture=Parliment"
And so on.
With AJAX, you essentially replace the GET with a "behind the scenes" GET, that just replaces a portion of your page. You would do this to make things faster... for example...
User navigates to display.php
User clicks "next" and the next image location is obtained using an AJAX request
The image (and image description) is changed to the next image
What you are suggesting is that you retrieve the "next url" using AJAX and then also perform a GET on the whole page. You would be much better off sending the "next" image when you send each page and not using AJAX at all.
this best describes everything i think: http://ajaxpatterns.org/Unique_URLs