I know that search engines base part of the calculation of rankings on how many other sites point to a specific site, so I was just wondering, given the following situation:
http://siteA/page.aspx contains an iFrame.
This iFrame points to http://siteB/script.aspx?url=http://siteA/page.aspx.
http://siteB/script.aspx generates a list of 1 or more links based on the supplied URL.
http://siteA/page.aspx therefore displays a list of links.
1) Where would Google etc consider links to be based?
2) If it would consider the links to be based at siteB, is there another technique I could use to force search engines to read the links as being based at siteA? For example ajax.
Thanks in advance,
Regards,
Richard
First of all, if you're looking to build a link farm Google will block you faster than you can blink. I would seriously reconsider using iframes to links as in the situation you've described you've no reason to load another page on your first.
To answer your question, iframes load a page on another, so naturally Google will view them as separate entities.
ok think I may have the answer - http://www.highrankings.com/forum/index.php?showtopic=44155&st=0&gopid=312999&#entry312999
Related
Good morning,
a customer of ours asked us if it was possible to change the image that Google shows next to his site in Google search results.
After several searches, we tried using different techniques all followed by re-indexing the page in order to instantly see the results.
We tried using structured data (both with ld+json and using microdata) and also of the attributes "og:image" and "og:title" in the "meta" tags, but none of these tests changed the image displayed on the right side next to the site in Google results.
We expected that with one of these methods would have changed the image, but nothing happened
Therefore, we wondered whether it was possible to change that image or whether Google chose the best image based on its search parameters.
Thank you for your valuable help,
Best regards
Since its related to AJAX technology so I thought this is the best place to ask.
I am displaying 5 articles at a time to the user on my website and when he clicks 'Next' I load the next 5 articles using AJAX without loading the entire page.The result is that he always stays at the same page .
One of my friend told me that website ranking depends on number of page views and I think this obviously reduce my page views.
Should I not use AJAX then?
(This might be a stupid question but I seriously have no idea about ranking and SEO so please help)
By loading your content dynamically Google will not see the entire page. Only the part that is loaded. So, if Google rank is important for you it's better to not use an infinity loader.
Actually it is not a good idea to navigate page using AJAX. Consider a scenario,
display 5 articles first then by clicking Next button, next 5 items will load and so on... by using this the page will not become Search engine friendly.
in this case search engine can't locate your contents exactly and will crawl only initial contents.
but with some efforts you can make ajax navigation search engine friendly.. see example here.
Currently the scheme of loading content of page dynamically is not a good idea for SEO friendly web page but try considering other ajax page navigation schemes that might help the page to make dynamic as well as search engine friendly.
some suggested ajax navigation schemes are listed below,
http://nickjohnson.com/b/how-to-make-ajax-search-engine-friendly-seo
http://ajax.rswebanalytics.com/
http://www.symatix.co.uk/articles/ajax/search-engine-friendly-ajax-navigation
I am working on a bootstrap based responsive website. The dropdown menus in the main website navigation are opened with a click rather than a hover. There is no index content for each section, only specific page links in the dropdown.
Is there any SEO penalty for having content located at:
www.mysite.com/books/moby-dick
when
www.mysite.com/books
results in a 404 error?
I could generate index pages with links to all children if I had to, but I'd rather avoid creating any content that isn't meant to be viewed directly.
I would like to organize the pages by "folder" using mod_rewrite which I have a pretty good handle on at this point.
The way I understand it, Search Engines place no relevance on one page's URL in relation to another page's URL. Are you looking for some documentation. What search engine are you looking for documentation on? I'm not even sure if that type of info is "documented" but if you think about it, the order of words in a URL only have meaning to us as humans. An engine doesn't place importance of links further up or down the url hierarchy. It wouldn't make sense.
I don't even think your page moby-dick would have a positive/negative impact on the domains home page. Google at least treats every URL as a unique page, hence the "Page Rank" algorithm. Not the site hierarchy algorithm.
I'm Wondering what are some best practices to decrease page load time of single page websites, and doing so in a way that won't hurt with SEO.
I'm leaning toward an ajax solution with "hijax linking", but I'm wondering what are some best practices in terms of the load order for a page. So for instance, say I have a simple webpage- has home, about, pictures of my cat, contact etc. and I'm planning to have it all show up on the homepage via vertical scrolling-alotting one "screen" worth of content per item.
I'm coding this in wordpress, so my main idea would be to first load the first "screen" i.e. hero section of homepage, as part of the home.php, so the user doesn't have to wait for the whole thing-and SEO. Then once that has finished loading, to load the next four via ajax, in the background. So I'm wondering what the best strategy might be to go about that. Someone provided this answer elsewhere:
"Build a standard 5 page site using php with proper separation of header, footer, content. Then use javascript to redirect to a single (separate) page with all content include()ed on the page."
In wordpress I'd take this to mean. Create a seperate page with a loop the grabs the other four "screens" as posts. and then load this page, after home.php has loaded.. Does anyone see any issues with this approach, or as the question asks, have any better or best practices to accomplish this, I'd appreciate them. Thanks.
There are several things you can do:
Need to improve the performance of your back end code in case there
is any.
Pagination: split page in smaller pages
Caching
Decrease the size of content, decrease the size of background images, compress js content
Compress Content
Most of the time the perfect optimization will depend on your situation. To start with one of the above will do it for you.
Your question is tagged with "wordpress". Therefore, I am assuming that you use wordpress.
if so, what I would think as logical starting point is to use one of the wordpress caching plugins. I use Quick Cache for my website and it makes significant difference.
But, you shouldn't stop with the plugin. Consider the quality of the theme you are using. You must be sure that the theme is good quality. Poorly designed themes may make inefficient database call and may slow your website.
delaying and Loading part of the page with ajax shouldn't be your first optimization action. Try all the other options first.
I have a web page loaded up in the browser (i.e. its DOM and element positioning are both accessible to me) and I want to find the block element (or a sorted list of these elements), which likely contains the most content (as in a continuous block of text). The goal is to exclude things like menus, headers, footers and such.
This is my personal favorite: VIPS: a Vision-based Page Segmentation Algorithm
First, if you need to parse a web page, I would use HTMLAgilityPack to transform it to an XML. It will speed everything and will enable you, using a simple XPath to go directly to the BODY.
After that, you have to run on all the divs (You can get all the DIV elements in a list from the agility pack), and get whatever you want.
There's a simple technique to do this,based on analysing how "noisy" HTML is, i.e., what is the ratio of markup to displayed text through an html page. The Easy Way to Extract Useful Text from Arbitrary HTML describes this tex, giving some python code to illustrate.
Cf. also the HTML::ContentExtractor Perl module, which implements this idea. It would make sense to clean the html first, if you wanted to use this, using beautifulsoup.
I would recommend Vit Baisa's thesis on Web Content Cleaning, I think he has some code too, but I can't find a link for it. There is also a discussion of the very same problem on the natural language processing LingPipe blog.