Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I have a site made with SPRING mvc and jstl. It has a product catalog on it. Can any one give me some recomendations for making it SEO friendly ? I understand that google's spiders searches in the html pages. However, how do i help them index on what it´s on the database (mysql), like the products and some content ? should i read/do something extra ?
Thank you!
Open account with google webmasters: http://www.google.com/webmasters
Make your URIs SEO friendly Best way to create SEO friendly URI string
Write service that will create sitemap.xml file for your site. Sitemap should include all the pages that are not obvious to the indexing spider bots.
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 3 years ago.
Improve this question
why it's need to change URL when we are working with AJAX ? as example when google translate start translate phrase or word change URL? whereas all information are sending and receiving using AJAX.
URL is not required to change. Google does this for convenience so that you can share the link to the translation with another person.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I must complete the comperation table below. I've searched and filled a few but i'm not sure about my answers.
Can you help me check them?
Thanks so much.
comperation table
I am working on Laravel from the last 5 years, so I can give my perspective on that. According to me, Laravel is easy to learn and fast to develop a framework.
They have artisan commands to create your basic structure very fast.
Routing - Laravel has now categorized routes according to their area, like web routes, API routes and console routes. so you can easily differentiate your routes according to their use.
Database - As you said Eloquent, it is enough powerful. you can use the DB facade as well.
HTML JSON & Image Rendering - Laravel uses a blade template engine so their helpers come very handy. e.g. #yield or #include, now they have even #slots.
helpers are available for assets.
Login - just enter one command basic login structure is ready for you. from registration to forgot password. make::auth command is there. it even creates basic database tables for you. you can override it as well. Authentication Quickstart
ACL - You are free to write your own middlewares.Middleware laravel
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I'm writing a blog application in Sinatra, and I want to collect some visit statistics.
As of now, I'm only thinking of getting more or less reliable visit statistics per user (that is, page visits grouped by users). Maybe later I'll want to get some client-related information (i.e., user agent).
How do I do that?
While you can use Sinatra to do this, the technology has already been implemented in other ways. I think the easiest solution is to put a piece of Javascript on the frontend that records this information for you. The most popular library for doing this is Google Analytics. This will give you far more information than you could easily capture yourself (screen size, device, etc..), and in a very clean format.
My idea to do it:
Use Rack sessions to determine the visitor ID;
Store the hits in a database table
Write a Thor task to unload it into something human-readable.
I'll appreciate any critique of this idea and/or any other ideas to do it.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I'm about to embark on building a music oriented website for a friend's band and I want to build something like this template. It uses ajax and deep linking.
My worry is that this site will not be crawlable by Google. Is there anything I can do or can code I can adjust to make it crawlable?
Many thanks in advance!
That template doesn't look crawlable to me. Googlebot will never find your content. If I go to the page for the template and view source, then search for "Gigs schedule with filter", I can't find it in the page source. That is because that particular content is loaded with AJAX and not part of the page source.
That template does not use Google's crawlable AJAX standard with #! in the url. https://developers.google.com/webmasters/ajax-crawling/ Googlebot will not be index the content on your site if you use that template.
Furthermore, there appear that there are some url issues. I see these two very similar URLS http://radykal.de/themeforest/stylico/features.html and http://radykal.de/themeforest/stylico/?page=features.html. As a user, if I visit that second url, I get the content, but I don't see the navigation. It seems likely that if googbot were to find the content, it would index that second url and use it as the landing page for your visitors. Missing navigation in that case would not be a good user experience, as users would not be able to navigate your site.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 12 months ago.
Improve this question
I am using SEOmoz to evaluate my store from an SEO perspective. It is throwing Duplicate Content errors for the customer login page. A typical URL will look like this:
www.site.com/customer/account/login/referer/aHR0cDovL3d3dy5tbW1zcGVjaW9zYS5jb20vcmV2aWV3L3Byb2R1Y3QvbGlzdC9pZC8xOTYvY2F0ZWdvcnkvNC8jcmV2aWV3LWZvcm0%2C/
I have configured the header links including login to use "rel=nofollow", but perhaps I need to add the "canonical" tag to the login page. How would I go about doing this in XML?
You should really just add that to your disallow list in your robots.txt file. That's the easiest method.
User-agent: *
Disallow: /customer/
More information on RogerBot.