Google Maps not loading when my page is accessed over https [closed] - laravel

Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed 4 years ago.
Improve this question
I am trying to include a map from Google Maps on my website. However, Google Maps won't load when my page is accessed over https, but it does work perfectly when my page is accessed over plain http. I'm using Laravel and don't have much experience with this framework yet.
this is the http-link and this is the https-link.

When you navigate to your website via the https url, the Google map doesn't show up because it fails to load its script via
<script src="http://maps.googleapis.com/maps/api/js ..."></script>
This is because your website is loaded over https and the script over http, which is known as the mixed-content problem.
You solve this by also loading the Google maps script over https
<script src="https://maps.googleapis.com/maps/api/js ..."></script>

Related

Server return 403 error on live share hosting , work in local [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed yesterday.
Improve this question
when i create some post with summernote editor , its work fine using laravel framework
when not include image upload function, its work
when I choose image and update data
its not work.
images are stored at contabo(block storage).
when update with uploaded image data in summer note , that not work but work in local
server return like this
thanks you all, sorry for my bad english
Work on local , not work in live server
when store to database with laravel framework

Pagespeed problem when typic in URL: "The referrer https://www.googleapis.com/ does not match the referrer restrictions configured on your API key." [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 3 years ago.
Improve this question
Anyone know what this means? I went to https://developers.google.com/speed/pagespeed/insights/?url=https%3A%2F%2Ftreepi.me%2F and I am given this error. It was working yesterday. I have no idea what is going on. It's just a WordPress blog with Google Site Kit enabled. I've removed and reinstalled Site Kit, but still same issue.
The referrer https://www.googleapis.com/ does not match the referrer
restrictions configured on your API key. Please use the API Console to
update your key restrictions.
This is normally caused by some sort of firewall issue / browser extension blocking referrer on requests.
Last couple of times I have seen this error someone was using Brave Browser with 'shields up'.
I ran it fine so try a different browser / incognito mode and see if that helps.

No Captcha Recaptcha is Not Displaying [closed]

Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed 5 years ago.
Improve this question
I'm upgrading to the NoCaptcha reCaptcha as described on Google's Documentation.
I'm using the Automatic Render method.
<head>
...
<script src="https://www.google.com/recaptcha/api.js" async defer></script>
...
</head>
<body>
...
<div class='g-recaptcha' data-sitekey='PUBLIC_KEY'></div>
...
</body>
However, it displays like this:
Where the Widget should live, it looks like this:
It successfully pings https://www.google.com/recaptcha/api.js and downloads recaptcha__en.js. If I use the deferred render from the example, it gets the same result. There are no errors reported in the console and no errors in the network responses.
The iFrame does have code generated in it, it just doesn't display. How can I make the widget display properly?
It took me a few days and playing with CSP to figure it out but it turned out one of the stylesheets was disabling the iframe.
I found this in my code which was no longer being used.
iframe {
height:0px;
display:none;
}
While the code would write to the frame and download the right Javascript, because there was no displayed iFrame, there was no where for the Captcha to write.

Ajax / Deep linking and Google indexing / SEO - Is it a bad idea? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I'm about to embark on building a music oriented website for a friend's band and I want to build something like this template. It uses ajax and deep linking.
My worry is that this site will not be crawlable by Google. Is there anything I can do or can code I can adjust to make it crawlable?
Many thanks in advance!
That template doesn't look crawlable to me. Googlebot will never find your content. If I go to the page for the template and view source, then search for "Gigs schedule with filter", I can't find it in the page source. That is because that particular content is loaded with AJAX and not part of the page source.
That template does not use Google's crawlable AJAX standard with #! in the url. https://developers.google.com/webmasters/ajax-crawling/ Googlebot will not be index the content on your site if you use that template.
Furthermore, there appear that there are some url issues. I see these two very similar URLS http://radykal.de/themeforest/stylico/features.html and http://radykal.de/themeforest/stylico/?page=features.html. As a user, if I visit that second url, I get the content, but I don't see the navigation. It seems likely that if googbot were to find the content, it would index that second url and use it as the landing page for your visitors. Missing navigation in that case would not be a good user experience, as users would not be able to navigate your site.

W3C validation for complete site [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 3 years ago.
Improve this question
I am working on a project where I have to validate the complete site, which has around 150 pages, through W3C Markup Validation. Is there a way to check W3C Markup Validation of an entire website?
The W3C doesn't offer this on w3.org.
http://validator.w3.org/docs/help.html#faq-batchvalidation
But you can use this tool and check "Validate entire site": (Also w3.org refers to this site!)
http://www.htmlhelp.com/tools/validator/
But you have a limit of 100 URLs to validate and will get this message when you reach 100 URLs:
Batch validation is limited to 100 URLs at one time. The remaining URLs were not checked.
Also there's a limit of errors displayed for each url.
The WDG offers two free solutions:
Validate entire site (select 'validate entire site')
Validate multiple URLs (batch)
You can run the validator yourself. As of 2018, W3C are using v.Nu for their validator, the code is at https://github.com/validator/validator/releases/latest and usage instructions are at https://validator.github.io/validator/#usage
For example, the following command will run it on all html files under the public_html directory:
java -jar vnu.jar --skip-non-html public_html
I use this tool bulk w3c html validator to validate my entire website
http://www.bulkseotools.com/bulk-w3c-validator.php
This tool uses W3c validator engine, you can check 500 urls at once.
I've used http://sitevalidator.com; I think it would be helpful to you.
I made this java app (Windows installer) in my spare time because I needed it at work:
https://gsoft.no/validator. It's free.
It uses either https://validator.w3.org/ or v.Nu running locally to validate an entire site.
It crawls a website and in the end makes a report with validator-links to all pages with warnings or errors. Because it crawls, all pages to be validated must be linked.
By running v.Nu locally you can validate an internal site (e.g. an intranet) which is not available online and therefore cannot be validated by online validators (unless you post the entire content of each page).

Resources