Full text search using Google Sites API gives 500 internal server error - google-api

I tried incorporating the full text search using the Google Sites API (via a service account) but to no effect. I also tried it on the Google Oauth playground, but I get 500 Internal server error in both the cases.
I am using https://sites.google.com/feeds/content/domainName/siteName?q=test as the URL.
However, I do get a 200 OK when I fire https://sites.google.com/feeds/content/domainName/siteName (without a query string) in the Google Oauth playground.
Please guide me if I am doing anything wrong.
Thanks

Related

Google Translate works well on localhost but throws `429 Too Many Requests` error on the live server

I am using this Laravel google translate package to translate the website based on the user's locale. I configured everything well and the translation seems to work quite well on the localhost. However, when I upload the website to the server, google translate throws error 429 too many requests:
[2020-09-17 08:04:03] production.ERROR: Client error: `GET https://translate.google.com/translate_a/single?client=webapp&hl=en&dt=t&dt=bd&dt=at&dt=ex&dt=ld&dt=md&dt=qca&dt=rw&dt=rm&dt=ss&sl=auto&tl=en&q=Canl%C4%B1+Bahisler&ie=UTF-8&oe=UTF-8&multires=1&otf=0&pc=1&trs=1&ssel=0&tsel=0&kc=1&tk=40965.430971` resulted in a `429 Too Many Requests` response:
I have tried searching for the cause of the error and found this post in google groups talking about exceeded quota but I don't think that applies to my case since I am only translating a few texts that can't get to that limit. Does someone have an idea on how to solve this?

Fetch As Google Ajax is blocking

I am newbie Parse and I have a problem. I want to use parse classes for dynamic content such as blog posts. Everyting works as expected there is no problem ; but when I try to fetch as google in Google Webmaster Tools it says AJAX blocked. So google will not index this content anyway.
when I follow the link I saw this below.
this is what I see when follow class link
So google crawler try to get ajax content but it comes to it with a ConnectionFailed aka 100 error. (I tested it to show in a label on page what returns in parse query error callback. So I see what renders google)
Am I doing something wrong is this an expected behaviour ?
Anyone knows how to solve this ?
Btw: I am hosting this website on heroku with custom domain over https (with cloudflare dns redirected and free ssl)
I also deployed to Parse Cloud Hosting unfortunately the result is same :(
This is the full result of the Fetch as Google :
full page result of fetch as google
The page at https://api.parse.com/1/classes/GameScore is asking for authentication, and it's throwing a 401 Unauthorized status code for unauthorised requests. That's already a problem.
Besides that, the page at https://api.parse.com/robots.txt is currently showing
User-Agent: *
Disallow: /
Googlebot can't access that page because it's disallowed for crawling in the first place, but even if it could access it, it would run into an authentication gate which it wouldn't be able to pass.
If the content from that URL (https://api.parse.com/1/classes/GameScore) is essential for the page where its referenced/used, you would have to work with Parse to allow crawlers access those URLs.
If it's not essential, then you can safely ignore that warning.

Google map places service is giving REQUEST

I am using google place api for places sugestions.
https://maps.googleapis.com/maps/api/place/textsearch/json?query=ari&sensor=false&key=your_api_key
I have valid api key and this URL is working fine when I am executing it from the browser.
The api return "OK" as status and places suggestion but when I am executing the same URL by cUrl or file_get_contents It returns "REQUEST_DENIED" as status and hence no place suggestions.
why this is behaving like this.
Is there any setting which I am missing.
Any suggestion would be a great help.
Thanks
Did you ever get your answer to this? As far as I am aware this is die to "cross-site-scripting" security limits. You can't go from the Places API directly to Google even though you can in a browsers address bar. You have to make the call back to your sever and have the server send the call to Google - then return those results back to your page/ web site.

How to get the document ids of all documents accessible by a user in google drive?

I tried using the Google Document List API "https://docs.google.com/feeds/default/private/full?max-results=100&showfolders=true" for fetching list of all files and documents accessible to a user.
But got the Error as - "Invalid request URI".
Please can anyone explain what is it that I am missing?
Try appending ?v=3 at the end or alternatively you can add an HTTP header GData-Version: 3 That should fix it.
This is because this API endpoint only works with the latest version of the Document List API and, by default the API uses v1. So you get the same error as you you were using ?v=1

SSL Certificate Match Error while calling Google Places API

I have been using Google Places API inside my application. The client code is written in Java. It was working fine till couple of weeks ago and suddenly it started throwing following exceptions while making API call:
javax.net.ssl.SSLException: hostname in certificate didn't match: <maps.googleapis.com/209.85.175.95> != <*.googleapis.com> OR <googleapis.com> OR <*.googleapis.com>
I am using following URL for making API call : https://maps.googleapis.com/maps/api/place/search/json?
Also tried with different Google API keys generated from Google API console.
Can someone please point what I am missing here?
Many thanks
I used the latest Google API (version 1.13.2) which support better SSL options:
ApacheHttpTransport.Builder transport = new ApacheHttpTransport.Builder().doNotValidateCertificate().build();

Resources