GSA Serving Logs api - google-search-appliance

I am using GSA Appliance 7.2, I want to access Serving logs by using java code,
Serving Logs we can get by using GSA module Reports-> Serving Logs but I am not able to see any Web Service by GSA to grab this logs (May be I am missing something).
I want serving logs for analyzing detailed query information like 'Number of documents scored for this query' and 'Total time spent in backend'.
Thank you

You'll need to screen scrape it and hit the download link from code as it's not part of the admin API.
GSA Admin API Documentation v7.2

Related

How to search for web and images results from Google Custom Search JSON API?

I'm implementing Google's Custom Search JSON API into my application which shall search for both web and images. I am able to query for web and images independently.
For web: https://www.googleapis.com/customsearch/v1?key=${key}&start=${start}&cx=${cx}&q=${searchVal}
For images: https://www.googleapis.com/customsearch/v1?key=${key}&searchType=image&start=${start}&cx=${cx}&q=${searchVal}
This works perfectly. However, as you can see, I have to make 2 separate requests to get the web results and the images for a single query value. This is an issue since it makes room for Google's search query quota to be exceeded. So, I need to find a way to make a single request which returns me both the web and image results for a particular search value. Any help or recommendation would be appreciated.

How to get list of page URLs with bad metrics Core Web Vitals from Google Search Console via API or other Google API?

How to get list of page URLs with bad metrics Core Web Vitals from Google Search Console via API or other Google API ?
There is a report https://search.google.com/search-console/core-web-vitals/drilldown which contains only some sample pages with problematic CLS (not a complete list of pages is displayed). How can I get a complete list of problem pages via API?
I tried to use https://developers.google.com/webmaster-tools/v1/searchanalytics/query?apix_params=%7B%22siteUrl%22%3A%22https%3A%2F%2Fwww.example.com%22%2C%22resource%22%3A%7B%22startDate%22%3A%222022-03-09%22%2C%22endDate%22%3A%222022-03-14%22%2C%22dimensions%22%3A%5B%22PAGE%22%5D%7D%7D , but that API not returns Core Web Vitals metrics
As you noted, Search Console only provides a limited list of sample URLs and the API doesn't include Core Web Vitals data.
There are a couple of ways to figure out which pages have poor CWV:
Measure CWV on your website directly (see web-vitals.js) and use an analytics service to aggregate the data by page.
Get a list of every URL on the website and query each one using the CrUX API. A tool like this can automatically crawl your sitemap and write the CWV results to a spreadsheet.

Google API Calendar PHP SSL error

I have written a code for fetching Google Calendar events using Google Api Package Php. It works perfectly fine. Just that, once it fetches details of an user. It is caching the details in the server. I am using Centos server. To my knowledge, it has something to do with Google api caching. Because of that I have to wait for 15 mins before another user's details can be fetched else it will show the previous user details.

Sending request to Google not using Contacts API

I was checking the API reports for Contact, Calendar and Tasks. I was surprised to see that the number of requests for Contacts API is 0 for last 28 days. However we synced thousands of contacts with Google everyday. Please refer screenshot attached.
From the stats it seems that the requests we are making to Google is NOT using Contacts API.
Overview of our application's google integration:
Our application is built on Ruby on Rails.
We are using 'google-contacts' gem (https://github.com/varunlalan/google-contacts) for syncing contacts.
We authenticate user using 'omniauth-google-oauth2' gem (https://github.com/zquestz/omniauth-google-oauth2).
OAuth 2 scopes include - "userinfo.email, userinfo.profile, https://www.google.com/m8/feeds/"
Any reason why it is not making use of Contacts API or requests not being showed up in the reports?
Any help or inputs would be highly appreciated.
Thanks.
just wanted to add that we're facing the same issue.
We've been using the Contacts API heavily (and we even got a 503 due to, it seems, exceeding the maximum requests/second) and yet the dashboard and the reports show 0% usage... which makes it a bit difficult to plan ahead!
After further investigation we have also seen that the per user quota is fixed at 10/user/second despite any changes to the config from the API console.
Also, the parameter quotaUser which is meant to enable developers to manage their quota more effectively is ignored.

Google Not Indexing AJAX URLs

I have submitted a sitemap for my AJAX web application to Google via their Webmaster Tools. The submitted URLs are of the form:
http://www.mysite.com/#!myscreen;id=object-id
http://www.mysite.com/#!myotherscreen;id=another-id
However, even though more than a week has passed since sitemap submission, Google has not indexed the URLs. Google states that the sitemap has been processed, states that 60 URLs have been detected, states that no errors occurred, but does not index any of the URLs.
I have already implemented the AJAX crawlability contract on the server side, where requests containing an _escaped_fragment_ are responded to with a snapshot.
Any help/info regarding why Google is not indexing the URLs would be greatly appreciated.
See GWT SE friendly application
Suggestions include following the guide at http://code.google.com/web/ajaxcrawling/.
Nowadays, you don't need to do something specific for Google anymore, and AJAX crawling scheme has been deprecated been Google.
Just make sure that your website is easy to use for your users, and Google will be able to properly crawl it.
If you want to go the extra mile, however, you can check that article:
* https://moz.com/blog/optimizing-angularjs-single-page-applications-googlebot-crawlers

Resources