How to get book cover from ISBN using Google Book API? - google-books

Is there a simple way to get book cover in JSON format from ISBN using Google Book API?

You can use the isbn: query, like this:
https://www.googleapis.com/books/v1/volumes?q=isbn:0771595158
This will return a proper JSON response containing either the book information or an error description if the ISBN is not found.

If you are looking for an answer for a Ruby (Rails, Sinatra or Console) answer on how to get a conver image or for that matter any detail available through the Google Books API, I think GoogleBooks gem is a good place to start.
For example, for the same scenario:
Install the gem
gem install googlebooks
Use the gem
require 'googlebooks'
GoogleBooks.search('isbn:9781443411080').first.image_link
first because it returns a collection of books.

Related

Using Pinterest Marketing api to fetch all entities?

Is there an edge that would allow fetching all the campaigns, adgroups, etc., of a given advertiser? In researching this I see these:
GET /ads/v3/campaigns/{campaign}/
GET /ads/v3/adgroups/{adgroup}/
...
which return only single entities. I also found that there is an async report that returns entities in batch:
advertisers/<advertiser ID>/entities/batch/
Is this the best way to accomplish this? If so, the doc says it only supports campaigns right now, is that still the case?
Thanks
Sorry to waste your time folks, I've since found more and better doc and discovered that there is this:
/ads/v3/advertisers/{advertiser}/campaigns/
Here is a link to the doc for your own reference:
https://developers.pinterest.com/docs/redoc/#operation/ads_v3_get_advertiser_campaigns_handler_GET

How do I get a count of the Shopify products in a collection using the Ruby API

I want to get the count of products in each collection in the shop as part of a Shopify App that I'm building.
I know that for a single collection Product.all(params: {collection_id: 29238895}).count will show me the count in the shopify console, but I'm not certain about how it is implemented.
The API document describes a call that counts all products that belong to a certain collection GET /admin/products/count.json?collection_id=841564295 but I have been unable to get a ruby expression that runs this.
Is there a more complete document on the Ruby API?
If you want to know exactly what is going on with the API, may I suggest the simple command: bundle open shopify_api
That will load the entire API into your text editor, allowing to quickly determine the answer to your question. The /lib/resources directory is especially rich, but do not forget to check the base class as well. In fact, I think the count option is declared right in the base itself. Nothing beats a few minutes of examining the code.

Searching a twitter list for certain tags or words

I am learning ruby at the moment and using twitter as a platform to help me build my first prototype in Sinatra. I'm using the Twitter gem and have managed to get a private list of mine and display all the tweets related to the users in that list.
However I now want to search through the list for a set of certain set of keywords, and if found display the tweet.
Does anyone know if there is anyway within the Twitter gem to do this? Or how I would go about doing this in rails in an efficient way.
The only way I can figure out is to iterate through each tweet returned, get the text related to that tweet and search for the keywords, if found display that tweet. This to me is stupidly inefficient and would this not use up unnecessary API request?
This is what I have so far if this is of any help to anyone.
require 'sinatra'
require 'rubygems'
require 'twitter'
client = Twitter::REST::Client.new do |config|
config.consumer_key = 'xxxx'
config.consumer_secret = 'xxx'
config.access_token = 'xx'
config.access_token_secret = 'xx'
end
get '/' do
#tweet = client.list_timeline(1231123123123,{:include_rts => 0})
erb :index
end
Many thanks in advance
Matt
You are correct about this: iterate through each tweet returned, get the text related to that tweet and search for the keywords, if found display that tweet.
You wrote: "this to me is stupidly inefficient". You are correct. It's inefficient because you have to retrieve all the tweets, rather than just the tweets that contain the keywords that you want.
The Twitter gem does not do what you want, because Twitter search is slightly unpredictable. This is because the Twitter search is optimizing for relevancy, not thoroughness.
What you're looking for, I think, is Twitter "Streams". When you ask for a Twitter Stream, you get all the tweets from the user (or site, or globally). This is more sophisticated to set up. This gives you everything, and gives it to you in real time.
https://dev.twitter.com/streaming/overview
Then you search the tweets within Rails.
If your want a simple search, you may want to look at using Ruby's select method and Regexp class.
If you want powerful search capabilities, you may want to look at various search gems and search engines such as sunspot_solr and Lucene.
If you're building a real-world business application with more advanced needs for scaling and searching, you may want to read about Twitter Firehose partners and text engines such as discovertext. These partners and engines provide real-time search APIs, caching, and many more features.
Consider using search method as shown in example here

Ruby DataMapper: How can I do pagination?

I am going to retrieve a list of objects.
get "/todoitems/?" do
debugger
todo = Todolist.all
todo.to_json
end
Is there example that can retrieve page by page?
Many thanks.
Here's a gem dm-pagination which provides pagination support for Datamapper
I also found dm-paginator
This should point you in the right direction:
http://ruby.railstutorial.org/chapters/updating-showing-and-deleting-users#sec-pagination
Essentially, there's a gem called will-paginate that takes care of sorting things in a page by page structure. There's an example included in the link

What is the correct way to get google search results?

I want to get all the search results for a particular keyword search on google. I've seen suggestions of scraping, but this seems like a bad idea. I've seen Gems (I plan on using ruby) that do scraping and use the API. I've also seen suggestions of using the API.
Does anyone know the best way to do this right now? The API Is no longer supported and I've seen people report they get unusable data back. Do the Gems help solve this or no?
Thanks in advance.
I also go for the scrape option, its quicker than asking google for a key and plus, and you are not limited to 100 search queries per day. Google´s TOS is an issue though, as Richard points out.
Here´s an example i´ve done that works for me - it´s also useful if you want to connect through a proxy:
require 'rubygems'
require 'mechanize'
agent = Mechanize.new
agent.set_proxy '78.186.178.153', 8080
page = agent.get('http://www.google.com/')
google_form = page.form('f')
google_form.q = 'new york city council'
page = agent.submit(google_form, google_form.buttons.first)
page.links.each do |link|
if link.href.to_s =~/url.q/
str=link.href.to_s
strList=str.split(%r{=|&})
url=strList[1]
puts url
end
end
According to http://code.google.com/apis/websearch/ , the Search API has been deprecated -- but there's a replacement, the Custom Search API. Will that do what you want?
If so, a quick Web search turned up https://github.com/alexreisner/google_custom_search , among other gems.
Use the Google Custom Search API:
http://code.google.com/apis/customsearch/v1/overview.html
The Custom Search API most likely is not what you're looking for. I'm pretty sure you have to set up a Custom Search engine which you use the API to query, and this can only search over a user-specified set of domains (i.e. you can't perform general web search).
If you need to perform a general Google search, then scraping is currently the only way to go. It's quite easy to write ruby code to perform Google searches and scrape the search results URLs (I did this myself for a summer research project), but it does violate Google's TOS, so be warned.
You will eventually get 503 errors if you are running a scraper on a google search result page. A more scalable (and legal) approach is to use the Google's Custom Search API.
The API provides 100 search queries per day for free. If you need more, you may sign up for billing in the Google Developers Console. Additional requests cost $5 per 1000 queries, up to 10k queries per day.
The example below get's Google search results in JSON format:
require 'open-uri'
require 'httparty'
require 'pp'
def get_google_search_results(search_phrase)
# assign api key
api_key = "Your api key here"
# encode search phrase
search_phrase_encoded = URI::encode(search_phrase)
# get api response
response = HTTParty.get("https://www.googleapis.com/customsearch/v1?q=#{search_phrase_encoded}&key=#{api_key}&num=100")
# pretty print api response
pp response
# get the url of the first search result
first_search_result_link = response["items"][0]["link"]
end
get_google_search_results("Top Movies in Theatres")
You can also use our API. We take care of the hard parts of scrapping and parsing Google search results. We have bindings available in Ruby as simple as:
query = GoogleSearchResults.new q: "coffee"
hash_results = query.get_hash
Repository: https://github.com/serpapi/google-search-results-ruby

Resources