Ruby - Problems concatenating array elements within loop - ruby

I'm very new to Ruby and am having some problems concatenating strings within a for loop.
Here is what I have so far
# search request
search = ["testOne", "testTwo"]
# Create base url for requests in loop
base_url = "http://example.com/"
# create an empty response array for loop below
response = []
search.each do |element|
response = "#{base_url}#{element}"
end
I'd like response[0] to hold "http://example.com/testOne". However, after the loop executes, response[0] only holds the first letter (h) of my base variable; response holds "http://example.com/testTwo".
I'm thinking this is a simple mistake, but can't find any helpful resources.

Use Array#<< method
# search request
search = ["testOne", "testTwo"]
# Create base url for requests in loop
base_url = "http://example.com/"
# create an empty response array for loop below
response = []
search.each do |element|
response << "#{base_url}#{element}"
end
response # => ["http://example.com/testOne", "http://example.com/testTwo"]
response = "#{base_url}#{element}" means you are assigning in each iteration a new string object to the local variable response. In the last iteration response holds the string object "http://example.com/testTwo". Now response[0] means you are calling the method String#[]. So at index 0 of the string "http://example.com/testTwo", the character present is h, so your response[0] returning 'h'- which is expected as per your code.
The same code can be written in more sweet way :
# search request
search = ["testOne", "testTwo"]
# Create base url for requests in loop
base_url = "http://example.com/"
response = search.map {|element| base_url+element }
response # => ["http://example.com/testOne", "http://example.com/testTwo"]
or
response = search.map(&base_url.method(:+))
response # => ["http://example.com/testOne", "http://example.com/testTwo"]
or, as Michael Kohl pointed :
response = search.map { |s| "#{base_url}#{s}" }
response # => ["http://example.com/testOne", "http://example.com/testTwo"]

Related

Updating same hash in iterator each loop in ruby

Updated with getEntitlementList() method
I am new to ruby and trying to update hash with data getting in steps.
Short story: I got my other problem solved and it was on this:
[1]: split and loop in Ruby
It has 2 problems:
Even though I am setting offset step wise by 250, it is returning the same response. For ex: offset=0. offset=250, offset=500 till offset=3000. It is returning same response not different.
On every new offset, I want to update the same hash so that in the end I will have hash for all items.
API Doc says:
limit >> Integer that specifies the maximum number of records to return in a single API call. If not specified a default limit will be used. Maxiumum of 250 records per page
2.offset >> Integer that specifies the offset of the first result
from the beginning of the collection. offset is record based,
not page based, and the index starts at 0.
For example, offset=0 and limit=20 will return records 0-19, while offset=1 and limit=20 will return records 1-20. Between 0 and the last record index.
def getEntitlementList(url,header,source,limit,offset)
# This method pulls list of existing entitlements of a given source
# This list is uses to create a hsh Map of displayableName vs Id
# Returns a hash Map
uri = URI.parse( "#{url}/cc/api/entitlement/list?limit=#{limit}&offset=#{offset}&CISApplicationId=#{source}" )
puts uri
http = Net::HTTP.new( uri.host, uri.port )
http.use_ssl = true
request = Net::HTTP::Get.new( uri.request_uri, header )
# puts "request : #{request}"
response = http.request( request )
# puts "entitlements hash map : #{response}"
# puts "#{JSON.parse(response.body)}"
case response
when Net::HTTPSuccess
responseBody = JSON.parse(response.body)
# puts responseBody
when Net::HTTPUnauthorized
puts "#{source} Entitlements Hash Map creation is a Failure!"
puts "Error: Unauthorized."
when Net::HTTPServerError
puts "#{source} Entitlements Hash Map creation is a Failure!"
puts "Error: Server Error."
else
puts "#{source} Entitlements Hash Map creation is a Failure!"
puts "Error: #{response.code}"
end
return responseBody
end
What I want to achieve is, I have below code:
limit = 250
entitlementHash = Hash.new
(0..3000).step(250) do |offset|
responseBody = getEntitlementList(url,header,source,limit,offset)
count = responseBody['count']
entitlementList = responseBody['items']
entitlementList.each {|a| entitlementHash["#{a['displayableName'].upcase()}"] =
a['id']}
puts "updated entitlementHash size: #{entitlementHash.size}"
end
return entitlementHash
My api has 3000 data items but in each run it pulls only 250.
So every time it pulls the data, I would like to update the entitlementHash with newer data.
So that at the end of last run, my entitlementHash will have all 3000 data items.
Currently it is just taking last 250 items.
I tried with below:
entitlementHash.update
entitlementHash.store
entitlementHash.merge
but didn't work.
Any help would be really grateful.

Why doesn't my web-crawling method find all the links?

I'm trying to create a simple web-crawler, so I wrote this:
(Method get_links take a parent link from which we will seek)
require 'nokogiri'
require 'open-uri'
def get_links(link)
link = "http://#{link}"
doc = Nokogiri::HTML(open(link))
links = doc.css('a')
hrefs = links.map {|link| link.attribute('href').to_s}.uniq.delete_if {|href| href.empty?}
array = hrefs.select {|i| i[0] == "/"}
host = URI.parse(link).host
links_list = array.map {|a| "#{host}#{a}"}
end
(Method search_links, takes an array from get_links method and search at this array)
def search_links(urls)
urls = get_links(link)
urls.uniq.each do |url|
begin
links = get_links(url)
compare = urls & links
urls << links - compare
urls.flatten!
rescue OpenURI::HTTPError
warn "Skipping invalid link #{url}"
end
end
return urls
end
This method finds most of links from the website, but not all.
What did I do wrong? Which algorithm I should use?
Some comments about your code:
def get_links(link)
link = "http://#{link}"
# You're assuming the protocol is always http.
# This isn't the only protocol on used on the web.
doc = Nokogiri::HTML(open(link))
links = doc.css('a')
hrefs = links.map {|link| link.attribute('href').to_s}.uniq.delete_if {|href| href.empty?}
# You can write these two lines more compact as
# hrefs = doc.xpath('//a/#href').map(&:to_s).uniq.delete_if(&:empty?)
array = hrefs.select {|i| i[0] == "/"}
# I guess you want to handle URLs that are relative to the host.
# However, URLs relative to the protocol (starting with '//')
# will also be selected by this condition.
host = URI.parse(link).host
links_list = array.map {|a| "#{host}#{a}"}
# The value assigned to links_list will implicitly be returned.
# (The assignment itself is futile, the right-hand-part alone would
# suffice.) Because this builds on `array` all absolute URLs will be
# missing from the return value.
end
Explanation for
hrefs = doc.xpath('//a/#href').map(&:to_s).uniq.delete_if(&:empty?)
.xpath('//a/#href') uses the attribute syntax of XPath to directly get to the href attributes of a elements
.map(&:to_s) is an abbreviated notation for .map { |item| item.to_s }
.delete_if(&:empty?) uses the same abbreviated notation
And comments about the second function:
def search_links(urls)
urls = get_links(link)
urls.uniq.each do |url|
begin
links = get_links(url)
compare = urls & links
urls << links - compare
urls.flatten!
# How about using a Set instead of an Array and
# thus have the collection provide uniqueness of
# its items, so that you don't have to?
rescue OpenURI::HTTPError
warn "Skipping invalid link #{url}"
end
end
return urls
# This function isn't recursive, it just calls `get_links` on two
# 'levels'. Thus you search only two levels deep and return findings
# from the first and second level combined. (Without the "zero'th"
# level - the URL passed into `search_links`. Unless off course if it
# also occured on the first or second level.)
#
# Is this what you intended?
end
You should probably be using mechanize:
require 'mechanize'
agent = Mechanize.new
page = agent.get url
links = page.search('a[href]').map{|a| page.uri.merge(a[:href]).to_s}
# if you want to remove links with a different host (hyperlinks?)
links.reject!{|l| URI.parse(l).host != page.uri.host}
Otherwise you'll have trouble converting relative urls to absolute properly.

Error: undefined local variable or method 'foo' for main:Object (NameError)

I am trying to find the needle in the haystack. I already received the dictionary with the two values and keys.
ruby haystack.rb
{"haystack"=>["D0zVh", "F1PFc", "j1WMn", "Ebz3k", "SE7gZ", "kOa7j", "0vCJb", "px18q", "NJSyl", "nRsOK", "T7t8F", "2jvwZ", "5414s", "q5z8U", "TI2Zm", "v4Bn9", "5dRcM", "M84vp", "8nQ0o", "OxEKw"], "needle"=>"v4Bn9"}
The first value, needle, is the string. The second value, haystack, is an array of strings.
The next step is to tell the API where the needle is in the haystack array.
I need to post my result to "api/validateneedle", using the key token for my token, and the key needle for the integer representing where the needle is in the haystack array.
When I run this file, I get the following error:
haystack.rb:59:in `<main>': undefined local variable or method `token' for main:Object (NameError)
Can anyone tell me why I'm receiving this error message? I really appreciate any help/feedback!
token_info = {:token => "SVilLuY0OU"}
require 'net/http'
http = Net::HTTP.new("challenge.code2040.org")
# Sending json in body of http request
# Creating a request that will use the post http method
require "json"
body = token_info.to_json
request = Net::HTTP::Post.new("/api/haystack")
# Setting the request body to be our json
request.body = body
# Storing my token in a variable
response = http.request(request)
# Printing my token to complete rest of assignment
#Printing the body of the response
response_hash = JSON.parse(response.body)
puts response_hash["result"]
def getIndex(response)
needle = response["result"]["needle"]
haystack = response["result"]["haystack"]
i = 0
while i < haystack.length
# if we find it, break the loop and return i
if haystack[i] == needle
return i.to_s
end
i += 1
end
return "not found"
end
def sendIndex(token)
response = getItems(token)
index = getIndex(response)
params = {'token' => token, 'needle' => index}
request = Net::HTTP::Post.new("/api/validateneedle")
end
sendIndex(token)
The error message is self-explanatory: on line 59, you are passing the argument token to the method sendIndex, but token isn't defined, neither as a method nor as a local variable.

Extract return string parameter values

I have a request that looks like this ;
url = "https://api-3t.sandbox.paypal.com/nvp?METHOD=DoExpressCheckoutPayment&TOKEN=#{transaction.token}
&PAYERID=#{transaction.payer_id}&PAYMENTREQUEST_n_PAYMENTACTION=sale"
url = CGI.escape(url)
uri = URI(url)
res = Net::HTTP.get_response(uri)
and res.body looks like this; TOKEN=EC%2d7UM71457T34680821&TIMESTAMP=2013%2d11%2d03T21%3a19%3a11Z&CORRELATIONID=3b73c396244ff&ACK=Success&VERSION=98&BUILD=8334781
How can i get the TOKEN and ACK values from the string? am not sure params works here?
Any ideas?
The body is URI-encoded, just like GET (or some POST) params. You could unpack it manually, by doing something like this:
require 'uri'
# body takes place of res.body for this example
body = 'TOKEN=EC%2d7UM71457T34680821&TIMESTAMP=2013%2d11%2d03' +
'T21%3a19%3a11Z&CORRELATIONID=3b73c396244ff&AC' +
'K=Success&VERSION=98&BUILD=8334781'
# First split into key/value pairs, and use inject to start building a hash
results = body.split('&').inject( {} ) do |hash,kv|
# Split each key/value pair
k,v = kv.split('=').map do |uri_encoded_value|
# Decode - innermost loop, because it may contain encoded '&' or '='
URI.decode(uri_encoded_value)
end
# Add to hash we are building with inject
hash[k] = v
hash
end
=> {"TOKEN"=>"EC-7UM71457T34680821", "TIMESTAMP"=>"2013-11-03T21:19:11Z",
"CORRELATIONID"=>"3b73c396244ff", "ACK"=>"Success", "VERSION"=>"98",
"BUILD"=>"8334781"}
Actually though URI can do nearly all of this for you (and deal with variations in the format better than above), with the decode_www_form class method.
params = {}
URI.decode_www_form( body ).each do |k,v|
params[k] = v
end
params
=> {"TOKEN"=>"EC-7UM71457T34680821", "TIMESTAMP"=>"2013-11-03T21:19:11Z",
"CORRELATIONID"=>"3b73c396244ff", "ACK"=>"Success", "VERSION"=>"98",
"BUILD"=>"8334781"}

Retrieving full request string using Ruby curl

I intend to send a request like the following:
c = Curl::Easy.http_post("https://example.com", json_string
) do |curl|
curl.headers['Accept'] = 'application/json'
curl.headers['Content-Type'] = 'application/json'
curl.headers['Api-Version'] = '2.2'
end
I want to log the exact http request that is being made. Is there a way to get the actual request that was made (base path, query parameters, headers and body)?
The on_debug handler has helped me before. In your example you could try:
curl.on_debug do |type, data|
puts type, data
end
You can reach the solution in differents manner:
Inside your block you can put:
curl.verbose = true # that prints a detailed output of the connection
Or outside the block:
c.url # return the url with queries
c.total_time # retrieve the total time for the prev transfer (name resolving, TCP,...)
c.header_str # return the response header
c.headers # return your call header
c.body_str # return the body of the response
Remember to call c.perform (if not yet performed) before call these methods.
Many more option can be found here: http://curb.rubyforge.org/classes/Curl/Easy.html#M000001

Resources