For loop inside <<-eos Ruby - ruby

I'm a rookie in Ruby language. I'm trying to write a json file with ruby to import it after to a Mongodb collection. I need the document maintain proper indentation to then fill it comfortably
At this moment, I'm doing it in this way, but I'm sure that isn't the recommened way
out_file = File.new('file.json', "w+")
str = <<-eos
{
"key1": #{#value1},
"key2" : #{#value2},
"key3" : {
"subkey_3_1" : {
"key" : #{#value},
"questions" : #{#invalid_questions}
},
"subkey_3_2" : {
"key" : #{value},
"array_key" : [
for i in 1..50
# Here, must be create 50 hash pair-value like this.
{},
{},
{},
...
end
]
}
}
}
eos
out_file.puts(str)
out_file.close
This is the final structure that I want.Thanks, and sorry for not explaining right from the start
How can I define it in ruby?

str = <<-eos
"key" : [
#{for i in 1..50 {
...something content...
}.join("\n") }
]
eos
However - why do you want a string here - I don't know what you are trying to do, but there must be a better way of doing it.
UPDATE:
Yep, as mentioned by #ArupRakshit you need to create the hash first and call to_json on it. If you don't have this method, you need to install gem called active_support and require 'active_support/core_ext' (no need to do this for rails app). Do not build json response manually.

Related

ruby POST requested Json to structured array or hash for CSV

{
"TOTAL" : "520"
,
"PROD_101379" : {
"IMG" : "1406301107587209.jpg",
"NAME" : "hello sunny",
"LINK" : "/product/productDetail.do?seq=101379",
"SEQ" : "101379",
"PRICE" : "18000",
"MILEAGE" : "2"
}
,
"PROD_101378" : {
"IMG" : "",
"NAME" : "special gift",
"LINK" : "/product/productDetail.do?seq=101378",
"SEQ" : "101378",
"PRICE" : "3000",
"MILEAGE" : "2"
}
,
"PROD_101376" : {
"IMG" : "1405020190326241.jpg",
"NAME" : "it radiant",
"LINK" : "/product/productDetail.do?seq=101376",
"SEQ" : "101376",
"PRICE" : "45000",
"MILEAGE" : "2"
}
,
"PAGER" : "<div class="paging"><a href="javascript:pageForm('0');
}
I am learning how to program in Ruby, using Nokogiri gem to parse internet data.
Above data was received after requesting with POST using Net/HTTP, then parsing it with Json gem.
JSON.parse(x.body)
I am trying to turn that data into CSV like so:
IMG | Name | Link | SEQ | Price | Mileage |
for each PROD_xxxxxx arrays.
I've read the gem documentation, and looked at other questions here to realize that much like Nokogiri parsing HTML(or more like Array & Hash), I should be able to parse JSON format.
I found that I can get the value of something by:
json_parsed["TOTAL"]
which would give me "520"
I get use the same approach to get at the "PROD_xxxxxx" nested data, but that would give me only specific nested data for that name.
I would like to be able to loop through them, so I've tried something like
json_parsed["PROD*"].each do |each|
but looks like this is not the correct way to use the syntax.
If can I loop through each and get "IMG", "NAME", "LINK", ...etc, I am trying to use CSV to:
CSV.open(fname,"w") do |csv|
Json_parsed[each-PROD].each do |each|
name = each["NAME"]
img = each["IMG"]
csv << [name, img]
end
end
However, if there is a better way than the above approach to turn the JSON data into CSV (maybe this is possible as JSON is a structured data, like CSV?) I would appreciate the suggestion.
Thanks,
The easiest way is to iterate through values of PROD_* keys:
h = '{ ... }':
require 'json'
hash = (JSON.parse h).select { |k,v| k =~ /PROD_/ } # select PROD_* only
require 'csv'
CSV.open('filename', 'w') do |csv|
csv << ['IMG', 'Name', 'Link', 'SEQ', 'Price', 'Mileage']
hash.each { |k,v|
csv << v.values
}
end
Hope it helps.

Objectify Ruby Hashes from/to JSON API

I just released a ruby gem to use some JSON over HTTP API:
https://github.com/solyaris/blomming_api
My naif ruby code just convert complex/nested JSON data structures returned by API endpoints (json_data) to ruby Hashes ( hash_data), in a flat one-to-one transaltion (JSON to ruby hash and viceversa). Tat's fine, but...
I would like a programming interface more "high level".
Maybe instatiating a class Resource for every endpoint, but I'm confused about a smart implementation.
Let me explain with an abstract code.
Let say I have a complex/nested JSON received by an API,
usually an Array of Hashes, recursively nested as here below (imagination example):
json_data = '[{
"commute": {
"minutes": 0,
"startTime": "Wed May 06 22:14:12 EDT 2014",
"locations": [
{
"latitude": "40.4220061",
"longitude": "40.4220061"
},
{
"latitude": "40.4989909",
"longitude": "40.48989805"
},
{
"latitude": "40.4111169",
"longitude": "40.42222869"
}
]
}
},
{
"commute": {
"minutes": 2,
"startTime": "Wed May 28 20:14:12 EDT 2014",
"locations": [
{
"latitude": "43.4220063",
"longitude": "43.4220063"
}
]
}
}]'
At the moment what I do, when I receive a similar JSON form an API is just:
# from JSON to hash
hash_data = JSON.load json_data
# and to assign values:
coords = hash_data.first["commute"]["locations"].last
coords["longitude"] = "40.00" # was "40.4111169"
coords["latitude"] = "41.00" # was "40.42222869"
that's ok, but with awfull/confusing syntax.
Instead, I probably would enjoy something like:
# create object Resource from hash
res = Resource.create( hash_data )
# ... some processing
# assign a "nested" variables: longitude, latitude of object: res
coords = res.first.commute.locations.last
coords.longitude = "40.00" # was "40.4111169"
coords.latitude = "41.00" # was "40.42222869"
# ... some processing
# convert modified object: res into an hash again:
modified_hash = res.save
# and probably at least I'll recover to to JSON:
modified_json = JSON.dump modified_hash
I read intresting posts:
http://pullmonkey.com/2008/01/06/convert-a-ruby-hash-into-a-class-object/
http://www.goodercode.com/wp/convert-your-hash-keys-to-object-properties-in-ruby/
and copying Kerry Wilson' code, I sketched the implementation here below:
class Resource
def self.create (hash)
new ( hash)
end
def initialize ( hash)
hash.to_obj
end
def save
# or to_hash()
# todo! HELP! (see later)
end
end
class ::Hash
# add keys to hash
def to_obj
self.each do |k,v|
v.to_obj if v.kind_of? Hash
v.to_obj if v.kind_of? Array
k=k.gsub(/\.|\s|-|\/|\'/, '_').downcase.to_sym
## create and initialize an instance variable for this key/value pair
self.instance_variable_set("##{k}", v)
## create the getter that returns the instance variable
self.class.send(:define_method, k, proc{self.instance_variable_get("##{k}")})
## create the setter that sets the instance variable
self.class.send(:define_method, "#{k}=", proc{|v| self.instance_variable_set("##{k}", v)})
end
return self
end
end
class ::Array
def to_obj
self.map { |v| v.to_obj }
end
end
#------------------------------------------------------------
BTW, I studied a bit ActiveResource project (was part of Rails if I well understood).
ARes could be great for my scope but the problem is ARes have a bit too "strict" presumption of full REST APIs...
In my case server API are not completely RESTfull in the way ARes would expect...
All in all I would do a lot of work to subclass / modify ARes behaviours
and at the moment I discarded the idea to use ActiveResource
QUESTIONS:
someone could help me to realize the save() method on the above code (I'm really bad with recursive methods... :-( ) ?
Does exist some gem that to the above sketched hash_to_object() and object_to_hash() translation ?
What do you think about that "automatic" objectifying of an "arbitrary" hash coming froma JSON over http APIs ?
I mean: I see the great pro that I do not need to client-side static-wire data structures, allowing to be flexible to possible server side variations.
But on the other hand, doing this automatic objectify, there is a possible cons of a side effect to allow security issues ... like malicious JSON injection (possible untrasted communication net ...)
What do you think about all this ? Any suggestion is welcome!
Sorry for my long post and my ruby language metaprogramming azards :-)
giorgio
UPDATE 2: I'm still interested reading opinions about question point 3:
Pros/Cons to create Resource class for every received JSON
Pros/Cons to create static (preemptive attributes) / automatich/dynamic nested objects
UPDATE 1: long reply to Simone:
thanks, you are right Mash have a sweet .to_hash() method:
require 'json'
require 'hashie'
json_data = '{
"commute": {
"minutes": 0,
"startTime": "Wed May 06 22:14:12 EDT 2014",
"locations": [
{
"latitude": "40.4220061",
"longitude": "40.4220061"
},
{
"latitude": "40.4989909",
"longitude": "40.48989805"
},
{
"latitude": "40.4111169",
"longitude": "40.42222869"
}
]
}
}'
# trasforma in hash
hash = JSON.load json_data
puts hash
res = Hashie::Mash.new hash
# assign a "nested" variables: longitude, latitude of object: res
coords = res.commute.locations.last
coords.longitude = "40.00" # was "40.4111169"
coords.latitude = "41.00" # was "40.42222869"
puts; puts "longitude: #{res.commute.locations.last.longitude}"
puts "latitude: #{res.commute.locations.last.latitude}"
modified_hash = res.to_hash
puts; puts modified_hash
This feature is provided by a few gem. One of the most known is Hashie, specifically the class Hashie::Mash.
Mash is an extended Hash that gives simple pseudo-object functionality that can be built from hashes and easily extended. It is designed to be used in RESTful API libraries to provide easy object-like access to JSON and XML parsed hashes.
Mash also supports multi-level objects.
Depending on your needs and level of nesting, you may get away with an OpenStruct.
I was working with a simple test stub. Hashie would have worked well, but was a bigger tool than I needed (and added dependency).

Passing BSON directly to MongoDB using the mong-ruby-driver?

Is there a way to pass a BSON object directly into the .find() in the mongo-ruby-driver?
At the moment I have a basic sinatra app that takes URL encoded JSON and parses it into the .find() but I would ideally like to give it straight BSON:
require 'sinatra'
require 'mongo'
require 'json'
include Mongo
db = MongoClient.new().db('test')
get '/' do
if request[:query]
query = JSON.parse(CGI::unescape(request[:query]))
db.collection('test_collection').find(query).to_a.to_json
end
end
So essentially have something along the lines of BSON.parse(url-encoded-query) and be able to pass that into a .find() returning the result.
Example URL: http://localhost:4567/?query=%7B%20%22name%22%20%3A%20%22john%20doe%22%20%7D
Current query: { "name" : "john doe" }
BSON query: { name: /.*john.*/, interests: [ 'fishing', 'golf' ]} that I'd like to work
The following test script demonstrates how to use the $elemMatch operator as a projection. Please note that the Collection#find method takes arbitrary documents for both the "selector" formal parameter and for the "opts" :fields option.
MongoDB documents are mapped to/from Ruby Hash objects, and these documents can fully incorporate MongoDB operators.
elemmatch_projection.rb
#!/usr/bin/env ruby
# Ruby translation of example from http://docs.mongodb.org/manual/reference/projection/elemMatch/
require 'mongo'
coll = Mongo::MongoClient.new['test']['students']
coll.remove
coll.insert({
zipcode: 63109,
dependents: [
{ name: "john", school: 102, age: 10 },
{ name: "jess", school: 102, age: 11 },
{ name: "jeff", school: 108, age: 15 }
]
})
p coll.find( { zipcode: 63109 }, :fields => { dependents: { '$elemMatch' => { school: 102 } } } ).to_a
ruby elemmatch_projection.rb
[{"_id"=>BSON::ObjectId('50eab29929daeb05ae000001'), "dependents"=>[{"name"=>"john", "school"=>102, "age"=>10}]}]
This is another answer because the question has been significantly clarified by the OP.
Hope that this helps you understand how to use MongoDB documents and operators in Ruby.
Your issue has more to do with parsing JSON or Ruby (Regexp) than with BSON. Your original question causes confusion by jumping directly to BSON. With the current Ruby driver, BSON is not directly exposed to the application writer, but mapped as naturally as possible from and to Ruby objects.
JSON is strictly limited and safe for parsing. Adding parsing for Regexp moves beyond this.
You can do what you want unsafely using Kernel#eval. This will parse your Regexp, but it will also parse exec, system, backticks, etc. For a public application with arbitrary user input, you will have to do something safer.
Also note, the differences between the following lines, which highlights semantics with both Ruby and MongoDB:
{ interests: [ 'fishing', 'golf' ] }
The above exactly matches interests iff they are exactly [ 'fishing', 'golf' ]. no more, no less, no other order.
{ interests: { '$in' => [ 'fishing', 'golf' ] } }
The above matches interests if interests have either 'fishing' or 'golf', any order, any position, any extras. Note that the string key '$in' requires the original => syntax.
Hope that this helps your understanding, and please feel free to followup with clarifying questions.
The following is a working example.
myapp.rb
require 'sinatra'
require 'mongo'
require 'json'
include Mongo
db = MongoClient.new().db('test')
get '/' do
if request[:query]
query = eval CGI::unescape(request[:query])
docs = db.collection('test_collection').find(query).to_a.to_json
"docs=#{docs}"
end
end
myapp_test.rb
require 'myapp'
require 'test/unit'
require 'rack/test'
require 'open-uri'
ENV['RACK_ENV'] = 'test'
class MyAppTest < Test::Unit::TestCase
include Rack::Test::Methods
def setup
#db ||= Mongo::MongoClient.new['test']
#coll ||= #db['test_collection']
#coll.remove
#coll.insert({name: 'john doe', interests: [ 'fishing', 'golf' ]})
end
def app
Sinatra::Application
end
def query_test(query)
uri = "http://localhost:4567/?query=#{URI::encode(query)}"
puts "uri=#{uri}"
get uri
puts last_response.body
assert_match(/^docs=/, last_response.body)
end
def test_john_doe
query_test("{ name: 'john doe'}")
end
def test_regexp
query_test("{ name: /.*john.*/, interests: [ 'fishing', 'golf' ]}")
end
end
ruby -I. myapp_test.rb
Run options:
# Running tests:
uri=http://localhost:4567/?query=%7B%20name:%20/.*john.*/,%20interests:%20[%20'fishing',%20'golf'%20]%7D
docs=[{"_id":{"$oid": "50e9e60029daeb0be1000001"},"name":"john doe","interests":["fishing","golf"]}]
.uri=http://localhost:4567/?query=%7B%20name:%20'john%20doe'%7D
docs=[{"_id":{"$oid": "50e9e60129daeb0be1000002"},"name":"john doe","interests":["fishing","golf"]}]
.
Finished tests in 0.065822s, 30.3850 tests/s, 60.7700 assertions/s.
2 tests, 4 assertions, 0 failures, 0 errors, 0 skips

Extracting values from a hash

Hello Im using HTTParty to call for a remote json file that I need to extract the URL's to use in one of my tests..
the json format goes something like:
"manifest" : {
"header" : {
"generated" : "xxxxxxxxxxxxxxx",
"name" : "xxxxxxxxxxx",
"version" : "1.0.0"
},
"files" : [ {
"file" : "blimp.zip",
"url" : "http://www.xxx.xx/restaurants_blimp.zip",
"checksum" : "ee98c9455b8d7ba6556f53256f95"
}, {
"file" : "yard.zip",
"url" : "www.xxx.xx/yard.zip",
"checksum" : "e66aa3d123f804f34afc622b5"
}
on irb I can get all the sub hashes inside example: ['manifest']['files'] and I can only get the url if I expecify which one.. like for example puts file['manifest']['files']['1']['url'] <-- this does work on irb but since I need to get ALL url's this is why I use .each but it gives me a cant convert to string error or similar
#!/usr/bin/env ruby
require 'httparty'
HOST=ARGV[0]
ID=ARGV[1]
VERSION=ARGV[2]
class MyApi
include HTTParty
end
file = MyApi.get("http://#{HOST}/v1/dc/manifest/#{ID}/#{VERSION}")
file.each do |item|
puts item['manifest']['files']['url']
end
not working but I can on IRB do a:
puts item['manifest']['files'][2]['url'] <-- and this will give me the url but with the .each will just complaint about cant convert to string or similar
Try the following:
#!/usr/bin/env ruby
require 'httparty'
(HOST, ID, VERSION) = ARGV
class MyApi
include HTTParty
format :json
end
response = MyApi.get("http://#{HOST}/v1/dc/manifest/#{ID}/#{VERSION}")
puts response.inspect
The addition of the format :json tells HTTParty to parse the response as JSON. Then you'll get a hash you can iterate over properly.
Try:
file['manifest']['files'].each do |item|
puts item['url']
end

How to dump strings in YAML using literal scalar style?

I have a big string of formatted data (e.g. JSON) that I want to dump to YAML using Psych in ruby while preserving formatting.
Basically, I want for JSON to appear in YAML using literal style:
---
json: |
{
"page": 1,
"results": [
"item", "another"
],
"total_pages": 0
}
However, when I use YAML.dump it doesn't use literal style. I get something like this:
---
json: ! "{\n \"page\": 1,\n \"results\": [\n \"item\", \"another\"\n ],\n \"total_pages\":
0\n}\n"
How can I tell Psych to dump scalars in wanted style?
Solution:
Big thanks to Aaron Patterson for his solution that I'm expanding on here: https://gist.github.com/2023978
Although a bit verbose, that gist is a working way of tagging certain strings in ruby to be output using literal style in YAML.
require 'psych'
# Construct an AST
visitor = Psych::Visitors::YAMLTree.new({})
visitor << DATA.read
ast = visitor.tree
# Find all scalars and modify their formatting
ast.grep(Psych::Nodes::Scalar).each do |node|
node.plain = false
node.quoted = true
node.style = Psych::Nodes::Scalar::LITERAL
end
begin
# Call the `yaml` method on the ast to convert to yaml
puts ast.yaml
rescue
# The `yaml` method was introduced in later versions, so fall back to
# constructing a visitor
Psych::Visitors::Emitter.new($stdout).accept ast
end
__END__
{
"page": 1,
"results": [
"item", "another"
],
"total_pages": 0
}

Resources