Ruby w/ Postgres & Sinatra - Query won't order right with parameter? - ruby

So I set a variable in my main ruby file that's handling all my post and get requests and then use ERB templates to actually show the pages. I pass the database handler itself into the erb templates, and then run a query in the template to get all (for this example) grants.
In my main ruby file:
grants_main_order = "id_num"
get '/grants' do
erb :grants, :locals => {:db=>db, :order=>grants_main_order, :message=>params[:message]}
end
In the erb template:
db = locals[:db]
getGrants = db.exec("SELECT * FROM grants ORDER BY $1", [locals[:order]])
This produces some very random ordering, however if I replace the $1 with id_num, it works as it should.
Is this a typing issue? How can I fix this? Using string replacement with #{locals[:order]} also gives funky results.

Parameters are there to put in constant values into the query. It's possible and legal, but not meaningful to use them in an ORDER BY-clause.
Say you want to issue this query:
SELECT first_name, last_name
FROM people
ORDER BY first_name
If you put "first_name" in a string and pass it in as a parameter, you instead get:
SELECT first_name, last_name
FROM people
ORDER BY "first_name"
The difference is huge. That last ORDER BY-clause really tells te database not to care about the column values for each row, and just sort as if all rows were identical. Sorting order will be random.

I would recommend using datamapper (http://datamapper.org/) for sinatra. It's a very slick ORM and handles the paramaterized queries you are trying to build quite well.

have you inspected what locals[:order] is? Maybe something funky in there.
p locals[:order]

Related

Calling PostGIS functions within Ruby

I need to execute the PostGIS function st_intersection within an SQL SELECT clause in Ruby. At the moment I am doing it as raw SQL query:
sql_query = "SELECT id, ST_ASEWKT(ST_INTERSECTION(geometry, ?)) FROM trips WHERE status='active';"
intersections = Trip.execute_sql(sql_query, self[:geometry].to_s)
This way has the disadvantage that I receive the result as text and I need to parse the objects out of the strings. Much nicer would be the use of the ActiveRecord interface to make queries. However, I could not find any solution yet to run PostGIS functions (e.g. st_intersection) within ActiveRecord.
An earlier version of the activerecord-postgis-adapter's README showed a nice example using the gem squeel:
my_polygon = get_my_polygon # Obtain the polygon as an RGeo geometry
MySpatialTable.where{st_intersects(lonlat, my_polygon)}.first
As this is not part of the current README anymore, I am wondering whether this is not recommended or if there are any better alternatives.
There are two problems to solve here.
The first is using an SQL function within a .select clause. Ordinarily this is pretty easy—you just use AS to give the result a name. Here's an example from the ActiveRecord Rails Guide:
Order.select("date(created_at) as ordered_date, sum(price) as total_price").group("date(created_at)")
The resulting Order objects would have ordered_date and total_price attributes.
This brings us to the second problem, which is that Rails doesn't give us an easy way to parameterize a select (i.e. use a ? placeholder), so (as far as I can tell) you'll need to do it yourself with sanitize_sql_array:
sql_for_select_intersection = sanitize_sql_array([
"ST_ASEWKT(ST_INTERSECTION(geometry, ?)) AS intersection",
geometry,
])
This will return a sanitized SQL fragment like ST_ASEWKT(ST_INTERSECTION(geometry, '...')), which you can then use to specify a field in select:
Trip.where(status: "active").select(:id, sql_for_select_intersection))
The resulting query will return Trip objects with id and intersection attributes.

Build sql query based on input param

I am building a SQL query and created a execution method to run it as follows:
module Helper
module Action
BASE_SQL_QUERY = 'SELECT a,b,c FROM SOME_TABLE'
SELECT_QUERY = "#{BASE_SQL_QUERY}"
def self.execute(action:, db_client:, data:)
db_client.prepare(Helper::Count::SELECT_QUERY).execute
end
end
end
It is working fine bit I am going to have exact same module and the only different will be in the query. Instead SELECT a,b,c it'll be SELECT count(*) and everything else will be same.
In execution method action argument will have what action i want to do action == 'read' do SELECT a,b,c and for action == count do SELECT count(*).
What am trying to do is have only one module Action and based on action value build sql. Is it possible to do that? I tried bunch of ways like creating a method and passing action to it and tried to build sql but i get error dynamic constant assignment because CONSTANT value can not be in method.
is it possible to build sql based on action value?
Sure it's possible, any existing ORM does exactly that among other things - provide the interface to build sql queries using the language common syntax.
Besides large and scary full-featured ORMs there are some gems that does just the job you're looking for - query building. You could check how they are implemented and borrow some ideas there (for example, https://github.com/izniburak/qruby)

Parsing large txt files in ruby taking a lot of time?

below is the code to download a txt file from internet approx 9000 lines and populate the database, I have tried a lot but it takes a lot of time more than 7 minutes. I am using win 7 64 bit and ruby 1.9.3. Is there a way to do it faster ??
require 'open-uri'
require 'dbi'
dbh = DBI.connect("DBI:Mysql:mfmodel:localhost","root","")
#file = open('http://www.amfiindia.com/spages/NAV0.txt')
file = File.open('test.txt','r')
lines = file.lines
2.times { lines.next }
curSubType = ''
curType = ''
curCompName = ''
lines.each do |line|
line.strip!
if line[-1] == ')'
curType,curSubType = line.split('(')
curSubType.chop!
elsif line[-4..-1] == 'Fund'
curCompName = line.split(" Mutual Fund")[0]
elsif line == ''
next
else
sCode,isin_div,isin_re,sName,nav,rePrice,salePrice,date = line.split(';')
sCode = Integer(sCode)
sth = dbh.prepare "call mfmodel.populate(?,?,?,?,?,?,?)"
sth.execute curCompName,curSubType,curType,sCode,isin_div,isin_re,sName
end
end
dbh.do "commit"
dbh.disconnect
file.close
106799;-;-;HDFC ARBITRAGE FUND RETAIL PLAN DIVIDEND OPTION;10.352;10.3;10.352;29-Jun-2012
This is the format of data to be inserted in the table. Now there are 8000 such lines and how can I do an insert by combining all that and call the procedure just once. Also, does mysql support arrays and iteration to do such a thing inside the routine. Please give your suggestions.Thanks.
EDIT
I have to make insertion's into the tables depending on whether they are already exist or not, also I need to make use of conditional comparison's before inserting into the table. I definitely can't write SQL statements for these, so I wrote SQL stored procedures. Now I have a list #the_data, how do I pass that to the procedure and then iterate through it all on MySQL side. Any ideas ?
insert into mfmodel.company_masters (company_name) values
#{#the_data.map {|str| "('#{str[0]}')"}.join(',')}
this makes 100 insertions but 35 of them are redundant so I need to search the table for existing entries before doing a insertion.
Any Ideas ? thanks
From your comment, it looks like you are spending all your time executing DB queries. On a recent Ruby project, I also had to optimize some slow code which was importing data from CSV files into the database. I got about a 500x performance increase by importing all the data by using a single bulk INSERT query, rather than 1 query for each row of the CSV file. I accumulated all the data in an array, and then built a single SQL query using string interpolation and Array#join.
From your comments, it seems that you may not know how to build and execute dynamic SQL for a bulk INSERT. First get your data in a nested array, with the fields to be inserted in a known order. Just for an example, imagine we have data like this:
some_data = [['106799', 'HDFC FUND'], ['112933', 'SOME OTHER FUND']]
You seem to be using Rails and MySQL, so the dynamic SQL will have to use MySQL syntax. To build and execute the INSERT, you can do something like:
ActiveRecord::Base.connection.execute(<<SQL)
INSERT INTO some_table (a_column, another_column)
VALUES #{some_data.map { |num,str| "(#{num},'#{str}')" }.join(',')};
SQL
You said that you need to insert data into 2 different tables. That's not a problem; just accumulate the data for each table in a different array, and execute 2 dynamic queries, perhaps inside a transaction. 2 queries will be much faster than 9000.
Again, you said in the comments that you may need to update some records rather than inserting. That was also the case in the "CSV import" case which I mentioned above. The solution is only slightly more complicated:
# sometimes code speaks more eloquently than prose
require 'set'
already_imported = Set.new
MyModel.select("unique_column_which_also_appears_in_imported_files").each do |x|
already_imported << x.unique_column_which_also_appears_in_imported_files
end
to_insert,to_update = [],[]
imported_data.each do |row|
# for the following line, don't let different data types
# (like String vs. Numeric) get ya
# if you need to convert the imported data to match correctly against what's
# already in the DB, do it!
if already_imported.include? row[index_of_unique_column]
to_update << row
else
to_insert << row
end
end
Then you must build a dynamic INSERT and a dynamic UPDATE for each table involved. Google for UPDATE syntax if you need it, and go wild with all your favorite string processing functions!
Going back to the sample code above, note the difference between numeric and string fields. If it is possible that the strings may contain single quotes, you will have to make sure that all the single quotes are escaped. The behavior of String#gsub may be surprise you when you try to do this: it assigns a special meaning to \'. The best way I have found so far to escape single quotes is: string.gsub("'") { "\\'" }. Perhaps other posters know a better way.
If you are inserting dates, make sure they are converted to MySQL's date syntax.
Yes, I know that "roll-your-own" SQL sanitization is very iffy. There may even be security bugs with the above approach; if so, I hope my better-informed peers will set me straight. But the performance gains are just too great to ignore. Again, if this can be done using a prepared query with placeholders, and you know how, please post!
Looking at your code, it looks like you are inserting the data using a stored procedure (mfmodel.populate). Even if you do want to use a stored procedure for this, why do you have dbh.prepare in the loop? You should be able to move that line outside of lines.each.
You might want to try exporting the data as csv and loading it with 'load data infile... replace'. It seems cleaner/easier than trying to construct bulk insert queries.

SQLITE3 strings in where clauses seem confused

I'm wondering if anyone has any clarification on the difference between the following statements using sqlite3 gem with ruby 1.9.x:
#db.execute("INSERT INTO table(a,b,c) VALUES (?,?,?)",
some_int, other_int, some_string)
and
#db.execute("INSERT INTO table(a,b,c) VALUES (#{some_int},"+
+"#{some_int}, #{some_string})")
My problem is: When I use the first method for insertion, I can't query for the "c" column using the following statement:
SELECT * FROM table WHERE c='some magic value'
I can use this:
"SELECT * FROM table WHERE c=?", "some magic value"
but what I really want to use is
"SELECT * FROM table WHERE c IN ('#{options.join("','")}')"
And this doesn't work with the type of inserts.
Does anyone know what the difference is at the database level that is preventing the IN from working properly?
I figured this out quite a while ago, but forgot to come back and point it out, in case someone finds this question at another time.
The difference turns out to be blobs. Apparently when you use the first form above (the substitution method using (?,?)) SQLite3 uses blogs to enter the data. However, if you construct an ordinary SQL statement, it's inserted as a regular string and the two aren't equivalent.
Insert is not possible to row query but row query used in get data that time this one working.
SQLite in you used in mobile app that time not work bat this row query you write in SQLite Browse in that work

Stop Activerecord from loading Blob column

How can I tell Activerecord to not load blob columns unless explicitly asked for? There are some pretty large blobs in my legacy DB that must be excluded for 'normal' Objects.
I just ran into this using rail 3.
Fortunately it wasn't that difficult to solve. I set a default_scope that removed the particular columns I didn't want from the result. For example, in the model I had there was an xml text field that could be quite long that wasn't used in most views.
default_scope select((column_names - ['data']).map { |column_name| "`#{table_name}`.`#{column_name}`"})
You'll see from the solution that I had to map the columns to fully qualified versions so I could continue to use the model through relationships without ambiguities in attributes. Later where you do want to have the field just tack on another .select(:data) to have it included.
fd's answer is mostly right, but ActiveRecord doesn't currently accept an array as a :select argument, so you'll need to join the desired columns into a comma-delimited string, like so:
desired_columns = (MyModel.column_names - ['column_to_exclude']).join(', ')
MyModel.find(id, :select => desired_columns)
I believe you can ask AR to load specific columns in your invocation to find:
MyModel.find(id, :select => 'every, attribute, except, the, blobs')
However, this would need to be updated as you add columns, so it's not ideal. I don't think there is any way to specifically exclude one column in rails (nor in a single SQL select).
I guess you could write it like this:
MyModel.find(id, :select => (MyModel.column_names - ['column_to_exclude']).join(', '))
Test these out before you take my word for it though. :)
A clean approach requiring NO CHANGES to the way you code else where in your app, i.e. no messing with :select options
For whatever reason you need or choose to store blobs in databases.
Yet, you do not wish to mix blob columns in the same table as your
regular attributes. BinaryColumnTable helps you store ALL blobs in
a separate table, managed transparently by an ActiveRecord model.
Optionally, it helps you record the content-type of the blob.
http://github.com/choonkeat/binary_column_table
Usage is simple
Member.create(:name => "Michael", :photo => IO.read("avatar.png"))
#=> creates a record in "members" table, saving "Michael" into the "name" column
#=> creates a record in "binary_columns" table, saving "avatar.png" binary into "content" column
m = Member.last #=> only columns in "members" table is fetched (no blobs)
m.name #=> "Michael"
m.photo #=> binary content of the "avatar.png" file

Resources