Threading sqlite connections in Ruby - ruby

I've been trying to get my ruby script threaded since yesterday. I've since opted for SQLite to save data, with the parallel gem to manage concurrency.
I've built a quick script for testing, but I'm having trouble getting the threading working; the database is locked. I've added db.close to the end, which doesn't help, and I've tried adding sleep until db.closed?, but that just sleeps indefinitely. What am I doing wrong?
The error is "database is locked (SQLite3::BusyException)".
Here's my code:
require 'sqlite3'
require 'pry'
require 'parallel'
STDOUT.sync = true
db = SQLite3::Database.new "test.db"
arr = [1,2,3,4,5,6,7,8,9,10]
rows = db.execute <<-SQL
create table test_table (
original string,
conversion string
);
SQL
def test(num)
db = SQLite3::Database.new "test.db"
puts "the num: #{num}"
sleep 4
{ num => num + 10}.each do |pair|
db.execute "insert into test_table values (?, ?)", pair
end
db.close
end
Parallel.each( -> { arr.pop || Parallel::Stop}, in_processes: 3) { |number| test(number) }

SQLite is threadsafe by default (using its "serialized" mode) and the ruby wrapper apparently supports this to whatever extent it needs to. However, it's not safe across processes, which makes a certain sense since the adapter or engine probably has to negotiate some state in the process to prevent locks.
To fix your example change in_processes to in_threads

Related

Parsing sqlite3 query responses in ruby

I'm trying to read and parse an sqlite3 query in Ruby using the sqlite3 gem. This db already exists on my machine.
I'm opening the db with
db = SQLite3::Database.new "/path to/database.sqlite"
The I'm executing my particular query with
db.execute( "SELECT * FROM `ZSFNOTE` WHERE `ZTRASHED` LIKE '0'" ) do |row|
Now, based on my (limited) experience, I was hoping that this could be parsed like a JSON response, where I could call something like row["ZTITLE"]. However, those headers aren't available in my response, I can only get at what I'm looking for by guessing an integer, like row[19].
I know I'm not even scratching the surface of the sqlite3 gem, but couldn't find the answer to this in the docs. Any help would be much appreciated.
You can use #execute2 to get the headers.
require 'sqlite3'
db = SQLite3::Database.new(':memory:')
db.execute 'CREATE TABLE "examples" ("header" varchar(20), "value" integer(8))'
db.execute 'INSERT INTO examples(header, value) VALUES("example", 1)'
db.execute2('select * from examples')
# => [["header", "value"], ["example", 1]]
You can map the headers to the columns like so:
headers, *rows = db.execute2('select * from examples')
rows.map! do |row|
row.each_with_index.with_object({}) do |(col, i), o|
o[headers[i]] = col
end
end
rows.each do |row|
p row['header']
end
# => "example"

postgres avoiding extra quotes inside a string

I have following select query which I will be passing to the database to get results back,
sql = "select * from movies where title = #{movie_title};"
movie_title contains a value that can sometimes contain single quotes and other chars that need escaping. I have come across dollar quoted string which is working well when used inside a INSERT statement but SELECT is not behaving the same, if I use $$#{movie_title}$$ like this it just doesn't get converted to a value inside movie_title. Is there any solution for this?
I am using postgres 9.5.0 and I am programming using ruby.
Bad idea. Don't do that, as you are making your code vulnerable to SQL injection attacks, and also making your life harder. Read more about prepared SQL statements, SQL injection etc.
In short, unless you are using some ORM, you should do something like:
#!/usr/bin/ruby
require 'pg'
if ARGV.length != 1 then
puts "Usage: prepared_statement.rb rowId"
exit
end
rowId = ARGV[0]
begin
con = PG.connect :dbname => 'testdb', :user => 'janbodnar'
con.prepare 'stm1', "SELECT * FROM Cars WHERE Id=$1"
rs = con.exec_prepared 'stm1', [rowId]
puts rs.values
rescue PG::Error => e
puts e.message
ensure
rs.clear if rs
con.close if con
end
(an example taken from http://zetcode.com/db/postgresqlruby/)
Edit: You don't need to use prepared statements, you can also use your DB lib's methods which provide proper parameter binding:
require 'pg'
conn = PG::Connection.open(:dbname => 'test')
res = conn.exec_params('SELECT $1 AS a, $2 AS b, $3 AS c', [1, 2, nil])
Take a look at docs for PG#exec_params

How to access to various databases with Ruby?

I am building a module in Ruby to read metadata from source tables in various databases.
I wrote a small program to test with PostgreSQL:
#!/usr/bin/ruby
require 'pg'
begin
puts "start"
puts 'Version of libpg: ' + PG.library_version.to_s
con = PG.connect(host: 'localhost', dbname: 'rdv_app_dev', user: 'rdv_app', password: 'rdv_app')
puts con.server_version
pst = con.exec "SELECT * FROM users"
pst.each do |row|
puts "%s %s " % [ row['id'], row['email'] ]
end
puts 'There are %d columns ' % pst.nfields
puts 'The column names are:'
pst.fields.each do |f|
puts pst.fnumber(f).to_s + ' ' + f + ' ' + pst.ftype(pst.fnumber(f)).to_s
end
rescue PG::Error => e
puts e.message
ensure
pst.clear if pst
con.close if con
puts "stop"
end
It works fine, but it uses functions that are specific to Postgres. I need to have it working for any database without re-coding it for each one of them.
I read about Ruby-DBI, but it looks to be out of date, since it did not evolve for 7 years.
Is there a generic solution for accessing a database with Ruby ?
ActiveRecord is by far the most popular (see RubyGems stats). DataMapper is very similar, but more lightweight and makes changing the database quicker. I'm not familiar with sequel.
These gems introduce their own syntax for communicating with the database that is intended to abstact away the database-specific implementation details. I.e. a query like User.where(verified: true).includes(:posts).order(created_at: :desc) will list users ordered by most-recent creation date and include their' posts (performing a join behind the scenes). This Ruby syntax will compile to db-specific code based on what adapter and configuration you've specified.
Look into Sinatra and DataMapper; there are many tutorials.
Also look into Rails and how to configure Rails to use MySQL or Postgres instead of (its default) SQLite. You will find that the ORM (ActiveRecord) code doesn't change regardless of which you use.

Preparing and executing SQLite Statements in Ruby

I have been trying to puts some executed statements after I prepare them. The purpose of this is to sanitize my data inputs, which I have never done before. I followed the steps here, but I am not getting the result I want.
Here's what I have:
require 'sqlite3'
$db = SQLite3::Database.open "congress_poll_results.db"
def rep_pull(state)
pull = $db.prepare("SELECT name, location FROM congress_members WHERE location = ?")
pull.bind_param 1, state
puts pull.execute
end
rep_pull("MN")
=> #<SQLite3::ResultSet:0x2e69e00>
What I am expecting is a list of reps in MN, but instead I just get "SQLite3::ResultSet:0x2e69e00" thing.
What am I missing here? Thanks very much.
Try this
def rep_pull(state)
pull = $db.prepare("SELECT name, location FROM congress_members WHERE location = ?")
pull.bind_param 1, state
pull.execute do |row|
p row
end
end

Get inserted ID from Sequel prepare

I have a prepared insert statement in Sequel (using Oracle).
prepared_statement = DB[:table_name].prepare(:insert, :name, :value=>:$value)
When I call it the row gets added just fine.
prepared_statement.call :value=>'Some value'
I have a trigger and a sequence set up so the ID will be auto generated. I would like to get back the row (or the id) I just added, but I can't see how. I can't use insert because value is a CLOB and may be greater than 4000 characters.
In JRuby, using the JDBC adapter you can override the insert and pass in the returning clause. The tricky part is that you don't always know what the primary key is at this level so you may have to use ROWID or request all of the columns back.
You end up with something that looks similar to this:
module Sequel
module JDBC
class Database
def execute_insert_with_returning(conn, sql, opts = {})
columns = opts[:key_columns] || ["ROWID"]
q = "{ call #{sql} returning #{columns.join(',')} into #{columns.collect {|_| '?'}.join(',')} }"
stmt = conn.prepare_call(q)
raise "Unable to prepare call for insert" if stmt.nil?
begin
columns.each_with_index do |_, index|
stmt.registerOutParameter(index+1, JavaSQL::Types::VARCHAR)
end
return nil if 0 == stmt.executeQuery
values = (1..columns.count).inject({}) do |memo, index|
key = columns[index-1].downcase.to_sym rescue nil
memo[key] = stmt.get_string(index) unless key.nil?
memo
end
values
ensure
stmt.close
end
end # #execute_insert_with_returning
alias execute_without_specialized_insert execute
def execute(sql, opts={}, &block)
if opts[:type] == :insert
synchronize(opts[:server]) do |conn|
execute_insert_with_returning conn, sql, opts
end
else
execute_without_specialized_insert sql, opts, &block
end
end # #execute
end # Database
end # JDBC
end # Sequel
I've done something pretty much like this and it works pretty good. I think we had to override the Sequel::Model as well so it passes the primary key in as opts[:key_columns] but I may be remembering incorrectly.
This is a bit of a load bearing kludge that gets the job done. It would be more elegant to specialize it to the Oracle JDBC adapter and to ensure that all of the error handling code is present from the original execute statement. Given the time I'd love to get something better and give it back to the Sequel project.
The way to get the populated sequence values is through the RETURNING clause of the INSERT
statement, as I discuss in this response to a similar question regarding CodeIgniter.
I'm not sure whether the base version of RoR supports that syntax, but it appears to be possible to extend ActiveRecord to handle it. Find out more.
Sequel's Oracle adapter doesn't have native prepared statement support, so it falls back to issuing a regular query. If you can use JRuby, the jdbc adapter has native prepared statement support, so it should just work there. If you can't use JRuby, you'll have to work on adding native prepared statement support to the Oracle adapter. I don't have access to an Oracle installation, so I can't test any support, but I'll be happy to provide advice if you run into problems.

Resources