I am building a module in Ruby to read metadata from source tables in various databases.
I wrote a small program to test with PostgreSQL:
#!/usr/bin/ruby
require 'pg'
begin
puts "start"
puts 'Version of libpg: ' + PG.library_version.to_s
con = PG.connect(host: 'localhost', dbname: 'rdv_app_dev', user: 'rdv_app', password: 'rdv_app')
puts con.server_version
pst = con.exec "SELECT * FROM users"
pst.each do |row|
puts "%s %s " % [ row['id'], row['email'] ]
end
puts 'There are %d columns ' % pst.nfields
puts 'The column names are:'
pst.fields.each do |f|
puts pst.fnumber(f).to_s + ' ' + f + ' ' + pst.ftype(pst.fnumber(f)).to_s
end
rescue PG::Error => e
puts e.message
ensure
pst.clear if pst
con.close if con
puts "stop"
end
It works fine, but it uses functions that are specific to Postgres. I need to have it working for any database without re-coding it for each one of them.
I read about Ruby-DBI, but it looks to be out of date, since it did not evolve for 7 years.
Is there a generic solution for accessing a database with Ruby ?
ActiveRecord is by far the most popular (see RubyGems stats). DataMapper is very similar, but more lightweight and makes changing the database quicker. I'm not familiar with sequel.
These gems introduce their own syntax for communicating with the database that is intended to abstact away the database-specific implementation details. I.e. a query like User.where(verified: true).includes(:posts).order(created_at: :desc) will list users ordered by most-recent creation date and include their' posts (performing a join behind the scenes). This Ruby syntax will compile to db-specific code based on what adapter and configuration you've specified.
Look into Sinatra and DataMapper; there are many tutorials.
Also look into Rails and how to configure Rails to use MySQL or Postgres instead of (its default) SQLite. You will find that the ORM (ActiveRecord) code doesn't change regardless of which you use.
Related
Raise notice is typically used to debug PSQL scripts in postgres (link).
The docs say that there's some kind of support for printing notices when using the pg gem, but there's no info on how to use this proc, what it yields, possible (probable?) caveats etc.
Does anyone have a working code example for production and/or development? Ideally, I'm looking for a solution that allows PG notices to be printed out in development when Sequel logging is enabled.
When I do:
DB = Sequel.connect(
ENV['DATABASE_URL'],
notice_receiver: lambda{ |x| binding.pry }
)
the notice_receiver lambda never gets called once I execute a function that raises a notice. I.e
[1] pry(#<Psql::CalculateMasterBalancesTest>)> DB.select{ |o| Sequel.function(:emit_notice) }.first
I, [2017-05-17T16:51:56.746003 #23139] INFO -- : (0.000335s) SELECT emit_notice() LIMIT 1
=> {:emit_notice=>""}
where emit notice is:
CREATE OR REPLACE FUNCTION emit_notice()
RETURNS VOID AS $$
BEGIN
RAISE NOTICE 'NOTICE ME!!!';
END;
$$ LANGUAGE plpgsql;
and it works from PgAdmin:
NOTICE: NOTICE ME!!!
Total query runtime: 21 ms.
1 row retrieved.
UPDATE
Alejandro C gave a good working example, and it seems that notices don't get distributed with the notice_receiver hook. For example:
Sequel.connect(DB.opts.merge(:notice_receiver=>proc{|r| puts r.result_error_message})){ |db|
db.do("BEGIN\nRAISE NOTICE 'foo';\nEND;")
}
prints nothing, and:
Sequel.connect(DB.opts.merge(:notice_receiver=>proc{|r| puts r.result_error_message})){ |db|
db.do("BEGIN\nRAISE WARNING 'foo';\nEND;")
}
Prints
WARNING: foo
Since Sequel just calls set_notice_receiver from PG, I guess I should file a bug report with PG.
EDIT 2
Yet when I try things just with the PG gem I get
conn = PG.connect( :dbname => 'db_test', user: 'test', password: 'test', host: '127.0.0.1' )
conn.set_notice_receiver{|r| puts r.result_error_message }
conn.exec("SELECT emit_notice()")
NOTICE: NOTICE ME!!!
=> #<PG::Result:0x0000000405ac18 status=PGRES_TUPLES_OK ntuples=1 nfields=1 cmd_tuples=1>
So at this point I'm a bit confused...
EDIT 3
Posted an issue GitHub...
EDIT 4
Ah, apparently there's another options you need to use, client_min_messages needs to be set to :notice as so:
DB = Sequel.connect(
ENV['DATABASE_URL'],
notice_receiver: proc{|r| puts r.result_error_message},
client_min_messages: :notice
)
and this works
You pass in your own proc which gets the notice as a string. To have it trigger on notices and not just warnings and above, use client_min_messages. For example:
a = nil
Sequel.connect(
DB.opts.merge(
notice_receiver: proc{|r| a = r.result_error_message},
client_min_messages: :notice)) { |db|
db.do("BEGIN\nRAISE WARNING 'foo';\nEND;")
}
a == "WARNING: foo\n" # true
I've been trying to get my ruby script threaded since yesterday. I've since opted for SQLite to save data, with the parallel gem to manage concurrency.
I've built a quick script for testing, but I'm having trouble getting the threading working; the database is locked. I've added db.close to the end, which doesn't help, and I've tried adding sleep until db.closed?, but that just sleeps indefinitely. What am I doing wrong?
The error is "database is locked (SQLite3::BusyException)".
Here's my code:
require 'sqlite3'
require 'pry'
require 'parallel'
STDOUT.sync = true
db = SQLite3::Database.new "test.db"
arr = [1,2,3,4,5,6,7,8,9,10]
rows = db.execute <<-SQL
create table test_table (
original string,
conversion string
);
SQL
def test(num)
db = SQLite3::Database.new "test.db"
puts "the num: #{num}"
sleep 4
{ num => num + 10}.each do |pair|
db.execute "insert into test_table values (?, ?)", pair
end
db.close
end
Parallel.each( -> { arr.pop || Parallel::Stop}, in_processes: 3) { |number| test(number) }
SQLite is threadsafe by default (using its "serialized" mode) and the ruby wrapper apparently supports this to whatever extent it needs to. However, it's not safe across processes, which makes a certain sense since the adapter or engine probably has to negotiate some state in the process to prevent locks.
To fix your example change in_processes to in_threads
I have following select query which I will be passing to the database to get results back,
sql = "select * from movies where title = #{movie_title};"
movie_title contains a value that can sometimes contain single quotes and other chars that need escaping. I have come across dollar quoted string which is working well when used inside a INSERT statement but SELECT is not behaving the same, if I use $$#{movie_title}$$ like this it just doesn't get converted to a value inside movie_title. Is there any solution for this?
I am using postgres 9.5.0 and I am programming using ruby.
Bad idea. Don't do that, as you are making your code vulnerable to SQL injection attacks, and also making your life harder. Read more about prepared SQL statements, SQL injection etc.
In short, unless you are using some ORM, you should do something like:
#!/usr/bin/ruby
require 'pg'
if ARGV.length != 1 then
puts "Usage: prepared_statement.rb rowId"
exit
end
rowId = ARGV[0]
begin
con = PG.connect :dbname => 'testdb', :user => 'janbodnar'
con.prepare 'stm1', "SELECT * FROM Cars WHERE Id=$1"
rs = con.exec_prepared 'stm1', [rowId]
puts rs.values
rescue PG::Error => e
puts e.message
ensure
rs.clear if rs
con.close if con
end
(an example taken from http://zetcode.com/db/postgresqlruby/)
Edit: You don't need to use prepared statements, you can also use your DB lib's methods which provide proper parameter binding:
require 'pg'
conn = PG::Connection.open(:dbname => 'test')
res = conn.exec_params('SELECT $1 AS a, $2 AS b, $3 AS c', [1, 2, nil])
Take a look at docs for PG#exec_params
My simple Sqlite3 database is as follows:
CREATE TABLE balances(
balance zilch
);
My Ruby is as follows:
require('active_record')
ActiveRecord::Base.establish_connection(:database => "testbalance.db", :adapter => "sqlite3")
class Balance < ActiveRecord::Base
end
x = Balance.new
x.balance = 50
x.save
When I exit, and come back, and enter in the same Ruby again, at first, (before I runx.balance = 50) balance is nil. Why is this? Why isn't my DB saving?
If you enter the same code, then you're creating a new object again. No wonder its balance is nil.
To check that your object is saved, you can (for example) check Balance.count before and after record creation.
This is an old demo way of using Active Record and not very useful for production. It will get you started though. My code will make connections without sqlite3 gem required. I think that Active Record will include it if you use the :adapter hash entry. Of course you need it installed but it's not really needed in your code for Active Record. Just try it without that require to see. Then if you're still in doubt, un-install the gem just for fun. There are more Active Record namespaces and methods you should try especially ones that check to see if the database already exists. Then by pass the creation of one.
Here's some sample code from the book Metaprogramming Ruby.
#---
# Excerpted from "Metaprogramming Ruby",
# published by The Pragmatic Bookshelf.
# Copyrights apply to this code. It may not be used to create training material,
# courses, books, articles, and the like. Contact us if you are in doubt.
# We make no guarantees that this code is fit for any purpose.
# Visit http://www.pragmaticprogrammer.com/titles/ppmetr2 for more book information.
#---
# Create a new database each time
File.delete 'dbfile' if File.exist? 'dbfile'
require 'active_record'
ActiveRecord::Base.establish_connection :adapter => "sqlite3",
:database => "dbfile.sqlite3"
# Initialize the database schema
ActiveRecord::Base.connection.create_table :ducks do |t|
t.string :name
end
class Duck < ActiveRecord::Base
validate do
errors.add(:base, "Illegal duck name.") unless name[0] == 'D'
end
end
my_duck = Duck.new
my_duck.name = "Donald"
my_duck.valid? # => true
my_duck.save!
require_relative '../test/assertions'
assert my_duck.valid?
bad_duck = Duck.new(:name => "Ronald")
assert !bad_duck.valid?
duck_from_database = Duck.first
duck_from_database.name # => "Donald"
assert_equals "Donald", duck_from_database.name
duck_from_database.delete
File.delete 'dbfile' if File.exist? 'dbfile'
This code deletes the db file after usage and that's not very good persistence either. But you get the idea as it's just for testing assertions. You could try that to be sure as you change balances.
Do you want the rest of the code? https://pragprog.com/book/ppmetr/metaprogramming-ruby
Am I training you or am I the like? Moderators delete this if I'm wrong here please. I don't want to set a bad example.
I have a prepared insert statement in Sequel (using Oracle).
prepared_statement = DB[:table_name].prepare(:insert, :name, :value=>:$value)
When I call it the row gets added just fine.
prepared_statement.call :value=>'Some value'
I have a trigger and a sequence set up so the ID will be auto generated. I would like to get back the row (or the id) I just added, but I can't see how. I can't use insert because value is a CLOB and may be greater than 4000 characters.
In JRuby, using the JDBC adapter you can override the insert and pass in the returning clause. The tricky part is that you don't always know what the primary key is at this level so you may have to use ROWID or request all of the columns back.
You end up with something that looks similar to this:
module Sequel
module JDBC
class Database
def execute_insert_with_returning(conn, sql, opts = {})
columns = opts[:key_columns] || ["ROWID"]
q = "{ call #{sql} returning #{columns.join(',')} into #{columns.collect {|_| '?'}.join(',')} }"
stmt = conn.prepare_call(q)
raise "Unable to prepare call for insert" if stmt.nil?
begin
columns.each_with_index do |_, index|
stmt.registerOutParameter(index+1, JavaSQL::Types::VARCHAR)
end
return nil if 0 == stmt.executeQuery
values = (1..columns.count).inject({}) do |memo, index|
key = columns[index-1].downcase.to_sym rescue nil
memo[key] = stmt.get_string(index) unless key.nil?
memo
end
values
ensure
stmt.close
end
end # #execute_insert_with_returning
alias execute_without_specialized_insert execute
def execute(sql, opts={}, &block)
if opts[:type] == :insert
synchronize(opts[:server]) do |conn|
execute_insert_with_returning conn, sql, opts
end
else
execute_without_specialized_insert sql, opts, &block
end
end # #execute
end # Database
end # JDBC
end # Sequel
I've done something pretty much like this and it works pretty good. I think we had to override the Sequel::Model as well so it passes the primary key in as opts[:key_columns] but I may be remembering incorrectly.
This is a bit of a load bearing kludge that gets the job done. It would be more elegant to specialize it to the Oracle JDBC adapter and to ensure that all of the error handling code is present from the original execute statement. Given the time I'd love to get something better and give it back to the Sequel project.
The way to get the populated sequence values is through the RETURNING clause of the INSERT
statement, as I discuss in this response to a similar question regarding CodeIgniter.
I'm not sure whether the base version of RoR supports that syntax, but it appears to be possible to extend ActiveRecord to handle it. Find out more.
Sequel's Oracle adapter doesn't have native prepared statement support, so it falls back to issuing a regular query. If you can use JRuby, the jdbc adapter has native prepared statement support, so it should just work there. If you can't use JRuby, you'll have to work on adding native prepared statement support to the Oracle adapter. I don't have access to an Oracle installation, so I can't test any support, but I'll be happy to provide advice if you run into problems.