I have multiple .sql files which might consists of multiple Alter table statements or view definitions or plsql function definitions. I want to apply complete sql file on PostgreSQL DB. It should be something similar like
psql -h "HostName" -p "Port" -U "userName" -d "pwd" -f "Sql file location"
I have tried doing PG.connect.execute and that way I need to load the file and execute each statement separately. Is there a way I can execute entire sql file in Ruby?
There are two APIs for communicating with a PostgreSQL database, and both are
available as gems. The pg gem provides a pure Ruby binding to Postgres.
Exemple:
This code assumes you’re accessing the database through TCP/IP on port 5432 of
your local machine:
gem 'pg'
require 'pg'
def with_db
db = PG.connect(
dbname: 'dbname',
user: 'user',
password: 'password'
)
begin
yield db
ensure
db.close
end
end
sql = File.open('path/to/your_sql.sql', 'rb') { |file| file.read }
with_db do |db|
begin
db.exec(sql)
rescue PG::Error
#####
end
end
For more information check this
Related
Running a simple login automation test using Capybara. No database is used.
I am trying to use parallel_test gem to invoke multiple test sessions at the same time (chrome browser) using the following command
parallel_rspec -n 2
two browsers are invoked but only the first one is launched with the correct URL and the second one is simply blank. Also, the login and password values are concatenated twice (data collision) in the first browser input fields.
Framework - Non-rails. Ruby with capybara
config/database.yml
test: &TEST
adapter: postgresql
host: localhost
database: test_db<%= ENV['TEST_ENV_NUMBER'] %>
encoding: utf8
username: postgres
password: postgres
spec_file - 1
describe '' do
it '' do
ENV['TEST_ENV_NUMBER'] = ''
end
end
spec_file - 2
describe '' do
it '' do
ENV['TEST_ENV_NUMBER'] = '2'
end
end
both spec files are in the same folder
There is very little guidance for nonrails setup. Any help would be much appreciated.
My oracle db is only accessable via a jumpoff server and is load balanced. As a result I run the following background tunnel command in bash:
ssh ${jumpoffUser}#${jumpoffIp} -L1521:ont-db01-vip:1521 -L1522:ont-db02-vip:1521 -fN
Before I run my commands on the db using sqlplus like so:
sqlplus #{#sqlUsername}/#{#sqlPassword}#'#{#sqlUrl}' #scripts/populateASDB.sql
This all works fine.
Now I want to rubisize this procedure.
In looking up the documentation on ruby I could not find how to put the tunnel in the background (which would be my preference) but I found documentation on local port forwarding which I thought would emulate the above tunnel and subsequent sqlplus command.
Here is my code:
Net::SSH.start( #jumpoffIp, #jumpoffUser ) do |session|
session.forward.local( 1521, 'ont-db01-vip', 1521 )
session.forward.local( 1522, 'ont-db02-vip', 1521 )
puts "About to populateDB"
res = %x[sqlplus #{#sqlUsername}/#{#sqlPassword}#'#{#sqlUrl}' #scripts/populateASDB.sql > output.txt]
puts "populateDb output #{res}"
session.loop
end
When I run the above I get the line "About to populateDB" but it hangs on the actual running of the sqlplus command. Is there something wrong with my port forwarding code or how do I put the following:
ssh ${jumpoffUser}#${jumpoffIp} -L1521:ont-db01-vip:1521 -L1522:ont-db02-vip:1521 -fN
into ruby code?
A
Try to use this gem: https://github.com/net-ssh/net-ssh-gateway/
require 'net/ssh/gateway'
gateway = Net::SSH::Gateway.new(#jumpoffIp, #jumpoffUser)
gateway.open('ont-db01-vip', 1521, 1521)
gateway.open('ont-db02-vip', 1521, 1521)
res = %x[sqlplus #{#sqlUsername}/#{#sqlPassword}#'#{#sqlUrl}' #scripts/populateASDB.sql > output.txt]
puts "populateDb output #{res}"
gateway.shutdown!
You have two problems.
1) You need to use 'session.loop { true }' so that the session actually loops
2) You don't start looping the session until your sqlplus command is done, but the sqlplus needs the session looping (the forwarding to be up).
So I suggest creating a background thread using Thread.new and then killing the thread once sqlplus is done.
Thanks to David's answer, I came up with the following:
Net::SSH.start(ip_addr, 'user') do |session|
session.forward.local( 9090, 'localhost', 9090 )
# Need to run the event loop in the background for SSH callbacks to work
t = Thread.new {
session.loop { true }
}
commands.each do | command |
command.call(9090)
end
Thread.kill(t)
end
I have a seeds.rb file with data to seed. Not all the data gets seeded and rake db:seed ends with a message killed in the terminal. The same, however, works for development environment.
Here's the part I want to be seeded
xls_utility = Roo::Spreadsheet.open('/path/to/data.xlsx')
utilities = []
xls_utility.each do |row|
utility = Utility.new
if row[0] != "State"
["state_code", "value"].each_with_index do |attribute, index|
utility[attribute] = row.flatten[index]
end
utilities << utility
end
end
Utility.import utilities
Since I needed to seed data to some tables and since rake db:seed didn't seem to work.
Exporting and importing table in mysql was an alternative.
Export from local
mysqldump -p - -user=username dbname utilites > utilities.sql
Import from production
mysql -u username -p -D dbname < utilities.sql
However, I still want to know why rails seeding won't work. Thanks
Is there a MySQL query command to upload/insert all the queries in a .sql file (generated from mysqldump) on a local server to a mysql database on a remote server?
I'd like to try and do this all within MySQL queries from within an application and avoid issuing command-line mysql commands because I think that there would be a bit more overhead in parsing the output in that way.
I'm looking for something like, e.g. in Perl:
my $hostname = "remote_server_address";
my $dsn = "DBI:mysql:dbname:$hostname";
my $user = "user";
my $password = "password";
my $dbh= DBI->connect($remote)dsn, $user, $pw) );
my $myquery = "SPECIAL_INSERT_QUERYCOMMAND my_local_mysql_query_file.sql";
my $execute = $dbh->prepare($myquery);
$execute->execute;
Update: Additional requirements: Is there any "flow-control and resilience" whereby any connection issues between the local and remote server are handled so that the entire set of queries get transfered. And would there be a limit to the size of the file to be transferred? (I hope not).
I'm not sure why you couldn't use the command line tool mysql
cat my_local_mysql_query_file.sql | mysql -uuser -ppassword dbname
or on windows
mysql -uuser -ppassword dbname < my_local_mysql_query_file.sql
This is pretty weird. I have my public key added at host machine. I can simply run
ssh -p <port> -l <username> hostt.com
which simply opens the remote shell. I can even run my capistrano scripts for the deployments on the same machine. But when i was trying connect with this following simple ruby script
require 'rubygems'
require 'net/ssh'
Net::SSH.start("hostt.com",
:port => <port>,
:username => <username>
) do |session|
puts session.pwd
end
it refuses immediately with the following exception:
`initialize': Connection refused - connect(2) (Errno::ECONNREFUSED)
Is there anything I'm missing here?
Appreciate your help.
Okay, now after a few days when I look back to the problem, I got a quick success with the following tweak:
Net::SSH.start("<host>", "<user>", :port => "<port>") { |ssh|
puts "logged in"
puts ssh.exec!("ls -l")
} rescue puts "failed to connect."
So the difference with the previous one is the username, which in this case is passed as the second argument rather than like an option key.
you probably need to provide the location of your SSH key, or a password to use with the username you provide in the SSH.start parameters. for the keys, you need to pass the map value as an array :keys => ["path_to_key"]. I'm not sure why the api is set up that way, but it is.