rake db:seed won't seed all the data in production - ruby

I have a seeds.rb file with data to seed. Not all the data gets seeded and rake db:seed ends with a message killed in the terminal. The same, however, works for development environment.
Here's the part I want to be seeded
xls_utility = Roo::Spreadsheet.open('/path/to/data.xlsx')
utilities = []
xls_utility.each do |row|
utility = Utility.new
if row[0] != "State"
["state_code", "value"].each_with_index do |attribute, index|
utility[attribute] = row.flatten[index]
end
utilities << utility
end
end
Utility.import utilities

Since I needed to seed data to some tables and since rake db:seed didn't seem to work.
Exporting and importing table in mysql was an alternative.
Export from local
mysqldump -p - -user=username dbname utilites > utilities.sql
Import from production
mysql -u username -p -D dbname < utilities.sql
However, I still want to know why rails seeding won't work. Thanks

Related

How to run a SQL file in Ruby

I have multiple .sql files which might consists of multiple Alter table statements or view definitions or plsql function definitions. I want to apply complete sql file on PostgreSQL DB. It should be something similar like
psql -h "HostName" -p "Port" -U "userName" -d "pwd" -f "Sql file location"
I have tried doing PG.connect.execute and that way I need to load the file and execute each statement separately. Is there a way I can execute entire sql file in Ruby?
There are two APIs for communicating with a PostgreSQL database, and both are
available as gems. The pg gem provides a pure Ruby binding to Postgres.
Exemple:
This code assumes you’re accessing the database through TCP/IP on port 5432 of
your local machine:
gem 'pg'
require 'pg'
def with_db
db = PG.connect(
dbname: 'dbname',
user: 'user',
password: 'password'
)
begin
yield db
ensure
db.close
end
end
sql = File.open('path/to/your_sql.sql', 'rb') { |file| file.read }
with_db do |db|
begin
db.exec(sql)
rescue PG::Error
#####
end
end
For more information check this

Writing to a file in production with sinatra

I cannot write to a file for the life of me using Sinatra in production.
In my development environment, I can use Logger without a problem and log STDOUT to a file.
It seems like in production, the Logger class is overwritten by the RACK middleware's Logger and it makes things more complicated.
I simply want to write to a file like this:
post '/' do
begin
$log_file = File.open("/home/ec2-user/www/logs/app.log", "w")
...do..stuff...
$log_file.write "INFO -- #{Time.now} --\n #{notification['Message']}"
...do..stuff...
rescue
$log_file.write "ERROR -- #{Time.now} --" + "\njob failed"
ensure
$log_file.close
end
end
The file doesn't get created when I receive a POST request to '/'.
However the file DOES get created when I load the app running pry:
pry -r ./app.rb
I am certain the code inside the POST block is effectively running because new jobs are getting added to the database upon receiving requests..
Any help would be greatly appreciated.
I was finally able to get to the bottom of this.
I changed the nginx user in /etc/nginx/nginx.conf from nginx to ec2-user. (Ideally I would just fix the write permissions for the nginx user but this solution suits me for now.)
Then I ps aux | grep unicorn and saw the timestamp next to the process name: unicorn master -c unicorn.rb -D was 3 days old!!
All this time I was pushing my code the the production server, restarting nginx and never killed and restart the unicorn process.
I removed all the code in my POST block and left only the file creation part
post '/' do
$log_file = File.open("/home/ec2-user/www/logs/app.log", "a")
$log_file.write("test log string")
$log_file.close
end
And the the file was successfully written to upon receiving a POST request.

Thor Start Jekyll then Open Page in Browser

Hi I want Thor to start a server - Jekyll / Python / PHP etc then open the browser
However the starting is a blocking task.
Is there a way to create a child process in Thor; or spawn a new terminal window - couldnt see and google gave me no reasonable answers.
My Code
##
# Project Thor File
#
# #use thor list
##
class IanWarner < Thor
##
# Open Jekyll Server
#
# #use thor ian_warner:openServer
##
desc "openServer", "Start the Jekyll Server"
def openServer
system("clear")
say("\n\t")
say("Start Server\n\t")
system("jekyll --server 4000 --auto")
say("Open Site\n\t")
system("open http://localhost:4000")
say("\n")
end
end
It looks like you are messing things up. Thor is in general a powerful CLI wrapper. CLI itself is in general singlethreaded.
You have two options: either to create different Thor descendents and run them as different threads/processes, forcing open thread/process to wait until jekyll start is running (preferred,) or to hack with system("jekyll --server 4000 --auto &") (note an ampersand at the end.)
The latter will work, but you still are to control the server is started (it may take a significant amount of time.) The second ugly hack to achieve this is to rely on sleep:
say("Start Server\n\t")
system("jekyll --server 4000 --auto &")
say("Wait for Server\n\t")
system("sleep 3")
say("Open Site\n\t")
system("open http://localhost:4000")
Upd: it’s hard to imagine what do you want to yield. If you want to leave your jekyll server running after your script is finished:
desc "openServer", "Start the Jekyll Server"
def openServer
system "clear"
say "\n\t"
say "Starting Server…\n\t"
r, w = IO.pipe
# Jekyll will print it’s running status to STDERR
pid = Process.spawn("jekyll --server 4000 --auto", :err=>w)
w.close
say "Spawned with pid=#{pid}"
rr = ''
while (rr += r.sysread(1024)) do
break if rr.include?('WEBrick::HTTPServer#start')
end
Process.detach(pid) # !!! Leave the jekyll running
say "Open Site\n\t"
system "open http://localhost:4000"
end
If you want to shutdown the jekyll after the page is opened, you are to spawn the call to open as well and Process.waitpid for it.

Looking for a MySQL query command to insert all the queries in local .sql file to the MySQL database on a remote server

Is there a MySQL query command to upload/insert all the queries in a .sql file (generated from mysqldump) on a local server to a mysql database on a remote server?
I'd like to try and do this all within MySQL queries from within an application and avoid issuing command-line mysql commands because I think that there would be a bit more overhead in parsing the output in that way.
I'm looking for something like, e.g. in Perl:
my $hostname = "remote_server_address";
my $dsn = "DBI:mysql:dbname:$hostname";
my $user = "user";
my $password = "password";
my $dbh= DBI->connect($remote)dsn, $user, $pw) );
my $myquery = "SPECIAL_INSERT_QUERYCOMMAND my_local_mysql_query_file.sql";
my $execute = $dbh->prepare($myquery);
$execute->execute;
Update: Additional requirements: Is there any "flow-control and resilience" whereby any connection issues between the local and remote server are handled so that the entire set of queries get transfered. And would there be a limit to the size of the file to be transferred? (I hope not).
I'm not sure why you couldn't use the command line tool mysql
cat my_local_mysql_query_file.sql | mysql -uuser -ppassword dbname
or on windows
mysql -uuser -ppassword dbname < my_local_mysql_query_file.sql

Merb & DataMapper - accessing database connection info?

I'm using Merb and DataMapper with a MySQL db. I want to access the database name, user, and password from a Rake task for my Merb app. I guess I could YAML.load() the the database.yml, but that seems ugly. Any ideas?
desc "outputs database connection parameters"
task :db_conn => :merb_env do |t|
puts "Username: #{DataMapper.repository.adapter.uri.user}"
puts "Password: #{DataMapper.repository.adapter.uri.password}"
puts "Database: #{DataMapper.repository.adapter.uri.path.split('/').last}"
end
The interesting part there is the => :merb_env bit. That ensures that the "merb_env" task has executed before your task does. This simply loads up the Merb environment, at which point you can proceed to inspect its configuration.

Resources