Sinatra + Chartkick + Sequel gem, chart not updating - ruby

I'm running a very basic Sinatra server, which simply shows a Chartkick graph of some data I have through the Sequel gem. I'm noticing that the data on the chart doesn't seem to update unless I quit the Sinatra server script and rerun it. I don't really understand how that would be possible... the only non-normal thing option I'm using when reading my database using Sequel is the read-only option.. would that cause this?

It turns out, from reading another post on here:
First, by default, multiple processes can have the same SQLite
database open at the same time, and several read accesses can be
satisfied in parallel.
In case of writing, a single write to the database locks the database
for a short time, nothing, even reading, can access the database file
at all.
Beginning with version 3.7.0, a new “Write Ahead Logging” (WAL) option
is available, in which reading and writing can proceed concurrently.
By default, WAL is not enabled. To turn WAL on, refer to the SQLite
documentation.
I currently have script A, which maintains a connection to the DB file and writes to it regularly, and script B, which is my Sinatra server that reads information from that DB file. I worked around this issue by using a block connection in my Sinatra script. I don't know how to turn on WAL with Sequel though...

Related

AWS RDS database can't read record that was just written to database

I'm seeing an error with some Laravel code that uses an AWS RDS database. The code writes a record to the database and then immediately does a search to load that record using the primary key and gets no results.
If I try it manually afterwards I find the record. If I insert a 1-second sleep in the code it works correctly.
I've tried this using Laravel's separate settings for read and write hosts. I've also tried setting them to the same host and only using one host. The result is always the same. However other environments with the same configuration do not have the error.
Is there an option in RDS that needs to be changed to have the record available immediately after it's written.
The error is due to the mySQL master-slave replication lag.
A common mistake is to use a mySQL cluster and then perform a read
immediately after a write.
Since the read occurs on one of the slave/read hosts and the write occurs on the master, the data would not be replicated at the time of the read.
There are a couple of ways to rectify the error:
The read immediately after must be performed on the master (not the slave). Even though you've mentioned that you changed it to a single host, often people make a mistake while switching the connection. Refer this SO post to properly switch connections in Laravel
An easier way may be to use the sticky database option in Laravel. Beware: this may cause performance issues if not used carefully for only the use case you desire. From the docs:
The sticky option is an optional value that can be used to allow the
immediate reading of records that have been written to the database
during the current request cycle.
If the sticky option is enabled and a "write" operation has been
performed against the database during the current request cycle, any
further "read" operations will use the "write" connection.
The most "non-obvious" way is to NOT perform a read immediately after a write. Think about whether this can be avoided depending on your use case.
Other methods: refer this SO post

iMessage app storage location - why is chat.db-wal updated instantly but chat.db takes awhile?

So playing around with iMessages and thinking of ways to back them up and various things.
I found their location at ~/Library/Messages.
There are three files
1. chat.db
2. chat.db-wal
3. chat.db-shm
If I run a node script that watches for file changes while sending a iMessage to someone I see chat.db-wal is changed instantly but chat.db takes awhile to update.
I would like to get the messages as soon as possible, but I am not sure I can read the .db-wal file. Anyone know if I can read that file? Or why the .db file seems to take longer to update?
Thanks.
Everything is fine. Your data is there. This is just how SQLite works.
In order to support ACID transactions, where your data is guaranteed to be stored properly in the case of crashes or power-offs, SQLite first writes your data into a "write-ahead log" (the *-wal file). When the database is properly closed, or the write-ahead log gets too full, SQLite will update the database file with the contents of the log.
SQLite, when reading, will consult the write-ahead log first, even if multiple connections are using the same database. Data in the log is still "in the database".
SQLite should apply the log to the database as part of closing the database. If it does not, you can run PRAGMA wal_checkpoint; to manually checkpoint the log file.
Corollary to this: do not delete the -wal file, especially if you have not cleanly closed the database last time you used it.
More information about write-ahead logging in SQLite can be found in the SQLite documentation.

Simple Local Database Solution for Ruby?

I'm attempting to write a simple Ruby/Nokogiri scraper to get event information from multiple pages and then output it to a CSV that is attached to an email sent out weekly.
I have completed the scraping components and the CSV component and it's working perfectly. However, I now realize that I need to know when new events are added, which means I need some sort of database. Ideally I would just store this locally.
I've dabbled a bit with using the ruby gem 'sequel', but the data does not seem to persist beyond the running of the program. Do I need to download some database software to work with 'sequel'? Also I'm not using the Rails framework, just Ruby.
Any and all guidance is deeply appreciated!
I'm guessing you did Sequel.sqlite, as in the first example in the Sequel README, which creates an in-memory SQLite database. To create a database in your filesystem instead of memory, just pass it a path, e.g.:
Sequel.sqlite("./my-database.db")
This is, of course, assuming that you have the sqlite3 gem installed. If the given file doesn't exist, it will be created.
This is covered in the Sequel docs.

Viewing SCHEMA logs for postgresql in Rails

I'm having a hard time trying to get logging to include postgres schema changes in it.
I've isolated the line of code that blocks them, but rather than being a configurable option, it seems to be hard coded that they don't appear. Right now I'm chaining the method that sets the schema_search_path to log then, but the sql is already being executed through the normal method which is logged so it'd be nice to take advantage of that. Here's the offending line of code:
https://github.com/rails/rails/blob/v3.2.13/activerecord/lib/active_record/log_subscriber.rb#L27
How would I go about subscribing to those logs too?

Can you connect to a MS Access database from Ruby running on a Mac?

I'm pretty sure the answer is "no" but I thought I'd check.
Background:
I have some legacy data in Access, need to get it into MySQL, which will be the DB server for a Ruby application that uses this legacy data.
Data has to be processed and transformed. Access and MySQL schemas are totally different. I want to write a rake task in Ruby to do the migration.
I'm planning to use the techniques outlined in this blog post: Using Ruby and ADO to Work with Access Databases. But I could use a different technique if it solves the problem.
I'm comfortable working on Unix-like computers, such as Macs. I avoid working in Windows because it fills me with deep existential horror.
Is there a practical way that I can write and run my rake task on my Mac and have it reach across the network to the grunting Mordor that is my Windows box and delicately pluck the data out like a team of commandos rescuing a group of hostages? Or do I have to just write this and run it on Windows?
Why don't you export it from MS-Access into Excel or CSV files and then import it into a separate MySQL database? Then you can rake the new one to your heart's content.
Mac ODBC drivers that open Access databases are available for about $30.00
http://www.actualtechnologies.com/product_access.php is one. I just run access inside vmware on my mac and expore to csv/excel as CodeSlave mentioned.
ODBC might be handy in case you want to use the access database to do a more direct transfer.
Hope that helps.
I had a similar issue where I wanted to use ruby with sql server. The best solution I found was using jruby with the java jdbc drivers. I'm guessing this will work with access as well, but I don't know anything about access

Resources