I have a problem with migrating my SQLite3 database to PostgreSQL. How and what do I need to do?
I am searching the internet, but find only migrations from MySQL to PostgreSQL.
Can anyone help me?
I need to convert my SQLite database to PostgreSQL database for Heroku cloud hosting.
You don't want to try to do a binary conversion.
Instead, rely on exporting the data, then importing it, or use the query language of both and using selects and inserts.
I HIGHLY recommend you look at Sequel. It's a great ORM, that makes switching between DBMs very easy.
Read through the opening page and you'll get the idea. Follow that by reading through the cheat sheet and the rest of the documentation and you'll quickly see how easy and flexible it is to use.
Read about migrations in Sequel. They're akin to migrations in Rails, and make it very easy to develop a schema and maintain it across various systems.
Sequel makes it easy to open and read the SQLite3 table, and concurrently open a PostgreSQL database and write to it. For instance, this is a slightly modified version of the first two lines of the "cheat sheet":
SQLITE_DB = Sequel.sqlite('my_blog.db')
PGSQL_DB = Sequel.connect('postgres://user:password#localhost/my_db')
Base all your subsequent interactions with either database using SQLITE_DB and PGSQL_DB and you'll be on your way to porting the data.
The author of Sequel is very responsive and is a big fan of PostgreSQL, so the ORM has great integration with all its features.
Related
We are developing a large data migration from Oracle DB (12c) to another system with SSIS. The developers are using a production copy database but the problem is that, due to the complexity of the data transformation, we have to do things in stages by preprocessing data into intermediate helper tables which are then used further downstream. The problem is that all developers are using the same database and screw each other up by running things simultaneously. Does Oracle DB offer anything in terms of developer sandboxing? We could build a mechanism to handle this (e.g. have dev ID in the helper tables, then query views that map to the dev), but I'd much rather use built-in functionality. Could I use Oracle Multitenant for this?
We ended up producing a master subset database of select schemas/tables through some fairly elaborate PL/SQL, then made several copies of this master schema so each dev has his/her own sandbox (as Alex suggested). We could have used Oracle Data Masking and Subsetting but it's too expensive. Another option for creating the subset database wouldn have been to use Jailer. I should note that we didn't have a need to mask any sensitive data.
Note. I would think this a fairly common problem so if new tools and solutions arise, please post them here as answers.
As part of my bachelor's thesis I'm building a Microservice using Postgres which would in part replace an existing part of an application using MongoDB. Now to change as little as possible at the moment on the client side I was wondering if there was an easy way to translate a Mongoid::Criteria to an SQL query (assuming all fields are named the same, of course), without having to write a complete parser myself. Are there any gems out there that might support this?
Any input is highly appreciated.
Maybe you're looking for this : https://github.com/stripe/mosql.
I don't dig it but it seems to work for what you need :
"MoSQL imports the contents of your MongoDB database cluster into a PostgreSQL instance, using an oplog tailer to keep the SQL mirror live up-to-date. This lets you run production services against a MongoDB database, and then run offline analytics or reporting using the full power of SQL."
I am working with one application which requires to show a friend list up to the 4th degree. After some research I came to know about one solution i.e. Neo4j.
I didn't get a clear idea from their tutorial, can I connect Neo4j to MySQL, and if not how should I implement that myself? I am currently using the codeigniter framework with MySQL.
Thanks.
neo4j is a database, and mysql is a database. so, this question is largely about connecting databases from different vendors together.
at this time, neo4j and mysql do not support direct connections to each other. you'd typically accomplish your desired task by exporting your data from mysql as CSV files (http://www.mysqltutorial.org/mysql-export-table-to-csv/) and importing to neo4j (http://jexp.de/blog/2014/06/load-csv-into-neo4j-quickly-and-successfully/)
michael hunger, a colleague of mine at neo4j, recently wrote this auto importer. you might want to check it out to make this process much easier:
https://github.com/jexp/neo4j-rdbms-import
before going through this data export/import, you may just want to download neo4j and play with the movie dataset. you can do this in about a minute (https://www.youtube.com/watch?v=om6E-HqtrZ0).
then, there are standalone PHP drivers for Neo4j:
http://neo4j.com/developer/php/
josh addell, the author of Neo4jPHP, has even written a post about how to use codeignitor 2 with his library:
http://blog.everymansoftware.com/2011/08/getting-neo4jphp-working-with.html
We are using liquibase as evolutionary DB change management tool in our applications, it works great when we use it in "common" database schemas.
But we also work with GIS applications using esri arcSDE 9.3 platform over Oracle and in this case, all (or almost all) tables (both GIS and 'alphanumeric' tables) in the schema are managed (create table, grants, etc.) through arcSDE. So when we want to create new feature classe now we use arcCatalog, and this way, it's not possible to manage the feature classes’ changes directly through SQL using liquibase or other automate refactoring tool.
So if we cannot use liquibase to manage changes, at least we want to execute the management operations over our features through command line.
We’ve started looking for tools that avoid the use of arcCatalog, and then try to automate the changes using scripts, we are investigating these possibilities:
Try to capture the SQL that arcCatalog/arcSDE is executing each time we make a change in one Feature Class monitoring the oracle connection. It outcomes us a too complex set of SQL instructions that involves indexes, versioning tables, etc. so we give up this way.
Use the sdelayer and sdetable admin commands installed on arcSDE server.
Use the data management tool: a python based library to manage the feature classes, but it has to be executed from a machine with the desktop version installed.
These last two options will provide a way to manage features from command-line, but our target is to find/develop a tool to manage changes similar to the way liquibase do. But with these tools we will have to find a tool that let us map each SQL DDL operation to an arcSDE command, and currently no db refactoring tool provides this (currently we have check liquibase, dbdeploy, flyway).
Anybody had solved this problem of evolutionary change management with arcSDE? Any insight of another way to face this problem?
I'll take a stab at this, although I'm unfamiliar with one of the products you mention (liquibase specifically - I have used Oracle and I'm very familiar with ArcGIS (ArcMap & ArcCatalog).
Here's just some additional information that may help and my interpretation of your question.
My interpretation - "What's a simple way to manage or enable us to automate the management of our tables of GIS data in our Oracle database without having to use ArcCatalog all the time?"
So - I'll throw this concept back in the ring - I know SQL Server has spatial datatypes "geometry", etc. and that you can bypass SDE and let ArcGIS directly connect and interpret this data without even installing SDE. I also know Oracle has compatible spatial types. So I would possibly consider migrating my data from the managed FC's that ArcCatalog creates and push them into oracle-native geometry based tables. This way you can treat them like regular tables, cut ESRI out of the solution, and manage them with liquibase, etc. Hopefully that helps.
I would also consider upgrading to 10.1 or at least 10.0 (I promise I'm not an undercover salesman), although that will require your users to come with you on the client side (http://resources.arcgis.com/en/help/main/10.1/index.html#//002q000000n8000000) because the newer python APIs are much easier and faster to use (arcpy vs. the GP model), if you do choose to use Python to manage your stuff. (Regardless, either API isn't very well developed and isn't intuitive to code in or fast.)
Good luck.
I'm trying to move snapshots of data from our MongoDB into our Oracle BI data store.
From the BI team I've been asked to make the data available for ODI, but I haven't been able to find an example of that being done.
Is it possible and what do I need to implement it?
If there is a more generic way of getting MongoDB data into Oracle then I'm happy to propose that as well.
Versions
MongoDB: 2.0.1
ODI: 11.1.1.5
Oracle: 11.2g
Edit:
This is something that will be queried once a day, maybe twice but at this stage the BI report granularity is daily
In ODI, under the Topology tab and Physical Architecture sub-tab, you can see all technologies that are supported out of the box. MongoDB is not one of them. There are also no Knowledge Modules available for importing/exporting from/to MongoDB.
ODI supports implementing your own technologies and your own Knowledge Modules.
This manual will get you started with developing your won Knowledge module, and in one of the other manuals i'm sure you can find an explanation on how to implement your own technologies. (Ctrl-F for "Data integrator")
If you're lucky, you might find someone else who has already implemented it. Your best places to look would be The Oracle Technology Network Forum, or a forum related to MongoDB.
Instead of creating a direct link, you could also take an easier workaround. Export the data from the MongoDB to a format that ODI supports, and MongoDB can extract to. CSV or XML maybe? Then load the data trough ODI into the oracle database. I think... that will be the best option, unless you have to do this frequently...
Look at the blog post below for an option;
https://blogs.oracle.com/dataintegration/entry/odi_mongodb_and_a_java
Cheers
David