I would like to know if I can dump an ArangoDB with Go.
I know how to use ArangoDB Go driver but there is no example which show how to dump the db.
I also aware about how to download the database on the command line.
The best way is to use the arangodump and arangorestore tools provided by ArangoDB to dump and restore a database.
If needed, call those executables from Go.
ArangoDump: https://www.arangodb.com/docs/stable/programs-arangodump.html
ArangoRestore: https://www.arangodb.com/docs/stable/programs-arangorestore.html
Related
I am currently setting up SONAR with the in-memory db for an evaluation. Should we wish to use the tool, I would like to then migrate the analysis results onto an Oracle db to use going forward. Is this possible?
No tool is provided to do such a migration, and I advise you not to try to do so.
However, be aware that you will have the possibility to replay the history of your analysis: you can check out old versions of your code and launch an analysis on each one using the "sonar.projectDate" paramater to change the date.
I am a little lost with the current documentation I have.
I am trying to access an oracle server within a Network, using a Debian Box. Ideally, what I'd need to do is to cron job something into SQL plus, so it performs periodically.
My scripts are ready, but I am not sure how to do the instalation part, or what to install in order to get access to SQLPlus. Does I need the full-fledged oracle client? Oracle XE? Will SquirrelSQL work?
Thanks a lot!
You just need the Oracle client, not the full database install. You can download it here. From the sound of it, you don't need the full client.
Oracle data pump export utility expect a parameter DIRECTORY (DBA_DIRECTORIES) which exist in DB server. Is it possible to map this directory to local machine or is there any other way to export multiple table to local from oracle database?
If using Data Pump, there is no direct way to store a dump file on your local machine. That is the way how Data Pump designed.
However, there is one of possible ways to achieve what you want. A workaround has two steps:
Run expdp as usual, which creates a dump file on server
Use ocp tool to transfer a dump file from a database server to your local machine (and back, if you want to).
An ocp tool stands for "Oracle Copy" and written exactly for the purpose of copying dump files back and forth from/to a database server. It is available here: https://github.com/maxsatula/ocp/releases/download/v0.1/ocp-0.1.tar.gz That is a source distribution, so once downloaded and unpacked, run ./configure && make
(Hopefully you do not have Windows on a client side, because I never tried to compile it there)
That is a simple command-line tool with a simple syntax. For example, this command will pull a file for you:
ocp <connection_string> DATA_PUMP_DIR:remote_file_name.dmp local_file_name.dmp
The tool uses a database connection and a minimum set of database privileges.
Update:
Finally I was able to adjust the source code and build ocp tool for Windows 32-bit:
https://github.com/maxsatula/ocp/releases/download/v0.1/ocp-0.1-win32.zip
Compiled/tested with 32-bit Instant Client 11.2.0.4 available here: http://www.oracle.com/technetwork/topics/winsoft-085727.html
instantclient-basiclite-nt-11.2.0.4.0.zip (20,258,449 bytes)
I believe it will work with a full Oracle Client installation too (just watch for bits, should be 32), however did not check myself.
Unfortunately, Windows build of ocp does not have a fancy progress meter during file transfer. That piece of code had too much *nix-specific stuff, so I had to cut it off.
Also, since it uses popt and zlib libraries, which are compiled as a part of GnuWin project, and available in 32-bit only, ocp for Windows is 32-bit only too. Hopefully, not having of a 64-bit version is not mission critical to you.
Update 2:
Warning! Make sure you always use DEDICATED server connection when download files from server, otherwise (for SHARED server) the downloaded copy of the file will be corrupted with no error messages!
With a bit of a hack you can get data pump to do what you want, but you need to have a database on your local machine.
What you need to do is create a database link on your local machine to the remote machine.
Then in the datapump options, login to the local database as the db link owner, specify the 'network_link' option to be the name of the database link name you created. That way it should export from the remote database through the local database and create the file on your local instance. For example:
expdp directory=<local_dir_object> network_link=<dblinkname on local instance> dumpfile=.. logfile=.. tables/schema=...
No, data pump sucks that way, but Oracle can get faster throughput using the same server the db sits on, so thats the tradeoff. Other enhancements too, but I still think this is a big disadvantage for data pump. Use old exp/imp or third party tools for this purpose.
You should ask yourself: "Why do I want to keep data outside the database - the most secure place for my data? Where backup,restore and recovery is in place.
If you are going to move data from database A to database B, make sure both databases have access to a common file-area where they can access the datadump-files through their directory-object and use the datapump.
If you still want to export data to client side you can use the good old tools exp and imp.
I have an MS-Access mdb file that I need to import data from into my mysql instance. I am on a mac, is there any free/OSS tools that allow me to do that? If not, is there a free/OSS JDBC driver that I can use to extract the data I need?
Thanks.
Have a look at Jackcess. Note that this doesn't support Access 97 databases, however, only 2000+.
For Access 97, the only thing I'm aware of is mdbtools, but that's a C library, so you'll have to write some JNI glue code if you want to use it from Java; also, it's not maintained anymore, to the best of my knowledge.
UCanAccess is a pure-Java JDBC driver that can read from and write to Access 2000 and newer databases. (Access 97 files are supported read-only.) It will work on any machine that runs Java.
For more details see
Manipulating an Access database from Java without ODBC
Is your Access MDB on mac?
Does the mac Access have the option of using linked tables?
If so, you can create a Linked Table from Access MDB to mySQL. Then, you could treat mysql tables as if it were part of MS-Access.
EDIT: See if this helps.
You could export the MDB file using something like this. This won't help you if you need to do it from within your app, but if you are ok exporting the data then using it, then this should help.
I do the following way to convert;
Download ACCDB MDB Explorer
http://accdb-mdb-explorer.en.softonic.com/mac
Open the MDB file
Export as SQL
Import in MySQL using MySQL Workbench.
Hope it helps..
I have tried SQLite in Java, but the speed is slow due to the JDBC driver. Then I tried HSQLDB and thought the speed is good, but I cannot find a good management tool for HSQLDB such as phpMyAdmin for MySQL or SQLite Manager for SQLite.
I'd like to use the manager tool to prepare the test data for unit tests, or use the manager tool to navigate the data after doing some small experiments.
Is there any good tool?
Here are a couple other suggestions you might checkout:
Squirrel SQL http://squirrel-sql.sourceforge.net/
Execute Query http://executequery.org/
Razor SQL (paid) http://www.razorsql.com/
Razor has the best feature set, but is paid. The others are good at different things and worth checking into.
This would only have meaning if you are running in HSQLDB server mode. If you are running in memory or file mode, then you either can't access the DB from another process or doing so would lock it.
In Server mode you could use any universal client. JDBC driver is the hsqldb.jar itself.
Actually HSQL brings its own management tool (which is not super). See http://hsqldb.org/doc/guide/apf.html
I've used Squirrel SQL. It's a universal client for any JDBC database.
See: http://squirrel-sql.sourceforge.net/