Am working on a library management system and am trying to use and embedded database. Can Derby store thousands and probably millions of records.
Related
I have a big table in teradata with around 10 million rows. Need to migrate this table from teradata to mariadb on a daily basis. I read about fast export. Can it be use here programatically in java application in this case.
I've been task with figuring out a way to get data from Oracle and store it in a SQLite database. The back story is we currently use SQLite for our local storage on a mobile application and we currently populate that data via a file download, because the data is a large amount it could take up to 5 minutes to populate the database. An easy solution for us would be to build the table on the server and download it via http. The data is currently stored in a Oracle database on the server. My question is is it possible to create a DBLink from Oracle to SQLite to insert the data into the SQLite database on the server? If this is not possible are there any other solutions that would achieve this?
Thanks
We have two divisions in our company, one uses E1 on Oracle 11g the other uses SAP on Oracle 11g.
We also have a SQL Server system we use to data warehouse information once a night from both system to run our report server against.
The question I have is for pooled tables in SAP, such as A016, how would I get that information out of SAP?
Currently we have SSIS's setup with a linked server to the two Oracle servers which pull the data we need I just don't have the knowledge of SAP to find the Pooled tables.
if I can't pull the pooled tables because they don't physically exist is there a tool I can use in SAP to find out what tables the pooled table is getting it's information from? This way I can rebuild that table in SQL using a open query and some fun Joins.
Thanks
You have to access those tables using the application server. They can't be accessed directly from the database.
You'll probably want to write an ABAP program to extract the data you need go from there.
Is there any product which can be queries using JDBC (normal SQL), it sees whether all the tables in the query are in CACHED tables, and use the cache, otherwise fallback to the back-end database.
I am aware of two products: Oracle In Memory Database (IMDB) Cache, and VMware SQLFire.
I'm not familiar with none of them, so I want to know is it possible to query IMDB cache on non-cached tables, so it falls-back to underlying database?
Is there any other products which support this feature?
With 11g you can use the JDBC OCI Client Result Cache:
Client result cache feature enables client-side caching of SQL query
result sets in client memory. In this way, OCI applications can use
client memory to take advantage of the client result cache to improve
response times of repetitive queries.
Note that the CACHE clause doesn't mean what the name implies:
For data that is accessed frequently, this clause indicates that the
blocks retrieved for this table are placed at the most recently used
end of the least recently used (LRU) list in the buffer cache when a
full table scan is performed. This attribute is useful for small
lookup tables.
Oracle In Memory Database (IMDB) Cache does support the feature that you ask about.
If the SQL statement that you use refers to IMDB cache tables, then the cache will be used. If the SQL statements that you use refers to non cache tables, then the Oracle database will be accessed.
IMDB Cache uses SQL or PLSQL to do read and/or write caching to Oracle databases.
You can use JDBC [or ODBC, OCI, ODP.Net, Node.js, Python, Go, Ruby etc] to talk to either an Oracle database or the IMDB Cache.
IMDB Cache also works with object relational mapping technologies such as Hibernate [eg JPA] for data access.
IMDB Cache uses the Oracle TimesTen In-Memory Database and is now called 'Oracle Application Tier Database Cache'.
I am a product manager for Oracle TimesTen.
I have 2 databases, Oracle and SQlite. And I want to create exact copies of some of the Oracle tables in SQLite in one of my applications. Most of these tables contains more than 10,000 rows so copying each table by going through each row programmatically is not efficient. Also the table structure may change in the future so I want to achieve this using a generic way without hard-coding the SQL statements. Is there a any way to do this?
p.s. - This application is being developed using Qt framework. All the queries and databases are represented by QtSql module objects.
Can't help with Qt framework, but for large amounts of data is is usually better to use bulk-copy operations.
Export data from Oracle
http://download.oracle.com/docs/cd/B25329_01/doc/admin.102/b25107/impexp.htm#BCEGAFAB
Import data into SQLite
http://www.sqlite.org/cvstrac/wiki?p=ImportingFiles
IHTH
What you probably really want to use is the Oracle Database Mobile Server, which can automatically synchronize a SQLite and an Oracle Database.
The recent release of the Oracle Database Mobile Server (formally called Oracle Database Lite Mobile Server) supports synchronization between an Oracle Database and a SQLite or a Berkeley DB database running on the client. It supports both synchronous and asynchronous data exchange, as well as secure communications between client and server. You can configure the Mobile Server to synchronize based on several options without the need to modify the application that is accessing the database.
You can also find an excellent discussion forum for questions from developers and implementers using the Mobile Server.