Airpal currently uses presto client to connect to PrestoDB. However as I understand, it can also use JDBC for this connectivity. Is there any code available for this purpose? Even if it is for connecting to any other database it might be helpful for me. The model for presto client looks a lot different than other models like JDBC etc.
Airpal is using presto client connectivity and also using these objects (mostly for schema and data like Column, QueryResults etc.) internally in its various modules.
One way for providing JDBC connectivity is to move its lowest layer of DB connectivity (executeWith invocations of com.airbnb.airpal.core.execution.QueryCliemt: there is 1 for data and about 6 for metadata) to JDBC query execution. The JDBC results (mostly data and schema) can then be converted to presto client api equivalent objects and rest of the logic in airpal would follow.
Another approach is to rewrite airpal with native JDBC support by moving over to JDBC objects for internal use and communication as well. It looks like a much bigger change.
I am planning to add support for dynamically choosing between presto client or JDBC connectivity. I will use the com.airbnb.airpal.presto.QueryRunner to hold either a presto client session or a JDBC connection accordingly.
Related
I know to write a Kafka consumer and insert/update each record into Oracle database but I want to leverage Kafka Connect API and JDBC Sink Connector for this purpose. Except the property file, in my search I couldn't find a complete executable example with detailed steps to configure and write relevant code in Java to consume a Kafka topic with json message and insert/update (merge) a table in Oracle database using Kafka connect API with JDBC Sink Connector. Can someone point demonstrate an example including configuration and dependencies? Are there any disadvantages with this approach? Do we anticipate any potential issues when table data increases to millions?
Thanks in advance.
There won't be an example for your specific use-case becuase the JDBC connector is meant to be generic.
Here is one configuration example with an Oracle database
All you need is
A topic of some format
key.converter and value.converter to be set to deserialize that topic
Your JDBC string and database schema (tables, projection fields, etc)
Any other JDBC Sink Specific Options
All this goes in a Java properties / JSON file, not Java source code
If you have a specific issue creating this configuration, please comment.
Do we anticipate any potential issues when table data increases to millions?
Well, those issues would be database server related, not with Kafka Connect. For example, disk filling up or increased load while accepting continuous writes.
Are there any disadvantages with this approach?
You'd have to handle de-deduplication or record expiration (e.g. GDPR) separately, if you did want that.
I have a gemfire cache v8.2.1 from which I want to access data using a third party tool which can only access data using jdbc driver only. Does anyone know how can I connect to gemfire cache for accessing data using jdbc? I don't require to write to cache, just want to read from the cache.
I came across with gemfirexd on internet but can see that its marked as "End of availability".
Is there any other way where persisted Objects can be retrieved or OQL can be fired but can mimic a jdbc driver so that the any tool that accepts only jdbc drivers can be used?
Please help.
Thanks
Apache Calcite has a Geode adapter that enables you read data from GemFire over JDBC. There is also this video explaining this.
I'm writing an application that has to communicate across 3 different platforms. Two expose their DB via a REST API (no jdbc driver) and one is a native JDBC connection (ex: Derby, MySQL, Oracle, etc).
My problem is that I have no way of assuring any ACID'ity when updating data, given that the three should be updated at the same time.
I've tried reading up on Spring XA but it seems as both 2PC and 1PC require some form of transactional backends. Given that 2 of my 3 destinations are REST APIs, I don't have any transactions. Just a save/update option.
Are there techniques I can use to ensure that the 3 sources are synchronized and that I don't run into inconsistent states if ever a write fails (ie: REST endpoint unavialble, etc)?
A transaction example would be:
Read from DB
Write to REST-1 endpoint
Update DB
Write to REST-2 endpoint
Is there some form of XA I could employ to wrap everything in such a way I can be assured of consistency?
Can someone help me understand how to expose the SYS schema objects of a JBoss Teiid Virtual Database when connected via an ODBC-JDBC bridge ?
The client is connecting to ODBC side of the bridge and the JDBC side of it is connecting to the Virtual Database (VDB) running on the JBoss SOA server.
With the current setting only the tables and columns modeled thru the JBoss Studio's Teiid Designer are exposed but not the SYS schema and its underlying objects. Client App is Microstrategy BI application.
You are able to traverse all metadata from all used data sources using native JDBC JAVA API.
I am new to Teiid and had the similar question.
When you create the VDB with JBoss designer you can specify which models will be exposed to the client applications. As a good practice, only View models are exposed and Source models are not. As a result, querying against the System tables of the VDB will only show you the metadata within the View models, which will be a subset of the metadata in the underlying data sources.
Hope this helps.
Is there a way to force encryption of network traffic (that is, result set data) using Oracle thin client and jdbc?
I understand that this can be done by setting up a java.util.Properties object and passing that to DriverManager.getConnection( String, Properties), but is there a way to specify this in the jdbc url?
I'm using a third party tool written in Java, which handles creating its own connections, so creating and passing the Properties object won't work for me.
Thanks.
Have a look at the Oracle JDBC documentation. There is a chapter about Client Side Security Features, that talks about using system properties to configure a Thin Driver for SSL.