Data transfer and decoding in Odoo - odoo-8

I want transfer and decode binary data (base64) from a postgresql database through network.
I have tried through Web services for retrieving some data which is working but I want get attachment from 'ir_attachement' table through the Web services.
How can i do it ?

Related

How to load BLOB files from Oracle to Snowflake and download them from there

I have BLOB files (pdfs) on Oracle and I'm trying to migrate them all to Snowflake.
My goal is to be able to download the BLOB files (that would be the VARBINARY then) from Snowflake directly instead of just having the hex code.
I understand I'd need a Amazon S3 bucket or any blob storage but still, how from Snowflake could I have access to the pdf files as it is column-based relational database?
How would I do it?
Thank you

Copying JSON data from CosmosDB to Snowflake through ADF

**Hello,
I am trying to copy data from Cosmos DB to Snowflake through Azure Data Factory. But I get the error- "Direct copying data to Snowflake is only supported when source dataset is DelimitedText, Parquet, JSON with Azure Blob Storage or Amazon S3 linked service, for other dataset or linked service, please enable staging". Would that imply that I need to create a linked service with blob storage? What URL and SAS token should I give? Do I need to move everything to Blob and then move forward with staging?
Any help is appreciated. Thank You very much.**
Try it with a data flow activity instead of a copy activity

Apache Kafka for an existing get request with Oracle DB

I’m trying to learn about streaming services and reading kafka doc’s :
https://kafka.apache.org/quickstart
https://kafka.apache.org/24/documentation/streams/quickstart
To take a simple example I’m attempting to refactor a Spring web services GET request which accepts an ID parameter and returns a list of attributes associated with that ID. The DB backend is Oracle.
What is the approach for loading a single Oracle DB table which can be served by Kafka ? The above docs don't contain information for this. Do I need to replicate the Oracle DB to a NoSql DB such as MongoDB ? (Why we require Apache Kafka with NoSQL databases?)
Kafka is an event streaming platform. It is not a database. Instead of thinking about "loading a single Oracle DB table which can be served by Kafka", you need to think in terms of what events are you looking for that will trigger processing?
Change Data Capture (CDC) products like Oracle Golden Gate (there are other products too) will detect changes to rows and send messages into Kafka each time a row changes.
Alternatively you could configure a Kafka JDBC Source Connector to execute a query and pull data into Kafka.

Creating Oracle DBLink to insert data into a SQLite Database

I've been task with figuring out a way to get data from Oracle and store it in a SQLite database. The back story is we currently use SQLite for our local storage on a mobile application and we currently populate that data via a file download, because the data is a large amount it could take up to 5 minutes to populate the database. An easy solution for us would be to build the table on the server and download it via http. The data is currently stored in a Oracle database on the server. My question is is it possible to create a DBLink from Oracle to SQLite to insert the data into the SQLite database on the server? If this is not possible are there any other solutions that would achieve this?
Thanks

Transfer data from an ORACLE View to greenplum DB table

I have an Oracle view containing very large amount of data in it and I want to migrate this data in a table in Greenplum database. Is there any way I can write any query in Postgresql to fetch that Oracle view's data?
If not possible by query in Postgresql, kindly suggest me some way to access Oracle view from Linux server, so that I can create data file from that Oracle view to my Linux server and load that file via gpfdist to a Greenplum table.
NOTE: an Oracle view is from third party, I only have an access to view that data (I have all the connection info) I can access that view via SQL Developer
NOTE: Exporting data from SQL Developer to my local machine is not feasible here as the data is very large
Thanks,
Sunny
The last time I used Greenplum (3 years ago) I don't think there were any untrusted languages like plperlu, so fetching directly from Oracle from within Greenplum might not be possible. If the data has a primary key, are you able to fetch in batches, compress it, then ship it to Greenplum?
Do you have a Greenplum support contract? If so, you could also try them if you haven't already: https://sso.emc.com/sso/login.htm
I recall that gpfdist can be configured to fetch from remote servers with a bit of fiddling, so if you are able to copy out the Oracle data to disk, you could fetch it using gpfdist without any intermediary steps.

Resources