Is there way to SFTP Mainframe files To Azure File Storage or Blob ?
Is there any other way of doing apart from AzCopy ?
This is for copying Transaction files from Mainframe to Azure Blob/File.
Current versions of z/OS ship with sftp. Dovetailed Technologies has their own implementation, which some shops use.
Whether or not you are allowed to exfiltrate potentially sensitive data is a local decision.
Further information of a more generic nature which may or may not be helpful is here.
Blob storage now supports the SSH File Transfer Protocol (sFTP). Follow the detail article at below mentioned location about how to setup certificates on Mainframe and how to enable sFTP on Azure blob storage for direct Mainframe to Azure blob storage file transfer using sFTP.
https://techcommunity.microsoft.com/t5/modernization-best-practices-and/mainframe-files-transfer-to-azure-data-platform-using-sftp/ba-p/3302194
Related
My problem:
I need a data pipeline created from my organization’s Oracle DB (Oracle Cloud Infrastructure) to an AWS S3 bucket. Ideally, I would love for there to be some mechanism for oracle to push new data that has entered the database to be pushed to an S3 bucket as it is added (in whatever format).
Question:
Is this possible with Oracle native, specifically Oracle Cloud Infrastructure?
Or would is there a better solution you have seen?
Note:
I have seen AWS has the Data Sync product, this seems like it could facilitate with this problem, however I am not sure if it is suitable for this specific problem.
An S3 bucket is object storage; it can only hold complete files. You cannot open and update an existing file like you would in a normal file system, even just to add new rows. You will need to construct your whole file outside of Oracle and then push it to S3 with some other mechanism.
You may want to consider the following steps:
Export your data from Oracle Cloud into Oracle Object Storage (similar to S3) using the Oracle Cloud's integration with their object storage. (https://blogs.oracle.com/datawarehousing/the-simplest-guide-to-exporting-data-from-autonomous-database-directly-to-object-storage)
THEN:
Let the customer access the Oracle Object Store as they normally would access S3, using Oracle's Amazon S3 Compatibility API. (https://docs.oracle.com/en-us/iaas/Content/Object/Tasks/s3compatibleapi.htm)
OR:
Use an externally driven script to download the data - either from Oracle Object Store or directly from the database - to a server, then push the file up to Amazon S3. The server could be local, or hosted in either Oracle OCI or in AWS, as long as it has access to both object stores. (https://blogs.oracle.com/linux/using-rclone-to-copy-data-in-and-out-of-oracle-cloud-object-storage)
OR:
You may be able to use AWS Data Sync to move data directly from Oracle Object Storage to S3, depending on networking configuration requirements. (https://aws.amazon.com/blogs/aws/aws-datasync-adds-support-for-on-premises-object-storage/)
I am wandering around and didn't get an answer to this question.
is there any way to import Oracle .dmp file stored on a local machine to RDS Oracle?
If Yes how to do it?
Else why it's not possible to do so as other databases gave the flexibility to do these kinds of imports through more than one way.
You can't do that. When you import data with Oracle Data Pump, you must transfer the dump file that contains the data from the source database to the target database**. You can transfer the dump file using an Amazon S3 bucket or by using a database link between the two databases.
If your local machine contains a database and you have a network connection between your on-premises database and your Oracle RDS , then you can use NETWORK_LINK, although I don't recommend it. It is much better to tranfer the file using S3 bucket.
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Oracle.Procedural.Importing.html
I want to explore Oracle data integrator , i am not able to understand what does 'Use credential File' option in Data server does in Oracle data integrator. If anyone can explain it would be helpful and i want to improve performance of my oracle data integrator script as well, any ideas on that as well.
Ok, now I think that I understood. You run ODI in Cloud.
You will need a credential File in order to connect to your database.
The way you obtain that credential file, is:
Credential files are downloaded from the ADW console to the ODI host in the Oracle Cloud Infrastructure (OCI).
Note: When ODI is deployed from the Marketplace, client credential folders are downloaded from autonomous databases that exist in the OCI compartment containing ODI.
If ADW is in a different compartment than ODI follow the steps below.
Download the Credentials
Connect to the ODI host using VNC. Refer to the Deployment blog above
for details.
Launch Firefox from the Applications>Favorites list.
Follow the steps in Downloading Autonomous Data Warehouse Credentials
to obtain the client credentials compressed folder containing the
wallet and network configuration files used by ODI to make the
connections.
The entire way of connecting is described here.
I have close to ~6TB of data to be migrated. This data resides in the form of files in Oracle Block Storages. What can be the fastest way to migrate this data over to Azure Files?
I am the PM on Azure Files. You have a couple of options.
Use AzCopy
Mount Azure Files Share and copy data using traditional tools like robocopy or gio
Disk rental
Feel free to reach out to me at rena dot shah at microsoft dot com.
Thank You
Rena Shah
It is recommended that we store document information in blob storage. In our case the blob storage is related to the SQL Azure data, is the facility available to back up the blob storage in sync with the SQL Azure data ? What I don't want to see is a point in time restore of the SQL Azure data only to find we don't have the same snapshot of the blob data at that time :(
Does anyone know what is available
Interesting issue you have to solve. But there is no automated way to keep in sync BLOB and Azure SQL Database Data. You have to do manage this yourself. And here is not just about blob snapshots. How about your updated DB record refers to a new blob. What happens with the old one ?! This is all business rules to apply at application level. And you have to question yourself to what degree you want that backup of Blobs.
Here is an interesting blog post on Azure SQL and Storage backup. But again - there is no service that will keep for you in sync data between SQL DB and Azure Storage.