I have close to ~6TB of data to be migrated. This data resides in the form of files in Oracle Block Storages. What can be the fastest way to migrate this data over to Azure Files?
I am the PM on Azure Files. You have a couple of options.
Use AzCopy
Mount Azure Files Share and copy data using traditional tools like robocopy or gio
Disk rental
Feel free to reach out to me at rena dot shah at microsoft dot com.
Thank You
Rena Shah
Related
I have an ETL project running on-premises using ODI.
This ETL is quite customized with python scripts to perform tasks such as download/uploading files to Sharepoint, reading and parsing Excel files etc...
If I would want to migrate this to OCI, what would be the best service(s) for it? Is it a simple migration or a redo using a new approach?
I see that we have Data Integration workspaces but it seems not to be able to handle this type of customization...
Not sure about Data Flows. I am definitely not familiar with all Oracle stack, so I need some help :)
Thank you
I have a following scenario. (A rather common one but I am not entirely sure where to start)
I have data incoming into blob storage container (our raw zone). The files get dropped in raw zone everyday(by someone sitting somewhere). Each day as the new files come in, the old files are overwritten, but the number of records increases.
Suppose a customer file from yesterday may have 100 records, today's file might have 150 records. (100 from yesterday and 50 from today).
Now, what is the best way to do an incremental load (or other solutions welcome) for moving latest number of records into the azure table storage.
I have worked with using watermarks etc when loading data from or into sql, but don't have so much experience with Azure table. Would appreciate if I can get a lead.
Thanks in advance.
You can use ADF to do incremental load into Azure Table Storage using watermarks. Refer to below links and you might need to tweak the implementation a little based on requirements.
Incrementally load data from Azure SQL Database to Azure Blob storage using the Azure portal
Copy data to and from Azure Table storage using Azure Data Factory or Synapse Analytics
As the title suggests, I'm attempting to open an SDF/MSSQL CE database in GoLang. Is this possible?
The GoLang Libraries I've found do not appear to suppot MSSQL CE database connections.
I hope this is not a duplicate, but I can not find info online
I'm afraid it would hardly be possible to work with these files directly as they merely are on-disk storage format, and note that MSSQL DBs of all flavors use .sdf as the extension of the file names of their DBs, so by itself they mean nothing.
OTOH, one direct way to approach this problem would be using
OLE DB layer.
You can also try to use
https://github.com/denisenkom/go-mssqldb
to connect to MSSQL Express instance like that:
sqlserver://sa#localhost/SQLExpress?database=master&connection+timeout=30
I'm looking for the best way to upload files to a Azure SQL Database.
We have to use Azure Data Factory as at this moment we are not allowed to use Azure VM's with SSIS.
Each day we are upload 1,5Gb of XML files.
Currently we we are uploading the to a Blob storage and with a Copy activity we are uploading them into the DB.
But this takes up to 2,5 hours.
What would be a better/faster concept to do this ?
Any Suggestions ?
You can use the bcp utility to import your data into an instance of SQL Server as explained in this document. From there, you can use the Azure Data Synchronization tool to synchronize your Azure SQL Database with your SQL Server Database. This may provide a faster execution time.
Finally it dropped to 35 minutes.
Just by using several storage accounts (5) and split the data over the 5 accounts.
5 ADF pipelines uploding all into the same staging table.
Some where huge files, but we had over 100.000 small files from 2 up to 100K.
This worked out fine for us.
We noticed that the DTU's never went up to their limit so we thought that the DB was not the problem, by firts splitting into 2 we saw DTU rising a bit more. we continued on that path....
I need to automate a selective table / user object backup I currently am doing via PL / SQL Developer.
The way I currently do it is via Tools/Export Tables and Tools/Export User Objects, manually select tables / objects, then set the options, choose destination and export. I do this from a windows laptop and the database is located in a suse linux server, both are in the same LAN. DB is running 24/7 and can not be shutdown. Also currently my oracle programming skills are very basic as I only do maintenance to this solution. I would like to keep doing the backup process in the windows laptop, but I would consider a server side script solution also and then retrieving the .sql files from server.
Thanks in advance
I wouldn't really call it a backup, but look at exp/imp and expdp/impdp (data pump) in the Utilities manual
As Gary implies exp/imp really isn't a backup solution. If this database is important to you or others, figure out how to use RMAN , which is usually configured to run in a mode that doesn't require the database to be shut down. Although it executes on the database host and for non-tape destinations must write its files to a filesystem attached to the host, it can be launched remotely.
RMAN is aimed at restoring/recovering the entire database, so if what you're looking for is only the ability to recover isolated objects it may not be for you.