How can I ensure that file has been transfered completely onto destination remote server using oracle DBMS_FILE_TRANSFER.PUT_FILE() utility? - oracle

Below is my groovy code which transfer and import the oracle .dmp file from my local to remote (rds instance) but the problem is before DBMS_FILE_TRANSFER transfer the file on to dest. server my impdp command executes which through exceptions.
I want to check in my code whether the DBMS_FILE_TRANSFER has transferred the file completely so i can proceed to impdp command.
Is there any way to check the status of transferred file before importing it.
sql.call("""BEGIN
DBMS_FILE_TRANSFER.PUT_FILE(
source_directory_object => 'DATA_PUMP_DIR',
source_file_name => '"""+schema+""".dmp',
destination_directory_object => 'DATA_PUMP_DIR',
destination_file_name => '"""+schema+""".dmp',
destination_database => 'to_rds'
);
END;""")
println("Dump file has been transfered to destination server successfully...")
DropUser(rDbUrl,rUser,rPassword,driver,schema)
def impCmp = """cmd /c impdp """+rUser+"""/"""+rPassword+"""#db-m3-medium.coplvukvijdo.us-east-1.rds.amazonaws.com:1521/ORCL DUMPFILE="""+schema+""".dmp DIRECTORY=DATA_PUMP_DIR full=y"""
proc=impCmp.execute()
proc.waitFor()
println("Dump file has been imported on to destination server successfully...")

Related

bulk import pdf files into oracle table

i have multiple folders on my disk and each folder has pdf files (4 files in each folder). How can i insert files in each folder in oracle table rows. the folder name will make primary key (being unique social svc #). i have used code as is from this link but i get following error:-
ora-22285 non-existent directory or file for fileopen operation
ora-06512 at sys.dbns_lob line 805
i ve also granted all permissions on the directory to my user with command:-
grant all on directory blob_dir to testuser
pl tell me what am i doing wrong.
if you going to use BLOB data type then you can upload data from external file using SQL*Loader. In case you are going to use BFILE then you just need to copy files into Oracle Server file system and grant access to it via DIRECTORY object with READ privelege. BFILE provides read only access to external files via SQL.

Why do I get ORA-39001: invalid argument value when I try to impdp in Oracle 12c?

When I run this command in Oracle 12c SE2:
impdp system/Oracle_1#pdborcl directory=DATA_PUMP_DIR dumpfile=mydb.dmp nologfile=Y
I get this:
ORA-39001 : invalid argument value
ORA-39000 : bad dump file specification
ORA-39088 : directory name DATA_PUMP_DIR is invalid
We used to import this into 11g all the time.
How can I solve these errors?
From the 12c documentation:
Be aware of the following requirements when using Data Pump to move data into a CDB:
...
The default Data Pump directory object, DATA_PUMP_DIR, does not work with PDBs. You must define an explicit directory object within the PDB that you are exporting or importing.
You will need to define your own directory object in your PDB, which your user (system here) has read/write privileges against.
create directory my_data_pump_dir as 'C:\app\OracleHomeUser1\admin\orcl\dpdump';
grant read, write on directory my_data_pump_dir to system;
It can be the same operating system directory that DATA_PUMP_DIR points to, you just need a separate directory object. But I've used the path you said you'd prefer, from a comment on a previous question.
Then the import is modified to have:
... DIRECTORY=my_data_pump_dir DUMPFILE=mydb.dmp

How to pass values into sqlloader - Oracle

I have to prepare few scripts for importing data into the Oracle database, but I will have to run it on different databases.
For each table to be imported I have a data and control file:
table1.dat
table1.ctl
table2.dat
table2.ctl
etc..
For each table I have prepared separate .bat file that runs sqlloader :
table1.bat:
sqlldr login/password#database control=table1.ctl log=table1.log
It is easy and simple solution as slong as I don't have to run it on different databases and change login credentials.
What I wolud like to do is have one file with login and password that runs loading scripts for each table.
Have you got any suggestions how it could be done?
Regards
Pawel
I hope I understood your question.
In your .bat file you can connect to any database but you sqlldr login decides on which database the import is started.
I would call a start.sql in the .bat file where I do something like this:
-- database 1
host sqlldr login/password#database1 control=table1.ctl log=table1_db1.log
host sqlldr login/password#database1 control=table2.ctl log=table2_db1.log
-- database 2
host sqlldr login/password#database2 control=table1.ctl log=table1_db2.log
host sqlldr login/password#database2 control=table2.ctl log=table2_db2.log
An other option is to call import_db1.sql in your start file en write your code concerning database 1, etc.
start.sql
##import_db1.sql
##import_db2.sql
import_db1.sql
-- database 1
host sqlldr login/password#database1 control=table1.ctl log=table1_db1.log data=csvfile.csv
host sqlldr login/password#database1 control=table2.ctl log=table2_db1.log data=csvfile.csv
etc.
Your issue isn't very clear, however it sounds like you just want to source username/password per server. In which case for bash you can do:
. /dir/to/file/.sql_password_file
where sql_password_file has the entry:
SQLLDRLOGON='user/pass'
then in your script you can do
sqlldr userid=$SQLLDRLOGON control=table1.ctl log=table1.log
I would look into changing your script to a loop too e.g.
for load in table1 table2
do
loads="control=${load}.ctl bad=${load}.bad log=${load}.log"
sqlldr $SQLLDRLOGON $loads
etc...

copy contents of a remote location to a local file in ruby script

I want to copy the contents of a remote location file to some local file after an ssh connection has been made.
begin
ssh = Net::SSH.start("localhost", "user")
logger.info "conn successful!"
results = conn.exec!('ruby somefile "#{arguments}"')
#code to copy the contents of a.txt in remote location to local file
#IO.copy_stream (localfile, remotefile)
rescue
logger.info "error - cannot connect to host"
end
I tried using IO.copy_stream but that doesn't work. How do I go about this?
Use Net::SCP (which requires Net::SSH) to transfer files:
Net::SCP.download!("remote.host.com", "username",
"/remote/path", "/local/path",
:password => password)
More info here: https://rubygems.org/gems/net-scp

FTP: copy, check integrity and delete

I am looking for a way to connect to a remote server with ftp or lftp and make sure the following steps:
Copy files from FTP server to my local machine.
Check if the downloaded files are fine (i.e. md5checksum).
If the download was fine then delete the downloaded files from the FTP server.
This routine will be executed each day from my local machine. What would be the best option to do this? Is there a tool that makes abstraction of all the 3 steps ?
I am running Linux on both client and server machines.
Update: Additionally, I have also a text file that contains the association between the files on the FTPserver and their MD5sum. They were computed at the FTP server side.
First, make sure your remote server supports the checksum calculation at all. Many do not. I believe there's even no standard FTP command to calculate a checksum of a remote file. There were many proposals and there are many proprietary solutions.
The latest proposal is:
https://datatracker.ietf.org/doc/html/draft-bryan-ftpext-hash-02
So even if your server supports checksum calculation, you have to find a client that supports the same command.
Some of the commands that can be used to calculate checksum are: XSHA1, XSHA256, XSHA512, XMD5, MD5, XCRC and HASH.
You can test that with WinSCP. The WinSCP supports all the previously mentioned commands. Test its checksum calculation function or the checksum scripting command. If they work, enable logging and check, what command and what syntax WinSCP uses against your server.
Neither the ftp (neither Windows nor *nix version) nor the lftp support checksum calculation, let alone automatic verification of downloaded file.
I'm not even aware of any other client that can automatically verify downloaded file.
You can definitely script it with a help of some feature-rich client.
I've wrote this answer before OP specified that he/she is on Linux. I'm keeping the Windows solution in case it helps someone else.
On Windows, you could script it with PowerShell using WinSCP .NET assembly.
param (
$sessionUrl = "ftp://username:password#example.com/",
[Parameter(Mandatory)]
$localPath,
[Parameter(Mandatory)]
$remotePath,
[Switch]
$pause = $False
)
try
{
# Load WinSCP .NET assembly
Add-Type -Path (Join-Path $PSScriptRoot "WinSCPnet.dll")
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.ParseUrl($sessionUrl);
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
Write-Host "Downloading $remotePath to $localPath..."
$session.GetFiles($remotePath, $localPath).Check();
# Calculate remote file checksum
$buf = $session.CalculateFileChecksum("sha-1", $remotePath)
$remoteChecksum = [BitConverter]::ToString($buf)
Write-Host "Remote file checksum: $remoteChecksum"
# Calculate local file checksum
$sha1 = [System.Security.Cryptography.SHA1]::Create()
$localStream = [System.IO.File]::OpenRead($localPath)
$localChecksum = [BitConverter]::ToString($sha1.ComputeHash($localStream))
Write-Host "Downloaded file checksum: $localChecksum"
# Compare cheksums
if ($localChecksum -eq $remoteChecksum)
{
Write-Host "Match, deleting remote file"
$session.RemoveFiles($remotePath).Check();
$result = 0
}
else
{
Write-Host "Does NOT match"
$result = 1
}
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
}
catch [Exception]
{
Write-Host "Error: $($_.Exception.Message)"
$result = 1
}
# Pause if -pause switch was used
if ($pause)
{
Write-Host "Press any key to exit..."
[System.Console]::ReadKey() | Out-Null
}
exit $result
You can run it like:
powershell -file checksum.ps1 -remotePath ./file.dat -localPath C:\path\file.dat
This is partially based on WinSCP example for Verifying checksum of a remote file against a local file over SFTP/FTP protocol.
(I'm the author on WinSCP)
The question was later edited to say that OP has a text file with a checksum. That makes it a completely different question. Just download the file, calculate local checksum and compare it to the checksum you have in the text file. If they match, delete the remote file.
That's a long shot, but if the server supports php, you can exploit that.
Save the following as a php file (say, check.php), in the same folder as your name_of_file.txt file:
<? php
echo md5_file('name_of_file.txt');
php>
Then, visit the page check.php, and you should get the md5 hash of your file.
Related questions:
Calculate file checksum in FTP server using Apache FtpClient
How to perform checksums during a SFTP file transfer for data integrity?
https://serverfault.com/q/98597/401691

Resources