How to read a file present in application server using external tables. I can't place the file in DB server as we are restricted from DB server.
According to the documentation, the data file for an external file must be visible to the server.
You access an external file via a DIRECTORY object in the database, which points to a directory on the db server.
Now, you can mount a drive so that it appears in the file system, but someone with access to the db server will probably have to do that.
Related
I am creating a web app using Oracle apex and I need to draw data from an excel file or CSV file from a remote server, is the remote server option of oracle apex capable of that, is it possible to be done and how? thanks in advance.
If you are talking about the "Data Loading" page type, then yes - the Wizard creates several pages; one of them lets you browse for the source file. If the remote server's directory is mapped on your computer, you'll see it while browsing for the file and use it as data source.
I have a specific need for data pump and I am having a hard time searching for a solution.
Currently, I have a exp/imp program that exports tables (selectively based on queries) from one database, and imports that same data into another database. This program and the dump files reside on a common server that can access both the source and destination databases. This is a totally automated process. It works good, albeit slowly.
Due to various reasons, I must migrate this program to use data pump. The biggest change now is the location of the dmp files. I also have very limited access to the database servers themselves, but I can run data pump.
The process will be run from the same common server, but the exported files will now reside on the database server for the source database. No issue there. I can create dmp files using expdp.
My issue is how to get that same data into the destination database. When I run impdp, it is expecting a data_pump_dir in the destination area (not source area). Again, this is automated, and I don't have the luxury of being able to transfer dmp files using scp or ftp or anything like that.
What can I use to overcome this problem using datapump?
No reason you cannot configure an external directory on BOTH databases:
CREATE DIRECTORY mydumpdir AS '/whatever/the/path/is';
Then, impdp and expdp will take the DIRECTORY argument as mydumpdir
Make sure you configure permissions for the Oracle schemas/users to read/write to the directory AND the oracle process account should have OS level rights to read/write to that location also. The expdp server should also have write access as it might be trying to write reports to the locations or you might be using to do file cleanup.
I am trying BFILE functionality in Oracle. My plan is all the files should be stored in file server, whose IP is 192.165.1.10.
Based on this I created a directory in my local PC database like this
create directory TEST_DIR as `\\192.165.1.10\c\ATTACH_FILES\STUDENT`
Directory is created. My doubt is being my db system and file server are in different locations so should I give any other privileges in Oracle?
Please give your opinion as Bfile is not working properly for me.
Note, my database server and file server are both Windows.
"My doubt is being my db system and file server in different locations "
That's a very good doubt to have. The database can only access OS directories on its local server, and directories which have been shared with that server. So you will need to share your file server directory using System Tools > Shared Folders > Shares.
As the database server is Windows you will need to map the shared directory if it isn't mapped already. The mapping must be owned by the OS user that owns the Oracle database, or the mapping owner must grant permissions to Oracle OS user( or its group). So that requires sysadmin access. Find out more. Also you may have to bounce the database.
I have this very specific requirement.
My database server is running on some linux server X, where I have written some stored procedure which will read the file from a DIRECTORY and create an XML table based on the content of that xml file.
Now, The file in picture can come from any machine i.e. it is uploaded by user in Browser and then we need to process it with the stored procedure.
Is there a way I can access the file of my local machine from the database server without mount/ftp? I mean, is there any utility in Oracle which can access file system of the client to read the file content?
is there any utility in Oracle which can access file system of the client to read the file content?
No, there is not. PLSQL program cannot reach your client PC. You have to upload it to the server then can use UTL_FILE to interpret it.
I create a monetdb database using mserver5 (in actual fact I use R and MonetDB.R for this part).
Retrospectively (once the db has been created) I would like to do the following:
set a remote connection to the db
set a passphrase for the remote connection (apparently necessary)
Please note that from the manuals I think I can do the above on a new dbFarm created with monetdbd.
My problem is to do the above retrospectively on an existing db.
To start I tried to use monetdbd and pointing it to the db folder (created by mserver5) with
monetdbd get all myFolderCreatedWithmserver5
But I get
unable to read properties from myFolderCreatedWithmserver5: no such file or directory
You are most probably providing monetdbd the wrong folder. You can identify the "dbfarm" folder by its contents:
.merovingian_lock
merovingian.log
.merovingian_properties
one folder for each database
If you provide a folder above this one, or any folder of a database, you will get the above error message.