Sqoop accessing specific Oracle Schema - oracle

I have an Oracle database that has a number of schemas in it, a master schema and a bunch of children schemas. My master schema has privileges so that it can create/destroy/access tables in any of the children.
My question is, I'm doing a list-tables in Sqoop on the master schema and I I'm seeing all the children tables get included in the results.
Is there a way to distinguish which schema those tables belong to? I have some names that overlap and it's impossible to tell which table goes where at the moment.
mj

I believe that this issue is being solved by SQOOP-741 [1].
Links:
1: https://issues.apache.org/jira/browse/SQOOP-741

Sounds like you are selecting from ALL_TABLES. If you are logged in as the "master" schema user, you want to select from USER_TABLES. If you are selecting from ALL_TABLES you need to filter by OWNER=.

Related

Oracle user DB export command's scope (User/Schema level)?

I'm totally novice in terms of Oracle DB knowledge. Trying to understand IMPDB command and its scope.
Issue: Suppose there are 500 tables in a particular DB, many of them (60% - 70% or more) are coming as zero records when we're importing the data into a fresh Oracle DB (getting the data from one vendor who has the DB). The doubt is, how can most of the tables be zero records in a DB (why were they created at the first place then?). Also, we're assuming maybe the vendor is using a specific user while generating the .DMP files who has no access to those tables and hence the 0 count. When we asked the vendor, they said, that's not how Oracle works, they've provided user export dump and said, "Schema is a collection of database objects owned by a specific user. Those objects include tables, indexes, views, functions, stored procedures, etc."
When asked about the zero records issue, they said they're pulling correctly and have no understanding as to why so many tables are zero. The SO community has great experts in Oracle DB, can anyone shed some light as to:
What might be the issue?
Is our assumption correct (i.e, that user doesn't have access to those tables which got zero records)?
What's the right way forward?
4) Anything else you want to add.
The vendor is correct - the utility used to generate the export, EXPDP (the compliment to IMPDP) can create a full dump of all of the database objects of a specific user. However, the parameters used to generate the export can vary greatly, and it's absolutely possible for an export to not include table data IF the EXPDP command/parameters used to create the export are specified in that way. For example, let's imagine that someone wants to export a specific schema using the following commmand:
expdp [USER]#[DATABASE] schemas=test directory=DATA_PUMP_DIR dumpfile=test.dmp logfile=test.log query=TEST.TABLE:'"WHERE row_date>sysdate"'
While the export is being generated, all of the rows in that specific table will be evaluated based on the where condition. Unless rows have a date that is in the future, none of the rows dated prior and up to the sysdate will be exported. If a where condition like that is applied to the entire export, you'll have tables with 0 rows in the dump file.
That is just an example - it might also be the case that the tables really have 0 rows. This is possible for a lot of reasons - perhaps it is an older schema with tables that have previously been truncated. Perhaps that particular database isn't used often, and the tables within the schema are empty because rows were never added to the tables. Maybe a developer or another DBA created a bunch of unnecessary tables and they simply were never dropped. It could be a plethora of potential reasons/issues for a schema to have empty tables, and that doesn't mean there is something wrong with the database or the export file being generated. Applications and their technical requirements change all the time, and it's possible that the schema simply wasn't updated when those tables were no longer needed.
The first thing I would recommend is:
Ask the vendor to provide record counts of each table in that schema from their end for validation purposes. This will tell you if the tables are empty in the database. If they are empty in the database, they will be empty in your export. This is very simple and can be achieved with a query like select owner, table_name, num_rows, sample_size, last_analyzed from all_tables where owner=[SCHEMA]; provided that their table statistics are up to date.
If this is a big concern for you, you can always ask them to exclude those tables in the export with a command like:
expdp [USER]#[DATABASE] schemas=test exclude=TABLE:"IN ('Table1', 'Table2')" directory=DATA_PUMP_DIR dumpfile=test.dmp logfile=test.log
Or simply exclude them during your import with a command like:
impdp [USER]#[DATABASE] schemas=test exclude=TABLE:"IN ('Table1', 'Table2')" directory=DATA_PUMP_DIR dumpfile=test.dmp logfile=test.log
Either way should work, but be careful and ensure that there will be no issues from a constraint/child record perspective. You can also exclude the constraints. There are many ways to work around it.
IF THERE ARE INCONSISTENCIES BETWEEN THE COUNTS AND THE ROWS IMPORTED, I would recommend asking the vendor for the specific EXPDP command or parameter file that was used to generate the export. This will let you know if the empty rows are being caused by a clause in the export command.
It's impossible to know if your assumption is correct without knowing more about the database the export is coming from or seeing the the commands being used to generate the export. I would ask the vendor to verify record counts before assuming that it's a permission issue. Empty tables are created all the time.

oracle synchronize 2 tables

I have the following scenario and need to solve it in ORACLE:
Table A is on a DB-server
Table B is on a different server
Table A will be populated with data.
Whenever something is inserted to Table A, i want to copy it to Table B.
Table B nearly has similar columns, but sometimes I just want to get
the content from 2 columns from tableA and concatenate it and save it to
Table B.
I am not very familiar with ORACLE, but after researching on GOOGLE
some say that you can do it with TRIGGERS or VIEWS, how would you do it?
So in general, there is a table which will be populated and its content
should be copien to a different table.
This is the solution I came up so far
create public database link
other_db
connect to
user
identified by
pw
using 'tns-entry';
CREATE TRIGGER modify_remote_my_table
AFTER INSERT ON my_table
BEGIN INSERT INTO ....?
END;
/
How can I select the latest row that was inserted?
If the databases of these two tables are in two different servers, then you will need a database link (db-link) to be created in Table A schema so that it can access(read/write) the Table B data using db-link.
Step 1: Create a database link in Table A server db pointing to Table B server DB
Step 2: Create a trigger for Table A, which helps in inserting data to the table B using database link. You can customize ( concatenate the values) inside the trigger before inserting it into table B.
This link should help you
http://searchoracle.techtarget.com/tip/How-to-create-a-database-link-in-Oracle
Yes you can do this with triggers. But there may be a few disadvantages.
What if database B is not available? -> Exception handling in you trigger.
What if database B was not available for 2h? You inserted data into database A which is now missing in database B. -> Do crazy things with temporarily inserting it into a cache table in database A.
Performance. Well, the performance for inserting a lot of data will be ugly. Each time you insert data, Oracle will start the PL/SQL engine to insert the data into the remote database.
Maybe you could think about using MViews (Materialized Views) to replicate the data via database link. Later you can build your queries so that they access tables from database B and add the required data from database A by joining the MViews.
You can also use fast refresh to replicate the data (almost) realtime.
From perspective of an Oracle Database Admin this would make a lot more sense than the trigger approach.
try this code
database links are considered rather insecure and oracle own options are having licences associated these days, some of the other options are deprecated as well.
https://gist.github.com/anonymous/e3051239ba401e416565cdd912e0de8c
uses ora_rowscn to sync tables across two different oracle databases.

How to use Oracle Dblinks that honor referential integrity constraints?

I have a need to move data between two identical Oracle databases. I have figured out how to use dbLinks to achieve most of it. Here is my confusion.
Lets say I have Table A, which refers to Table B present in DB1 and also similar structure in DB2. Is there any way possible for me to create db link to move data between Table A in DB1 and DB2 which automatically copies the relevant data in Table B to support referential constraints (without me having to spell it out)?
Thanks
Kay
A simple approach would be to duplicate the foreign key and check constraints in DB2.TableB in the destination table DB1.TableA.
A little more work is to create a materialized view in DB1 along the lines of
Create Materialized View TableA as Select * from TableB#DB2.link;
Refresh as required... You cannot do a fast refresh on a remote database but very few applications require true real time synchronization.

Oracle - Create two tables of different schema with same name in same table space. Is it valid? Does not produce any errors?

I Understand that we can have tables with same name as long as they are in different schema. Let say, both schema tables are dumped to same tablespace even then it is valid? I think as per my knowledge it is valid.. Is my understanding correct? Or is there any clause in this scenario? or any other issues?
Yes, it's perfectly valid.
The namespaces in an Oracle database are schemas. Tablespaces are solely related to physical storage of segments. You can't have two tables with the same name in the same schema but you can have as many tables with the same name in the same tablespace.

SSIS breaks Oracle Privileges

I make a privileges to user on one schema at Oracle, when accessing oracle database using SSIS I saw all tables and schema. When I use SQL Plus show me only one schema.
What is the problem here?
What query are you running to see tables in SQL*Plus? If you are querying USER_TABLES, you will only see the tables that the current user owns. If you are querying ALL_TABLES, you will see all the tables that you have permission to query regardless of the owner. If you are querying DBA_TABLES, you will see all the tables in the database (though you need additional privileges to query the DBA% objects.
There is another question on how to get a list of all the tables in a database that goes into more detail about this.

Resources