How do I add the Peoplesoft identifier to my autonomous database (shared) - oracle

I was told that some settings in autonomous are special for Peoplesoft and that I need some kind of identifier. How do I do this?

Once you provision ADB-S instance, log an SR detailing the following information:
PRODUCT: Autonomous Database on Shared Infrastructure
ABSTRACT/SUMMARY: PSFT on ADBS: please set PeopleSoft DB identifier
In the SR, please include:
Region (Data Center location)
Tenancy name and OCID
Autonomous DB name and OCID
Request to set init.ora paramaeter: _unnest_subquery=false
Cloud Ops will then make the change for your environment(s).
Reference:
MOS Note: 2765965.1 - Tech Update – PeopleSoft Extends Support for Oracle Autonomous Database

Related

In Oracle, how do I determine what audit settings are present for individual schemas?

I manage an Oracle database as DBA in my current role, but the database infrastructure including audit settings was defined by my predecessor, who I cannot contact.
I am trying to identify how my predecessor set up Oracle auditing. I suspect they configured auditing at a schema level with different configurations for different schemas. Is this possible?
If so, how do I see the different configurations for each schema?
Thanks

Oracle Cloud Infrastructure. "No Active State" tenancy when creating autonomous database

I am new in OCI and I created my account in Oracle Cloud Free Tier.
I want to create an autonomous database, but I always get this error:
Operation failed because the OCI tenancy with OCID ocid1.tenancy.oc1..aaaaaaaav6vqssmwmak4toidqdnvwtj6tk2urynptsevcwikmaqmar3ebk4a is not in the Active state.
When I check at my tenancy details, it appears as Active:
Can you help me to create my autonomous database?
From screenshot account is active, however as CSI number is blank wanted to check if this is a new account created today ?
If Yes ,possibly account might not have fully-provisioned .Please contact Oracle cloud support via chat ,added screenshot for your reference.
Hope this helps . Incase if it is old account ,Please ignore my response.

AWS DMS - incremental migration from Oracle to Redshift

I am new to AWS DMS service. Plans are to migrate on-prem Oracle to Redshift. Before going into production environment, currently trying out a test Oracle RDS in AWS which is a small subset of actual database as source. So far have been successful in the bulk load and incremental migration from RDS to Redshift.
When it comes to on-prem oracle , particularly for the incremental load
1) As per document : http://docs.aws.amazon.com/dms/latest/sbs/CHAP_On-PremOracle2Aurora.Steps.ConfigureOracle.html, the on-prem needs to be enabled with supplemental logging. Plans are to use the following two commands.
ALTER DATABASE ADD SUPPLEMENTAL LOG DATA;
ALTER DATABASE ADD SUPPLEMENTAL LOG DATA (PRIMARY KEY) COLUMNS;
The production database has multiple logging locations. Is there any other log settings other than the above two that I should be looking into for DMS to pick up multiple log locations?
2) In the same link given, point 4 says 'Create or configure a database account to be used by AWS DMS.'
Where should I create this user? on-prem oracle or AWS?
How do I configure DMS to use this user?
You need to read this documentation:
https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Source.Oracle.html
For your second question; You need to create a user in the Oracle source database, the section 'Working with a Self-Managed Oracle Database as a Source for AWS DMS' tells you all of the grants you need to give.
For your first question, if you look at the SQL Server documentation;
https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Source.SQLServer.html
It specifies the limitation of; 'SQL Server backup to multiple disks isn't supported. If the backup is defined to write the database backup to multiple files over different disks, AWS DMS can't read the data and the AWS DMS task fails.'
I can't see a similar stipulation in the oracle documentation, first link, I would hazard a guess that DMS is able, in the case of oracle, to determine and cope with multiple logging locations from a configuration value inside the database.

Accessing SAP Pool Table A016 from Sql Developer

We have two divisions in our company, one uses E1 on Oracle 11g the other uses SAP on Oracle 11g.
We also have a SQL Server system we use to data warehouse information once a night from both system to run our report server against.
The question I have is for pooled tables in SAP, such as A016, how would I get that information out of SAP?
Currently we have SSIS's setup with a linked server to the two Oracle servers which pull the data we need I just don't have the knowledge of SAP to find the Pooled tables.
if I can't pull the pooled tables because they don't physically exist is there a tool I can use in SAP to find out what tables the pooled table is getting it's information from? This way I can rebuild that table in SQL using a open query and some fun Joins.
Thanks
You have to access those tables using the application server. They can't be accessed directly from the database.
You'll probably want to write an ABAP program to extract the data you need go from there.

Change Oracle Schema at runtime when using SubSonic

In my project, I am using Oracle Database and SubSonic for DAL. I have a problem with SubSonic and Oracle Schema, that is:
When developing, I used a schema DEV in Oracle Database and generate DAL using SubSonic.
After that when release to customer, he used a new schema TEST in Oracle Database and changed the connection string in app.config to connect to Oracle. The error will appear, that is “Table or View does not exist”. I found this error and see that the schema of tables is still DEV.
I do not want re-generate DAL after change schema and when released to the customer. Please help me.
Firstly, your schema should not be DEV. DEV is a user or role.
Your schema name should be related to the data content (eg ACCOUNTS or SALES)
Secondly, consider whether you or the customer is going to decide the schema name. Say you have a product called FLINTSTONE. You may decide that the schema name should be FLINTSTONE. However your customer may want to run two instances of your product (eg one for local sales, the other for international) and use the same database. So they want FS_LOCAL and FS_INTER as the schema names. Is that option a feature of your product ?
Next, decide if your application should connect as the schema owner. There are good security reasons for NOT doing that. For example, the schema owner has privileges to drop tables, which is generally something the application doesn't do and thus, on the principle of least privilege, is something your application shouldn't have privileges to do.
Generally I would recommend some config parameter for the application for the schema name, and after connecting to the database, the app should do an "ALTER SESSION SET CURRENT_SCHEMA = 'whatever was it the config file'". The application database user would need the appropriate insert/update/delete/select/execute privileges on the objects in the application schema. If the application can't do that, you can have a LOGON trigger in the database.
Gary is correct in not using DEV as a schema on your own machine. In using Oracle we typically set up the schema as what the client is going to name their schema. This however does not fix your issue. What you need to do is create a global alias in Oracle that maps say DEV to CLIENTSCHEMA. You should still rename the schema on your machine but this will allow your schema to differ from your clients.

Resources