i have a PL/SQL program which do a query to an AS400 database through Transparent Gateway. Sometimes the AS400 not responds to the query (may be network problems )and the PL/SQL program hangs.
Is there any method to set a timeout to the Oracle query so that when certain amount of time passes an exception is risen?
Have you tried setting the HS_FDS_CONNECT_PROPERTIES parameter in the AS400 Transparent Gateway initialisation file?
For a timeout of 2 minutes:
HS_FDS_CONNECT_PROPERTIES="timeout='120'"
Another more general option for setting a query timeout is to create a profile and assign it to the user running your query.
A resource profile can be used to set limits on all sorts of usage in any particular session - one resource limit available is connection time.
For example, you could create a profile as400_tg_profile and assign it a maximum connection time of 2 minutes:
create profile as400_tg_profile limit connect_time 2;
... then you could assign this profile to the user running the query:
alter user as400_tg_user profile as400_tg_profile;
There are lots of options on creating a profile and there are many ways to assign a profile to a particular user so you should read through the documentation.
You could also look into using Oracle Resource Manager creating resource groups and resource profiles if you need to dynamically assign particular resource limits - this gives you fine-grained control of resources for individual sessions.
The Oracle documentation is really good on this - for starters, give this a read:
http://www.oracle.com/technology/products/manageability/database/pdf/twp03/twp_oracle%20database%2010g%20resource%20manager.pdf
For more detail:
http://download.oracle.com/docs/cd/B19306_01/server.102/b14231/dbrm.htm#ADMIN027
This is one of those bits of functionality that's easier to use in Enterprise Manager, but a quick PL/SQL example is given in:
http://www.dba-oracle.com/job_scheduling/resource_manager.htm
Related
I would like to create an Oracle DB user and I would like to disable him in exactly 8 hours.
I don't care if a user just gets locked or if all of his roles are revoked, I just want
to prevent him from doing any activities on DB exactly 8 hours after his DB user was created.
Does Oracle provide such option out of the box ?
If not, I might go with the following solution:
create a table where all newly created DB users are stored (with DB user creation time)
create a trigger on Create user, so I save DB username and his creation time in my table
create a function / job that checks my table every 5 minutes if there's any user older than 8 hours and it locks him
My proposed solution is very nasty so I really hope there's a better solution for my issue.
How about creating a profile
which is a set of limits on database resources. If you assign the profile to a user, then that user cannot exceed these limits.
Especially check the following parameters:
CONNECT_TIME: Specify the total elapsed time limit for a session, expressed in minutes
PASSWORD_LIFE_TIME: Specify the number of days the same password can be used for authentication. If you also set a value for PASSWORD_GRACE_TIME, then the password expires if it is not changed within the grace period, and further connections are rejected. If you omit this clause, then the default is 180 days
I am trying to use clickhouse setting profile,
I added to users.xml this profile:
<max_result_rows>3000</max_result_rows>
and I created user:
CREATE USER web IDENTIFIED WITH no_password SETTINGS PROFILE 'web'
and gave him select permission on a table I created
when I am running select I am getting 5000 rows and not 3000 limit why is that?
what I am doing wrong ?
you have to use limit as well in your query .
https://clickhouse.tech/docs/en/operations/settings/query-complexity/#result-overflow-mode
Using ‘break’ is similar to using LIMIT. Break interrupts execution only at the block level. This means that amount of returned rows is greater than max_result_rows, multiple of max_block_size and depends on max_threads.
I have a simple web app UI (which stores certain dataset parameters (for simplicity, assuming they are all data tables in a single Redshift database, but the schema/table name can vary, and the Redshift is in AWS). Tableau is installed on an EC2 instance in the same AWS account.
I am trying to determine an automated way of passing 'parameters' as a data source (i.e. within the connection string inside Tableau on EC2/AWS) rather than manually creating data source connections and inputting the various customer requests.
The flow for the user would be say 50 users select various parameters on the UI (for simplicity suppose the parameters are stored as a JSON file in AWS) -> parameters are sent to Tableau and data sources created -> connection is established within Tableau without the customer 'seeing' anything in the back end -> customer is able to play with the data in Tableau and create tables and charts accordingly.
How may I do this at least through a batch job or cloud formation setup? A "hacky" solution is fine.
Bonus: if the above is doable in real-time across multiple users that would be awesome.
** I am open to using other dashboard UI tools which solve this problem e.g. QuickSight **
After installing Tableau on EC2 I am facing issues in finding an article/documentation of how to pass parameters into the connection string itself and/or even parameterise manually.
An example could be customer1 selects "public_schema.dataset_currentdata" and "public_scema.dataset_yesterday" and one customer selects "other_schema.dataser_currentdata" all of which exist in a single database.
3 data sources should be generated (one for each above) but only the data sources selected should be open to the customer that selected it i.e. customer2 should only see the connection for other_schema.dataset_currentdata.
One hack I was thinking is to spin up a cloud formation template with Tableau installed for a customer when they make a request, creating the connection accordingly, and when they are done then just delete the cloud formation template. I am mainly unsure how I would get the connection established though i.e. pass in the parameters. I am not sure spinning up 50 EC2's though is wise. :D
An issue I have seen so far is creating a manual extract limits the number of rows. Therefore I think I need a live connection per customer request. Hence I am trying to get around this issue.
You can do this with a combination of a basic embed and applying filters. This would load the Tableau workbook. Then you would apply a filter based on whatever values your user selects from the JSON.
The final missing part is that you would use a parameter instead of a filter and pass those values to the database via initial sql.
I want to prevent users to login in Oracle BI12c with the same "username" more than once.
also, I checked many documents and see parameter like "Max Session Limit" but its not worked for my problem.
thank for your guidance to have any solution
Just as a wrap-up. OBIEE is an analytical platform and you have to think about connections in a different way. As cdb_dba said:
1.) take a step back
2.) think about what you want to do
3.) learn and comprehend how the tool works and does things
4.) decide on how you implement and control things by matching #2 and #3
You can configure this using Database Resource Manager, or by creating a customized profile for the group of users you want to limit sessions for.
Oracle's documentation on profiles can be found at the following link. You want to define the SESSIONS_PER_USER parameter as 1. https://docs.oracle.com/database/121/SQLRF/statements_6012.htm#SQLRF01310
Edit based on the additional Requirements:
After giving it some thought, I'm not sure if you could do something like this at the profile level, You'll probably have to do something like creating a trigger based on the v$session table. v$session has SCHEMANAME, OSUSER, and MACHINE. Since your users are sharing the same schema, you may be able to create a trigger that throws an error like "ERROR: Only one Connection per User/Machine" based on either the MACHINE or the OSUSER columns in the v$session table. This is less than ideal for a number of reasons, and your developers will probably hate you, but if you absolutely need to do something like this, it is possible.
Is there a way in PowerCenter 9.1 to get the number of inserts, deletes and updates after an execution of a session? I can see the data on the log but I would like to see it in a more ordered fashion in a table.
The only way I know requires building the mapping appropriately. You need to have 3 separate instances of the target and use a router to redirect the rows to either TARGET_insert or TARGET_update or TARGET_delete. Workflow Monitor will then show a separate row for the inserted, updated and deleted rows.
There are few ways,
1. You can use $tgtsuccessrows / $TgtFailedRows and assign it to workflow variable
2. Expression transformation can be used with a variable port to keep track of insert/update/delete
3. You can even query OPB_SESSLOG in second stream to get row count inside same session.
Not sure if PowerCenter 9.1 offers a solution to this problem.
You can design your mapping to populate a Audit table to track the number of insert/update/delete's
You can download a sample implementation from Informatica Marketplace block titled "PC Mapping : Custom Audit Table"
https://community.informatica.com/solutions/mapping_custom_audit_table
There are multiple ways like you can create a assignment task attach this assignment task just after you session once the session complete its run the assignment task will pass on the session stats from session to the workflow variable defined at workflow level, sessions stats like $session.status,$session.rowcount etc and now create a worklet having a mapping included in it, pass the session stats captured at workflow level to the newly created worklet and from worklet to the mapping, now once the stats are available at mapping level in the mapping scan these stats (using a SQL or EXP transformation) and then write these stats to the AUDIT table ... attach the combination of assignment task and worklet after each session and it will start capturing the stats of each session after the session completes it run....