I am trying to use clickhouse setting profile,
I added to users.xml this profile:
<max_result_rows>3000</max_result_rows>
and I created user:
CREATE USER web IDENTIFIED WITH no_password SETTINGS PROFILE 'web'
and gave him select permission on a table I created
when I am running select I am getting 5000 rows and not 3000 limit why is that?
what I am doing wrong ?
you have to use limit as well in your query .
https://clickhouse.tech/docs/en/operations/settings/query-complexity/#result-overflow-mode
Using ‘break’ is similar to using LIMIT. Break interrupts execution only at the block level. This means that amount of returned rows is greater than max_result_rows, multiple of max_block_size and depends on max_threads.
Related
Use Case :-
We have an application which can have multiple users logged in simultaneously on a single database or multiple users logged in on multiple databases . The relationship can be 1:M , M:M or M:1 . The issue is we have a very rigorous Business Logic which authenticates these users before letting them log in . Each user will have a token of its own , plus generate his own session accordingly . I cannot fake user's as the app under test will not let it Log In .
I can put up a Load test using some authentic users that are already present in a single database and generate load using HTTP Thread - VM users from different machines and make the session go up periodically .
How do I go for this specific condition - Test for 5x - 150K concurrent Users, 250k Sessions/min . I cannot have that many database present which will give me a window of 150k concurrent Users . Please advise .
If the app allows concurrent logins it might be one option, like you have X pairs of credentials in the CSV file and the CSV Data Set Config set up for reading these credentials. By default each thread will read next line on next iteration so if the app won't kick out the previously logged in guy it might be a viable solution.
Another option is login once and save the token/session into a JMeter Property, JMeter Properties are global and can be read by multiple virtual users, it can be done using i.e. __setProperty() function
The best solution would be generating as many test users in the system as you need because each JMeter thread must represent real user using a real browser as close as possible.
Generating session is not an issue , but you cannot have data for 150k concurrent users . Plus there are conditions such as - what will we target multiple databases or a single one as we have a provision for both in our target app . Need an answer where some one would have executed a use case like this one . I could set userid's , password and other required information in CSV file and later read data from it for each user . But the question is how many users , I cannot create 150k users and then set each one for each iteration . Is there a way around it .
In the documentation for Autonomus Database, parameters that can be modified are listed here:
https://docs.oracle.com/en/cloud/paas/autonomous-database/adbsa/appendix-restrictions-database-initialization-parameters.html#GUID-7CF648C1-0822-4602-8ED1-6F5719D6779E
However, I cannot find a list for parameters which cannot be modified. For example what is the limit of number of sessions allowed on an ADB instance? Is it dependent on number of OCPUs or does it increase with autoscaling?
The number of sessions is 300 per OCPU. It is the base OCPU number that counts. Enabling autoscaling does not increase the number of sessions.
The following query from v$parameter will list the names and default values for each parameter:
select name, value from v$parameter order by name;
For sessions, you can find more details here:
Predefined Database Service Names for Autonomous Transaction Processing
300 sessions per OCPU.
One can scale upto maximum of 128 OCPUs in an ADB instance as on today.
I want to prevent users to login in Oracle BI12c with the same "username" more than once.
also, I checked many documents and see parameter like "Max Session Limit" but its not worked for my problem.
thank for your guidance to have any solution
Just as a wrap-up. OBIEE is an analytical platform and you have to think about connections in a different way. As cdb_dba said:
1.) take a step back
2.) think about what you want to do
3.) learn and comprehend how the tool works and does things
4.) decide on how you implement and control things by matching #2 and #3
You can configure this using Database Resource Manager, or by creating a customized profile for the group of users you want to limit sessions for.
Oracle's documentation on profiles can be found at the following link. You want to define the SESSIONS_PER_USER parameter as 1. https://docs.oracle.com/database/121/SQLRF/statements_6012.htm#SQLRF01310
Edit based on the additional Requirements:
After giving it some thought, I'm not sure if you could do something like this at the profile level, You'll probably have to do something like creating a trigger based on the v$session table. v$session has SCHEMANAME, OSUSER, and MACHINE. Since your users are sharing the same schema, you may be able to create a trigger that throws an error like "ERROR: Only one Connection per User/Machine" based on either the MACHINE or the OSUSER columns in the v$session table. This is less than ideal for a number of reasons, and your developers will probably hate you, but if you absolutely need to do something like this, it is possible.
Is there a way in PowerCenter 9.1 to get the number of inserts, deletes and updates after an execution of a session? I can see the data on the log but I would like to see it in a more ordered fashion in a table.
The only way I know requires building the mapping appropriately. You need to have 3 separate instances of the target and use a router to redirect the rows to either TARGET_insert or TARGET_update or TARGET_delete. Workflow Monitor will then show a separate row for the inserted, updated and deleted rows.
There are few ways,
1. You can use $tgtsuccessrows / $TgtFailedRows and assign it to workflow variable
2. Expression transformation can be used with a variable port to keep track of insert/update/delete
3. You can even query OPB_SESSLOG in second stream to get row count inside same session.
Not sure if PowerCenter 9.1 offers a solution to this problem.
You can design your mapping to populate a Audit table to track the number of insert/update/delete's
You can download a sample implementation from Informatica Marketplace block titled "PC Mapping : Custom Audit Table"
https://community.informatica.com/solutions/mapping_custom_audit_table
There are multiple ways like you can create a assignment task attach this assignment task just after you session once the session complete its run the assignment task will pass on the session stats from session to the workflow variable defined at workflow level, sessions stats like $session.status,$session.rowcount etc and now create a worklet having a mapping included in it, pass the session stats captured at workflow level to the newly created worklet and from worklet to the mapping, now once the stats are available at mapping level in the mapping scan these stats (using a SQL or EXP transformation) and then write these stats to the AUDIT table ... attach the combination of assignment task and worklet after each session and it will start capturing the stats of each session after the session completes it run....
i have a PL/SQL program which do a query to an AS400 database through Transparent Gateway. Sometimes the AS400 not responds to the query (may be network problems )and the PL/SQL program hangs.
Is there any method to set a timeout to the Oracle query so that when certain amount of time passes an exception is risen?
Have you tried setting the HS_FDS_CONNECT_PROPERTIES parameter in the AS400 Transparent Gateway initialisation file?
For a timeout of 2 minutes:
HS_FDS_CONNECT_PROPERTIES="timeout='120'"
Another more general option for setting a query timeout is to create a profile and assign it to the user running your query.
A resource profile can be used to set limits on all sorts of usage in any particular session - one resource limit available is connection time.
For example, you could create a profile as400_tg_profile and assign it a maximum connection time of 2 minutes:
create profile as400_tg_profile limit connect_time 2;
... then you could assign this profile to the user running the query:
alter user as400_tg_user profile as400_tg_profile;
There are lots of options on creating a profile and there are many ways to assign a profile to a particular user so you should read through the documentation.
You could also look into using Oracle Resource Manager creating resource groups and resource profiles if you need to dynamically assign particular resource limits - this gives you fine-grained control of resources for individual sessions.
The Oracle documentation is really good on this - for starters, give this a read:
http://www.oracle.com/technology/products/manageability/database/pdf/twp03/twp_oracle%20database%2010g%20resource%20manager.pdf
For more detail:
http://download.oracle.com/docs/cd/B19306_01/server.102/b14231/dbrm.htm#ADMIN027
This is one of those bits of functionality that's easier to use in Enterprise Manager, but a quick PL/SQL example is given in:
http://www.dba-oracle.com/job_scheduling/resource_manager.htm