By default concurrent manager jobs are run under the APPS user. Is it possible to configure it to run as another user? I'd like to be able to isolate some of my stored procedures.
Not aware of any standard way, but you could code your concurrent request to handle that. E.g. copy fnd_concurrent_requests, then in the "real" concurrent program insert into your custom_fnd_concurrent_requests "select * from fnd_concurrent_requests". Create dbms_job or something to poll your custom table, and get it to call your "real" program - make sure you fnd_global.initialize with the user, resp, and appl from fnd_concurrent_requests, and direct output to your own files using utl_file rather than fnd_file.
Related
I want to use dynamic databases on runtime without effecting config/database.php because of concurrent users.
I have a main db with a table that contains reference to several other dbs. Now at runtime I need to not only connect to those dbs but also may want to run migrations on them.
I am aware that this is possible by having a second connection entry in config.database.connections but I have a feeling that if two users hit the server at the same time, the physical config file changes may create a conflict.
I also read (and also experimented) that you can edit the second connection using below code at runtime:
\Config::set('database.connections.mysql2.database', 'somedynamicdb');
DB::purge('mysql2');
But I fear that if it persists changes for different users, then it may conflict for concurrent users. And if it does not persist changes, then it wont work for migrations.
I want to understand/know two things specifically:
What is the scope of this above code (i.e. Config::set() call)? Does it persist over different user calls to the server?
If I call migrations using Artisan::call('migrate') with a --database=connectionname clause, right after I change the db name in connectionname, will that use the dynamically set database or the physical config value?
UPDATE
Also worth noting that a call to Artisan::call('migrate') with a --database=connectionname, will make the new connection persist for the rest of your app call.
See here for details:
https://github.com/laravel/framework/issues/28253
Config::set will only apply for the request for which it was set, won't apply to any other requests, and will not persist beyond the request. If you're not processing a request (e.g. a CLI command) then it won't affect anything beyond the current PHP process.
As for Item #2, if you're invoking from the command line, you can just do DB_CONNECTION=connectionname php artisan migrate. If you need to invoke the artisan command from code, using Config::set is still the right way to go.
We use connection created on the fly here all time and works very well. We setup this on Middleware that we included after authentication and is only valid on the user current user request based on login information.
I have a side project I'm working on currently that requires me to copy over a .csv file from a remote FTP and save it locally. I figured I would use DBMS_SCHEDULER.GET_FILE but I do not have permission. When I asked my manager, he said that I wont be able to get privileges to do this and should look up other ways.
After researching for a couple of days I keep coming back to DBMS_SCHEDULER, am I out of luck or are my searching skills terrible.
Thanks
I'm not certain you'd want to use DBMS_SCHEDULER for this; from what I understand from the documentation (never used this myself) the FTP site would have to be completely open to all; there is a parameter destination_permissions, but it's only "Reserved for future use", i.e. there's no way of specifying any permissions at the moment.
If I'm right with this then I agree with your manager, though not necessarily for the same reasons (it seems like you'll never get permission to use DBMS_SCHEDULER which I hope is incorrect).
There are other methods of doing this:
UTL_TCP; this is simply a method of interacting over a TCP/IP protocol. Oracle Base has an article, which includes a FTP package based on UTL_TCP and instructions how to use it. This also requires the use of the UTL_FILE package, which can write OS files.
UTL_HTTP; I'm 99% certain it's possible to connect to an FTP using this; it's certainly possible to connect to a SFTP/any server. It'll require a little more work but it would be worth it in the longer run. It would also require the use of UTL_FILE.
A Java stored procedure to FTP directly; this is probably the best approach; create one using one of the many Java FTP libraries.
A Java stored procedure to call call OS commands; this is easiest method but the least extensible. Oracle released a white paper on calling OS commands from within PL/SQL back in 2008 but there's plenty of other stuff out there (including Oracle Base again)
Lastly, you could question whether this is actually what you want to do...
What scheduler do you use? Does it have event driven scheduling? If so there's no need to FTP from within Oracle; use UTL_FILE to write a file to the OS and then OS commands from there.
Was the other file originally in a database? If that's the case you don't need to extract it. You could use DBMS_FILE_TRANSFER to collect it straight from the database or even create a JDBC connection or (more simply) a database link to SELECT the data directly.
I need to do a Load test on application which is built on ZK framework.
When i record a script which performs below action
a. User Login
b. Select Role
c. Open and Create Record
d. Log out.
When i run the script with multiple users say 10 users then scripts create 10 records in application.
But after some random duration say 4-5 hours later same script does not create any record even though all requests are shown as passed. Script also records COMET request (Ajax Push)
I am not able to figure out the reason.
Read this which explains how ids work :
http://books.zkoss.org/index.php?title=Small_Talks/2012/January/Execute_a_Loading_or_Performance_Test_on_ZK_using_JMeter
http://blog.zkoss.org/index.php/2013/08/06/zk-jmeter-plugin/
I have a web service that calls some stored procedure on a AS400 via JTOpen.
What I would like to do is that the connections used to call the stored procedures was opened in a specific subsystem with a specific user, instead of qusrwrk/quser as now (default).
I think I can be able to clone the qusrwrk subsystem to make it start with a specific user, but what I cannot figure out is the mechanism to open the connection in the specific subsystem.
I guess there should be a property at connection level to say subsystem=MySubsystem.
But unfortunatly I haven't found that property.
Any hint would be appreciated.
Flavio
Let the system take care of the subsystem the job database server job is started in.
You should just focus on the application (which is what IBM i excels in).
If need be, you can tweak subsystem parameters for QUSRWRK to improve performance by allocating memory, etc.
The system uses a pool of prestarted jobs as described in the FAQ: When I do WRKACTJOB, why is the host server job running under QUSER instead of the profile specified on the AS400 object?
To improve performance, the host server jobs are prestarted jobs running under QUSER. When the Toolbox connects to a host server job in order to perform an API call, run a command, etc, a request is sent from the Toolbox to an available prestarted job. This request includes the user profile specified on the AS400 object that represents the connection. The host server job receives the request and swaps to the specified user profile before it runs the request. The host server itself originally runs under the QUSER profile, so output from the WRKACTJOB command will show the job as being owned by QUSER. However, the job is in fact running under the profile specified on the request. To determine what profile is being used for any given host server job, you can do one of three things:
1. Display the job log for that job and find the message indicating which user profile is used as a result of the swap.
2. Work with the job and display job status attributes to view the current user profile.
3. Use Navigator for i to view all of the server jobs, which will list the current user of each job. You can also use Navigator for i to look at the server jobs being used by a particular user.
What is the best way to save data in session variables in a classic web site?
I am maintaining a classic web site and want to be able to allow my users to demo all functionality of the site, this means allowing them to delete records.
The closet example I have seen so far are the demos of Telerik controls where they are saving the dataset in sessions on first load and allowing the user to manipulate the data.
How can I achieve the same in ASP with an MS Access backend?
If you want to persist the state over multiple pages (e.g. to demo you complete application) then it's a bit tricky.
I would suggest copying the MDB file for each session and using the copied version. This would ensure that every session uses its own data.
create a version of your access db which will be used as a fresh template for each user
on session copy the template and name it after the users session ID
use the individual MDB
Note: Then only drawback I can see here is that you need to remove the unused MDB files as it can get a lot after sometime. You could do it with a scheduled task or even on session start before you create a new one.
I am not sure what you can use to check if it's used or not but check the files creation date or maybe the LDF file can help you as well (if it does not exist = unused).
You can store a connection or inclusive an object in a session variable as far you remember what kind of variable are you storing at the retrieving time. I had never stored a dataset in a session variable but I had stored a lot of arrays in session variables so you can use the ADO Getrows method to locate a complete dataset into a session variable.
How big is the Access database? If your database is small enough (relative to the server capacity, expected number of users, and so forth) then I like the idea of using a fresh copy of the database for each user that runs the demo.
With this approach, you simplify your possible code paths. Otherwise this "are we in demo mode or not?" logic will permeate a heck of a lot of your code.
I'd do it like this...
When the user begins the demo, make a copy of the Access DB for that user to use. If your db is foo.mdb, copy it to /tempdb/foo_1234567890.mdb where 1234567890 is the user's session ID.
Alter the user's connection string to point to the fresh database copy. From this point on, your app can operate like "normal" with no further modifications.
Have a scheduled task that deletes all files in /tempdb with last-modified times more than __ hours in the past. If you don't have the ability to schedule tasks on the server (perhaps you're in a shared hosting environment, etc) then you could do this at the same time you do step #1.