Functions used to connect Oracle DB - oracle

We are trying to connect oracle db from loadrunner. Could you please help us in providing these details. we wanted the script to get the data from oracle DB before starting the execution. In our testing we are used different application and protocols used Web http/html, Tuxedo and CRM aplls.

You are headed down a path which will cause you harm. Build your data files ahead of time and include them with the scripts. I could tell you how to solve your problem technically, but doing so would the the technical equivalent of instructing you how to use a gun to rob a bank, ultimately a bad set of advice. Your question also exposes a lack of critical skills in client side architecture and communications. This gap will not serve you well as a performance tester.

Related

POSTing XML from Oracle to WebAPI's

Our company is in the process of creating an ASP.NET service to accept XML data sent from ERP systems such as Oracle. We have no experience (at all) with Oracle, so please excuse the simplicity of this question.
I see online that Oracle has a tool called JDeveloper that can hook up to WCF Services that use a DataContract/WSDL to send/receive data with relative ease.
Can anyone advise about the situation regarding WebAPI's, where no WSDL or DataContracts exist? Is it simple to craft a POST in Oracle to send to a WebAPI, or is the former option better/easier to work with?
Thanks in advance.
It's simple enough to call web service directly from Oracle:
There are a good support of XML/XSLT/XQuery to construct requests and parse responses (XML DB)
Oracle have an API to work with HTTP/HTTPS requests (UTL_HTTP package).
So if you decide to call web service from Oracle - it's possible and relatively simple for SOAP and REST web-services.
You can find example code in this answer on StackOverflow.
Update - answer on a comment
To make it clear, example above isn't work at a "database query level" because it's implemented on PL/SQL. Oracle Database engine natively incorporates support for two different languages:
traditional SQL, which is ANSI SQL standards implementation (traditionally with some incompatibilities and extensions);
PL/SQL, which is procedural programming language tightly integrated with a traditional SQL.
This two things are really different. Even there are a common questions about performance affected by switching context between SQL and PL/SQL engines and mostly caused by improper procedures design.
PL/SQL as a procedural language can access a rich set of APIs, provided by Oracle as a set of built-in packages. Among others there are a number of packages directly related to network communication protocols and standards: UTL_TCP, UTL_URL, UTL_SMTP, UTL_MAIL, UTL_INADDR, UTL_HTTP, HTP, HTF, DBMS_LDAP.
Needs to be said, that there are a set of APIs, provided to support publishing of PL/SQL code on the web. Set of OWA_xxxx packages supports access through mod_plsql. Another thing is a support for publishing SOAP web services in Oracle XML DB.
If you need to unload data from Oracle to web service on a schedule then look at DBMS_SCHEDULE and DBMS_JOB packages to start unload procedures periodically.
Most of this system packages implemented on Java and it's possible to write your own Java extensions callable from PL/SQL.
P.S. There are a UTL_DBWS package dedicated for implementation of calls to SOAP services from Oracle Database, but seems that it produces more problems than solves and I can't find reference to it in 11g documentation (10g only).
P.P.S. Some statements may be slightly inaccurate or contain exaggerations, but that should be enough to understand the overall picture.

Best way to get database information to a program (windows and mac)

I'm using delphi at the moment and have a program that connects to another program (a server) which has the mysql database on it and sends the data back to the client. I have a web server that has the server program and the database but my question is can I just go straight from the client program I have made (windows and future mac) to the mysql database on the web server? Or do I really need the server program? If so, what do I need to do to connect my client program to the MySQL database over the internet?
You should be able to access the mysql database directly as long as you've created a user/pw combo for the database that allows remote access (Security discussion aside). You'll then want to search for a compatible mysql library that would ease the communication between your program and mysql. At the far technical end you might have to read/write directly to the mysql socket but that's possible as well.
Depends on whether your client programs will continue to be native applications or whether you plan to migrate to browser based clients.
If they're native applications you can obtain library components for the languages they're written in which will be able to communicate directly with the MySQL database. There are plenty of options for Delphi; I'm not familiar with what options might be available for native Mac development (but, of course, Embarcadero is in the process of rolling out a Delphi that can generate Mac applications).
If, however, you're planning on making your clients browser-based, ajax solutions want to talk to a web server rather than a database server. In that case, you will need to maintain your middleware. For a discussion of whether it's possible or desirable to have a browser based application communicate directly with a database server see this question.
I would use SOAP/XML for this, and leave the SQL out of the client entirely.
This is a typical use case where REST (for example using JSON encoded database records) can be helpful. It is easy to implement a Delphi client using lkJSON or SuperObject, to put the database records from the HTTP response into a TClientDataSet.
Yes, it's possible, but is it a good idea?
here's a basic discussion of 2 tier v 3 tier architecture

Performance monitoring all layers of a system

I use several loadtesting tools (Loadrunner, JMeter, NeoLoad) to performance test different applications. Im wondering if it is possible to monitor all layers of an application stack so for example. Say i have the following data chain.
Loadbalancer <-x-> Application Server <-x-> RMI <-x-> Java Application <-x-> MQ <-x-> Legacy application <-x-> Database
Where i have marked the x in the chain i am interested in monitoring, for example avg responsetimes.
Obviously we could simply create a wrapper on all endpoints which would gather the statistics for us and maybe we could import it into loadrunner or other loadtesting tools and sideline hem with the tools inbuilt performance statistics, but maybe there is tools/applications which already does this?
If not, how should we proceed, in order to gather this kind of statistics?
The standard for this was supposed to be Application Response Measurement (ARM). It was a cross language set of APIs that did just what you were looking for. The issue is that the products that implement this spec all tend to be big, expensive "enterprise" level monitoring tools. Think multi-week installs, consultants, more infrastructure and lots of buzzwords.
Still, if this is a mission critical app with a mission critical budget, this may be what you need. But you may be able to build your own that does just enough without too much effort. A quick search turns up at least one open source ARM implementation if you still want to use that API.
Another option is to simply to have transactions you can run against each tier of the system to check general responsiveness. For example you can have a static web page on the LB, a no-op tx on the app server, a "hello" servlet on the Java app, put a message directly on the queue, etc. During a performance / load test, these could be hit directly by the load testing tool or you could write a wrapper servlet / application call that does this as a single HTTP (RMI?) call. Running these a few times a minute won't add too much load to the system, but it should help you pinpoint which tier is slower. The nice thing about this approach is that it also works in production, just watch out for security issues.
For single user kind of test, where you know you have problem (e.g. this tx is "slow"), I have also had pretty good luck with network tracing. It's very tedious, but when you aren't sure what tier is slow, starting up a network trace on a few machines and running a single tx usually gives a good idea of what the system is doing.
I have handled this decomposition a number of ways in the past. The first is at a very low level using protocol analyzer dumped data to find the time points where a conversation leaves tier X and enters tier Y. The second method is through the use of log examination for the various tiers. Something that can make your examination quite usefule in this case is a common log server for all of your components (syslog, Rsyslog, etc....) and a nice log parsing tool, such as the freely available Microsoft Logparser. The third method utilization of the audit trail for an application stored in the database. You may find this when working on enterprise services bus style applications which have a consumer/producer model and a bus to pass information rather than a direct connection. The audit trails I have seen are typically stored in a database and allow the tracking of an individual transaction through the entire application infrastructure. Your Load balancer, as a network device, may be out of the hunt on this one.
Note, if you go the protocol analyzer or log route, then be sure and synchronize all of your source information devices to a common time server. Having one of your collectors (analyzer, app log) off on a time stamp basis can really be a hair pulling experience when you get into the analysis phase.
As to how you move from your collected data into LoadRunner, that part is very mechanical. The Analysis program supports an interface to import external datapoints. The format is very specific and is documented in both help and the online docs. This import process works very well, as I often have to use it for collection of statistics from hosts which I do not have direct monitoring access to, but which need to be included as a part of the monitored test infrastructure.
James Pulley
Moderator (YahooGroups LoadRunner, Advanced-Loadrunner; GoogleGroups lr-LoadRunner; Linkedin LoadRunner, LoadRunnerByTheHour; SQAForums LoadRunner, WinRunner)

Web Hosting, Web Scaling

I have a simple web application to conduct online exams for the college students. All questions are multiple choice questions. Around 5000 users will be taking up the exam. My backend is mysql and using PHP as the front end. I want to know the hardware configuration for the servers that will be required to host this application and work seamlessly for the required no of users.
I am also looking out for cloud solutions. If I choose Amazone EC2 instances, can some body give me advice on what type of EC2 machine I should go into for this application?
It is impossible to tell the exact specs of the servers that will be required to run your setup, because there are too many variables. However, it is definitely a good question: when I was a student at university, it happened that a professor tried to do this, and didn't do testing: on the exam date, the system got overloaded and the exam had to be cancelled!
Start with testing what you already have. You can use something like the ab tool or JMeter. It will simulate the requested load for you automatically, so you can check how your actual server performs, and act accordingly.
Application design is also important. Like you can cache all the question at web layer to avoid database query. Make client heavy app such that server payload is minimum (json response) to reduce download time load on server.
Request multiple questions at once and Batch user responses to answer question together to decrease ajax calls.
Make use of nosql solution to avoid RDMS constraints overhead.

Performance testing application for bottle necks using production data

I have been tasked with looking for a performance testing solution for one of our Java applications running on a Weblogic server. The requirement is to record production requests (both GET and POST including POST data) and then run these requests in a performance test environment with a copy of the production database.
The reasons for using production requests instead of a test script are:
It is a large application with no existing test scripts so it would be a a large amount of work to write scripts to cover the entire application.
Some performance issues only appear when users do a number of actions in a particular order.
To test using actual user interaction with the system not an estimation at how the users may interact with the system. We all know that users will do things we have not thought of.
I want to be able to fix performance issues and rerun the requests against the fixed code before releasing to production.
I have looked at using JMeters Access Log Sampler with server access logs however the access logs do not contain POST data and the access log sampler only looks at the request URL so it cannot simulate users submitting form data.
I have also looked at using the JMeter HTTP Proxy Server however this can record the actions of only one user and requires the user to configure their browser to use the proxy. This same limitation exist with Tsung and The Grinder.
I have looked at using Wireshark and TCReplay but recording at the packet level is excessive and will not give any useful reports at a request level.
Is there a better way to analyze production performance considering I need to be able to test fixes before releasing to production?
That is going to be a hard ask. I work with Visual Studio Test Edition to load test my applications and we are only able to "estimate" the users activity on the site.
It is possible to look at the logs and gather information on the likelyhood of certain paths through your app. You can then look at the production database to look at the likely values entered in any post requests. From that you will have to make load tests that approach the useage patterns of your production site.
With any current tools I don't think it is possible to record and playback actual user interation.
It is possible to alter your web app so that is records and logs every request and post against session and datetime. This custom logging could be then used to generate load test requests against a test website. This would be some serious code change to your existing site and would likely have performance impacts.
That said, I have worked with web apps that do this level of logging and the ability to analyse the exact series of page posts/requests that caused an error is quite valuable to a developer.
So in summary: It is possible, but I have not heard of any off the shelf tools that do it.
Please check out this Whitepaper by Impetus Technologies on this page.. http://www.impetus.com/plabs/sandstorm.html
Honestly, I'm not sure the task you're being asked to do is even possible, let alone a good idea. Depending on how complex the application's backend is, and how perfect you can recreate the state (ie: all the way down to external SOA services or the time/clock), it may not be possible to make those GET and POST requests reproduce the same behavior.
That said, performance testing against production data is always great, but it usually requires application-specific knowledge that will stress said data. Simply repeating HTTP GETs and POSTs will almost certainly not yield useful results.
Good luck!
I would suggest the following to get the production requests and simulate the accurate workload:
1) Use coremetrics: CoreMetrics provides such solutions using which you can know the application usage patterns. This would help in coming up with an accurate workload model. This model can then be converted into test scripts and executed against a masked copy of production database. This will provide you accurate results about the application performance in realtime.
2) Another option would be creating a small utility using AOP (Aspect oriented apporach) so that it can trace all the requests and corresponding method traces. This would help in identifying the production usage pattern and in turn accurate simulation of workload. AOP frameworks such as AspectJ can be used. This would not require any changes in code. The instrumentation can be done on the fly. The other benefit would be that thi cna only be enabled for a specific time window and then it can be turned off.
Regards,
batterywalam

Resources