What is your experience regarding the scalability of Oracle Forms? What's the maximum number of application users you would use Oracle Forms for: 100, 1000, 10000, 50000?
I know that this question lacks many detail information for a well-founded answer. However, I am interested in the gut feeling of seasoned Forms developers.
Thanks.
You may find this Oracle white paper useful: Forms Capacity Planning Guide.
One thing to consider is that Forms is a "stateful" system, so connected users will actually be maintaining Oracle sessions. Contrast this with a "stateless" system like Oracle Application Express (APEX). I believe (but don't have evidence to prove it) that APEX will scale better than Forms (i.e. with less hardware).
I am currently involved in an APEX project that will have 2000 concurrent users. The original plan was to use Oracle Forms, but we didn't change because Forms couldn't scale to 2000 users (it could), there were other reasons for doing so.
Personal opinon: At this point, we tend to use Forms for complex UI apps with lots of validation and pretty intense usage.
If you can meet the business needs easily in a pure-web tool like ApEx (or any of hundreds of others), I wouldn't use forms.
So you'll probably need to assume that many of those Forms users are going to be keeping their connections pretty active.
And complex Forms use a lot of memory. We're running the app server on 34-bit Windows (not my choice) and running into memory limits with about 50 active connections.
Forms is pretty good on concurrency, so with reasonable coding you're not going to hit any major database limits. And app server processing and IO won't be your constraint. It's really just a matter of how many active users you're dealing with at one time, what their memory footprint is, and how big or how many app servers you're willing to deal with.
(Background: Forms Developer since version 2.3 (with a bit of 2.0), still using it for some projects and a lot of legacy)
In simple terms Oracle Forms scales. My evidence? Oracle E-business Suite uses it. If one of Oracle's premier products couldn't scale it would have been moved off this platform a long time ago.
I agree with Tony & Jim comments, in that Forms being more expensive to scale because of the use of persistent connections.
I know this is an old question and maybe things changed now.
But we have running a huge project in oracle forms for over 2 years now with about 3000 concurrent users. And we are still expanding the project, so this number will rise in the future.
We have only 2 AIX servers running as application servers. And 1 DB server also on AIX.
Just make the necessary configurations and do a lot of performance tuning on you're applications and play with the parameters of DB and application server.
This all works fine now with our setup. So the answer of Tony Andrews about 2000 concurrent users not possible in forms is wrong. You just need to invest some time on tuning, but which application for 3000 concurrent users won't need tuning?
Related
Are there any performance benchmarks between the managed and unmanaged Oracle ODP.Net drivers?
(i.e. is there any advantage to moving to the managed driver other than for architectural/deployment simplicity)
I would like to share some results. I think that the small lack of performance is worth compared to the easiness of deployment.
Note: seg means seconds. Sorry about that.
Of course that it is a simple test, and there are several topics that is not covered like connection pool, stability, reliability and so on...
It is important to mention, that the scenarios were executed 100 times. So the time quantities are the average of that 100 executions.
Bullets from the quick start video:
Fewer files (1 or 2 dlls at most)
Smaller footprint (10 MB compared to 200 MB)
Easier side by side deployment
Same assembly for 32 and 64 bit (except for second MTS assembly).
Code Access Security
I'm not sure about performance but I doubt it will be much different either way. My guess is that the two drivers communicate in an identical way over "Oracle Net." While there might be minor differences in the in-memory client side operations done to prepare a command and process the results, this overhead typically only represents a fraction of the time relative to the entire transaction. Most of the cost/time is spent on the server in physical IO and transferring the data back to the client. This simply isn't the same as going from the oledb provider or the System.DataAccess.OracleClient driver. This is another release from the same RDBMS company - they're going to exploit all the same performance tricks that their other client used. I wish I could post a study, but i'd guess such a thing doesn't exist because in the end it would be unremarkable. A case of no news is good news - if the new provider was somehow worse you would be reading about it.
Simplicity is enough reason to switch to this IMO. The vast majority of developers and administrators do not fully understand the provider and its relationship to the unmanaged client. Confusion about oracle home preference, version mismatch, upgrades, etc comes up constantly. To eliminate these questions would be a welcome change.
Here is a gotcha for all you folks. Took me a couple weeks to figure out why Oracle Managed drivers would not connect using ef6. If your database has the following data integrity algorithms then you MUST use the unmanaged drivers!!
buried deep in the oracle documentation!!! THANKS ORACLE!!!!!
The easier deployment and bitness independence are really nice benefits, but you should rather evaluate your typical driver usage thoroughly. I faced almost 50% performance handicap when using the new managed driver in 64bit processes. Other people are reporting memory leaks etc on Oracle forum: https://forums.oracle.com/community/developer/english/oracle_database/windows_and_.net/odp.net . It looks like it's kind of typical Oracle buggy product which needs some more months/years to settle back :/
Keep in mind that Custom Types are not supported yet. This could be a reason not to switch to the managed driver.
See this Oracle doc for the differences between the managed and unmanaged version:
http://docs.oracle.com/cd/E16655_01/win.121/e17732/intro004.htm
Thanks for all the questions and responses posted on here. This site usually shows up whenever I search for information from google, and in many cases, the answers are usually relevant to the issues I needed solved.
I want to preface my question by stating that I've been programming (.NET, XML, T-SQL, AJAX, etc) for less than 2 years, and I still have a lot to learn; so, pardon my ignorance.
Here's my situation (and question): I'm building a social web application, which I know will have much traffic in a short time; as a result,
What are the basic information that I need to have, in order not to be overwhelmed? It's currently a one-man affair, and here is the hosting specification that I plan to start with: 2GB RAM, 600 HDD, 1000 GB bandwidth, and 2.13GHz Duo Core Processor.
I've read about web-farms, but I've never had an opportunity to use them, so I'm not entirely sure how to phrase this question: how can one split the same application on multiple physical servers? How do you make all the files act as one entity? And since every .net application requires a web.config, how is it split among the various files on these multiple servers?
I've built smaller projects before, but this is the first big project I'm building, and to be frank, I'm a little intimidated. So, I would like to ensure I know what I'm getting into before starting.
Thank you.
Based on your background I assume you are developing in a .Net environment? If so, I highly recommend you take a look at Windows Azure. Developing your app against Azure will allow you to deploy your app in Microsoft's cloud platform. Once deployed you can shrink and grow your resources according to demand without having to deal with the relative hassle of setting up multiple servers in multiple locations and managing it all. This allows you to pay for a "little bit" of server up front and if your app gets popular you can easily pay for "web farm" like power and geographic diversity. It also gives you a decent framework for developing an app that will scale relatively well. That's an 18,000-feet overview. If you can put some more details in your question I'm sure you will get more detailed responses. Best of luck!
Your "social web application" will not have any users if it isn't working and deployed. Don't worry about scaling much until the site actually does something useful and has a few hundred users (or at least a few dozen!). Get it working, find people around you who can help when the going gets tough, and keep at it. Otherwise your concerns about needing to scale will never be warranted.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I see this claim made in a rant here http://discuss.joelonsoftware.com/default.asp?joel.3.456646.47 . As well as in various other rants that can be looked up on google using "oracle sucks". Ok, well, if let's say something as low key as Drupal doesn't have an easy to use visual IDE I can understand why, but if this is really true about something as big money as Oracle, why don't we see an entire ecosystem of user-friendly visual tools for basic DBA work on Oracle? I mean, people who work on Oracle work for companies with big budgets, so surely they could afford a license for a fancy "sit tight and enjoy the ride Oracle admin studio" of some sort to help developers do some stuff by themselves without pestering the DBA? Or do these tools really exist and do good job whereas the people doing the rants are simply unaware of them?
Quest Software has a variety of tools, primarily TOAD but also Spotlight and there is a backup monitoring tool in beta, for database admin.
Part of the issue is that Oracle runs on a variety of platforms, such as Solaris, Linux and Windows. The larger (and therefore more complex) installs have been on more exotic hardware. A 'full stack' admin tool would really have to be native to the database platform, and that just hasn't been practical. That's one reason why the OEM stuff is built as a web-app, and why SQL*Plus, the standard client, has stuck as a command line tool. As has RMAN, the backup/recovery manager.
Another issue is that there is a lot of baggage in Oracle. Rather than a simple "Database = File" or "Table = File" model, Oracle needed to cope with data volumes too big for single files. So they have a concept of a tablespace which maps database objects to data files. That's not so much an issue with modern filesystems.
Finally, Oracle is a high-end product. You use it in situations where the cheaper alternatives can't cut it. So it is often applied in more complex environments which would require more admin anyway. In that way, it is more a case that with Oracle, you can admin your way out of situations which impossible for a competitor product.
There are tools for Oracle, both built-in and third-party.
I think that the tools for SQL Server are a lot easier to use. And third party tools for SQL Server (i.e. Red Gate) are also extremely easy to use and powerful (compared to Toad, which has a byzantine and complex user interface)
Oracle is a multi-platform database and it dates from the original RDBMS implementations generation (one of the first which competed to replace older systems), so it has a lot of layers at install which can be very challenging to deal with. PL/SQL is also more difficult for development compared to SQL Server, MySQL or DB/2 in many ways.
From the point of view of small development shops without dedicated development DBA (or a production DBA who actually understands development) resources, Oracle is less productive than SQL Server or MySQL.
For DBA management and monitoring there's Oracle Enterprise Manager Grid Control. Not an IDE, purely an enterprise-wide administration tool for all of the databases in an organization. Everything from backups to performance monitoring, job creation, alerts, and so forth.
When I was a grasshopper Master Po told me : 'A fool with a tool is still a fool'. As others have pointed out Oracle is a high-end product. You really have to read the documentation, once you understand the basic concepts of oracle there are a lot of tools available. Allmost all tasks are command-line based. A lot of different GUI applications are available to assist you. Oracle's main tools are Enterprise Manager and SQL Developer. Server side you have a few tools you can use: Database Configuration Assitant, Network Configuration Assistent, Migration Assistent, etc. Choose the one you like for a sprecific task. Bottom line is : it's not a point and click application.
If you're deploying Oracle in a large corporate environment, there is an ecosystem of user-friendly tools to administer the database. But most of those tools are relatively painful to install-- they need their own database, for example, and install components on the database server along with the central repository. It makes perfect sense to invest in this sort of heavy-weight infrastructure when you're spending 6 or 7 figures on Oracle database licenses and you need to handle things like continuous monitoring and alerting.
On the other hand, most of the folks that are complaining about Oracle usability are trying to install and run Oracle in a much different environment. If you're a developer, for example, that wants to run Oracle on your local laptop so that you have the full stack installed, you're not going to need or want one of these heavyweight tools. Those folks are going to end up with whatever tools Oracle installs by default. Traditionally, those tools have been somewhat less than ideal. Oracle is getting better about that by shipping a lightweight Enterprise Manager web client with the database that is very useful for these types of installs. But it can still be a bit of a fight to ensure that the Enterprise Manager web client works perfectly on a developer's Windows laptop install which leads a non-trivial number of developers to conclude that "Oracle sucks".
I use an app called PL/SQL developer, and it works pretty well, IMO.
www.enterprise-elements.com is one such tool
You have noticed that you are pointing to a four-year-old rant right? By a supposed DBA who didn't even know enough to turn off unneeded services in order to shorten up the load time?
I'm sorry, but if the complaint is "why can't this industrial-strength DB be managed as easy as this lightweight, feature-poor, freeware?" then I think it is a self-answering question.
To answer the rest, yes there are tools out there. To specifically answer your " I mean, people who work on Oracle work for companies with big budgets, so surely they could afford a license for a fancy "sit tight and enjoy the ride Oracle admin studio" of some sort to help developers do some stuff by themselves without pestering the DBA? " , this is more often a factor of a DBA choosing to lock down privileges - not a function of the database itself. A tool is no use to a developer if their user account is not granted the rights to do what they want.
Rants like that one? Looks like someone tasked with running an app they had no interest in actually learning much about. No wonder they got frustrated. Yes, sometimes Oracle causes frustration of its own, but many of these rants are from people who probably picked a database platform far above their needs, and are disinclined to really learn how to manage it.
I was looking for ETL tool and on google found lot about Pentaho Kettle.
I also need a Data Analyzer to run on Star Schema so that business user can play around and generate any kind of report or matrix. Again PentaHo Analyzer is looking good.
Other part of the application will be developed in java and the application should be database agnostic.
Is Pentaho good enough or there are other tools I should check.
Pentaho seems to be pretty solid, offering the whole suite of BI tools, with improved integration reportedly on the way. But...the chances are that companies wanting to go the open source route for their BI solution are also most likely to end up using open source database technology...and in that sense "database agnostic" can easily be a double-edged sword. For instance, you can develop a cube in Microsoft's Analysis Services in the comfortable knowledge that whatver MDX/XMLA your cube sends to the database will be intrepeted consistently, holding very little in the way of nasty surprises.
Compare that to the Pentaho stack, which will typically end interacting with Postgresql or Mysql. I can't vouch for how Postgresql performs in the OLAP realm, but I do know from experience that Mysql - for all its undoubted strengths - has "issues" with the types of SQL that typically crops up all over the place in an OLAP solution (you can't get far in a cube without using GROUP BY or COUNT DISTINCT). So part of what you save in licence costs will almost certainly be used to solve issues arising from the fact the Pentaho doesn't always know which database it is talking to - robbing Peter to (at least partially) pay Paul, so to speak.
Unfortunately, more info is needed. For example:
will you need to exchange data with well-known apps (Oracle Financials, Remedy, etc)? If so, you can save a ton of time & money with an ETL solution that has support for that interface already built-in.
what database products (and versions) and file types do you need to talk to?
do you need to support querying of web-services?
do you need near real-time trickling of data?
do you need rule-level auditing & counts for accounting for every single row
do you need delta processing?
what kinds of machines do you need this to run on? linux? windows? mainframe?
what kind of version control, testing and build processes will this tool have to comply with?
what kind of performance & scalability do you need?
do you mind if the database ends up driving the transformations?
do you need this to run in userspace?
do you need to run parts of it on various networks disconnected from the rest? (not uncommon for extract processes)
how many interfaces and of what complexity do you need to support?
You can spend a lot of time deploying and learning an ETL tool - only to discover that it really doesn't meet your needs very well. You're best off taking a couple of hours to figure that out first.
I've used Talend before with some success. You create your translation by chaining operations together in a graphical designer. There were definitely some WTF's and it was difficult to deal with multi-line records, but it worked well otherwise.
Talend also generates Java and you can access the ETL processes remotely. The tool is also free, although they provide enterprise training and support.
There are lots of choices. Look at BIRT, Talend and Pentaho, if you want free tools. If you want much more robustness, look at Tableau and BIRT Analytics.
On the software development projects that you have worked on, what has been the approximate cost (expressed as a percentage of total system cost) of system integration? System integration includes integrating with other software, databases, etc.
33.3% because system integration is usually associated with a fair amount of risk that is not as prevalent in other phases of the projects (coding, documentation, etc).
This is a very difficult value to estimate, especially when you are facing integrating with a system that you are not familiar with. The best you can do is track you or your team's past performance on similar projects and use those values to try to estimate how you will perform on new projects.
Generally, system integration will take longer if:
It uses a protocol, database engine, operating system, etc. that you or your team have not yet worked with.
Vendor or community support is lacking or unresponsive.
Official system documentation is not detailed enough or is out of date.
The system does not have large global market share. Such a system will not have a wide user base and a big footprint in online programming Q&A sites such as this one. This may include new, less popular, or highly domain-bound systems.
Between 0 and 99%. I have built systems with no integration at all and systems that were basically just integration of other systems. The nice thing about integration can be that it is easy to estimate. But only when the interface is fully understood. Then it is just a duplication of functionality.
There are some complicating factors, though. They can make it very expensive to impossible:
is the system you have to integrate with well understood (do the programmers who developed it still work there?)
is the system you have to integrate with well-refactored (and has automated unit and acceptance tests)?
single or multiple platform?
are domain experts available?
It depends on the integrated system's importance and other factors.
I've worked in systems with integration in a bunch of web services that were the application's core. If the web services were down, our system was simply useless.
I would list the following variables when trying to evaluate the cost:
How many systems do you integrate and how frequently are they changed?
Do you have documentation to these systems?
Is it a third party component/service that you have no control of?
If you have control over the integrated system, does it use too much "legacy" code, like COBOL; (just an example, at least where I work COBOL programmers are expensive);
Are your employees experienced with the integrated system and with the application itself?
In case of failure of the integrated service, what is the impact on your application?
How much is an employee's hour rate in these scenarios? How many hours they would need to work on these integrated systems? How much money do you have for your project? I can't say it's going to cost X% on your case without knowing these details, specially the last one.