IBM RAD 7 and Websphere 6.1 is slow and unresponsive - websphere

How can I improve performance when developing locally with Websphere and RAD? I am using one web application of moderate size (1000? classes) and it is impossible to handle the app locally on a Windows box. The Websphere 6.1 configuration uses the default configs. RAD7 is configured to handle a max heap of 1024mb. I thought about increasing the heap of the server. At present, the min and max are 128/300mb.
In terms of unresponsiveness, sometimes it may take minutes to load a page, if the page loads at all. Also, I disabled "Build Automatically" and Publish Automatically. Maybe those should be turned on?

I'm not sure about RAD7 but from my past experience, I'd suggest to give MyEclipse Blue a try.
Since that might not be an option, here are some other usual culprits, you can check:
How much RAM does your machine have? It's good to give WS 1GB of RAM but if your computer only has 1GB of real RAM, it's going to swap itself to death. If your boss won't pay for it, go get some RAM with your own money. 2GB are less than $80 ATM. I suggest to get at least 4GB. Yes, Windows can only use 3.5GB even when 4 are installed but that half GB costs $20 or less. Even thinking about this for more than five minutes will cost more than simply buying it.
Next make sure whether you are using the correct Java GC options. There should be some info about this in the docs. Plus make sure that the process uses the "jvm.dll" from the "server" directory, not the "client" one. "Process Explorer" will help.
Since I'm not using RAD, I'm not 100% sure about "Build Automatically" and "Publish Automatically" but since RAD7 is based on Eclipse, these options will compile code in the background as you type. This will greatly reduce the time between you saving your last change and the moment the app server can start to load the new code.
When all else fails, run websphere in a profiler and look where it spends all the time.

Aaron had great advice.
I would also suggest using JConsole to see what is going on, to help you determine if you need more memory, larger heap size, etc. My experience with running Websphere and RAD locally is that it will be slow, but then I was on an old machine that needed more memory. :)
http://java.sun.com/j2se/1.5.0/docs/guide/management/jconsole.html

Berlin,
RAD 7 saps your PC! When I was using it to develop Portlets, I followed this optimization guide and it made the IDE significantly quicker to develop Portlets in. Obviously it is aimed at Portlet development but it might help you.
Also following the advice given to the answer to this question will also help.

I definitely agree with Aaron Digulla. You will see a major performance improvement with 4GB RAM installed on your development machine. I developed an Eclipse/RAD plugin with some buddies of mine and we were able to measure how much time we saved by upgrading from 2GB to 4GB.
The plugin is available here: http://lopb.org/
After gathering some hard numbers on how much time we spent waiting for publishing and loading the app on our 2GB development machines, we were able to convince management to upgrade the rest of the developers on the team.
Anyway, you should really consider upgrading to 4GB if you want to run RAD 7 and Websphere 6 on the same development machine. Each one needs -Xms=512m -Xmx=1024m as JVM args to run well, and that means you will swap to disk way too much if you only have 2GB of RAM or less. HTH

Make sure your running was in development mode for your development and testing.
Option is under the server in the console.
Karl

hehe, we had the same problem with RAD6 and Websphere 6.
The way we speeded things up is moved to Eclipse and JBoss.
We developed on Eclipse and JBoss and then first round of testing was on Websphere. We had some issues with the differences but would never had completed the project were it not for out switch (a lot fewer issues than deving on RAD/WAS).
But to help you in the mean time...
Definitely, probably want build automatically and publish automatically off. That way you can make a bunch of changes and then tell RAD to compile and deploy while you go and get coffee.
There is a "run in dev mode" in Websphere (I know there was for 6.0) so track that down and turn it on (it's on the WAS console somewhere)
I found WAS's on stack replacement to work fairly well. I found that at the beginning of the day I'd deploy to WAS and then not have to redeploy at least until lunch time (as I was debugging). I would make changes and the changes would be fed to the server without my having to redeploy.
Chances are, even after running the profiler you'll find there's nothing much that you can do..
Turn off all validations (in RAD), they tend to take forever.
Depending on what you're doing with EE investigate the possibility of deving on another IDE/Server combo, maybe you can do the bulk of your work in there and then deploy from RAD/WAS to do some final testing. If you're using vanilla ejb's or web services this is feasible.
That max heap does sound a bit small to me. The suggestion to fire up JConsole is a good one cause it will tell you how much heap is being used, though I'm not sure if it will work on the IBM vm (RAD's). You might try and turn on the memory usage monitor in RAD, tells you how much memory is being used, that way you can tell if it's hitting the max.

JConsole will not work without specifically enabling it via a JVM command line switch.
Suggestions from Michael Wiles sound reasonable but please update your RAD first to the latest FixPack available.
You can also contact support.

Related

NetBeans issues a low memory message quite frequently

While developing Java EE applications, the NetBeans IDE (currently using 8.0.2 but the issue is not restricted/limited to this version only) issues a low memory message quite frequently.
For example, while modifying and saving some Java classes (especially, JPA entity classes), the memory soon gets over-flooded after repeating this modifying and saving process for merely 10 to 15 entity classes. The IDE issues a low memory message disturbing the whole system severely.
It leaves only one alternative which is to restart the system and I indeed have been wasting more time in restarting the system than actually developing Java EE applications for couple of years :). I have not yet seen anything about this issue over the internet.
The auto-deploy (Deploy on save) option on the IDE is turned off forever. I never use this facility as it causes the heap/PermGen space to get over-flooded quite soon.
This may or may not happen in small toy applications. I am only talking about Java EE applications using a Java EE compliant application server.
I see if there is no way to get around this problem, then this IDE cannot be used in developing real Java EE applications because dealing with this problem basically takes more time than the actual development time and otherwise, the solution may be to lean towards other IDEs like Eclipse, IntelliJ Idea, JDeveloper etc, if they go better.
I am currently using,
GlassFish Server 4.1/Java EE 7
JDK 8u45
NetBeans 8.0.2
Is there a solution somewhere in the world?
NetBeans tends to be slower than other IDEs on Windows but this is far beyond the single term slow.
The picture is taken from a simple test web application (thus not an enterprise application). In enterprise applications, changing a single character in a single entity class is not affordable. One is likely to run into a big problem, if attempted. Fixing those errors shown in the link may take hours or probably the whole day or so.
Not to mention again that I am not talking anything other than large scale applications involving Java EE or other platforms like Spring.
Long story short : Nothing can prevent me from wondering as to how it is affordable to use this IDE in software industries. It is no longer usable in this way (Nothing to abuse. Honestly, I kept patience for almost three years. I am merely wasting time using this IDE).

Environment requirements developing GWT GXT application

I have to maintain a web application built using GWT 2, GXT 2.2, RPC calls, Hibernate, Spring and MySql.
In order to debug the application server/client side, compile and work easily what are the minimum requirements (Windows Xp system)
Recommendation from GWT Team -
Source - https://vaadin.com/blog/-/blogs/the-future-of-gwt-report-2012
You can upvote GWT Dev Requests For compile time improvements - https://vaadin.com/gwt/report-2012/wishlist
Some other tips - How to improve GWT hosted mode / compilation times?
Eclipse will work on nearly any hardware. It might simply take a bit longer to compile. I develope on a 3 year old laptop and it is fast enough.
I recommend you use Eclipse with Google Plugin For Eclipse and then create your GWT project using Google Plugin For Eclipse and configure gxt for your project.
Well I've used -localWorkers 6 as argument and -Xmx512m for the VM, I've cleaned the old .class files, but still stuck on 6 minutes (my project is 9.5MB), the only thing i think could improve compiling time is to use a strong CPU fast enough
The single most performance improvement I've made with all my projects is to move my toolkit VM onto a Solid State Drive.
For best results, upgrade your laptop system disk to SSD but if downsizing is out of the question, box a 60G SSD into a external case, hook it up to an ESATA port, and blaze through your compiles. USB 3.0 is a suitable substitute for esata but uSB cannot sustain the same peak throughput.
my problem was resolved since December :D , by reformatting my old PC, the compilation time took 2 minutes, using compilation just for ie8 and no other configuration, but your replies will be useful for me in the future ;)

performance problem - growing memory and cpu usage

I have windows application I wrote.
I installed it on virtual server (vmware) that holds windows server 2008 and for some reason the application getting bigger and bigger. I used perfmon in order to see maybe there is a memory leak - but as I understand, there isn't:
Here is the proccess in task manager:
There are also two proccesses that use a lot of memory and cpu but are steady and not growing like SimeserManager.exe.
The memory growing slows the surfing on sited that this server holds.
Before this week I used the my application on phizical server with windows server 2003 and there were no problem with surfing. I can't restore the situtaion in the phizical machine since I don't have it anymore, but I don't believe there was memory error when using the phizical server.
The application is written in c# .net using visual studio 2010.
What can be the problem?
Where can I get some clues?
UPDATE
I get ANTS memory profiler and tried to seek for the problem. I created memory snapshots and here is the results:
Now I'm really lost.
I tryed the standart filters but didn't manage to find a clue for the problem. In the image you can see there is increase in the private bytes. Does that sais there is a memory leak?
Can anyone give me some clues how to continue?
Thanks!
We don't have enough information here to really debug your application. However, there are tools you can use to identify and solve this issue in your application. I would suggest you use the ANTS Memory Profiler from RedGate to help you look for your problem. Here is a link to it:
http://www.red-gate.com/products/dotnet-development/ants-memory-profiler/
It isn't free but it is cheap and extremely effective. Get the 14-day free trial and run it on your application. I would go as far as to say that if it doesn't find the issue, the issue probably isn't with your application.
As for the other processes that are taking a lot of memory, this is normal. SQL Server tries to get as much memory as possible. Running other applications on the same box as a SQL server may cause you performance issues if you aren't careful. Here is a good article on how SQL Server uses memory:
http://sqlblog.com/blogs/jonathan_kehayias/archive/2009/08/24/troubleshooting-the-sql-server-memory-leak-or-understanding-sql-server-memory-usage.aspx
As for IIS memory usage (the other process that was using lots of memory), there could be multiple reasons for this. I would suggest you read through this forum to get a better idea of what it could be (if it truly is an issue):
http://forums.iis.net/t/1150494.aspx

Drupal development: performance

as the single user / developer on a drupal website im experience serious performance problems. several issues occur:
usually i develop drupal on our company dev server but now im at a client's office. the IT guys installed a VM with WAMP on the server they usually use for .net development. on the first day of dev (installing drupal, required modules and configuring them) httpd.exe would max out the cpu and loading any page would take minutes. IT guys just scratch their heads.
i then just installed WAMP on the local machine they gave me: some 299,99 Win XP Dell piece of sh*t, nevertheless a P4 2.8Ghz 2GB Ram. the fan blows so loud the entire office is giving me dirty looks. Again httpd.exe maxes out. again, any page (esp admin ones) takes minutes to load
in firefox, the views UI is completely unworkable. alot of stuff is loaded with ajax and it again takes minutes to see the various html elements dynamically inserted in the UI to appear - try to imagine this.
Chrome seems to handle the JS a bit better but it still takes way too long to complete any kind of action.
the devel_themer module ads tons of markup to the page which leads to "Allowed memory size of X exhausted" errors (memory_limit = 128MB ).
now im at the themeing stage where i need to do a LOT of page refreshes. I NEED firebug which requires firefox which in its turn eats up CPU and RAM. What usually takes seconds now takes minutes and by the time whatever action is completed, i forgot what i was doing. im basically reading news stories in between every page reload.
now, i know drupal is resource intensive but that its impossible to develop on a typical Dell / Win XP machine is a bit much, no? at home i work on an iMac and everything runs silky smooth.
i cant imagine im the only guy with this problem since what im doing is basically drupal 101 (no custom modules so far ...). unless someone can offer a solution, im concluding that you basically can not develop a typical drupal site on a normal home desktop computer.
what gives?
So you have abandoned the VM,check you php.ini file for the memory limit, increase it and see if there is a performance boost. its usually set to a default of 16M.
HTH
I'd suggest you either make sure to spend some time actually tuning your XP system, because the default WAMP config is definitely suboptimal, or consider an alternative, like Zend Server community edition (ZCE). Although not completely free as in speech, it is free as in beer, and simply builds up on top of a better default config for Apache and MySQL.
Although less convenient than WAMP or ZCE since not bundled, a manual install of Apache 2.2 is also usually a good choice.
Also note that, that the way devel_themer works, it is constantly building files in your temp directory, meaning that unless that directory is cleaned regularly, files will accumulate and directory browses will become exceedingly slow. Only a cron.php run will cause drupal to clean those files, for an up-to-date version of devel. See my patch adding this cleanup at http://drupal.org/node/303443
Finally, you mention Firebug, and you might be using the Drupal for Firebug module, which has known performance issues, apparently related to infinite recursion in some cases; although recent versions are supposed to fix this problem. See for instance http://drupal.org/node/303443
A couple things I've run into that could potentially help.
Unless you actually need it, turn off Locale. It causes a ton of extra queries (at least the last time I looked into it, this may have changed) so if you're not using it then don't put the unnecessary load on your DB.
Just like on a regular development machine, make sure MySQL is properly tuned and configured. This goes for any setup; local, development or production. 3/4 of the time the database is the bottleneck so start there.
If you've got the devel module installed and enabled it should have a query log you can tell it to output at the bottom of the page, this should help you with number 2.

Terrible DotNetNuke performance

I'm involved with a project using DotNetNuke version 05.01.04 Community Edition. We are building our new Intranet using it, but performance is terrible.
We have five people adding pages and content to it and every 15-30 seconds they experience a pause of 10 seconds or longer before the system continues and the next screens loads.
The server is Windows 2003, 3.8GHz with 1GB of RAM. I'm told by our server admin that the CPU and memory performance don't appear to be the bottleneck.
We currently have 350 pages in the system, we a plan to add 1000. So we need to resolve this performance problem so that we can enter content and so we can go live.
I just can't see where the bottleneck is. Is there a good why to determine the bottleneck when using DotNetNuke?
Modules installed
Publish:Engage (Not currently in
use)
Page Blaster (Doesn't appear
to providing caching when users
logged in using Integrated
Authentication)
SimpleGallery
XMod
Content Manager
IIS Setup
Application recycling completely disabled (Apart from a 2am recycle)
New findings: 18th March 2010
The main bottleneck was due to version 5.1.4 having a bug which caused 1300 database roundtrips on an average page, due to broken database in-memory caching. We've upgraded to 5.2.4 which has resolved this bottleneck.
Now the next biggest bottleneck is the navigation. We've used both DDR:Menu and DDN:Nav, but both have a major impact on performance.
Is there a navigation interface out there that doesn't drain performance so badly?
I think you need to start investigating this using performance profiling tools. For the DNN application itself I'd grab something like JetBrains DotTrace or Red Gate's ANTS Performance Profiler.
For the database SQL Server Profiler would be the first choice or a tool such as Red Gate's SQL Response.
Without profiling the application these you're going to be pulling at straws.
And as Tim pointed out in his comment, installing Firebug in Firefox with the YSlow add-in to see what resources are taking longest to serve to the browser.
Mitchel Sellers has some good tutorials and checklists to go through with regards to performance in DNN. Start with Explaining High Performance DotNetNuke Configuration and Management (which points to some of his earlier articles).
I have several years of dnn development and maintainance experience, when I have this kind of problem, I start doing things from database clean up. Next thing is, find for missing indexes, and/or rebuild all the indexes periodically (sql job scheduled for that) but major performance gain would be from clean up of table
Another good considerations would be, disabling trace, debug mode to false and turn off features of dnn that you don't use (scheduler is the first one to turn off)
Edit: consider keep alive as well
Hope this helps
Is your database on that server? If so, just throw in some more RAM, or get a faster disk array...
Have you considered creating this lot of pages directly through TSQL? It's not hard to do and may save you a lot of time.

Resources