Ibm Web Aplication Server 7 publishing issue - websphere

My application is a combination of Spring/Hibernate/JPA. Recently my development environment was migrated to RAD 7 with WAS7. Previous I was using v.6 for RAD & WAS.
The problem is,
when I make a Java change, the server publishes for a long time, sometimes it takes upto 10 mins for a single line of change to take effect. Also even JSP changes alone takes much time during publishing!!
This was not the case in WAS6. Publishing java changes was not even a concern in WAS6. The changes takes effect immediately as the publish process is done within a few seconds.
This publishing process keeps on running several times as I make changes in my code, and I have to wait (for long intervals during work hours) till it completes, to verify/test my changes during runtime. This is horrible!!
Is there a way to make WAS7 publish JSP/Java changes faster in few seconds as like WAS6? Is there any fix/refresh pack for this?
Can someone help me with this?
Thanks in adavance.

This problem can be overcome if you have the control to publish rather than automatically publishing it. You can wait to make all your changes and then publish it.
To do that
In the server view double click on the server that you are working on and under publishing check the option "Never publish automatically".
Also if you can give the option "Run server with resources within the workspace" that would do reduce the time of copying the files from your workspace to server space while publishing.

Related

SSIS Pre Validation taking a long time - only on server

I have an (SQL Server 2017) SSIS package that takes about 5 seconds to run in Visual Studio 2019, and a similar amount of time to execute after being deployed to my local database server - on my development computer. However, when I deploy it to another server, and run it, it takes about 26 seconds to run. When looking at the execution reports - almost all of the extra time is spent in the pre-validation phase in one step
The two log entries are the first two messages in the log, and the first one is for the pre validation of the whole package. All the rest of the entries look similar to those I see on my development server.
One other note: I had previously deployed this package to this server without this same issue. I then added two tasks. One to pull xml into a result set, and another to email the results of the package. Although one of the them does load an external dll to do the emailing, neither of these two tasks take more than a second to validate or execute.
Anybody have an idea of why I would see a 20 second delay on the package pre-validate - but only on another server - and how I might be able to get rid of it?
Further Note:
I re-deployed an earlier version without the latest changes, and the 20 seconds went away. Then step by step I added the functionality back. The 20 seconds did not come back.
So just to validate this, I re-built the current version (that originally had the problem) and deployed it... and it is now back to taking 5 to 6 seconds to execute!
It could be the re-build, or it could be that that server had just been re-booted. I don't know!
I will leave this question open for a day or two to see if it comes back.

Does Visual Studio Publish to Azure Website Cause Whole Site to Recycle?

We've recently launched a new website in Azure (i.e. Azure Websites) and as is typical with new launches we've had to deploy a few tweaks to fix minor issues shortly after launch.
We want to use Slots in the long run but this is not possible at the moment. Hence we are deploying to the live site. It's a fairly busy site with a good amount of traffic and obviously want to keep downtime to am minimum.
We are using Visual Studio to publish file changes to Azure but have noticed that even if we publish a relatively insignificant single file the whole site goes down and struggles to come back up. I was assuming that publishing a single file would literally just replace that file on the file system but it's behaving more like it recycles the application pool (or Azure equivalent) for the site. The type of files I've been publishing have been Razor views, hence would not typically cause a recycle.
Does anyone know what actually happens under the hood of VS Publish and if there is a way to avoid this happening?
Thanks.
I just tried this using a basically clean new MVC app (https://github.com/KuduApps/Dev14_Net46_Mvc5), and I did not see this behavior. The Index.html view has a hit count based on a static, which would tell us if the app or the page got restarted (or if that specific page got recompiled).
Then the test is to publish it, make a change to some other view (about.cshtml), and publish again. WHen doing this and hitting Index.cshtml, the count keeps going up, and there is minimal slowdown.
If you see it getting restarted after a view change, I suggest using Kudu Console to look at the files in site\wwwroot before/after the publish, and check what has a newer timestamp (e.g. check web.config, bin folder, ...).

rational application developer - WAS server synchronize

I'm a newbie in RAD ide and currently I always restarts the server everytime I make a change.
I'm wondering what is usually the most efficient way when deploying an ear file to WAS upon code changes.
1) If the WAS is auto synchronize, does it mean that on every ctrl+s I make, it will redeploy immediately?
2) If auto synchronize is off, does changing a resources like jsp or xhtml pages would reflect without restart? How about changing java codes?
As much as I would like to try these right now, I do not have a license of RAD at home.
Thanks in advance for all the help.
In general you should try to avoid restarting server as much as possible, as it takes time. It is better to Republish, or remove and then add application to server.
1) If the WAS is auto synchronize, does it mean that on every ctrl+s I make, it will redeploy immediately?
Not immediately, it depends on the Publishing interval setting in the Server settings > Publishing (I don't remember the default, it is about 10-15 sec).
2) If auto synchronize is off, does changing a resources like jsp or xhtml pages would reflect without restart? How about changing java codes?
Restart has nothing to do with that, only republishing.
Behavior depends on other publishing options - In the Publishing settings for WAS in server settings you have the following options:
Run server with resources within the workspace
Run server with resources on Server
If you run from workspace, changes will be detected and application will reflect changes (if change in the java code application will be restarted).
If you run with on Server setting, then changes will not be reflected until republish not restart.
Restart is only required, if you change some settings on the server, e.g. changing datasource settings, security settings, etc..
For synchronizing - if you have large project, and making lots of changes it is usually better to temporary remove application from the server to avid constant republishing or disable automatic publishing.

suffers when using ibm wid (websphere integration developer), any advises?

I joined a company who uses WID as the develop tool, I am new to wid, but i feels pain when using it for a reason:
every time i change or create a jsp file or java file or configuration file, if i want to see the the outcomes, i need to restart the wid server or republish the resources, and it takes a whole lot of time, i feel that 50% of my working time is waiting the server to start.
any one have any tips to reduce the waiting time?
I found a few by myself:
we can edit the css file in the deployed folder, and refresh the page, we can see the changes right away
so as the jsp files
what about java files? and xml config files? do i need to restart the wid server everytime i make a change?
I will be really appreciate your help, because i may not need to work overtime often because of your tips. :-)
WID is primarily an integration tool built on top of RAD. It is predominantly used for SCA components and assembling them.
Try this for a start
Turn off auto publish
Turn of auto build
These two guidelines applies to RAD too.
By doing this you would save a good amount of time.
HTH
Manglu

How to speed up the eclipse project 'refresh'

I have a fairly large PHP codebase (10k files) that I work with using Eclipse 3.4/PDT 2 on a windows machine, while the files are hosted on a Debian fileserver. I connect via a mapped drive on windows.
Despite having a 1gbit ethernet connection, doing an eclipse project refresh is quite slow. Up to 5 mins. And I am blocked from working while this happens.
This normally wouldn't be such a problem since Eclipse theoretically shouldn't have to do a full refresh very often. However I use the subclipse plugin also which triggers a full refresh each time it completes a switch/update.
My hunch is that the slowest part of the process is eclipse checking the 10k files one by one for changes over samba.
There is a large number of files in the codebase that I would never need to access from eclipse, so I don't need it to check them at all. However I can't figure out how to prevent it from doing so. I have tried marking them 'derived'. This prevents them from being included in the build process etc. But it doesn't seem to speed up the refresh process at all. It seems that Eclipse still checks their changed status.
I've also removed the unneeded folders from PDT's 'build path'. This does speed up the 'building workspace' process but again it doesn't speed up the actual refresh that precedes building (and which is what takes the most time).
Thanks all for your suggestions. Basically, JW was on the right track. Work locally.
To that end, I discovered a plugin called FileSync:
http://andrei.gmxhome.de/filesync/
This automatically copies the changed files to the network share. Works fantastically. I can now do a complete update/switch/refresh from within Eclipse in a couple of seconds.
Do you have to store the files on a share? Maybe you can set up some sort of automatic mirroring, so you work with the files locally, and they get automatically copied to the share. I'm in a similar situation, and I'd hate to give up the speed of editing files on my own machine.
Given it's subversioned, why not have the files locally, and use a post commit hook to update to the latest version on the dev server after every commit? (or have a specific string in the commit log (eg '##DEPLOY##') when you want to update dev, and only run the update when the post commit hook sees this string).
Apart from refresh speed-ups, the advantage of this technique is that you can have broken files that you are working on in eclipse, and the dev server is still ok (albeit with an older version of the code).
The disadvantage is that you have to do a commit to push your saved files onto the dev server.
I solwed this problem by changing "File Transfer Buffer Size" at:
Window->Preferences->Remote Systems-Files
and change "File transfer buffer size"-s Download (KB) and Upload (KB) values to high value, I set it to 1000 kb, by default it is 40 kb
Use offline folder feature in Windows by right-click and select "Make availiable offline".
It could save a lot of time and round trip delay in the file sharing protocol.
The use of svn externals with the revision flag for the non changing stuff might prevent subclipse from refreshing those files on update. Then again it might not. Since you'd have to make some changes to the structure of your subversion repository to get it working, I would suggest you do some simple testing before doing it for real.

Resources