Grails 2.4.4 : PooledConnection has already been closed - java-8

I have been trying to upgrade my project from grails 2.1.2 to grails 2.4.4. This project calls another module(upgraded to java 8) which uses ibatis for database connection. While the module as a standalone works fine, it gives me "Connection Closed" exception when accessed from the grails project.
This applications had been working fine on grails 2.1.2 with java 6. However, the upgrade seems to be breaking something.
Exception:
Caused by: java.sql.SQLException: PooledConnection has already been closed.
at org.apache.tomcat.jdbc.pool.DisposableConnectionFacade.invoke(DisposableConnectionFacade.java:86)
at com.sun.proxy.$Proxy35.prepareStatement(Unknown Source)
at sun.reflect.GeneratedMethodAccessor354.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.springsource.loaded.ri.ReflectiveInterceptor.jlrMethodInvoke(ReflectiveInterceptor.java:1270)
at org.springframework.jdbc.datasource.LazyConnectionDataSourceProxy$LazyConnectionInvocationHandler.invoke(LazyConnectionDataSourceProxy.java:376)
at com.sun.proxy.$Proxy37.prepareStatement(Unknown Source)
at sun.reflect.GeneratedMethodAccessor354.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.springsource.loaded.ri.ReflectiveInterceptor.jlrMethodInvoke(ReflectiveInterceptor.java:1270)
at org.springframework.jdbc.datasource.TransactionAwareDataSourceProxy$TransactionAwareInvocationHandler.invoke(TransactionAwareDataSourceProxy.java:240)
at com.sun.proxy.$Proxy37.prepareStatement(Unknown Source)
at org.apache.ibatis.executor.statement.PreparedStatementHandler.instantiateStatement(PreparedStatementHandler.java:87)
at org.apache.ibatis.executor.statement.BaseStatementHandler.prepare(BaseStatementHandler.java:88)
at org.apache.ibatis.executor.statement.RoutingStatementHandler.prepare(RoutingStatementHandler.java:59)
at org.apache.ibatis.executor.SimpleExecutor.prepareStatement(SimpleExecutor.java:85)
at org.apache.ibatis.executor.SimpleExecutor.doQuery(SimpleExecutor.java:62)
at org.apache.ibatis.executor.BaseExecutor.queryFromDatabase(BaseExecutor.java:324)
at org.apache.ibatis.executor.BaseExecutor.query(BaseExecutor.java:156)
at org.apache.ibatis.executor.BaseExecutor.query(BaseExecutor.java:136)
at org.apache.ibatis.session.defaults.DefaultSqlSession.selectList(DefaultSqlSession.java:148)
at org.apache.ibatis.session.defaults.DefaultSqlSession.selectList(DefaultSqlSession.java:141)
at org.apache.ibatis.session.defaults.DefaultSqlSession.selectOne(DefaultSqlSession.java:77)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.springsource.loaded.ri.ReflectiveInterceptor.jlrMethodInvoke(ReflectiveInterceptor.java:1270)
at org.mybatis.spring.SqlSessionTemplate$SqlSessionInterceptor.invoke(SqlSessionTemplate.java:434)
I have searched for other similar questions online but I dont seem to be facing any of the issues mentioned. I do not see any Abandoned connections in my logs. Also from the DB side I see that there are two connections still alive(minimum idle connection is set to 2).
DataSource.config:
dataSource {
pooled = true
driverClassName = "oracle.jdbc.driver.OracleDriver"
username = xxx
password = yyy
dialect = 'org.hibernate.dialect.Oracle10gDialect'
dbCreate = "none"
properties {
maxActive = 15
maxIdle = 5
minIdle = 2
initialSize = 8
minEvictableIdleTimeMillis = 60000
timeBetweenEvictionRunsMillis = 60000
maxWait = 10000
testOnBorrow = true
validationQuery = "select 1 from dual"
}
}
EDIT 1:
After some more debugging on my side I see that the issue happened after we upgraded our tomcat plugin to version 7.0.55. Earlier we were using 2.1.2. This new plugin uses jdbc-pool through JdbcInterceptor to create the DB connections. This connection is then being sent to the second module(which uses ibatis)
EDIT 2: SOLVED
We tried to create a different datasource around the above mentioned datasource config pointing it to c3p0 and placed it in resources.groovy. This was injected to the ibatis module and we saw that this time round there was no "Connection Pool closed" error. It seems like some issue with the jdbc-pool, but we would like to know if there is some other workaround.
New config in resources.groovy:
dataSource_new (ComboPooledDataSource) { bean ->
idleConnectionTestPeriod = 1 * 60 * 60
testConnectionOnCheckin = true
bean.destroyMethod = 'close'
user = xxx
password = yyy
driverClass = <same as in datasource>
jdbcUrl = zzz
}
BuildConfig.groovy: compile('com.mchange:c3p0:0.9.5.1')

Related

org.apache.thrift.transport.TTransportException: Cannot open without port?

there
I am quite new to Hive, and a java app which accesses hive with kerberos authentication, like below:
try
{
System.setProperty("java.security.krb5.conf", "/haManage/krb5.conf");
StringBuilder sBuilder = new StringBuilder();
sBuilder.append("jdbc:hive2://ha-cluster/default");
sBuilder.append(";zk.quorum=").append("x.x.x.x,x.x.x.x");//ip list
sBuilder.append(";zk.port=").append("24002");
if (isSecureVer) {
sBuilder.append(";user.principal=")
.append("hadoop#HADOOP.COM")
.append(";user.keytab=")
.append("/home/hdclient/gyj/user.keytab")
.append(";sasl.qop=auth-conf;auth=KERBEROS;principal=hive/" +
"hadoop.hadoop.com#HADOOP.COM;zk.principal=zookeeper/hadoop.hadoop.com");
}
url = sBuilder.toString();
logger.info(url);
Class.forName("org.apache.hive.jdbc.HiveDriver");
connToHive = DriverManager.getConnection(url,"","");
} catch (Exception e)
{
logger.error("Error occurs",e);
}
But exception happens, shown below:
Caused by: org.apache.thrift.transport.TTransportException: Cannot open without port.
at org.apache.thrift.transport.TSocket.open(TSocket.java:172) ~[hive-exec-0.14.0.jar:0.14.0]
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:248) ~[hive-exec-0.14.0.jar:0.14.0]
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) ~[hive-exec-0.14.0.jar:0.14.0]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) ~[hive-exec-0.14.0.jar:0.14.0]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) ~[hive-exec-0.14.0.jar:0.14.0]
at java.security.AccessController.doPrivileged(Native Method) ~[na:1.7.0_45]
at javax.security.auth.Subject.doAs(Subject.java:415) ~[na:1.7.0_45]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656) ~[hadoop-common-2.6.4.jar:na]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) ~[hive-exec-0.14.0.jar:0.14.0]
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:190) ~[hive-jdbc-1.1.0.jar:1.1.0]
... 6 common frames omitted
Any effort will be appreciated.
While you have the zookeeper port specified as a query string parameter (needed for kerberos auth) you also need to have the port for hive after the hostname part of the URL. The normal port used by Hive is 10000, so your URL might start like this:
sBuilder.append("jdbc:hive2://ha-cluster:10000/default");

Cannot execute SQL from inside HIVE UDF via jdbc driver

This is my first post! I've been searching for solutions to this to no avail for weeks (on and off)...
I have a java Hive UDF where I want to run SQL against Hive table to map data in memory for later use. I have the connection information correct, but it cannot find the HiveDriver. I have tried adding the Hive jdbc driver jar in hive and exporting it into a runnable jar.
Error below, relevant code after.
hive> add jar /tmp/xhUDFs.jar;
Added [/tmp/xhUDFs.jar] to class path
Added resources: [/tmp/xhUDFs.jar]
hive> CREATE TEMPORARY FUNCTION getRefdata as 'xhUDFs.GetRefdata';
OK
Time taken: 0.042 seconds
hive> select '06001',getRefdata('CBSAname','06001');
java.lang.ClassNotFoundException:
org.apache.hadoop.hive.jdbc.HiveDriver
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at xhUDFs.GetRefdata.getRefmap(GetRefdata.java:53)
at xhUDFs.GetRefdata.evaluate(GetRefdata.java:35)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.invoke(FunctionRegistry.java:954)
at org.apache.hadoop.hive.ql.udf.generic.GenericUDFBridge.evaluate(GenericUDFBridge.java:182)
at org.apache.hadoop.hive.ql.udf.generic.GenericUDF.initializeAndFoldConstants(GenericUDF.java:168)
at org.apache.hadoop.hive.ql.plan.ExprNodeGenericFuncDesc.newInstance(ExprNodeGenericFuncDesc.java:233)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getXpathOrFuncExprNodeDesc(TypeCheckProcFactory.java:1093)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.process(TypeCheckProcFactory.java:1312)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:90)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatchAndReturn(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:79)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:133)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:110)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory.genExprNode(TypeCheckProcFactory.java:209)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory.genExprNode(TypeCheckProcFactory.java:153)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genAllExprNodeDesc(SemanticAnalyzer.java:10499)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:10455)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:3822)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:3601)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPostGroupByBodyPlan(SemanticAnalyzer.java:8943)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:8898)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9743)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9636)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genOPTree(SemanticAnalyzer.java:10109)
at org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:329)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10120)
at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:211)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:227)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:454)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:314)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1164)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1212)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1101)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1091)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:216)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:168)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:379)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:739)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:624)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136) java.lang.ClassNotFoundException:
org.apache.hadoop.hive.jdbc.HiveDriver
Now the code snippet:
try {
Class.forName("org.apache.hadoop.hive.jdbc.HiveDriver");
Connection wConn = DriverManager.getConnection("jdbc:hive2://172.20.2.40:10000/refdata", "", "");
Statement sql = wConn.createStatement();
ResultSet wResult = sql.executeQuery("select distinct zipcode5,cbsa_name,msa_name,pmsa_name,region,csaname,cbsa_div_name from refdata.zip_code_business_all_16_1_2016_orc where primaryrecord='P'");
while (wResult.next()){
censusMap.put(wResult.getString("zipcode5"), wResult.getString("cbsa_name"));
}
} catch (SQLException | ClassNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}

ActiveMQ connect, unknow data type: 49

I'm using ActiveMQ to send files from an application to another and I've encountered this error when the consumer attempt to connect at ActiveMQ server.
Here is the connect method
public void connect()
{
connectionFactory = new ActiveMQConnectionFactory(
"tcp://" + Configuration.getInstance().getServerAddress() +
":61616?jms.blobTransferPolicy.defaultUploadUrl=" + "http://" +
Configuration.getInstance().getServerAddress() + ":8161/fileserver/"
+ "&connectionTimeout=0&soTimeout=0&soWriteTimeout=0"
+ "&useInactivityMonitor=false");
try
{
connection = connectionFactory.createConnection();
session = (ActiveMQSession) connection.createSession(false,
Session.AUTO_ACKNOWLEDGE);
destination = session.createQueue(Configuration.getInstance().getQueueName());
consumer = session.createConsumer(destination);
connection.start();
// System.out.println("Consumer connected");
} catch (JMSException e) {
logger.error("PACS", e);
e.printStackTrace();
}
}
This is exactly the same method I use to connect the producer to ActiveMQ and in that case it works perfectly, at the consumer side I have the following error:
javax.jms.JMSException: Unknown data type: 49
at org.apache.activemq.util.JMSExceptionSupport.create(JMSExceptionSuppo
rt.java:54)
at org.apache.activemq.ActiveMQConnection.syncSendPacket(ActiveMQConnect
ion.java:1417)
at org.apache.activemq.ActiveMQConnection.ensureConnectionInfoSent(Activ
eMQConnection.java:1522)
at org.apache.activemq.ActiveMQConnection.createSession(ActiveMQConnecti
on.java:328)
at com.kratossrl.pacs.consumer.CloudPacsConsumer.connect(CloudPacsConsum
er.java:74)
at com.kratossrl.pacs.consumer.CloudPacsConsumer.start(CloudPacsConsumer
.java:91)
at com.kratossrl.pacs.consumer.CloudPacsConsumer.run(CloudPacsConsumer.j
ava:135)
at com.kratossrl.pacs.consumer.CloudPacsConsumer.init(CloudPacsConsumer.
java:57)
at com.kratossrl.pacs.consumer.CloudPacsConsumer$1.call(CloudPacsConsume
r.java:39)
at com.kratossrl.pacs.consumer.CloudPacsConsumer$1.call(CloudPacsConsume
r.java:1)
at javafx.concurrent.Task$TaskCallable.call(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Caused by: java.io.IOException: Unknown data type: 49
at org.apache.activemq.openwire.OpenWireFormat.doUnmarshal(OpenWireForma
t.java:348)
at org.apache.activemq.openwire.OpenWireFormat.unmarshal(OpenWireFormat.
java:268)
at org.apache.activemq.transport.tcp.TcpTransport.readCommand(TcpTranspo
rt.java:221)
at org.apache.activemq.transport.tcp.TcpTransport.doRun(TcpTransport.jav
a:213)
at org.apache.activemq.transport.tcp.TcpTransport.run(TcpTransport.java:
196)
... 1 more
javax.jms.JMSException: Unknown data type: 49
at org.apache.activemq.util.JMSExceptionSupport.create(JMSExceptionSuppo
rt.java:54)
at org.apache.activemq.ActiveMQConnection.syncSendPacket(ActiveMQConnect
ion.java:1417)
at org.apache.activemq.ActiveMQConnection.ensureConnectionInfoSent(Activ
eMQConnection.java:1522)
at org.apache.activemq.ActiveMQConnection.createSession(ActiveMQConnecti
on.java:328)
at com.kratossrl.pacs.consumer.CloudPacsConsumer.connect(CloudPacsConsum
er.java:74)
at com.kratossrl.pacs.consumer.CloudPacsConsumer.start(CloudPacsConsumer
.java:91)
at com.kratossrl.pacs.consumer.CloudPacsConsumer.run(CloudPacsConsumer.j
ava:135)
at com.kratossrl.pacs.consumer.CloudPacsConsumer.init(CloudPacsConsumer.
java:57)
at com.kratossrl.pacs.consumer.CloudPacsConsumer$1.call(CloudPacsConsume
r.java:39)
at com.kratossrl.pacs.consumer.CloudPacsConsumer$1.call(CloudPacsConsume
r.java:1)
at javafx.concurrent.Task$TaskCallable.call(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Caused by:
java.io.IOException: Unknown data type: 49
at org.apache.activemq.openwire.OpenWireFormat.doUnmarshal(OpenWireForma
t.java:348)
at org.apache.activemq.openwire.OpenWireFormat.unmarshal(OpenWireFormat.
java:268)
at org.apache.activemq.transport.tcp.TcpTransport.readCommand(TcpTranspo
rt.java:221)
at org.apache.activemq.transport.tcp.TcpTransport.doRun(TcpTransport.jav
a:213)
at org.apache.activemq.transport.tcp.TcpTransport.run(TcpTransport.java:
196)
... 1 more
I have seek in the web for this error but I have not found the '49' value.
Anyone has encountered my situation or knows the cause/solution of this problem?
Thanks in advice and sorry for my not perfect English
double-check your ActiveMQ versions, it almost appears that the client is more recent than the server and when they're negotiating the protocol the client sends something not understood by the server and the exception is thrown

CannotGetMongoDbConnectionException: Failed to authenticate to database

In the replica-set Mongo Shell, use products;db.auth('worker'.'a*******6'); is alright, but in spring-data-mongondb, I encountered the following problem:
Exception in thread "main" org.springframework.data.mongodb.CannotGetMongoDbConnectionException: Failed to authenticate to database [products], username = [worker], password = [a*******6]
at org.springframework.data.mongodb.core.ReflectiveDbInvoker.authenticate(ReflectiveDbInvoker.java:83)
at org.springframework.data.mongodb.core.MongoDbUtils.doGetDB(MongoDbUtils.java:127)
at org.springframework.data.mongodb.core.MongoDbUtils.getDB(MongoDbUtils.java:94)
at org.springframework.data.mongodb.core.SimpleMongoDbFactory.getDb(SimpleMongoDbFactory.java:197)
at org.springframework.data.mongodb.core.SimpleMongoDbFactory.getDb(SimpleMongoDbFactory.java:185)
at org.springframework.data.mongodb.core.MongoTemplate.getDb(MongoTemplate.java:1595)
at org.springframework.data.mongodb.core.MongoTemplate.execute(MongoTemplate.java:441)
at org.springframework.data.mongodb.core.MongoTemplate.doCreateCollection(MongoTemplate.java:1612)
at org.springframework.data.mongodb.core.MongoTemplate.createCollection(MongoTemplate.java:492)
at com.qixin.appmarket.service.test.TestMongodb.main(TestMongodb.java:24)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
Try updating to spring-data-mongodb to the latest version. I was getting that authentication error while using version 1.7.2.RELEASE but it stopped after updating to 1.8.0.RELEASE.
In MongoDB, you can use:
use products
db.createUser(
{
user: "accountUser",
pwd: "password",
roles: [ "readWrite", "dbAdmin" ]
}
)

Building a JMX client in a servlet installed on the Deployment Manager

I'm building a monitoring application as a servlet running on my websphere 7 ND deployment manager. The tool uses JMX to query the deployment manager for various data. Global Security is enabled on the dmgr.
I'm having problems getting this to work however. My first attempt was to use the websphere client code:
String sslProps = "file:" + base +"/properties/ssl.client.props";
System.setProperty("com.ibm.SSL.ConfigURL", sslProps);
String soapProps = "file:" + base +"/properties/soap.client.props";
System.setProperty("com.ibm.SOAP.ConfigURL", pp);
Properties connectProps = new Properties();
connectProps.setProperty(AdminClient.CONNECTOR_TYPE, AdminClient.CONNECTOR_TYPE_SOAP);
connectProps.setProperty(AdminClient.CONNECTOR_HOST, dmgrHost);
connectProps.setProperty(AdminClient.CONNECTOR_PORT, soapPort);
connectProps.setProperty(AdminClient.CONNECTOR_SECURITY_ENABLED, "true");
AdminClient adminClient = AdminClientFactory.createAdminClient(connectProps) ;
This results in the following exception:
Caused by: com.ibm.websphere.management.exception.ConnectorNotAvailableException: ADMC0016E: The system cannot create a SOAP connector to connect to host ssunlab10.apaceng.net at port 13903.
at com.ibm.ws.management.connector.soap.SOAPConnectorClient.getUrl(SOAPConnectorClient.java:1306)
at com.ibm.ws.management.connector.soap.SOAPConnectorClient.access$300(SOAPConnectorClient.java:128)
at com.ibm.ws.management.connector.soap.SOAPConnectorClient$4.run(SOAPConnectorClient.java:370)
at com.ibm.ws.security.util.AccessController.doPrivileged(AccessController.java:118)
at com.ibm.ws.management.connector.soap.SOAPConnectorClient.reconnect(SOAPConnectorClient.java:363)
... 22 more
Caused by: java.net.ConnectException: Connection refused
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:333)
at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:195)
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:182)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
at java.net.Socket.connect(Socket.java:519)
at java.net.Socket.connect(Socket.java:469)
at java.net.Socket.<init>(Socket.java:366)
at java.net.Socket.<init>(Socket.java:209)
at com.ibm.ws.management.connector.soap.SOAPConnectorClient.getUrl(SOAPConnectorClient.java:1286)
... 26 more
So, I then tried to do it via RMI, but adding in the sas.client.properties to the environment, and setting the connectort type in the code to CONNECTOR_TYPE_RMI. Now though I got a NameNotFoundException out of CORBA:
Caused by: javax.naming.NameNotFoundException: Context: , name: JMXConnector: First component in name JMXConnector not found. [Root exception is org.omg.CosNaming.NamingContextPackage.NotFound: IDL:omg.org/CosNaming/NamingContext/NotFound:1.0]
To see if it was an IBM issue, I tried using the standard JMX connector as well with the same result (substitute AdminClient for JMXConnector in the above error)
JMXServiceURL url = new JMXServiceURL("service:jmx:rmi:///jndi/JMXConnector");
Hashtable h = new Hashtable();
String providerUrl = "corbaloc:iiop:" + dmgrHost + ":" + rmiPort + "/WsnAdminNameService";
h.put(Context.PROVIDER_URL, providerUrl);
// Specify the user ID and password for the server if security is enabled on server.
String[] credentials = new String[] { "***", "***" };
h.put("jmx.remote.credentials", credentials);
// Establish the JMX connection.
JMXConnector jmxc = JMXConnectorFactory.connect(url, h);
// Get the MBean server connection instance.
mbsc = jmxc.getMBeanServerConnection();
At this point, in desperation I wrote a wsadmin sccript to run both the RMI and SOAP methods. To my amazement, this works fine. So my question is, why does the code not work in a servlet installed on the dmgr ?
regards,
Trevor
For the SOAP error, the ConnectException looks like the wrong SOAP host/port was used for the dmgr. I would double-check the server logs for the SOAP port. For the RMI error (NameNotFoundException), it looks like you're trying to use JMXConnectorFactory, which isn't supported by WAS.
If your application is installed on the dmgr, it's probably easiest to just use AdminServiceFactory.getAdminService to get an in-process reference to the AdminService rather than trying to open a new connection to the same process:
http://publib.boulder.ibm.com/infocenter/wasinfo/fep/topic/com.ibm.websphere.javadoc.doc/web/apidocs/com/ibm/websphere/management/AdminServiceFactory.html

Resources