I ve deployed an app on Heroku and I m trying to hook up an Amazon RDS add-on. I ve created an Amazon RDS instance and uploaded my Mysql database there.Then I followed all the steps described in the heroku documentation on how to connect to Amazon RDS, including auuthorization process.
https://devcenter.heroku.com//articles/amazon_rds
I have also set the Database URL on the Amazon RDS add-on. I am able to connect to my amazon instance from a mysql management tool ( so the credentials and the host address are correct)
When I run my app I get the following exception:
org.postgresql.util.PSQLException: Connection refused. Check that the hostname and port are correct and that the postmaster is accepting TCP/IP connections.
org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:138)
org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:66)
org.postgresql.jdbc2.AbstractJdbc2Connection.<init>(AbstractJdbc2Connection.java:125)
org.postgresql.jdbc3.AbstractJdbc3Connection.<init>(AbstractJdbc3Connection.java:30)
org.postgresql.jdbc3g.AbstractJdbc3gConnection.<init>(AbstractJdbc3gConnection.java:22)
org.postgresql.jdbc4.AbstractJdbc4Connection.<init>(AbstractJdbc4Connection.java:32)
org.postgresql.jdbc4.Jdbc4Connection.<init>(Jdbc4Connection.java:24)
org.postgresql.Driver.makeConnection(Driver.java:393)
org.postgresql.Driver.connect(Driver.java:267)
org.apache.commons.dbcp.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:38)
org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
org.apache.commons.dbcp.BasicDataSource.validateConnectionFactory(BasicDataSource.java:1556)
org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1545)
org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1388)
org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044)
org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider.getConnection(InjectedDataSourceConnectionProvider.java:71)
org.hibernate.jdbc.ConnectionManager.openConnection(ConnectionManager.java:446)
org.hibernate.jdbc.ConnectionManager.getConnection(ConnectionManager.java:167)
org.hibernate.jdbc.JDBCContext.connection(JDBCContext.java:160)
org.hibernate.transaction.JDBCTransaction.begin(JDBCTransaction.java:81)
org.hibernate.impl.SessionImpl.beginTransaction(SessionImpl.java:1473)
org.hibernate.ejb.TransactionImpl.begin(TransactionImpl.java:60)
org.springframework.orm.jpa.DefaultJpaDialect.beginTransaction(DefaultJpaDialect.java:70)
org.springframework.orm.jpa.vendor.HibernateJpaDialect.beginTransaction(HibernateJpaDialect.java:59)
org.springframework.orm.jpa.JpaTransactionManager.doBegin(JpaTransactionManager.java:377)
org.springframework.transaction.support.AbstractPlatformTransactionManager.getTransaction(AbstractPlatformTransactionManager.java:371)
org.springframework.transaction.interceptor.TransactionAspectSupport.createTransactionIfNecessary(TransactionAspectSupport.java:335)
org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:105)
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:202)
$Proxy19.listPeople(Unknown Source)
com.example.controller.PersonController.listPeople(PersonController.java:27)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:616)
org.springframework.web.method.support.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:213)
org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:126)
org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:96)
org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:617)
org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:578)
org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:80)
org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:923)
org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:852)
org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:882)
org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:778)
javax.servlet.http.HttpServlet.service(HttpServlet.java:621)
javax.servlet.http.HttpServlet.service(HttpServlet.java:722)
root cause
java.net.ConnectException: Connection timed out
java.net.PlainSocketImpl.socketConnect(Native Method)
java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:327)
java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:193)
java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:180)
java.net.SocksSocketImpl.connect(SocksSocketImpl.java:384)
java.net.Socket.connect(Socket.java:546)
java.net.Socket.connect(Socket.java:495)
java.net.Socket.<init>(Socket.java:392)
java.net.Socket.<init>(Socket.java:206)
org.postgresql.core.PGStream.<init>(PGStream.java:62)
org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:76)
org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:66)
org.postgresql.jdbc2.AbstractJdbc2Connection.<init>(AbstractJdbc2Connection.java:125)
org.postgresql.jdbc3.AbstractJdbc3Connection.<init>(AbstractJdbc3Connection.java:30)
org.postgresql.jdbc3g.AbstractJdbc3gConnection.<init>(AbstractJdbc3gConnection.java:22)
org.postgresql.jdbc4.AbstractJdbc4Connection.<init>(AbstractJdbc4Connection.java:32)
org.postgresql.jdbc4.Jdbc4Connection.<init>(Jdbc4Connection.java:24)
org.postgresql.Driver.makeConnection(Driver.java:393)
org.postgresql.Driver.connect(Driver.java:267)
org.apache.commons.dbcp.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:38)
org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
org.apache.commons.dbcp.BasicDataSource.validateConnectionFactory(BasicDataSource.java:1556)
org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1545)
org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1388)
org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044)
org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider.getConnection(InjectedDataSourceConnectionProvider.java:71)
org.hibernate.jdbc.ConnectionManager.openConnection(ConnectionManager.java:446)
org.hibernate.jdbc.ConnectionManager.getConnection(ConnectionManager.java:167)
org.hibernate.jdbc.JDBCContext.connection(JDBCContext.java:160)
org.hibernate.transaction.JDBCTransaction.begin(JDBCTransaction.java:81)
org.hibernate.impl.SessionImpl.beginTransaction(SessionImpl.java:1473)
org.hibernate.ejb.TransactionImpl.begin(TransactionImpl.java:60)
org.springframework.orm.jpa.DefaultJpaDialect.beginTransaction(DefaultJpaDialect.java:70)
org.springframework.orm.jpa.vendor.HibernateJpaDialect.beginTransaction(HibernateJpaDialect.java:59)
org.springframework.orm.jpa.JpaTransactionManager.doBegin(JpaTransactionManager.java:377)
org.springframework.transaction.support.AbstractPlatformTransactionManager.getTransaction(AbstractPlatformTransactionManager.java:371)
org.springframework.transaction.interceptor.TransactionAspectSupport.createTransactionIfNecessary(TransactionAspectSupport.java:335)
org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:105)
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:202)
$Proxy19.listPeople(Unknown Source)
com.example.controller.PersonController.listPeople(PersonController.java:27)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:616)
org.springframework.web.method.support.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:213)
org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:126)
org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:96)
org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:617)
org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:578)
org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:80)
org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:923)
org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:852)
org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:882)
org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:778)
javax.servlet.http.HttpServlet.service(HttpServlet.java:621)
javax.servlet.http.HttpServlet.service(HttpServlet.java:722)
Anybody has an idea why this is happening? I have double checked that the credentials and the host address are the correct one
Well I finally figured out what was wrong.I initially created an Amazon RDS instance in Ireland (since I reside in Europe). However Heroku is not able to connect to Ireland instances (since apparently they reside in US and they are using EC instances in US). So I created another instance in US East(N. Virginia) and worked!!! I m not sure if it works also with the other Amazon sites at US West (Oregon,N.California).
Related
I have the Apache J Meter Tool on my pc and i am trying to stress my App being deployed on GKE.
Sending some requests concurrently or sequentially i am noticing a wired performance.
For example if i make 100 requests at one of the endpoints ,while the most of the requests are successful, 3 or 4 requests from 100 may return the following message:
org.apache.http.conn.HttpHostConnectException: Connect to 35.246.108.130:31225 [/35.246.108.130] failed: Connection refused: connect
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:156)
at org.apache.jmeter.protocol.http.sampler.HTTPHC4Impl$JMeterDefaultHttpClientConnectionOperator.connect(HTTPHC4Impl.java:404)
at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:376)
at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:393)
at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236)
at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
at org.apache.jmeter.protocol.http.sampler.HTTPHC4Impl.executeRequest(HTTPHC4Impl.java:935)
at org.apache.jmeter.protocol.http.sampler.HTTPHC4Impl.sample(HTTPHC4Impl.java:646)
at org.apache.jmeter.protocol.http.sampler.HTTPSamplerProxy.sample(HTTPSamplerProxy.java:66)
at org.apache.jmeter.protocol.http.sampler.HTTPSamplerBase.sample(HTTPSamplerBase.java:1296)
at org.apache.jmeter.protocol.http.sampler.HTTPSamplerBase.sample(HTTPSamplerBase.java:1285)
at org.apache.jmeter.threads.JMeterThread.doSampling(JMeterThread.java:638)
at org.apache.jmeter.threads.JMeterThread.executeSamplePackage(JMeterThread.java:558)
at org.apache.jmeter.threads.JMeterThread.processSampler(JMeterThread.java:489)
at org.apache.jmeter.threads.JMeterThread.run(JMeterThread.java:256)
at java.lang.Thread.run(Unknown Source)
Caused by: java.net.ConnectException: Connection refused: connect
at java.net.DualStackPlainSocketImpl.connect0(Native Method)
at java.net.DualStackPlainSocketImpl.socketConnect(Unknown Source)
at java.net.AbstractPlainSocketImpl.doConnect(Unknown Source)
at java.net.AbstractPlainSocketImpl.connectToAddress(Unknown Source)
at java.net.AbstractPlainSocketImpl.connect(Unknown Source)
at java.net.PlainSocketImpl.connect(Unknown Source)
at java.net.SocksSocketImpl.connect(Unknown Source)
at java.net.Socket.connect(Unknown Source)
at org.apache.http.conn.socket.PlainConnectionSocketFactory.connectSocket(PlainConnectionSocketFactory.java:75)
at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
... 19 more
Also, the number of requests returning errors may be different (smaller or bigger) if the tests are conducted by a different host machine.
Any ideas about what may cause this problem?
Knowing that you are performing a stress test, a possible explanation could be that your application crashed at that particular instance and restarted. Connection Refused error is usually thrown when there is no application listening for incoming requests.
More details on this stackoverflow thread
Please check if any APM softwares are running for application logs. You will get more hints as what happened at that moment.
We are facing issues in JMeter remote testing. Master is stuck at:
Waiting for possible Shutdown/StopTestNow/Heapdump message on port 4445
While the client is throwing:
Connection refused to host exception
We are running master like below:
.\jmeter <b>-Djava.rmi.server.hostname=10.19.120.43 </b> -n -t .\Test.jmx -R 10.75.225.188
But in the slave side test completion status is sent to some other IP address:
2020-06-11 15:54:01,788 INFO o.a.j.e.RemoteJMeterEngineImpl: Creating JMeter engine on host 10.75.225.188 base '.'
<br>2020-06-11 15:54:01,788 INFO o.a.j.e.RemoteJMeterEngineImpl:<b> Remote client host: 10.19.120.43</b>
<br>2020-06-11 15:54:01,788 INFO o.a.j.s.FileServer: Set new base='.'
<br>2020-06-11 15:54:01,793 INFO o.a.j.e.RemoteJMeterEngineImpl: Cleaning previously set properties: {sample_variables=ulp_buffer_fill,ulp_lag_time,ulp_play_time,ulp_lag_ratio,ulp_dwn_time,ulp_hits,ulp_avg_chunk_time,ulp_avg_manifest_time}
2020-06-11 15:54:01,794 INFO o.a.j.e.StandardJMeterEngine: Applying properties {sample_variables=ulp_buffer_fill,ulp_lag_time,ulp_play_time,ulp_lag_ratio,ulp_dwn_time,ulp_hits,ulp_avg_chunk_time,ulp_avg_manifest_time}
<br>2020-06-11 15:54:01,795 INFO o.a.j.e.RemoteJMeterEngineImpl: Running test
<br>2020-06-11 15:54:01,797 INFO o.a.j.e.StandardJMeterEngine: Running the test!
<br>2020-06-11 15:54:01,797 INFO o.a.j.s.SampleEvent: List of sample_variables: [ulp_buffer_fill, ulp_lag_time, ulp_play_time, ulp_lag_ratio, ulp_dwn_time, ulp_hits, ulp_avg_chunk_time, ulp_avg_manifest_time]
<br>2020-06-11 15:54:22,801 ERROR o.a.j.s.RemoteListenerWrapper: testStarted(host) on 10.75.225.188
java.rmi.ConnectException: <b>Connection refused to host: 10.0.75.1; nested exception is:
java.net.ConnectException: Connection timed out: connect </b>
at sun.rmi.transport.tcp.TCPEndpoint.newSocket(Unknown Source) ~[?:1.8.0_231]
at sun.rmi.transport.tcp.TCPChannel.createConnection(Unknown Source) ~[?:1.8.0_231]
at sun.rmi.transport.tcp.TCPChannel.newConnection(Unknown Source) ~[?:1.8.0_231]
at sun.rmi.server.UnicastRef.invoke(Unknown Source) ~[?:1.8.0_231]
at java.rmi.server.RemoteObjectInvocationHandler.invokeRemoteMethod(Unknown Source) ~[?:1.8.0_231]
at java.rmi.server.RemoteObjectInvocationHandler.invoke(Unknown Source) ~[?:1.8.0_231]
at com.sun.proxy.$Proxy20.testStarted(Unknown Source) ~[?:?]
at org.apache.jmeter.samplers.RemoteListenerWrapper.testStarted(RemoteListenerWrapper.java:79) [ApacheJMeter_core.jar:4.0 r1823414]
at org.apache.jmeter.engine.StandardJMeterEngine.notifyTestListenersOfStart(StandardJMeterEngine.java:217) [ApacheJMeter_core.jar:4.0 r1823414]
at org.apache.jmeter.engine.StandardJMeterEngine.run(StandardJMeterEngine.java:384) [ApacheJMeter_core.jar:4.0 r1823414]
at java.lang.Thread.run(Unknown Source) [?:1.8.0_231]
Caused by: java.net.ConnectException: Connection timed out: connect
at java.net.DualStackPlainSocketImpl.connect0(Native Method) ~[?:1.8.0_231]
at java.net.DualStackPlainSocketImpl.socketConnect(Unknown Source) ~[?:1.8.0_231]
at java.net.AbstractPlainSocketImpl.doConnect(Unknown Source) ~[?:1.8.0_231]
at java.net.AbstractPlainSocketImpl.connectToAddress(Unknown Source) ~[?:1.8.0_231]
at java.net.AbstractPlainSocketImpl.connect(Unknown Source) ~[?:1.8.0_231]
at java.net.PlainSocketImpl.connect(Unknown Source) ~[?:1.8.0_231]
at java.net.SocksSocketImpl.connect(Unknown Source) ~[?:1.8.0_231]
at java.net.Socket.connect(Unknown Source) ~[?:1.8.0_231]
at java.net.Socket.connect(Unknown Source) ~[?:1.8.0_231]
at java.net.Socket.<init>(Unknown Source) ~[?:1.8.0_231]
at java.net.Socket.<init>(Unknown Source) ~[?:1.8.0_231]
at sun.rmi.transport.proxy.RMIDirectSocketFactory.createSocket(Unknown Source) ~[?:1.8.0_231]
at sun.rmi.transport.proxy.RMIMasterSocketFactory.createSocket(Unknown Source) ~[?:1.8.0_231]
... 11 more
FYI: this machine has docker running on it and 10.0.75.1 is associated with it
You need to EXPOSE port 1099 (or whatever port you're using as the server_port) in your Dockerfile (or use host network driver) and in the firewall of your 10.75.225.188 machine, this is the minimum requirement for JMeter master could establish the connectivity with the slave machine.
More information: Remote hosts and RMI configuration
You can also refer to JMeter Distributed Testing with Docker article, it has some sample networking configuration you might be willing to apply to your system
I have a job that get messages from an on-premise kafka and send it to BigQuery using Dataproc. But sometimes I get both bellow exceptions that causes my job to fail.
ConnectionRefused:
java.net.ConnectException: Call From <worker-3>/<ip> to <worker-1>:50267 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.GeneratedConstructorAccessor44.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
at org.apache.hadoop.ipc.Client.call(Client.java:1480)
at org.apache.hadoop.ipc.Client.call(Client.java:1413)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy81.startContainers(Unknown Source)
at org.apache.hadoop.yarn.api.impl.pb.client.ContainerManagementProtocolPBClientImpl.startContainers(ContainerManagementProtocolPBClientImpl.java:96)
at sun.reflect.GeneratedMethodAccessor17.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy82.startContainers(Unknown Source)
at org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl$Container.launch(ContainerLauncherImpl.java:151)
at org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl$EventProcessor.run(ContainerLauncherImpl.java:375)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:615)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:713)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:376)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1529)
at org.apache.hadoop.ipc.Client.call(Client.java:1452)
... 15 more
I checked the items listed in http://wiki.apache.org/hadoop/ConnectionRefused.
On the exception, we can see that:
"Call From worker-3/ip to worker-1:50267"
Communication between workers on a unknown port (?) (50267)
And sometimes with:
"Call From master-node/ip to master-node:8020"
Communication in master on NameNode metadata service port (8020).
On another reference https://serverfault.com/questions/725262/what-causes-the-connection-refused-message, it says:
The message 'Connection Refused' has two main causes:
Nothing is listening on the IP:Port you are trying to connect to.
The port is blocked by a firewall.
Shouldn't this be handled by Dataproc?
Connection reseted by peer:
Failed on local exception: java.io.IOException: Connection reset by peer; Host Details : local host is: "<master-node>/<ip>"; destination host is: "<master-node>":8020;
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:776)
at org.apache.hadoop.ipc.Client.call(Client.java:1479)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy16.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1426)
at gobblin.runtime.mapreduce.MRJobLauncher.<init>(MRJobLauncher.java:166)
at gobblin.runtime.mapreduce.CliMRJobLauncher.<init>(CliMRJobLauncher.java:59)
at gobblin.runtime.mapreduce.CliMRJobLauncher.main(CliMRJobLauncher.java:111)
... 5 more
Caused by: java.io.IOException: Connection reset by peer
at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)
at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
at sun.nio.ch.IOUtil.read(IOUtil.java:197)
at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
at org.apache.hadoop.net.SocketInputStream$Reader.performIO(SocketInputStream.java:57)
at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:520)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
at java.io.DataInputStream.readInt(DataInputStream.java:387)
at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1084)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:979)
My thoughts are that both errors are pretty generic and happened inside the Dataproc cluster, not related to my application. Am I correct? I'm using a retry aproach to resolve my problem right now. But I would like to understand why these errors occur in a Hadoop-based product like Dataproc and if I have a way to avoid them? Shouldn't Dataproc take care of all of it for the users?
I am getting the below exception (java.net.DualStackPlainSocketImpl) in Jmeter, I am manually able access the application so there is no problem from the server.I suspect it is related to the Port. Before this i have opened around more than 15 instances of jmeter to create test data, Is that might have caused this problem?
What is the clear difference in terms of network between making the request through jemeter and manual request?.Please let me know if any other additional information is required?
java.net.ConnectException: Connection timed out: connect
at java.net.DualStackPlainSocketImpl.connect0(Native Method)at java.net.DualStackPlainSocketImpl.socketConnect(Unknown Source)
at java.net.AbstractPlainSocketImpl.doConnect(Unknown Source)
at java.net.AbstractPlainSocketImpl.connectToAddress(Unknown Source)
at java.net.AbstractPlainSocketImpl.connect(Unknown Source)
at java.net.PlainSocketImpl.connect(Unknown Source)
at java.net.SocksSocketImpl.connect(Unknown Source)
at java.net.Socket.connect(Unknown Source)
at sun.security.ssl.SSLSocketImpl.connect(Unknown Source)
at org.apache.http.conn.ssl.SSLSocketFactory.connectSocket(SSLSocketFactory.java:542)
at org.apache.http.conn.ssl.SSLSocketFactory.connectSocket(SSLSocketFactory.java:414)
at org.apache.jmeter.protocol.http.sampler.LazySchemeSocketFactory.connectSocket(LazySchemeSocketFactory.java:97)
at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:180)
at org.apache.jmeter.protocol.http.sampler.hc.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:318)
at org.apache.jmeter.protocol.http.sampler.MeasuringConnectionManager$MeasuredConnection.open(MeasuringConnectionManager.java:114)
at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:610)
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:445)
at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:835)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
at org.apache.jmeter.protocol.http.sampler.HTTPHC4Impl.executeRequest(HTTPHC4Impl.java:654)
at org.apache.jmeter.protocol.http.sampler.HTTPHC4Impl.sample(HTTPHC4Impl.java:413)
at org.apache.jmeter.protocol.http.sampler.HTTPSamplerProxy.sample(HTTPSamplerProxy.java:74)
at org.apache.jmeter.protocol.http.sampler.HTTPSamplerBase.sample(HTTPSamplerBase.java:1189)
at org.apache.jmeter.protocol.http.sampler.HTTPSamplerBase.sample(HTTPSamplerBase.java:1178)
at org.apache.jmeter.threads.JMeterThread.executeSamplePackage(JMeterThread.java:491)
at org.apache.jmeter.threads.JMeterThread.processSampler(JMeterThread.java:425)
at org.apache.jmeter.threads.JMeterThread.run(JMeterThread.java:254)
at java.lang.Thread.run(Unknown Source)
I am trying to set up Hadoop 2.7.2 on Ubuntu 16.04.
I have done the Hadoop single-node cluster; I'm now trying Hive with Derby for metadata.
I believe that I've supplied appropriate permissions on the localhost port in java.policy. I have Hadoop Services started.
However, when I try to run Hive $HIVE_HOME/bin/hive, it fails with following exception trace.
Any help is appreciated.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/hive/lib/hive-jdbc-2.0.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Logging initialized using configuration in jar:file:/usr/lib/hive/lib/hive-common-2.0.0.jar!/hive-log4j2.properties
Exception in thread "main" java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1550)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3080)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3108)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:521)
at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:494)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:709)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:645)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1548)
... 15 more
Caused by: javax.jdo.JDOFatalDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby://localhost:1527/metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLNonTransientConnectionException: java.net.ConnectException : Error connecting to server localhost on port 1527 with message Connection refused.
at org.apache.derby.client.am.SQLExceptionFactory40.getSQLException(Unknown Source)
at org.apache.derby.client.am.SqlException.getSQLException(Unknown Source)
at org.apache.derby.jdbc.ClientDriver.connect(Unknown Source)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483)
at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:296)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606)
at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
at org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133)
at org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:420)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:821)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:338)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:217)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
at java.security.AccessController.doPrivileged(Native Method)
at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:397)
at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:426)
at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:320)
at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:287)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:55)
at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:64)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:516)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:481)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:547)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:370)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5749)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:219)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:67)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1548)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3080)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3108)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:521)
at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:494)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:709)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:645)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.derby.client.am.DisconnectException: java.net.ConnectException : Error connecting to server localhost on port 1527 with message Connection refused.
at org.apache.derby.client.net.NetAgent.<init>(Unknown Source)
at org.apache.derby.client.net.NetConnection.newAgent_(Unknown Source)
at org.apache.derby.client.am.Connection.<init>(Unknown Source)
at org.apache.derby.client.net.NetConnection.<init>(Unknown Source)
at org.apache.derby.client.net.NetConnection40.<init>(Unknown Source)
at org.apache.derby.client.net.ClientJDBCObjectFactoryImpl40.newNetConnection(Unknown Source)
... 66 more
Caused by: java.net.ConnectException: Connection refused
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at java.net.Socket.connect(Socket.java:538)
at java.net.Socket.<init>(Socket.java:434)
at java.net.Socket.<init>(Socket.java:211)
at javax.net.DefaultSocketFactory.createSocket(SocketFactory.java:271)
at org.apache.derby.client.net.OpenSocketAction.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
... 72 more
------
NestedThrowables:
java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby://localhost:1527/metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLNonTransientConnectionException: java.net.ConnectException : Error connecting to server localhost on port 1527 with message Connection refused.
at org.apache.derby.client.am.SQLExceptionFactory40.getSQLException(Unknown Source)
at org.apache.derby.client.am.SqlException.getSQLException(Unknown Source)
at org.apache.derby.jdbc.ClientDriver.connect(Unknown Source)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483)
at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:296)
Caused by: java.net.ConnectException: Connection refused
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at java.net.Socket.connect(Socket.java:538)
at java.net.Socket.<init>(Socket.java:434)
at java.net.Socket.<init>(Socket.java:211)
at javax.net.DefaultSocketFactory.createSocket(SocketFactory.java:271)
at org.apache.derby.client.net.OpenSocketAction.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
... 72 more
Your ubuntu server is refusing connections on port 1527. Either the derby server isn't running or you have a firewall rule blocking the connections. Run a lsof -i TCP:1527 to see if a process is running on the port. You can also try telneting to the port with telnet localhost 1527.
If you see a process on the port then the connection is probably blocked by a firewall rule or you have a stale process holding open the port. You can kill the PID and try restarting hive.
Start by debugging why the Derby server isn't starting by looking at the derby / hive server logs. These type of conditions are usually caused by permission issues on linux file paths or logs etc.
You may want to do some research on using mysql for your hive metadata if you plan on using this for more than a sandbox environment.