My spring boot app deployed in Elastic Beanstalks docker is unable to connect to external RDS. It always stuck at "com.zaxxer.hikari.HikariDataSource - HikariPool-1 - Starting..." during app startup.
Dockerfile
(I added the db connection details in ENTRYPOINT for troubleshooting purpose)
FROM openjdk:8
WORKDIR "/mrbs"
ARG JAR_FILE=target/*jar
COPY ${JAR_FILE} ./app.jar
EXPOSE 8080
#CMD ["java","-jar","./app.jar"]
ENTRYPOINT ["java","-jar","./app.jar","--spring.datasource.url=jdbc:mysql://[security].[security].ap-southeast-1.rds.amazonaws.com:3306/mrbs?serverTimezone=GMT%2B8","--spring.datasource.password=[security]","--spring.datasource.username=[security]","--logging.file.path=/mrbs/logs/"]
Dockerrun.aws.json
{
"AWSEBDockerrunVersion": 2,
"containerDefinitions": [
{
"name": "api",
"image": "[security]/mrbs-backend",
"hostname": "api",
"essential": true,
"memory": 128
}
]
}
.travis.yml
language: generic
sudo: required
services:
- docker
before_install:
install: skip
before_script:
script:
- mvn -f mrbs-backend clean package
- docker build -t [security]/mrbs-backend ./mrbs-backend
- echo "$DOCKER_PASSWORD" |docker login -u "$DOKCER_ID" --password-stdin
- docker push [security]/mrbs-backend
deploy:
skip_cleanup: true
provider: elasticbeanstalk
region: ap-southeast-1
#app: mrbs-docker
app: mrbs
env: Mrbs-env-3
bucket_name: elasticbeanstalk-ap-southeast-1-[security]351
bucket_patch: mrbs
on:
branch: master
access_key_id: $AWS_ACCESS_KEY
secret_access_key: $AWS_SECRET_KEY
Added security group mrbs-intra-docker to both RDS and Elastic Beanstalk
RDS Security Group
(image),
Elastic Beanstalk Security Group
(image),
EC2 Security Group
(image),
When I check the log from elastic beanstalk, it shows that spring boot app stuck at "com.zaxxer.hikari.HikariDataSource - HikariPool-1 - Starting..." during startup.
. ____ _ __ _ _
/\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \
( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
\\/ ___)| |_)| | | | | || (_| | ) ) ) )
' |____| .__|_| |_|_| |_\__, | / / / /
=========|_|==============|___/=/_/_/_/
:: Spring Boot :: (v2.3.3.RELEASE)
2020-11-01 09:14:16,590 [main] INFO c.j.m.MeetingRoomBookingSystemApplication - Starting MeetingRoomBookingSystemApplication v1.0 on api with PID 1 (/mrbs/app.jar started by root in /mrbs)
2020-11-01 09:14:16,599 [main] DEBUG c.j.m.MeetingRoomBookingSystemApplication - Running with Spring Boot v2.3.3.RELEASE, Spring v5.2.8.RELEASE
2020-11-01 09:14:16,600 [main] INFO c.j.m.MeetingRoomBookingSystemApplication - The following profiles are active: dev
2020-11-01 09:14:19,408 [main] INFO o.s.d.r.c.RepositoryConfigurationDelegate - Bootstrapping Spring Data JPA repositories in DEFERRED mode.
2020-11-01 09:14:19,729 [main] INFO o.s.d.r.c.RepositoryConfigurationDelegate - Finished Spring Data repository scanning in 294ms. Found 7 JPA repository interfaces.
2020-11-01 09:14:21,469 [main] INFO o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'org.springframework.security.access.expression.method.DefaultMethodSecurityExpressionHandler#c03cf28' of type [org.springframework.security.access.expression.method.DefaultMethodSecurityExpressionHandler] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
2020-11-01 09:14:21,513 [main] INFO o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'methodSecurityMetadataSource' of type [org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
2020-11-01 09:14:22,602 [main] INFO o.s.b.w.e.tomcat.TomcatWebServer - Tomcat initialized with port(s): 8080 (http)
2020-11-01 09:14:22,639 [main] INFO o.a.coyote.http11.Http11NioProtocol - Initializing ProtocolHandler ["http-nio-8080"]
2020-11-01 09:14:22,640 [main] INFO o.a.catalina.core.StandardService - Starting service [Tomcat]
2020-11-01 09:14:22,640 [main] INFO o.a.catalina.core.StandardEngine - Starting Servlet engine: [Apache Tomcat/9.0.37]
2020-11-01 09:14:22,826 [main] INFO o.a.c.c.C.[Tomcat].[localhost].[/] - Initializing Spring embedded WebApplicationContext
2020-11-01 09:14:22,827 [main] INFO o.s.b.w.s.c.ServletWebServerApplicationContext - Root WebApplicationContext: initialization completed in 6052 ms
2020-11-01 09:14:24,415 [main] INFO o.s.s.c.ThreadPoolTaskExecutor - Initializing ExecutorService 'applicationTaskExecutor'
2020-11-01 09:14:24,516 [main] INFO com.zaxxer.hikari.HikariDataSource - HikariPool-1 - Starting...
However, If I directly run the docker from EC2 command line, the spring boot can be started successfully, which means ec2 instance is able to connect to rds using the same image...
[ec2-user#ip-172-31-24-202 ~]$ docker image ls
REPOSITORY TAG IMAGE ID CREATED SIZE
[secruity]/mrbs-backend latest bd84014e90df 9 minutes ago 570MB
amazon/amazon-ecs-agent latest ebac5fda27cb 8 weeks ago 67MB
[ec2-user#ip-172-31-24-202 ~]$ docker run [secruity]/mrbs-backend
. ____ _ __ _ _
/\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \
( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
\\/ ___)| |_)| | | | | || (_| | ) ) ) )
' |____| .__|_| |_|_| |_\__, | / / / /
=========|_|==============|___/=/_/_/_/
:: Spring Boot :: (v2.3.3.RELEASE)
2020-11-01 09:36:39,075 [main] INFO c.j.m.MeetingRoomBookingSystemApplication - Starting MeetingRoomBookingSystemApplication v1.0 on 1d93b8a14d12 with PID 1 (/mrbs/app.jar started by root in /mrbs)
2020-11-01 09:36:39,087 [main] DEBUG c.j.m.MeetingRoomBookingSystemApplication - Running with Spring Boot v2.3.3.RELEASE, Spring v5.2.8.RELEASE
2020-11-01 09:36:39,088 [main] INFO c.j.m.MeetingRoomBookingSystemApplication - The following profiles are active: dev
2020-11-01 09:36:44,035 [main] INFO o.s.d.r.c.RepositoryConfigurationDelegate - Bootstrapping Spring Data JPA repositories in DEFERRED mode.
2020-11-01 09:36:44,558 [main] INFO o.s.d.r.c.RepositoryConfigurationDelegate - Finished Spring Data repository scanning in 474ms. Found 7 JPA repository interfaces.
2020-11-01 09:36:47,066 [main] INFO o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'org.springframework.security.access.expression.method.DefaultMethodSecurityExpressionHandler#c03cf28' of type [org.springframework.security.access.expression.method.DefaultMethodSecurityExpressionHandler] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
2020-11-01 09:36:47,107 [main] INFO o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'methodSecurityMetadataSource' of type [org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
2020-11-01 09:36:49,047 [main] INFO o.s.b.w.e.tomcat.TomcatWebServer - Tomcat initialized with port(s): 8080 (http)
2020-11-01 09:36:49,099 [main] INFO o.a.coyote.http11.Http11NioProtocol - Initializing ProtocolHandler ["http-nio-8080"]
2020-11-01 09:36:49,106 [main] INFO o.a.catalina.core.StandardService - Starting service [Tomcat]
2020-11-01 09:36:49,107 [main] INFO o.a.catalina.core.StandardEngine - Starting Servlet engine: [Apache Tomcat/9.0.37]
2020-11-01 09:36:49,546 [main] INFO o.a.c.c.C.[Tomcat].[localhost].[/] - Initializing Spring embedded WebApplicationContext
2020-11-01 09:36:49,549 [main] INFO o.s.b.w.s.c.ServletWebServerApplicationContext - Root WebApplicationContext: initialization completed in 10161 ms
2020-11-01 09:36:51,992 [main] INFO o.s.s.c.ThreadPoolTaskExecutor - Initializing ExecutorService 'applicationTaskExecutor'
2020-11-01 09:36:52,141 [main] INFO com.zaxxer.hikari.HikariDataSource - HikariPool-1 - Starting...
2020-11-01 09:36:54,622 [main] INFO com.zaxxer.hikari.HikariDataSource - HikariPool-1 - Start completed.
filePath=/mrbs/logs/
2020-11-01 09:36:55,296 [main] DEBUG com.jiangwensi.mrbs.AppContext - setApplicationContext is called
2020-11-01 09:36:55,528 [task-1] INFO o.h.jpa.internal.util.LogHelper - HHH000204: Processing PersistenceUnitInfo [name: default]
2020-11-01 09:36:56,161 [main] WARN o.s.b.a.o.j.JpaBaseConfiguration$JpaWebConfiguration - spring.jpa.open-in-view is enabled by default. Therefore, database queries may be performed during view rendering. Explicitly configure spring.jpa.open-in-view to disable this warning
2020-11-01 09:36:58,427 [task-1] INFO org.hibernate.Version - HHH000412: Hibernate ORM core version 5.4.20.Final
2020-11-01 09:36:58,874 [main] DEBUG o.s.s.w.a.e.ExpressionBasedFilterInvocationSecurityMetadataSource - Adding web access control expression 'permitAll', for Ant [pattern='/auth/signUp', POST]
2020-11-01 09:36:58,887 [main] DEBUG o.s.s.w.a.e.ExpressionBasedFilterInvocationSecurityMetadataSource - Adding web access control expression 'permitAll', for Ant [pattern='/auth/verifyEmail', GET]
2020-11-01 09:36:58,888 [main] DEBUG o.s.s.w.a.e.ExpressionBasedFilterInvocationSecurityMetadataSource - Adding web access control expression 'permitAll', for Ant [pattern='/auth/requestResetForgottenPassword', POST]
2020-11-01 09:36:58,888 [main] DEBUG o.s.s.w.a.e.ExpressionBasedFilterInvocationSecurityMetadataSource - Adding web access control expression 'permitAll', for Ant [pattern='/auth/resetForgottenPassword', POST]
2020-11-01 09:36:58,889 [main] DEBUG o.s.s.w.a.e.ExpressionBasedFilterInvocationSecurityMetadataSource - Adding web access control expression 'permitAll', for Ant [pattern='/auth/resetPassword', POST]
2020-11-01 09:36:58,892 [main] DEBUG o.s.s.w.a.e.ExpressionBasedFilterInvocationSecurityMetadataSource - Adding web access control expression 'authenticated', for any request
2020-11-01 09:36:59,011 [main] DEBUG o.s.s.w.a.i.FilterSecurityInterceptor - Validated configuration attributes
2020-11-01 09:36:59,020 [main] DEBUG o.s.s.w.a.i.FilterSecurityInterceptor - Validated configuration attributes
2020-11-01 09:36:59,036 [main] INFO o.s.s.w.DefaultSecurityFilterChain - Creating filter chain: any request, [org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter#31bcf236, org.springframework.security.web.context.SecurityContextPersistenceFilter#4c51cf28, org.springframework.security.web.header.HeaderWriterFilter#289710d9, org.springframework.web.filter.CorsFilter#4b3ed2f0, org.springframework.security.web.authentication.logout.LogoutFilter#3549bca9, com.jiangwensi.mrbs.security.LoginAuthenticationFilter#4fad9bb2, com.jiangwensi.mrbs.security.JwtAuthenticationFilter#517d4a0d, org.springframework.security.web.savedrequest.RequestCacheAwareFilter#5143c662, org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter#71c27ee8, org.springframework.security.web.authentication.AnonymousAuthenticationFilter#7862f56, org.springframework.security.web.session.SessionManagementFilter#3da30852, org.springframework.security.web.access.ExceptionTranslationFilter#4eb386df, org.springframework.security.web.access.intercept.FilterSecurityInterceptor#134d26af]
2020-11-01 09:36:59,324 [main] DEBUG o.s.s.a.i.a.MethodSecurityInterceptor - Validated configuration attributes
2020-11-01 09:36:59,643 [main] INFO o.h.validator.internal.util.Version - HV000001: Hibernate Validator 6.1.5.Final
2020-11-01 09:37:01,174 [task-1] INFO o.h.annotations.common.Version - HCANN000001: Hibernate Commons Annotations {5.1.0.Final}
2020-11-01 09:37:03,566 [task-1] INFO org.hibernate.dialect.Dialect - HHH000400: Using dialect: org.hibernate.dialect.MySQL8Dialect
2020-11-01 09:37:05,670 [main] DEBUG o.s.s.c.a.a.c.AuthenticationConfiguration$EnableGlobalAuthenticationAutowiredConfigurer - Eagerly initializing {webSecurityConfigurator=com.jiangwensi.mrbs.security.WebSecurityConfigurator$$EnhancerBySpringCGLIB$$aa3555ad#1a20270e}
2020-11-01 09:37:05,692 [main] INFO o.a.coyote.http11.Http11NioProtocol - Starting ProtocolHandler ["http-nio-8080"]
2020-11-01 09:37:06,111 [main] INFO o.s.b.w.e.tomcat.TomcatWebServer - Tomcat started on port(s): 8080 (http) with context path ''
2020-11-01 09:37:06,136 [main] INFO o.s.d.r.c.DeferredRepositoryInitializationListener - Triggering deferred initialization of Spring Data repositories…
2020-11-01 09:37:09,726 [task-1] INFO o.h.e.t.j.p.i.JtaPlatformInitiator - HHH000490: Using JtaPlatform implementation: [org.hibernate.engine.transaction.jta.platform.internal.NoJtaPlatform]
2020-11-01 09:37:09,802 [task-1] INFO o.s.o.j.LocalContainerEntityManagerFactoryBean - Initialized JPA EntityManagerFactory for persistence unit 'default'
2020-11-01 09:37:12,036 [main] INFO o.s.d.r.c.DeferredRepositoryInitializationListener - Spring Data repositories initialized!
2020-11-01 09:37:12,065 [main] INFO c.j.m.MeetingRoomBookingSystemApplication - Started MeetingRoomBookingSystemApplication in 36.293 seconds (JVM running for 42.707)
2020-11-01 09:37:12,073 [main] DEBUG c.j.mrbs.InitializeApplicationData - onApplicationEvent is called
Based on the comments.
The issue was caused by not sufficient memory allocated to the container.
The solution was to increase the memory.
We are using a stack on SpringBoot, Hibernate and Liquibase. I have a sql file with 24000 inserts. When I converted it into yaml (for versioning purposes), I got a huge yaml file which I split into 16 yamls. Insertion using the master file using the liquibase command line option is pretty quick. But with Spring and Hibernate, it gets stuck. Upto 5 files is fine. Anything more than this dosn't work.
I tried with 4 files each from the 16 files and it works too. So it is not an issue with any malformed yaml files. I also tried with the following properties in my application.yml.
spring:
datasource:
hikari:
maximum-pool-size: 100
properties:
hibernate:
jdbc:
batch_size: 200
Basically, it gets stuck at the changelog lock. This is what the log shows:
2019-10-17 18:10:26.375 INFO 1 --- [ main] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Starting...
2019-10-17 18:10:26.387 WARN 1 --- [ main] com.zaxxer.hikari.util.DriverDataSource : Registered driver with driverClassName=com.mysql.jdbc.Driver was not found, trying direct instantiation.
2019-10-17 18:10:26.927 INFO 1 --- [ main] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Start completed.
2019-10-17 18:10:27.760 INFO 1 --- [ main] liquibase.executor.jvm.JdbcExecutor : SELECT COUNT(*) FROM pbr.DATABASECHANGELOGLOCK
2019-10-17 18:10:27.794 INFO 1 --- [ main] liquibase.executor.jvm.JdbcExecutor : SELECT COUNT(*) FROM pbr.DATABASECHANGELOGLOCK
2019-10-17 18:10:27.804 INFO 1 --- [ main] liquibase.executor.jvm.JdbcExecutor : SELECT `LOCKED` FROM pbr.DATABASECHANGELOGLOCK WHERE ID=1
2019-10-17 18:10:27.812 INFO 1 --- [ main] l.lockservice.StandardLockService : Waiting for changelog lock....
2019-10-17 18:10:37.816 INFO 1 --- [ main] liquibase.executor.jvm.JdbcExecutor : SELECT `LOCKED` FROM pbr.DATABASECHANGELOGLOCK WHERE ID=1
2019-10-17 18:10:37.821 INFO 1 --- [ main] l.lockservice.StandardLockService : Waiting for changelog lock..
It did not work. Please help me.
The inserts are like this:
databaseChangeLog:
- changeSet:
id: 15706644546-4
author: pbr-admin
changes:
- insert:
columns:
- column:
name: model_id
value: xxxxx
- column:
name: category_id
value: ALL_TRANSACTIONS
- column:
name: afpr_indexed
valueBoolean: false
- column:
name: score
valueNumeric: xxx
The changelog is liquibase specific.
Im a newbie to apache camle and
Lately Ive been trying to make a post request to a HTTPS Rest API.
I have gone through many posts and documentation but still I couldnt get a gist of this.
Please find my code below
**
from("timer:aTimer?period=20s")
.process(ex->ex.getIn().setBody(
"{\n" +
" \"userId\": 777,\n" +
" \"title\": \"sample\",\n" +
" \"body\": \"my body\"\n" +
" }"
))
.setHeader(Exchange.HTTP_METHOD,constant("POST"))
.setHeader(Exchange.CONTENT_TYPE,constant("application/json"))
.to("restlet:https://jsonplaceholder.typicode.com/posts")
.log("${body}");**
Whenever I run my application im getting the below error.
Started
INFO DefaultCamelContext - Apache Camel 2.20.1 (CamelContext: camel-1) is starting
INFO ManagedManagementStrategy - JMX is enabled
INFO DefaultTypeConverter - Type converters loaded (core: 192, classpath: 14)
INFO DefaultCamelContext - StreamCaching is not in use. If using streams then its recommended to enable stream caching. See more details at http://camel.apache.org/stream-caching.html
Mar 05, 2018 3:20:45 PM org.restlet.ext.httpclient.HttpClientHelper start
INFO: Starting the Apache HTTP client
INFO DefaultCamelContext - Route: route1 started and consuming from: timer://aTimer?period=20s
INFO DefaultCamelContext - Total 1 routes, of which 1 are started
INFO DefaultCamelContext - Apache Camel 2.20.1 (CamelContext: camel-1) started in 0.879 seconds
INFO DefaultCamelContext - Apache Camel 2.20.1 (CamelContext: camel-1) is shutting down
INFO DefaultShutdownStrategy - Starting to graceful shutdown 1 routes (timeout 300 seconds)
INFO DefaultShutdownStrategy - Waiting as there are still 1 inflight and pending exchanges to complete, timeout in 300 seconds. Inflights per route: [route1 = 1]
INFO DefaultShutdownStrategy - There are 1 inflight exchanges:
InflightExchange: [exchangeId=ID-ubuntu-Latitude-6430U-1520243444162-0-1, fromRouteId=route1, routeId=route1, nodeId=to1, elapsed=0, duration=3018]
INFO DefaultShutdownStrategy - Waiting as there are still 1 inflight and pending exchanges to complete, timeout in 299 seconds. Inflights per route: [route1 = 1]
INFO DefaultShutdownStrategy - There are 1 inflight exchanges:
InflightExchange: [exchangeId=ID-ubuntu-Latitude-6430U-1520243444162-0-1, fromRouteId=route1, routeId=route1, nodeId=to1, elapsed=0, duration=4020]
INFO DefaultShutdownStrategy - Waiting as there are still 1 inflight and pending exchanges to complete, timeout in 298 seconds. Inflights per route: [route1 = 1]
INFO DefaultShutdownStrategy - There are 1 inflight exchanges:
InflightExchange: [exchangeId=ID-ubuntu-Latitude-6430U-1520243444162-0-1, fromRouteId=route1, routeId=route1, nodeId=to1, elapsed=0, duration=5023]
Mar 05, 2018 3:20:51 PM org.restlet.ext.httpclient.internal.HttpMethodCall sendRequest
WARNING: An error occurred during the communication with the remote HTTP server.
javax.net.ssl.SSLException: Unrecognized SSL message, plaintext connection?
at sun.security.ssl.InputRecord.handleUnknownRecord(InputRecord.java:710)
at sun.security.ssl.InputRecord.read(InputRecord.java:527)
at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:983)
at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1385)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1413)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1397)
at org.apache.http.conn.ssl.SSLSocketFactory.createLayeredSocket(SSLSocketFactory.java:573)
at org.apache.http.conn.ssl.SSLSocketFactory.connectSocket(SSLSocketFactory.java:557)
at org.apache.http.conn.ssl.SSLSocketFactory.connectSocket(SSLSocketFactory.java:414)
at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:180)
at org.apache.http.impl.conn.AbstractPoolEntry.open(AbstractPoolEntry.java:144)
at org.apache.http.impl.conn.AbstractPooledConnAdapter.open(AbstractPooledConnAdapter.java:134)
at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:610)
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:445)
at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:835)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56)
at org.restlet.ext.httpclient.internal.HttpMethodCall.sendRequest(HttpMethodCall.java:339)
at org.restlet.ext.httpclient.internal.HttpMethodCall.sendRequest(HttpMethodCall.java:363)
at org.restlet.engine.adapter.ClientAdapter.commit(ClientAdapter.java:81)
at org.restlet.engine.adapter.HttpClientHelper.handle(HttpClientHelper.java:119)
at org.restlet.Client.handle(Client.java:153)
at org.restlet.Restlet.handle(Restlet.java:342)
at org.restlet.Restlet.handle(Restlet.java:355)
at org.apache.camel.component.restlet.RestletProducer.process(RestletProducer.java:179)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:148)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:548)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:201)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:138)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:101)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:201)
at org.apache.camel.component.timer.TimerConsumer.sendTimerExchange(TimerConsumer.java:197)
at org.apache.camel.component.timer.TimerConsumer$1.run(TimerConsumer.java:79)
at java.util.TimerThread.mainLoop(Timer.java:555)
at java.util.TimerThread.run(Timer.java:505)
WARN TimerConsumer - Error processing exchange. Exchange[ID-ubuntu-Latitude-6430U-1520243444162-0-1]. Caused by: [org.apache.camel.component.restlet.RestletOperationException - Restlet operation failed invoking https://jsonplaceholder.typicode.com:80/443:posts with statusCode: 1001 /n responseBody:HTTPS/1.1 - Communication Error (1001) - The connector failed to complete the communication with the server]
org.apache.camel.component.restlet.RestletOperationException: Restlet operation failed invoking https://jsonplaceholder.typicode.com:80/443:posts with statusCode: 1001 /n responseBody:HTTPS/1.1 - Communication Error (1001) - The connector failed to complete the communication with the server
at org.apache.camel.component.restlet.RestletProducer.populateRestletProducerException(RestletProducer.java:304)
at org.apache.camel.component.restlet.RestletProducer$1.handle(RestletProducer.java:190)
at org.restlet.engine.adapter.ClientAdapter$1.handle(ClientAdapter.java:90)
at org.restlet.ext.httpclient.internal.HttpMethodCall.sendRequest(HttpMethodCall.java:371)
at org.restlet.engine.adapter.ClientAdapter.commit(ClientAdapter.java:81)
at org.restlet.engine.adapter.HttpClientHelper.handle(HttpClientHelper.java:119)
at org.restlet.Client.handle(Client.java:153)
at org.restlet.Restlet.handle(Restlet.java:342)
at org.restlet.Restlet.handle(Restlet.java:355)
at org.apache.camel.component.restlet.RestletProducer.process(RestletProducer.java:179)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:148)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:548)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:201)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:138)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:101)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:201)
at org.apache.camel.component.timer.TimerConsumer.sendTimerExchange(TimerConsumer.java:197)
at org.apache.camel.component.timer.TimerConsumer$1.run(TimerConsumer.java:79)
at java.util.TimerThread.mainLoop(Timer.java:555)
at java.util.TimerThread.run(Timer.java:505)
ERROR DefaultErrorHandler - Failed delivery for (MessageId: ID-ubuntu-Latitude-6430U-1520243444162-0-2 on ExchangeId: ID-ubuntu-Latitude-6430U-1520243444162-0-1). Exhausted after delivery attempt: 1 caught: org.apache.camel.component.restlet.RestletOperationException: Restlet operation failed invoking https://jsonplaceholder.typicode.com:80/443:posts with statusCode: 1001 /n responseBody:HTTPS/1.1 - Communication Error (1001) - The connector failed to complete the communication with the server
Message History
---------------------------------------------------------------------------------------------------------------------------------------
RouteId ProcessorId Processor Elapsed (ms)
[route1 ] [route1 ] [timer://aTimer?period=20s ] [ 5321]
[route1 ] [process1 ] [Processor#0x33ae3bf8 ] [ 4]
[route1 ] [setHeader1 ] [setHeader[CamelHttpMethod] ] [ 0]
[route1 ] [setHeader2 ] [setHeader[Content-Type] ] [ 0]
[route1 ] [to1 ] [restlet:https://jsonplaceholder.typicode.com/443:posts ] [ 5308]
Stacktrace
---------------------------------------------------------------------------------------------------------------------------------------
org.apache.camel.component.restlet.RestletOperationException: Restlet operation failed invoking https://jsonplaceholder.typicode.com:80/443:posts with statusCode: 1001 /n responseBody:HTTPS/1.1 - Communication Error (1001) - The connector failed to complete the communication with the server
at org.apache.camel.component.restlet.RestletProducer.populateRestletProducerException(RestletProducer.java:304)
at org.apache.camel.component.restlet.RestletProducer$1.handle(RestletProducer.java:190)
at org.restlet.engine.adapter.ClientAdapter$1.handle(ClientAdapter.java:90)
at org.restlet.ext.httpclient.internal.HttpMethodCall.sendRequest(HttpMethodCall.java:371)
at org.restlet.engine.adapter.ClientAdapter.commit(ClientAdapter.java:81)
at org.restlet.engine.adapter.HttpClientHelper.handle(HttpClientHelper.java:119)
at org.restlet.Client.handle(Client.java:153)
at org.restlet.Restlet.handle(Restlet.java:342)
at org.restlet.Restlet.handle(Restlet.java:355)
at org.apache.camel.component.restlet.RestletProducer.process(RestletProducer.java:179)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:148)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:548)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:201)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:138)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:101)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:201)
at org.apache.camel.component.timer.TimerConsumer.sendTimerExchange(TimerConsumer.java:197)
at org.apache.camel.component.timer.TimerConsumer$1.run(TimerConsumer.java:79)
at java.util.TimerThread.mainLoop(Timer.java:555)
at java.util.TimerThread.run(Timer.java:505)
Mar 05, 2018 3:20:52 PM org.restlet.ext.httpclient.HttpClientHelper stop
INFO: Stopping the HTTP client
INFO DefaultShutdownStrategy - Route: route1 shutdown complete, was consuming from: timer://aTimer?period=20s
INFO DefaultShutdownStrategy - Graceful shutdown of 1 routes completed in 3 seconds
INFO DefaultCamelContext - Apache Camel 2.20.1 (CamelContext: camel-1) uptime 7.927 seconds
INFO DefaultCamelContext - Apache Camel 2.20.1 (CamelContext: camel-1) is shutdown in 3.048 seconds
Please help me.. I've also tried to use Apache HTTP4 component but still no luck.
I am trying to install Oracle 12c in x86_64 x86_64 x86_64 GNU/Linux machine. This is my first time installation. I run the installer from the database folder using ./runInstaller -debug command. The output is as follows:
Starting Oracle Universal Installer...
Checking Temp space: must be greater than 500 MB. Actual 14103 MB Passed
Checking swap space: must be greater than 150 MB. Actual 3964 MB Passed
Checking monitor: must be configured to display at least 256 colors. Actual 16777216 Passed
Preparing to launch Oracle Universal Installer from /tmp/OraInstall2015-08-28_09-28-56AM. Please wait ...Archive: ../stage/Components/oracle.jdk/1.6.0.75.0/1/DataFiles/filegroup3.jar
inflating: /tmp/OraInstall2015-08-28_09-28-56AM/jdk/lib/ir.idl
inflating: /tmp/OraInstall2015-08-28_09-28-56AM/jdk/lib/sa-jdi.jar
...................
Archive: ../stage/Components/oracle.jdk/1.6.0.75.0/1/DataFiles/filegroup2.jar
.............
Archive: ../stage/Components/oracle.jdk/1.6.0.75.0/1/DataFiles/filegroup4.jar
.......
Archive: ../stage/Components/oracle.jdk/1.6.0.75.0/1/DataFiles/filegroup1.jar
............
Archive: ../stage/Components/oracle.jdk/1.6.0.75.0/1/DataFiles/filegroup5.jar
........
5 archives were successfully processed.
Archive: ../stage/Components/oracle.swd.oui/12.1.0.2.0/1/DataFiles/filegroup6.jar
...........
Archive: ../stage/Components/oracle.swd.oui/12.1.0.2.0/1/DataFiles/filegroup2.jar
..........
Archive: ../stage/Components/oracle.swd.oui/12.1.0.2.0/1/DataFiles/filegroup4.jar
............
Archive: ../stage/Components/oracle.swd.oui/12.1.0.2.0/1/DataFiles/filegroup7.jar
........
Archive: ../stage/Components/oracle.swd.oui/12.1.0.2.0/1/DataFiles/filegroup1.jar
.............
Archive: ../stage/Components/oracle.swd.oui/12.1.0.2.0/1/DataFiles/filegroup5.jar
....
6 archives were successfully processed.
Archive: ../stage/Components/oracle.swd.oui.core/12.1.0.2.0/1/DataFiles/filegroup3.jar
......
Archive: ../stage/Components/oracle.swd.oui.core/12.1.0.2.0/1/DataFiles/filegroup2.jar
........
Archive: ../stage/Components/oracle.swd.oui.core/12.1.0.2.0/1/DataFiles/filegroup4.jar
.........
Archive: ../stage/Components/oracle.swd.oui.core/12.1.0.2.0/1/DataFiles/filegroup1.jar
..........
Archive: ../stage/Components/oracle.swd.oui.core/12.1.0.2.0/1/DataFiles/filegroup5.jar
.....
5 archives were successfully processed.
Archive: ../stage/Components/oracle.swd.oui.core.min/12.1.0.2.0/1/DataFiles/filegroup2.jar
....
Archive: ../stage/Components/oracle.swd.oui.core.min/12.1.0.2.0/1/DataFiles/filegroup1.jar
......
2 archives were successfully processed.
LD_LIBRARY_PATH environment variable :
-------------------------------------------------------
Total args: 26
Command line argument array elements ...
Arg:0:/tmp/OraInstall2015-08-28_09-28-56AM/jdk/jre/bin/java:
Arg:1:-Doracle.installer.library_loc=/tmp/OraInstall2015-08-28_09-28-56AM/oui/lib/linux64:
Arg:2:-Doracle.installer.oui_loc=/tmp/OraInstall2015-08-28_09-28-56AM/oui:
Arg:3:-Doracle.installer.bootstrap=TRUE:
Arg:4:-Doracle.installer.startup_location=/oracle12c/database/install:
Arg:5:-Doracle.installer.jre_loc=/tmp/OraInstall2015-08-28_09-28-56AM/jdk/jre:
Arg:6:-Doracle.installer.nlsEnabled="TRUE":
Arg:7:-Doracle.installer.prereqConfigLoc= :
Arg:8:-Doracle.installer.unixVersion=2.6.32-279.el6.x86_64:
Arg:9:-Doracle.install.setup.workDir=/oracle12c/database:
Arg:10:-DCVU_OS_SETTINGS=SHELL_NOFILE_SOFT_LIMIT:1024,SHELL_UMASK:0022:
Arg:11:-Xms150m:
Arg:12:-Xmx256m:
Arg:13:-XX:MaxPermSize=128M:
Arg:14:-cp:
Arg:15:/tmp/OraInstall2015-08-28_09-28-56AM::/tmp/OraInstall2015-08-28_09-28-56AM/ext/jlib/emca.jar:/tmp/OraInstall2015-08-28_09-28-56AM/ext/jlib/entityManager_proxy.jar:/tmp/OraInstall2015-08-28_09-28-56AM/ext/jlib/prov_fixup.jar:/tmp/OraInstall2015-08-28_09-28-56AM/ext/jlib/orai18n-utility.jar:/tmp/OraInstall2015-08-28_09-28-56AM/ext/jlib/installcommons_1.0.0b.jar:/tmp/OraInstall2015-08-28_09-28-56AM/ext/jlib/wsclient_extended.jar:/tmp/OraInstall2015-08-28_09-28-56AM/ext/jlib/instdb.jar:/tmp/OraInstall2015-08-28_09-28-56AM/ext/jlib/jsch.jar:/tmp/OraInstall2015-08-28_09-28-56AM/ext/jlib/remoteinterfaces.jar:/tmp/OraInstall2015-08-28_09-28-56AM/ext/jlib/OraPrereqChecks.jar:/tmp/OraInstall2015-08-28_09-28-56AM/ext/jlib/orai18n-mapping.jar:/tmp/OraInstall2015-08-28_09-28-56AM/ext/jlib/instcommon.jar:/tmp/OraInstall2015-08-28_09-28-56AM/ext/jlib/emCoreConsole.jar:/tmp/OraInstall2015-08-28_09-28-56AM/ext/jlib/OraPrereq.jar:/tmp/OraInstall2015-08-28_09-28-56AM/ext/jlib/cvu.jar:/tmp/OraInstall2015-08-28_09-28-56AM/ext/jlib/ssh.jar:/tmp/OraInstall2015-08-28_09-28-56AM/ext/jlib/ojdbc6.jar:/tmp/OraInstall2015-08-28_09-28-56AM/ext/jlib/adf-share-ca.jar:/tmp/OraInstall2015-08-28_09-28-56AM/ext/jlib/jmxspi.jar:/tmp/OraInstall2015-08-28_09-28-56AM/ext/jlib/javax.security.jacc_1.0.0.0_1-1.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/OraInstaller.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/oneclick.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/xmlparserv2.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/share.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/OraInstallerNet.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/emCfg.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/emocmutl.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/OraPrereq.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/jsch.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/ssh.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/remoteinterfaces.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/http_client.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/OraSuiteInstaller.jar:../stage/Components/oracle.swd.opatch/12.1.0.2.0/1/DataFiles/jlib/opatch.jar:../stage/Components/oracle.swd.opatch/12.1.0.2.0/1/DataFiles/jlib/opatchactions.jar:../stage/Components/oracle.swd.opatch/12.1.0.2.0/1/DataFiles/jlib/opatchprereq.jar:../stage/Components/oracle.swd.opatch/12.1.0.2.0/1/DataFiles/jlib/opatchutil.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/OraCheckPoint.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/InstImages.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/InstHelp.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/InstHelp_de.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/InstHelp_es.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/InstHelp_fr.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/InstHelp_it.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/InstHelp_ja.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/InstHelp_ko.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/InstHelp_pt_BR.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/InstHelp_zh_CN.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/InstHelp_zh_TW.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/oracle_ice.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/help-share.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/ohj.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/ewt3.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/ewt3-swingaccess.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/swingaccess.jar::/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/jewt4.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/orai18n-collation.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/orai18n-mapping.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/ojmisc.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/xml.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/srvm.jar:/tmp/OraInstall2015-08-28_09-28-56AM/oui/jlib/srvmasm.jar:
Arg:16:oracle.install.ivw.db.driver.DBInstaller:
Arg:17:-scratchPath:
Arg:18:/tmp/OraInstall2015-08-28_09-28-56AM:
Arg:19:-sourceLoc:
Arg:20:/oracle12c/database/install/../stage/products.xml:
Arg:21:-sourceType:
Arg:22:network:
Arg:23:-timestamp:
Arg:24:2015-08-28_09-28-56AM:
Arg:25:-debug:
-------------------------------------------------------
Initializing Java Virtual Machine from /tmp/OraInstall2015-08-28_09-28-56AM/jdk/jre/bin/java. Please wait...
[oracle#korbsbvmlx22 database]$ [main] [ 2015-08-28 09:29:05.048 IST ] [ClusterVerification.getInstance:426] Method Entry. workDir=/tmp frameworkHome=/oracle12c/database/install/../stage/cvu
[main] [ 2015-08-28 09:29:05.062 IST ] [ParamManager.<init>:668] m_paramInstantiated set to TRUE
[main] [ 2015-08-28 09:29:05.062 IST ] [VerificationUtil.getLocalHost:1312] Hostname retrieved: korbsbvmlx22, returned: korbsbvmlx22
[main] [ 2015-08-28 09:29:05.064 IST ] [VerificationUtil.getDestLoc:3712] ==== CV_DESTLOC(pre-fetched value): '/tmp/'
[main] [ 2015-08-28 09:29:05.065 IST ] [VerificationUtil.getExecutionEnvironment:7586] RDBMS Version is -->12.1.0.2.0
[main] [ 2015-08-28 09:29:05.065 IST ] [VerificationUtil.validateCmdLineExecEnvironment:7602] Entered validateCmdLineExecEnvironment
[main] [ 2015-08-28 09:29:05.105 IST ] [Version.isPre:610] version to be checked 12.1.0.2.0 major version to check against 10
[main] [ 2015-08-28 09:29:05.105 IST ] [Version.isPre:621] isPre.java: Returning FALSE
[main] [ 2015-08-28 09:29:05.106 IST ] [Version.isPre:610] version to be checked 12.1.0.2.0 major version to check against 10
[main] [ 2015-08-28 09:29:05.106 IST ] [Version.isPre:621] isPre.java: Returning FALSE
[main] [ 2015-08-28 09:29:05.107 IST ] [Version.isPre:610] version to be checked 12.1.0.2.0 major version to check against 11
[main] [ 2015-08-28 09:29:05.107 IST ] [Version.isPre:621] isPre.java: Returning FALSE
[main] [ 2015-08-28 09:29:05.107 IST ] [Version.isPre:642] version to be checked 12.1.0.2.0 major version to check against 11 minor version to check against 2
[main] [ 2015-08-28 09:29:05.108 IST ] [Version.isPre:651] isPre: Returning FALSE for major version check
[main] [ 2015-08-28 09:29:05.108 IST ] [UnixSystem.isHAConfigured:2788] olrFileName = /etc/oracle/olr.loc
[main] [ 2015-08-28 09:29:05.109 IST ] [VerificationUtil.isHAConfigured:4181] haConfigured=false
[main] [ 2015-08-28 09:29:05.109 IST ] [VerificationUtil.validateCmdLineExecEnvironment:7639] Exit validateCmdLineExecEnvironment
[main] [ 2015-08-28 09:29:05.116 IST ] [ConfigUtil.importConfig:97] ==== CVU config file: /oracle12c/database/install/../stage/cvu//cv/admin/cvu_config
[main] [ 2015-08-28 09:29:05.117 IST ] [ConfigUtil.importConfig:114] ==== Picked up config variable: cv_raw_check_enabled : TRUE
[main] [ 2015-08-28 09:29:05.118 IST ] [ConfigUtil.importConfig:114] ==== Picked up config variable: cv_sudo_binary_location : /usr/local/bin/sudo
[main] [ 2015-08-28 09:29:05.118 IST ] [ConfigUtil.importConfig:114] ==== Picked up config variable: cv_pbrun_binary_location : /usr/local/bin/pbrun
[main] [ 2015-08-28 09:29:05.119 IST ] [ConfigUtil.importConfig:114] ==== Picked up config variable: cv_assume_cl_version : 12.1
[main] [ 2015-08-28 09:29:05.119 IST ] [ConfigUtil.isDefined:200] ==== Is ORACLE_SRVM_REMOTESHELL defined? : false
[main] [ 2015-08-28 09:29:05.121 IST ] [Library.load:194] library.load
[main] [ 2015-08-28 09:29:05.122 IST ] [sPlatform.isHybrid:66] osName=Linux osArch=amd64 JVM=64 rc=false
[main] [ 2015-08-28 09:29:05.122 IST ] [Library.load:262] Property oracle.installer.library_loc is set to value=/tmp/OraInstall2015-08-28_09-28-56AM/oui/lib/linux64
[main] [ 2015-08-28 09:29:05.123 IST ] [Library.load:264] Loading library /tmp/OraInstall2015-08-28_09-28-56AM/oui/lib/linux64/libsrvm12.so
[main] [ 2015-08-28 09:29:05.124 IST ] [ConfigUtil.getConfiguredValue:182] ==== Fallback to env var 'ORACLE_SRVM_REMOTESHELL'=null
[main] [ 2015-08-28 09:29:05.125 IST ] [ConfigUtil.isDefined:200] ==== Is ORACLE_SRVM_REMOTECOPY defined? : false
[main] [ 2015-08-28 09:29:05.125 IST ] [ConfigUtil.getConfiguredValue:182] ==== Fallback to env var 'ORACLE_SRVM_REMOTECOPY'=null
As seen in debug messages after "Please wait..." no installer window opens and there are further messages that is displayed in the console and then it is always like that. I have verified the logs in /tmp directory and it appears to be clean. I am stuck with this issue since long and do not know how to proceed.
Please guide.
We tried to submit a simple SparkPI example onto Spark on Yarn. The bat is written as below:
./bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn-cluster --num-executors 3 --driver-memory 4g --executor-memory 1g --executor-cores 1 .\examples\target\spark-examples_2.10-1.4.0.jar 10
pause
Our HDFS and Yarn works well. We are using Hadoop 2.7.0 and Spark 1.4.1. We have only 1 node that acts as both NameNode and DataNode.
When we execute it, it fails with log says the following:
2015-08-21 11:07:22,044 DEBUG [main] | ===============================================================================
2015-08-21 11:07:22,044 DEBUG [main] | Yarn AM launch context:
2015-08-21 11:07:22,044 DEBUG [main] | user class: org.apache.spark.examples.SparkPi
2015-08-21 11:07:22,044 DEBUG [main] | env:
2015-08-21 11:07:22,044 DEBUG [main] | CLASSPATH -> {{PWD}}<CPS>{{PWD}}/__hadoop_conf__<CPS>{{PWD}}/__spark__.jar<CPS>%HADOOP_HOME%\etc\hadoop<CPS>%HADOOP_HOME%\share\hadoop\common\*<CPS>%HADOOP_HOME%\share\hadoop\common\lib\*<CPS>%HADOOP_HOME%\share\hadoop\mapreduce\*<CPS>%HADOOP_HOME%\share\hadoop\mapreduce\lib\*<CPS>%HADOOP_HOME%\share\hadoop\hdfs\*<CPS>%HADOOP_HOME%\share\hadoop\hdfs\lib\*<CPS>%HADOOP_HOME%\share\hadoop\yarn\*<CPS>%HADOOP_HOME%\share\hadoop\yarn\lib\*<CPS>%HADOOP_MAPRED_HOME%\share\hadoop\mapreduce\*<CPS>%HADOOP_MAPRED_HOME%\share\hadoop\mapreduce\lib\*
2015-08-21 11:07:22,060 DEBUG [main] | SPARK_YARN_CACHE_FILES_FILE_SIZES -> 165181064,1420218
2015-08-21 11:07:22,060 DEBUG [main] | SPARK_YARN_STAGING_DIR -> .sparkStaging/application_1440062075415_0026
2015-08-21 11:07:22,060 DEBUG [main] | SPARK_YARN_CACHE_FILES_VISIBILITIES -> PRIVATE,PRIVATE
2015-08-21 11:07:22,060 DEBUG [main] | SPARK_USER -> msrabi
2015-08-21 11:07:22,060 DEBUG [main] | SPARK_YARN_MODE -> true
2015-08-21 11:07:22,060 DEBUG [main] | SPARK_YARN_CACHE_FILES_TIME_STAMPS -> 1440126441200,1440126441575
2015-08-21 11:07:22,060 DEBUG [main] | SPARK_YARN_CACHE_FILES -> hdfs://msra-sa-44:9000/user/msrabi/.sparkStaging/application_1440062075415_0026/spark-assembly-1.4.0-hadoop2.7.0.jar#__spark__.jar,hdfs://msra-sa-44:9000/user/msrabi/.sparkStaging/application_1440062075415_0026/spark-examples_2.10-1.4.0.jar#__app__.jar
2015-08-21 11:07:22,060 DEBUG [main] | resources:
2015-08-21 11:07:22,060 DEBUG [main] | __app__.jar -> resource { scheme: "hdfs" host: "msra-sa-44" port: 9000 file: "/user/msrabi/.sparkStaging/application_1440062075415_0026/spark-examples_2.10-1.4.0.jar" } size: 1420218 timestamp: 1440126441575 type: FILE visibility: PRIVATE
2015-08-21 11:07:22,060 DEBUG [main] | __spark__.jar -> resource { scheme: "hdfs" host: "msra-sa-44" port: 9000 file: "/user/msrabi/.sparkStaging/application_1440062075415_0026/spark-assembly-1.4.0-hadoop2.7.0.jar" } size: 165181064 timestamp: 1440126441200 type: FILE visibility: PRIVATE
2015-08-21 11:07:22,060 DEBUG [main] | __hadoop_conf__ -> resource { scheme: "hdfs" host: "msra-sa-44" port: 9000 file: "/user/msrabi/.sparkStaging/application_1440062075415_0026/__hadoop_conf__7908628615251032149.zip" } size: 82888 timestamp: 1440126441794 type: ARCHIVE visibility: PRIVATE
2015-08-21 11:07:22,060 DEBUG [main] | command:
2015-08-21 11:07:22,075 DEBUG [main] | {{JAVA_HOME}}/bin/java -server -Xmx4096m -Djava.io.tmpdir={{PWD}}/tmp '-Dspark.app.name=org.apache.spark.examples.SparkPi' '-Dspark.executor.memory=1g' '-Dspark.driver.memory=4g' '-Dspark.master=yarn-cluster' -Dspark.yarn.app.container.log.dir=<LOG_DIR> org.apache.spark.deploy.yarn.ApplicationMaster --class 'org.apache.spark.examples.SparkPi' --jar file:/D:/sp/./examples/target/spark-examples_2.10-1.4.0.jar --arg '10' --executor-memory 1024m --executor-cores 1 --num-executors 3 1> <LOG_DIR>/stdout 2> <LOG_DIR>/stderr
2015-08-21 11:07:22,075 DEBUG [main] | ===============================================================================
...........(omitting some lines)......
2015-08-21 11:07:23,231 INFO [main] | Application report for application_1440062075415_0026 (state: ACCEPTED)
2015-08-21 11:07:23,247 DEBUG [main] |
client token: N/A
diagnostics: N/A
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: default
start time: 1440126442169
final status: UNDEFINED
tracking URL: http://msra-sa-44:8088/proxy/application_1440062075415_0026/
user: msrabi
2015-08-21 11:07:24,263 TRACE [main] | 1: Call -> MSRA-SA-44/10.190.173.181:8032: getApplicationReport {application_id { id: 26 cluster_timestamp: 1440062075415 }}
2015-08-21 11:07:24,263 DEBUG [IPC Parameter Sending Thread #0] | IPC Client (443384617) connection to MSRA-SA-44/10.190.173.181:8032 from msrabi sending #37
2015-08-21 11:07:24,263 DEBUG [IPC Client (443384617) connection to MSRA-SA-44/10.190.173.181:8032 from msrabi] | IPC Client (443384617) connection to MSRA-SA-44/10.190.173.181:8032 from msrabi got value #37
2015-08-21 11:07:24,263 DEBUG [main] | Call: getApplicationReport took 0ms
2015-08-21 11:07:24,263 TRACE [main] | 1: Response <- MSRA-SA-44/10.190.173.181:8032: getApplicationReport {application_report { applicationId { id: 26 cluster_timestamp: 1440062075415 } user: "msrabi" queue: "default" name: "org.apache.spark.examples.SparkPi" host: "N/A" rpc_port: -1 yarn_application_state: ACCEPTED trackingUrl: "http://msra-sa-44:8088/proxy/application_1440062075415_0026/" diagnostics: "" startTime: 1440126442169 finishTime: 0 final_application_status: APP_UNDEFINED app_resource_Usage { num_used_containers: 1 num_reserved_containers: 0 used_resources { memory: 4608 virtual_cores: 1 } reserved_resources { memory: 0 virtual_cores: 0 } needed_resources { memory: 4608 virtual_cores: 1 } memory_seconds: 0 vcore_seconds: 0 } originalTrackingUrl: "N/A" currentApplicationAttemptId { application_id { id: 26 cluster_timestamp: 1440062075415 } attemptId: 1 } progress: 0.0 applicationType: "SPARK" }}
2015-08-21 11:07:24,263 INFO [main] | Application report for application_1440062075415_0026 (state: ACCEPTED)
.......(omitting some lines where the state are all ACCEPTED and final status are all UNDEFINED).....
2015-08-21 11:07:30,359 INFO [main] | Application report for application_1440062075415_0026 (state: FAILED)
2015-08-21 11:07:30,359 DEBUG [main] |
client token: N/A
diagnostics: Application application_1440062075415_0026 failed 2 times due to AM Container for appattempt_1440062075415_0026_000002 exited with exitCode: 1
For more detailed output, check application tracking page:http://msra-sa-44:8088/cluster/app/application_1440062075415_0026Then, click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_1440062075415_0026_02_000001
Exit code: 1
Stack trace: ExitCodeException exitCode=1:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:545)
at org.apache.hadoop.util.Shell.run(Shell.java:456)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Shell output: 1 file(s) moved.
And then we opened stderr, it says:
Error: Could not find or load main class 'Dspark.app.name=org.apache.spark.examples.SparkPi'
It's so strange, this should be a parameter passed to java, and it seems that java recognized it as the main class. There should be a main class parameter in the command section of the log, but there is not.
How can that happen? What should we do to know what's wrong with it?
Thank you!
We solved this problem.
The root cause is that when generating the java command line, our Spark uses single quote('-Dxxxx') to wrap the parameters. Single quote works only in Linux. On Windows, the parameters are either not wrapped, or wrapped with double quotes("-Dxxxx"). The only way to solve this is to edit the source code of Spark and re-compile it.
It seems that this is currently an issue of Spark. (https://issues.apache.org/jira/browse/SPARK-5754)