Using Local File in Spring Yarn application - spring-boot

i need deploy the war file on yarn container for that I am using Spring batch for creating a yarn application that uses embedded jetty server to deploy a war file,when i use a customhandle for handling the request it works fine the application get deployed on yarn but when i use war file as a handler it doesnt works i get a error
java.io.FileNotFoundException: /root/jettywebapps/webapps/Login.war
but the file is actually present at the given location i am stuck how to access the war file from the yarn container
#OnContainerStart
public void publicVoidNoArgsMethod() throws Exception {
String jetty_home="/root/jettywebapps";
Server server = new Server(9090);
WebAppContext webapp = new WebAppContext();
webapp.setContextPath("/Login");
webapp.setWar(jetty_home+"/webapps/Login.war");
server.setHandler(webapp);
server.start();
server.join();
}
Here is the Stack Trace
2015-04-17 06:05:14.972] boot - 26920 INFO [main] --- ContainerLauncherRunner: Running YarnContainer with parameters []
[2015-04-17 06:05:14.972] boot - 26920 INFO [main] --- ContainerLauncherRunner: Container requested that we wait state, setting up latch
[2015-04-17 06:05:14.975] boot - 26920 INFO [main] --- DefaultYarnContainer: Processing 1 #YarnComponent handlers
[2015-04-17 06:05:15.038] boot - 26920 INFO [main] --- Server: jetty-8.0.4.v20111024
[2015-04-17 06:05:15.080] boot - 26920 WARN [main] --- WebInfConfiguration: Web application not found /root/jettywebapps/webapps/Login.war
[2015-04-17 06:05:15.081] boot - 26920 WARN [main] --- WebAppContext: Failed startup of context o.e.j.w.WebAppContext{/Login,null},/root/jettywebapps/webapps/Login.war
java.io.FileNotFoundException: /root/jettywebapps/webapps/Login.war
at org.eclipse.jetty.webapp.WebInfConfiguration.unpack(WebInfConfiguration.java:479)
at org.eclipse.jetty.webapp.WebInfConfiguration.preConfigure(WebInfConfiguration.java:52)
at org.eclipse.jetty.webapp.WebAppContext.preConfigure(WebAppContext.java:416)
at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:452)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:59)
at org.eclipse.jetty.server.handler.HandlerWrapper.doStart(HandlerWrapper.java:89)
at org.eclipse.jetty.server.Server.doStart(Server.java:262)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:59)
at hello.container.HelloPojo.publicVoidNoArgsMethod(HelloPojo.java:40)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.springframework.expression.spel.support.ReflectiveMethodExecutor.execute(ReflectiveMethodExecutor.java:112)
at org.springframework.expression.spel.ast.MethodReference.getValueInternal(MethodReference.java:129)
at org.springframework.expression.spel.ast.MethodReference.access$000(MethodReference.java:49)
at org.springframework.expression.spel.ast.MethodReference$MethodValueRef.getValue(MethodReference.java:342)
at org.springframework.expression.spel.ast.CompoundExpression.getValueInternal(CompoundExpression.java:88)
at org.springframework.expression.spel.ast.SpelNodeImpl.getTypedValue(SpelNodeImpl.java:131)
at org.springframework.expression.spel.standard.SpelExpression.getValue(SpelExpression.java:330)
at org.springframework.yarn.support.AbstractExpressionEvaluator.evaluateExpression(AbstractExpressionEvaluator.java:126)
at org.springframework.yarn.container.ContainerMethodInvokerHelper.processInternal(ContainerMethodInvokerHelper.java:229)
at org.springframework.yarn.container.ContainerMethodInvokerHelper.process(ContainerMethodInvokerHelper.java:115)
at org.springframework.yarn.container.MethodInvokingYarnContainerRuntimeProcessor.process(MethodInvokingYarnContainerRuntimeProcessor.java:51)
at org.springframework.yarn.container.ContainerHandler.handle(ContainerHandler.java:99)
at org.springframework.yarn.container.DefaultYarnContainer.getContainerHandlerResults(DefaultYarnContainer.java:174)
at org.springframework.yarn.container.DefaultYarnContainer.runInternal(DefaultYarnContainer.java:77)
please help
Thanks

the container could fetch local files or hdfs files using following code:
Configuration conf = new Configuration();
FileSystem localFS = FileSystem.get(URI.create("file://localhost"), conf);
OutputStream outATXT = localFS.create(new Path("/home/walterchen/a.txt"));
or
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(URI.create("hdfs://localhost:9000"), conf);
OutputStream out = fs.create(new Path("/home/a.txt"));

Related

Tomcat throws an exception when finishing a Spring Boot application

Sometimes when Tomcat has been running for a while and I terminate the embeded Tomcat (Ctrl+c) the application throws the following exception:
2019-10-17 10:23:10.704 INFO 20020 --- [ Thread-3] o.s.b.f.support.DisposableBeanAdapter : Invocation of destroy method failed on bean with name 'entityManagerFactory': java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: org/springframework/orm/hibernate5/SpringBeanContainer$SpringContainedBean
Exception in thread "Thread-3" java.lang.NoClassDefFoundError: org/apache/catalina/Lifecycle$SingleUse
at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:269)
at org.apache.catalina.startup.Tomcat.stop(Tomcat.java:466)
at org.springframework.boot.web.embedded.tomcat.TomcatWebServer.stopTomcat(TomcatWebServer.java:254)
at org.springframework.boot.web.embedded.tomcat.TomcatWebServer.stop(TomcatWebServer.java:309)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.stopAndReleaseWebServer(ServletWebServerApplicationContext.java:305)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.onClose(ServletWebServerApplicationContext.java:171)
at org.springframework.context.support.AbstractApplicationContext.doClose(AbstractApplicationContext.java:1032)
at org.springframework.context.support.AbstractApplicationContext$1.run(AbstractApplicationContext.java:945)
It seems that the exception is thrown when Tomcat has been running for a long time and my application has not been used.
I checked my application jar file and spring-orm-5.1.8.RELEASE.jar is embeded and the class org/springframework/orm/hibernate5/SpringBeanContainer is there.
Any idea is apreciated.
Try to shutdown the application before replacing or renaming the jar.
#see Graceful shutdown fails

Setting SSL keystore at runtime in Apache Felix

I have a service that loads a keystore, but how do I tell Apache Felix to load this at runtime? Jetty 9.4.20 has support to reload the sslContext with SslContextFactory.reload() but how do I get the ServerConnector from felix?
I also tried an approach where I change the bundle properties with ManagedServiceFactory however that fails starting the server. See the example
public void updateSslContext(ManagedServiceFactory mf) throws exception {
Dictionary<String, Object> newProps =new Hashtable<>();
newProps.put("org.apache.felix.https.keystore", "keystore.jks");
newProps.put("org.apache.felix.https.keystore.password", "password");
newProps.put("org.apache.felix.https.keystore.key.password", "1234");
newProps.put("org.apache.felix.https.enable", true);
mf.updated("org.apache.felix.http", newProps);
}
It fails with the following error
[INFO] Started Jetty 9.4.20.v20190813 at port(s) HTTP:8080 HTTPS:8443 on context path / [minThreads=8,maxThreads=200,acceptors=1,selectors=2]
[INFO] Apachde Felix Http Whiteboard Service started
...
...
...
ERROR 20191006 21:23:35 bid#4 - Failed to start Connector: ServerConnector#49d09625{HTTP/1.1,[http/1.1]}{0.0.0.0:8080} (java.io.IOException: Failed to bind to 0.0.0.0/0.0.0.0:8080)
info 20191006 21:23:35 bid#4 - org.eclipse.jetty.server.Server :: jetty-9.4.20.v20190813; built: 2019-08-13T21:28:18.144Z; git: 84700530e645e812b336747464d6fbbf370c9a20; jvm 1.8.0_151-b12
info 20191006 21:23:35 bid#4 - org.eclipse.jetty.server.handler.ContextHandler :: Started o.e.j.s.ServletContextHandler#3ea91c64{/,null,AVAILABLE}
ERROR 20191006 21:23:35 bid#4 - Failed to start Connector: ServerConnector#27be6cdc{SSL,[ssl, http/1.1]}{0.0.0.0:8443} (java.io.IOException: Failed to bind to 0.0.0.0/0.0.0.0:8443)
info 20191006 21:23:35 bid#4 - Stopped Jetty.
ERROR 20191006 21:23:35 bid#4 - Jetty stopped (no connectors available)
ERROR 20191006 21:23:35 bid#4 - Exception while initializing Jetty. (java.lang.NullPointerException)
I would prefer to use SslContextFactory.reload() so that the server doesn't restart completely.

Spring Application not getting terminate for timeout exception

I have created a spring boot application to publish the message to the Kafka queue. For that, I am using spring cloud stream and Kafka binder as dependencies. Problem is my application is continuously trying to connect to Kafka broker if it is down for 2 minutes because of the default configuration.
I have reduced that time using the below property and set it to 1000 ms and getting the timeout exception
spring.kafka.properties.request.timeout.ms:1000.
But still, my spring application is running after the exception. I want it to fail if Kafka broker is not available to connect to. I have tried one more property for that spring.kafka.admin.fail-fast=true but still, the application is running.
I have also tried to search for some properties of spring cloud stream and Kafka binder that I can set to fail my application if Kafka broker is not available but couldn't find anything related to that.
Please, help me with this.
Please see below for the log of exception.
Caused by: java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.TimeoutException: Timed out waiting for a node assignment.
at org.apache.kafka.common.internals.KafkaFutureImpl.wrapAndThrow(KafkaFutureImpl.java:45)
at org.apache.kafka.common.internals.KafkaFutureImpl.access$000(KafkaFutureImpl.java:32)
at org.apache.kafka.common.internals.KafkaFutureImpl$SingleWaiter.await(KafkaFutureImpl.java:104)
at org.apache.kafka.common.internals.KafkaFutureImpl.get(KafkaFutureImpl.java:274)
at org.springframework.cloud.stream.binder.kafka.provisioning.KafkaTopicProvisioner.createTopicAndPartitions(KafkaTopicProvisioner.java:351)
at org.springframework.cloud.stream.binder.kafka.provisioning.KafkaTopicProvisioner.createTopicIfNecessary(KafkaTopicProvisioner.java:325)
at org.springframework.cloud.stream.binder.kafka.provisioning.KafkaTopicProvisioner.createTopic(KafkaTopicProvisioner.java:302)
... 33 common frames omitted
Caused by: org.apache.kafka.common.errors.TimeoutException: Timed out waiting for a node assignment.
2019-05-22 06:06:25 [main] DEBUG o.s.c.s.DefaultLifecycleProcessor - Successfully started bean 'outputBindingLifecycle'
2019-05-22 06:06:25 [main] DEBUG o.s.c.s.DefaultLifecycleProcessor - Starting beans in phase 2147482647
2019-05-22 06:06:25 [main] DEBUG o.s.c.s.binding.BindableProxyFactory - Binding inputs for :interface kafka.stream.RXXXStreams
2019-05-22 06:06:25 [main] DEBUG o.s.c.s.DefaultLifecycleProcessor - Successfully started bean 'inputBindingLifecycle'
2019-05-22 06:06:25 [main] DEBUG o.s.c.s.DefaultLifecycleProcessor - Starting beans in phase 2147483547
2019-05-22 06:06:25 [main] DEBUG o.s.c.s.DefaultLifecycleProcessor - Successfully started bean 'org.springframework.kafka.config.internalKafkaListenerEndpointRegistry'
2019-05-22 06:06:25 [main] DEBUG o.s.b.a.l.ConditionEvaluationReportLoggingListener -
Do you have spring-boot-web libraries as dependency? If that's the case, your application will not exit. A full log will be also very helpful.

Spring Boot Project Jar file not reading File placed on classpath

I am connecting my Spring Boot app with Google Cloud Sql and it got connected by placing Credential File in src/main/resources but Issue comes when I try to run the Jar File - [FileNotFound]
:: Spring Boot :: (v2.0.3.RELEASE)
application.properties-
spring.cloud.gcp.credentials.location=classpath:ArpanShoppingApp-863d536d1f93.json
and running jar file gives exception
java -jar CloudSQLConnect-1.0.jar
Exception-
2018-06-22 10:46:38.393 INFO 1172 --- [ main] o.s.c.g.s.a.GcpCloudSqlAutoConfiguration : Default MYSQL JdbcUrl provider. Connecting to jdbc:mysql://google/google_sql?cloudSqlInstance=mindful-highway-207309:asia-south1:shopping-db&socketFactory=com.google.cloud.sql.mysql.SocketFactory&useSSL=false with driver com.mysql.jdbc.Driver
2018-06-22 10:46:38.401 INFO 1172 --- [ main] o.s.c.g.s.a.GcpCloudSqlAutoConfiguration : Error reading Cloud SQL credentials file.
java.io.FileNotFoundException: class path resource [ArpanShoppingApp-863d536d1f93.json] cannot be resolved to absolute file path because it does not reside in the file system: jar:file:/Users/arpan/Documents/workspace-sts-3.8.4.RELEASE/CloudSQLConnect/target/CloudSQLConnect-1.0.jar!/BOOT-INF/classes!/ArpanShoppingApp-863d536d1f93.json
at org.springframework.util.ResourceUtils.getFile(ResourceUtils.java:217) ~[spring-core-5.0.7.RELEASE.jar!/:5.0.7.RELEASE]
at org.springframework.core.io.AbstractFileResolvingResource.getFile(AbstractFileResolvingResource.java:133) ~[spring-core-5.0.7.RELEASE.jar!/:5.0.7.RELEASE]
at org.springframework.cloud.gcp.sql.autoconfig.GcpCloudSqlAutoConfiguration.setCredentialsProperty(GcpCloudSqlAutoConfiguration.java:167) [spring-cloud-gcp-starter-sql-1.0.0.M1.jar!/:1.0.0.M1]
at org.springframework.cloud.gcp.sql.autoconfig.GcpCloudSqlAutoConfiguration.defaultJdbcInfoProvider(GcpCloudSqlAutoConfiguration.java:107) [spring-cloud-gcp-starter-sql-1.0.0.M1.jar!/:1.0.0.M1]
at org.springframework.cloud.gcp.sql.autoconfig.GcpCloudSqlAutoConfiguration$$EnhancerBySpringCGLIB$$edf77794.CGLIB$defaultJdbcInfoProvider$1(<generated>) [spring-cloud-gcp-starter-sql-1.0.0.M1.jar!/:1.0.0.M1]
This appears to be a limitation of spring-cloud-gcp. It seems like credential files must be on the filesystem and cannot be package into a jar. The latest code has a better error message than the M1 version that you're using.

Issue with Flume log4j Appender

I'm trying to configure flume to write hadoop service logs to a common sink.
Here's what I have added to hdfs log4j.properties
# Define the root logger to the system property "hadoop.root.logger".
log4j.rootLogger=${hadoop.root.logger}, flume
#Flume Appender
log4j.appender.flume = org.apache.flume.clients.log4jappender.Log4jAppender
log4j.appender.flume.Hostname = localhost
log4j.appender.flume.Port = 41414
and when I run sample pi job I get this error
$ hadoop jar hadoop-mapreduce-examples.jar pi 10 10
log4j:ERROR Could not find value for key log4j.appender.flume.layout 15/11/25 07:23:26 WARN api.NettyAvroRpcClient: Using default maxIOWorkers log4j:ERROR RPC client creation failed! NettyAvroRpcClient { host: localhost, port: 41414 }: RPC connection error Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.hadoop.util.RunJar.run(RunJar.java:200)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136) Caused by: org.apache.commons.logging.LogConfigurationException: User-specified log class 'org.apache.commons.logging.impl.Log4JLogger' cannot be found or is not useable.
at org.apache.commons.logging.impl.LogFactoryImpl.discoverLogImplementation(LogFactoryImpl.java:804)
at org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:541)
at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:292)
at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:269)
at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:657)
at org.apache.hadoop.util.ShutdownHookManager.<clinit>(ShutdownHookManager.java:44)
... 2 more
I have added these jars to hadoop-hdfs lib
avro-ipc-1.7.3.jar, flume-ng-log4jappender-1.5.2.2.2.7.1-33.jar, flume-ng-sdk-1.5.2.2.2.7.1-33.jar
and I do have the commons-logging( commons-logging-1.1.3.jar) and log4j(1.2.17) jars present in the hdfs lib. Any pointers to debug this issue?

Resources