S3 access point as Spark Eventlog Directory - spring-boot

We have a standalone Spring boot based Spark application where at the moment property spark.eventLog.dir is set to an s3 location.
SparkConf sparkConf = new SparkConf()
.setMaster("local[*]")
.setAppName("MyApp")
.set("spark.hadoop.fs.permissions.umask-mode", "000")
.set("hive.warehouse.subdir.inherit.perms", "false")
.set("spark.hadoop.mapreduce.fileoutputcommitter.algorithm.version", "2")
.set("spark.speculation", "false")
.set("spark.eventLog.enabled", "true")
.set("spark.extraListeners", "com.ClassName");
sparkConf.set("spark.eventLog.dir", "s3a://my-bucket-name/eventlog");
This has been working as expected however now the bucket access has changed to access point, so now the URL has to be arn:aws:s3:<bucket-region>:<accountNumber>:accesspoint:<access-point-name> e.g:
sparkConf.set("spark.eventLog.dir", "s3a://arn:aws:s3:eu-west-2:1234567890:accesspoint:my-access-point/eventlog");
After this change we are getting following Stack trace while booting up this app:
java.lang.NullPointerException: null uri host.
at java.base/java.util.Objects.requireNonNull(Objects.java:246)
at org.apache.hadoop.fs.s3native.S3xLoginHelper.buildFSURI(S3xLoginHelper.java:71)
at org.apache.hadoop.fs.s3a.S3AFileSystem.setUri(S3AFileSystem.java:470)
at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:235)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3303)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3352)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3320)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:479)
at org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1866)
at org.apache.spark.scheduler.EventLoggingListener.<init>(EventLoggingListener.scala:71)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:522)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$5(SparkSession.scala:935)
at scala.Option.getOrElse(Option.scala:138)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
Looking at the class S3xLoginHelper, looks like it is failing to create a java.net.URI object with the : char in the URL string.
I have following following relevant maven dependencies:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>2.4.4</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>2.4.4</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-aws</artifactId>
<version>3.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>3.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>3.2.0</version>
</dependency>
Update:
Also, tried to add following in core-site.xml (also tried in hdfc-site.xml) as mentioned in the hadoop-aws documentation:
<property>
<name>fs.s3a.bucket.my-access-point.accesspoint.arn</name>
<value>arn:aws:s3:eu-west-2:1234567890:accesspoint:my-access-point</value>
<description>Configure S3a traffic to use this AccessPoint</description>
</property>
And updated the code with sparkConf.set("spark.eventLog.dir", "s3a://my-access-point/eventlog");
This give a stack trace with java.io.FileNotFoundException: Bucket my-access-point does not exist which indicates that it is not using those updated properties for spark.eventLog.dir and treating my-access-point as bucket name!

Related

Why netlib-java native blas/lapack libraries doesn't give performance improvement?

I am using this piece of code to calculate spark recommendations:
SparkSession spark = SparkSession
.builder()
.appName("SomeAppName")
.config("spark.master", "local[" + args[2] + "]")
.config("spark.local.dir",args[4])
.getOrCreate();
JavaRDD<Rating> ratingsRDD = spark
.read().textFile(args[0]).javaRDD()
.map(Rating::parseRating);
Dataset<Row> ratings = spark.createDataFrame(ratingsRDD, Rating.class);
ALS als = new ALS()
.setMaxIter(Integer.parseInt(args[3]))
.setRegParam(0.01)
.setUserCol("userId")
.setItemCol("movieId")
.setRatingCol("rating").setImplicitPrefs(true);
ALSModel model = als.fit(ratings);
model.setColdStartStrategy("drop");
Dataset<Row> rowDataset = model.recommendForAllUsers(50);
These are maven dependencies to make this piece of code work:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.8</version>
</dependency>
Calculating recommendations with this code takes ~70sec for my data file. This code produces following warning:
WARN BLAS: Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS
WARN BLAS: Failed to load implementation from: com.github.fommil.netlib.NativeRefBLAS
WARN LAPACK: Failed to load implementation from: com.github.fommil.netlib.NativeSystemLAPACK
WARN LAPACK: Failed to load implementation from: com.github.fommil.netlib.NativeRefLAPACK
Now I try to enable netlib-java by adding this dependency in maven:
<dependency>
<groupId>com.github.fommil.netlib</groupId>
<artifactId>all</artifactId>
<version>1.1.2</version>
<type>pom</type>
</dependency>
to avoid crashing of this new environment I had to do this extra trick:
LD_PRELOAD=/usr/lib64/libopenblas.so
Now it also works, it gives no warnings, but it works slower and it takes ~170sec on average to perform the same calculation. I am running this on CentOS.
Shouldn't it be faster with native libraries? Is it possible to make it faster?
first you can check you centos version ,for centos 6 may not using the native libraries, check this
As far as I konw , the ALS algorithm has been improved since 2.0 version , you can check
Highlights in 2.2
And the source code from 2.2 as blow :
so the native libraries has not help!

TaskID.<init>(Lorg/apache/hadoop/mapreduce/JobID;Lorg/apache/hadoop/mapreduce/TaskType;I)V

val jobConf = new JobConf(hbaseConf)
jobConf.setOutputFormat(classOf[TableOutputFormat])
jobConf.set(TableOutputFormat.OUTPUT_TABLE, tablename)
val indataRDD = sc.makeRDD(Array("1,jack,15","2,Lily,16","3,mike,16"))
indataRDD.map(_.split(','))
val rdd = indataRDD.map(_.split(',')).map{arr=>{
val put = new Put(Bytes.toBytes(arr(0).toInt))
put.add(Bytes.toBytes("cf"),Bytes.toBytes("name"),Bytes.toBytes(arr(1)))
put.add(Bytes.toBytes("cf"),Bytes.toBytes("age"),Bytes.toBytes(arr(2).toInt))
(new ImmutableBytesWritable, put)
}}
rdd.saveAsHadoopDataset(jobConf)
When I run hadoop or spark jobs, I often meet the error:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.mapred.TaskID.<init>(Lorg/apache/hadoop/mapreduce/JobID;Lorg/apache/hadoop/mapreduce/TaskType;I)V
at org.apache.spark.SparkHadoopWriter.setIDs(SparkHadoopWriter.scala:158)
at org.apache.spark.SparkHadoopWriter.preSetup(SparkHadoopWriter.scala:60)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1.apply$mcV$sp(PairRDDFunctions.scala:1188)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1.apply(PairRDDFunctions.scala:1161)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1.apply(PairRDDFunctions.scala:1161)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:358)
at org.apache.spark.rdd.PairRDDFunctions.saveAsHadoopDataset(PairRDDFunctions.scala:1161)
at com.iteblog.App$.main(App.scala:62)
at com.iteblog.App.main(App.scala)`
At the begin, I think, is the jar conflict, but I carefully checked the jar: there are no other jars. The spark and hadoop versions are:
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.0.1</version>`
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>2.6.0-mr1-cdh5.5.0</version>
And I found that the TaskID and TaskType are all in the hadoop-core jar, but not in the same package. Why the mapred.TaskID can refer the mapreduce.TaskType ?
Oh,I have resolve this problem,add the maven dependency
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.6.0-cdh5.5.0</version>
</dependency>
the error disappear!
I have also faced such issue . It basically due to jar issue only.
Add the Jar file from Maven spark-core_2.10
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>2.0.2</version>
</dependency>
After changing the Jar file

Bean Validation with JBoss Errai

I want to make a GWT app with the Errai framework but I run in some problems with the Data Binding and Validation.
My pom.xml
<dependency>
<groupId>org.jboss.errai</groupId>
<artifactId>errai-validation</artifactId>
<version>${errai.version}</version>
</dependency>
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<classifier>sources</classifier>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<version>4.2.0.Final</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<version>4.2.0.Final</version>
<scope>provided</scope>
<classifier>sources</classifier>
</dependency>
My app.gwt.xml includes the Errai-Validation and HibernateValidator modules:
<inherits name="org.jboss.errai.validation.Validation" />
<inherits name="org.hibernate.validator.HibernateValidator" />
There are no unresolved dependencies I have already double checked this.
When I try to run the application with mvn gwt:run I'm getting the following error:
java.util.concurrent.ExecutionException: org.jboss.errai.ioc.rebind.ioc.exception.UnsatisfiedDependenciesException: #> org.jboss.errai.ui.nav.client.local.Navigation
- field org.jboss.errai.codegen.meta.MetaField:org.jboss.errai.ui.nav.client.local.Navigation.stateChangeEvent could not be satisfied for type: org.jboss.errai.ioc.client.lifecycle.api.StateChange
Message: can't resolve bean: org.jboss.errai.ioc.client.lifecycle.api.StateChange<java.lang.Object> ( #Default )
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:188)
at org.jboss.errai.config.rebind.AsyncGenerators$FutureWrapper.get(AsyncGenerators.java:112)
at org.jboss.errai.config.rebind.AsyncGenerators$FutureWrapper.get(AsyncGenerators.java:86)
at org.jboss.errai.config.rebind.AbstractAsyncGenerator.startAsyncGeneratorsAndWaitFor(AbstractAsyncGenerator.java:100)
at org.jboss.errai.ioc.rebind.ioc.bootstrapper.IOCGenerator.generate(IOCGenerator.java:58)
at com.google.gwt.core.ext.IncrementalGenerator.generateNonIncrementally(IncrementalGenerator.java:40)
at com.google.gwt.dev.javac.StandardGeneratorContext.runGeneratorIncrementally(StandardGeneratorContext.java:657)
at com.google.gwt.dev.cfg.RuleGenerateWith.realize(RuleGenerateWith.java:41)
at com.google.gwt.dev.shell.StandardRebindOracle$Rebinder.rebind(StandardRebindOracle.java:79)
at com.google.gwt.dev.shell.StandardRebindOracle.rebind(StandardRebindOracle.java:276)
at com.google.gwt.dev.shell.ShellModuleSpaceHost.rebind(ShellModuleSpaceHost.java:141)
at com.google.gwt.dev.shell.ModuleSpace.rebind(ModuleSpace.java:595)
at com.google.gwt.dev.shell.ModuleSpace.rebindAndCreate(ModuleSpace.java:465)
at com.google.gwt.dev.shell.GWTBridgeImpl.create(GWTBridgeImpl.java:49)
at com.google.gwt.core.shared.GWT.create(GWT.java:57)
at com.google.gwt.core.client.GWT.create(GWT.java:85)
at org.jboss.errai.ioc.client.Container.bootstrapContainer(Container.java:64)
at org.jboss.errai.ioc.client.Container.onModuleLoad(Container.java:41)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.google.gwt.dev.shell.ModuleSpace.onLoad(ModuleSpace.java:406)
at com.google.gwt.dev.shell.OophmSessionHandler.loadModule(OophmSessionHandler.java:200)
at com.google.gwt.dev.shell.BrowserChannelServer.processConnection(BrowserChannelServer.java:526)
at com.google.gwt.dev.shell.BrowserChannelServer.run(BrowserChannelServer.java:364)
at java.lang.Thread.run(Thread.java:744)
Thats why the Bootstrap is failing and the application is throwing a onModuleLoad Exception and is not starting.
If I remove the 2 validation modules I'm able to start the application without any errors.
Im using the Errai Tutorial with version 3.0.1 FINAL.
Thanks for your help :)
EDIT:
I resolved the error by adding
<inherits name="org.jboss.errai.ui.nav.Navigation" />
to my app.gwt.xml but now I'm running into the next problem with this exception:
java.lang.RuntimeException: Deferred binding failed for 'org.jboss.errai.validation.client.ValidatorFactoryImpl$GwtValidator' (did you forget to inherit a required module?)
at com.google.gwt.dev.shell.GWTBridgeImpl.create(GWTBridgeImpl.java:53)
at com.google.gwt.core.shared.GWT.create(GWT.java:57)
at com.google.gwt.core.client.GWT.create(GWT.java:85)
at org.jboss.errai.validation.client.ValidatorFactoryImpl.createValidator(ValidatorFactoryImpl.java:11)
at com.google.gwt.validation.client.AbstractGwtValidatorFactory.getValidator(AbstractGwtValidatorFactory.java:90)
at org.jboss.errai.validation.client.ValidatorProvider.get(ValidatorProvider.java:37)
at org.jboss.errai.ioc.client.BootstrapperImpl$28.getInstance(BootstrapperImpl.java:432)
at org.jboss.errai.ioc.client.BootstrapperImpl$28.getInstance(BootstrapperImpl.java:1)
at org.jboss.errai.ioc.client.container.IOCDependentBean.getInstance(IOCDependentBean.java:96)
at org.jboss.errai.ioc.client.container.IOCDependentBean.getInstance(IOCDependentBean.java:87)
at org.jboss.errai.ioc.client.container.SyncToAsyncBeanManagerAdapter$1.getInstance(SyncToAsyncBeanManagerAdapter.java:148)
at org.jboss.errai.ui.nav.client.local.spi.GeneratedNavigationGraph$2.produceContent(GeneratedNavigationGraph.java:69)
at org.jboss.errai.ui.nav.client.local.Navigation.maybeShowPage(Navigation.java:304)
at org.jboss.errai.ui.nav.client.local.Navigation.navigate(Navigation.java:249)
at org.jboss.errai.ui.nav.client.local.Navigation.navigate(Navigation.java:230)
at org.jboss.errai.ui.nav.client.local.Navigation.navigate(Navigation.java:225)
at org.jboss.errai.ui.nav.client.local.Navigation.goTo(Navigation.java:191)
at org.jboss.errai.ui.nav.client.local.DefaultNavigationErrorHandler.handleError(DefaultNavigationErrorHandler.java:27)
at org.jboss.errai.ui.nav.client.local.Navigation.goTo(Navigation.java:193)
Is there another module that is missing?
I'm correct that Errai is creating the ValidationFactory and injecting the correct instance? So I don't have to create my own ValidationFactory like here:
GWT Validation Tutorial
Yes, that's correct. You don't have to create your own ValidationFactory. Errai will do that for you. You can simply #Inject a Validator.
I have prepared a version of the Errai tutorial using 3.0.1.Final that shows exactly that (following the instructions from the reference guide). I've put the project on GitHub.
The last error you pasted doesn't contain enough information to investigate why this is failing for you. However, you should see more error information in the devmode console.

java.lang.ClassNotFoundException: Class org.apache.hadoop.hdfs.DistributedFileSystem not found

I am trying to use the Hadoop HDFS Java API to list all files in HDFS.
I am able to list the files on remote HDFS by running the code in my local eclipse.
But i get the exception
java.lang.ClassNotFoundException: Class org.apache.hadoop.hdfs.DistributedFileSystem
org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2290)
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2303)
org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:87)
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2342)
org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2324)
org.apache.hadoop.fs.FileSystem.get(FileSystem.java:351)
org.apache.hadoop.fs.FileSystem.get(FileSystem.java:163)
when I execute the code from a web server.
I have added the below maven dependencies.
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.0.0-cdh4.5.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-auth</artifactId>
<version>2.0.0-cdh4.5.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.0.0-cdh4.5.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>2.0.0-mr1-cdh4.5.0</version>
</dependency>
Also I have embedded the required jars into the exported jar and maven has added the same in the buildpath.
If any one has encountered this issue earlier request you to please share the solution.
I am facing a similar issue with Apache hadoop 2.2.0 realease, I did a workaround by running it as a separate process, by
final Process p = Runtime.getRuntime ().exec ("java -jar {jarfile} {classfile}";
final Scanner output = new Scanner (p.getErrorStream ());
while (output.hasNext ()) {
try {
System.err.println (output.nextLine ());
} catch (final Exception e) {
}
}
The jar file contains the implementation using the apache hadoop 2.2.0 jars.
Though, I am still searching for an exact solution.
For me, hadoop-hdfs-2.6.0.jar was missing in zeppelin server's lib dir. I copied in zeppelin lib forder and my problem was resolved. :)
and add dependency for hadoop-hdfs-2.6.0.jar in pom.xml also.

NoClassDefFoundError - javax/el/ExpressionFactory - Tomcat 7 and Spring MVC

I am trying to deploy to tomcat 7 a little Spring MVC app, and I am getting the dreaded exception:
java.lang.NoClassDefFoundError: javax/el/ExpressionFactory
org.apache.jasper.runtime.JspApplicationContextImpl.getExpressionFactory(JspApplicationContextImpl.java:108)
org.apache.jasper.compiler.Validator$ValidateVisitor.<init>(Validator.java:514)
org.apache.jasper.compiler.Validator.validateExDirectives(Validator.java:1795)
org.apache.jasper.compiler.Compiler.generateJava(Compiler.java:217)
org.apache.jasper.compiler.Compiler.compile(Compiler.java:373)
org.apache.jasper.compiler.Compiler.compile(Compiler.java:353)
org.apache.jasper.compiler.Compiler.compile(Compiler.java:340)
org.apache.jasper.JspCompilationContext.compile(JspCompilationContext.java:646)
org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:357)
org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:390)
org.apache.jasper.servlet.JspServlet.service(JspServlet.java:334)
javax.servlet.http.HttpServlet.service(HttpServlet.java:728)
I don't see any usual suspects in my WEB-INF/lib that is deployed (servlet-api.jar, el-api.jar, etc.).
Here is my pom.xml file:
<!-- Servlet -->
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>2.5</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>javax.servlet.jsp</groupId>
<artifactId>jsp-api</artifactId>
<version>2.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>jstl</artifactId>
<version>1.2</version>
<scope>provided</scope>
</dependency>
I am basically trying to load a simple jsp page. I know that Tomcat is choking compiling the jsp page using Jasper because it is hitting this Class Loader issue, but I can't figure out why.
My only guess is that either I need to package a library that I am not, or perhaps add a library to the tomcat/lib.
Here are the list of jars in my WEB-INF/lib
activation-1.1.jar
antlr-2.7.6.jar
aopalliance-1.0.jar
aspectjrt-1.6.6.jar
aspectjweaver-1.6.6.jar
commons-collections-3.2.jar
commons-dbcp-1.2.2.jar
commons-fileupload-1.2.1.jar
commons-io-1.4.jar
commons-lang-2.1.jar
commons-logging-1.1.1.jar
commons-pool-1.3.jar
dom4j-1.6.1.jar
ehcache-core-2.3.0.jar
freemarker-2.3.15.jar
groovy-all-1.8.5.jar
guava-12.0.jar
hibernate-commons-annotations-4.0.1.Final.jar
hibernate-core-3.3.1.GA.jar
hibernate-entitymanager-4.1.0.Final.jar
hibernate-jpa-2.0-api-1.0.1.Final.jar
hibernate-validator-4.1.0.Final.jar
hsqldb-1.8.0.7.jar
jacks_2.10-2.1.4.jar
jackson-annotations-2.1.4.jar
jackson-core-2.1.4.jar
jackson-core-asl-1.9.9.jar
jackson-databind-2.1.4.jar
javassist-3.15.0-GA.jar
javax.inject-1.jar
jboss-logging-3.1.0.CR2.jar
jboss-transaction-api_1.1_spec-1.0.0.Final.jar
jcl-over-slf4j-1.6.6.jar
jettison-1.1.jar
joda-time-1.6.2.jar
jsr305-1.3.9.jar
log4j-1.2.17.jar
logback-classic-1.0.7.jar
logback-core-1.0.7.jar
mail-1.4.6-rc1.jar
mysql-connector-java-5.1.18.jar
ognl-3.0.5.jar
scala-compiler-2.10.1.jar
scala-library-2.10.1.jar
scala-reflect-2.10.0.jar
scalap-2.10.1.jar
slf4j-api-1.7.5.jar
slf4j-log4j12-1.6.6.jar
snakeyaml-1.11.jar
spring-aop-3.2.2.RELEASE.jar
spring-batch-admin-manager-1.3.0.BUILD-20130514.094516-16.jar
spring-batch-admin-resources-1.3.0.BUILD-20130514.094516-17.jar
spring-batch-core-2.2.0.BUILD-20130514.030108-700.jar
spring-batch-infrastructure-2.2.0.BUILD-20130514.030108-714.jar
spring-batch-integration-1.3.0.BUILD-20130514.094516-17.jar
spring-beans-3.2.2.RELEASE.jar
spring-context-3.2.2.RELEASE.jar
spring-context-support-3.2.2.RELEASE.jar
spring-core-3.2.2.RELEASE.jar
spring-expression-3.2.2.RELEASE.jar
spring-integration-core-2.2.3.RELEASE.jar
spring-integration-file-2.2.3.RELEASE.jar
spring-integration-groovy-2.2.3.RELEASE.jar
spring-integration-http-2.2.3.RELEASE.jar
spring-integration-jms-2.2.3.RELEASE.jar
spring-integration-jmx-2.2.3.RELEASE.jar
spring-integration-mail-2.2.3.RELEASE.jar
spring-integration-scripting-2.2.3.RELEASE.jar
spring-integration-stream-2.2.3.RELEASE.jar
spring-integration-twitter-2.2.3.RELEASE.jar
spring-integration-ws-2.2.3.RELEASE.jar
spring-integration-xml-2.2.3.RELEASE.jar
spring-jdbc-3.2.2.RELEASE.jar
spring-jms-3.2.2.RELEASE.jar
spring-oxm-1.5.9.jar
spring-oxm-3.2.2.RELEASE.jar
spring-retry-1.0.2.RELEASE.jar
spring-scala-1.0.0.M2.jar
spring-security-config-3.2.0.M1.jar
spring-security-core-3.2.0.M1.jar
spring-security-crypto-3.1.0.RELEASE.jar
spring-security-web-3.2.0.M1.jar
spring-social-core-1.0.1.RELEASE.jar
spring-social-twitter-1.0.1.RELEASE.jar
spring-tx-3.2.2.RELEASE.jar
spring-web-3.2.2.RELEASE.jar
spring-webmvc-3.2.2.RELEASE.jar
spring-ws-core-1.5.9.jar
spring-xml-1.5.9.jar
thymeleaf-2.0.15.jar
thymeleaf-layout-dialect-1.0.5.jar
thymeleaf-spring3-2.0.15.jar
wsdl4j-1.6.1.jar
xml-apis-1.0.b2.jar
xpp3_min-1.1.4c.jar
xstream-1.3.jar

Resources