Related
Anybody has experience or know how to use SimpleScheduledRoutePolicy to manage a scheduler at runtime? To change the scheduler timing, suspend or resume. I tried using this in a spring boot project but it is not working as documented. Below is my Route configuration and a Test class I used to test this.
#Component
public class MyRouter extends RouteBuilder {
private static final Logger logger = LoggerFactory.getLogger(MyRouter.class);
#Override
public void configure() throws Exception {
SimpleScheduledRoutePolicy simpleScheduledRoutePolicy = new SimpleScheduledRoutePolicy();
long startTime = System.currentTimeMillis() + 5000l;
simpleScheduledRoutePolicy.setRouteStartDate(new Date(startTime));
simpleScheduledRoutePolicy.setRouteStartRepeatInterval(3000l);
simpleScheduledRoutePolicy.setRouteStartRepeatCount(3);
from("direct:myroute")
.routeId("myroute")
.routePolicy(simpleScheduledRoutePolicy)
.autoStartup(true)
.log(LoggingLevel.INFO, logger, "myroute invoked")
.to("mock:test");
}
}
Test class code
#RunWith(CamelSpringBootRunner.class)
#SpringBootTest
public class MyRouterTest {
#Autowired
private CamelContext camelContext;
#EndpointInject("mock:test")
MockEndpoint resultEndpoint;
#SneakyThrows
#Test
public void myRouteIsScheduledSuccessfully() {
resultEndpoint.expectedMessageCount(2);
Thread.sleep(7000);
resultEndpoint.assertIsSatisfied();
}
}
But I just gets below logs saying the scheduler started but it is not triggering every 3 seconds as configured in policy.
I tried to invoke the direct component from test method, still not working. Not sure where I am going wrong.
[INFO ] 2020-04-17 15:22:17.928 [main] MyRouterTest - Starting MyRouterTest on PPC11549 with PID 20892 (started by rmr in C:\Data\Telenet\Workspaces\atoms-event-engine)
[DEBUG] 2020-04-17 15:22:17.930 [main] MyRouterTest - Running with Spring Boot v2.2.6.RELEASE, Spring v5.2.5.RELEASE
[INFO ] 2020-04-17 15:22:17.932 [main] MyRouterTest - No active profile set, falling back to default profiles: default
[INFO ] 2020-04-17 15:22:19.634 [main] RepositoryConfigurationDelegate - Bootstrapping Spring Data JDBC repositories in DEFAULT mode.
[INFO ] 2020-04-17 15:22:19.679 [main] RepositoryConfigurationDelegate - Finished Spring Data repository scanning in 36ms. Found 0 JDBC repository interfaces.
[INFO ] 2020-04-17 15:22:20.226 [main] PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'org.apache.camel.spring.boot.CamelAutoConfiguration' of type [org.apache.camel.spring.boot.CamelAutoConfiguration] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
[INFO ] 2020-04-17 15:22:20.848 [main] HikariDataSource - HikariPool-1 - Starting...
[INFO ] 2020-04-17 15:22:21.437 [main] HikariDataSource - HikariPool-1 - Start completed.
[INFO ] 2020-04-17 15:22:22.602 [main] LRUCacheFactory - Detected and using LURCacheFactory: camel-caffeine-lrucache
[INFO ] 2020-04-17 15:22:24.082 [main] JobRepositoryFactoryBean - No database type set, using meta data indicating: H2
[INFO ] 2020-04-17 15:22:24.120 [main] SimpleJobLauncher - No TaskExecutor has been set, defaulting to synchronous executor.
[INFO ] 2020-04-17 15:22:24.485 [main] ThreadPoolTaskScheduler - Initializing ExecutorService 'taskScheduler'
[INFO ] 2020-04-17 15:22:24.695 [main] SpringBootRoutesCollector - Loading additional Camel XML routes from: classpath:camel/*.xml
[INFO ] 2020-04-17 15:22:24.698 [main] SpringBootRoutesCollector - Loading additional Camel XML rests from: classpath:camel-rest/*.xml
[INFO ] 2020-04-17 15:22:24.727 [main] MyRouterTest - Started MyRouterTest in 7.338 seconds (JVM running for 11.356)
[INFO ] 2020-04-17 15:22:24.729 [main] JobLauncherCommandLineRunner - Running default command line with: []
[INFO ] 2020-04-17 15:22:24.734 [main] CamelAnnotationsHandler - Setting shutdown timeout to [10 SECONDS] on CamelContext with name [camelContext].
[INFO ] 2020-04-17 15:22:24.815 [main] CamelSpringBootExecutionListener - #RunWith(CamelSpringBootRunner.class) before: class com.telenet.atoms.eventengine.camel.MyRouterTest.myRouteIsScheduledSuccessfully
[INFO ] 2020-04-17 15:22:24.817 [main] CamelSpringBootExecutionListener - Initialized CamelSpringBootRunner now ready to start CamelContext
[INFO ] 2020-04-17 15:22:24.818 [main] CamelAnnotationsHandler - Starting CamelContext with name [camelContext].
[INFO ] 2020-04-17 15:22:24.902 [main] DefaultManagementStrategy - JMX is enabled
[INFO ] 2020-04-17 15:22:25.308 [main] AbstractCamelContext - Apache Camel 3.2.0 (CamelContext: camel-1) is starting
[INFO ] 2020-04-17 15:22:25.438 [main] QuartzComponent - Create and initializing scheduler.
[INFO ] 2020-04-17 15:22:25.442 [main] QuartzComponent - Setting org.quartz.scheduler.jmx.export=true to ensure QuartzScheduler(s) will be enlisted in JMX.
[INFO ] 2020-04-17 15:22:25.483 [main] StdSchedulerFactory - Using default implementation for ThreadExecutor
[INFO ] 2020-04-17 15:22:25.487 [main] SimpleThreadPool - Job execution threads will use class loader of thread: main
[INFO ] 2020-04-17 15:22:25.511 [main] SchedulerSignalerImpl - Initialized Scheduler Signaller of type: class org.quartz.core.SchedulerSignalerImpl
[INFO ] 2020-04-17 15:22:25.511 [main] QuartzScheduler - Quartz Scheduler v.2.3.2 created.
[INFO ] 2020-04-17 15:22:25.516 [main] RAMJobStore - RAMJobStore initialized.
[INFO ] 2020-04-17 15:22:25.528 [main] QuartzScheduler - Scheduler meta-data: Quartz Scheduler (v2.3.2) 'DefaultQuartzScheduler-camel-1' with instanceId 'NON_CLUSTERED'
Scheduler class: 'org.quartz.core.QuartzScheduler' - running locally.
NOT STARTED.
Currently in standby mode.
Number of jobs executed: 0
Using thread pool 'org.quartz.simpl.SimpleThreadPool' - with 10 threads.
Using job-store 'org.quartz.simpl.RAMJobStore' - which does not support persistence. and is not clustered.
[INFO ] 2020-04-17 15:22:25.528 [main] StdSchedulerFactory - Quartz scheduler 'DefaultQuartzScheduler-camel-1' initialized from an externally provided properties instance.
[INFO ] 2020-04-17 15:22:25.528 [main] StdSchedulerFactory - Quartz scheduler version: 2.3.2
[INFO ] 2020-04-17 15:22:25.529 [main] AbstractCamelContext - StreamCaching is not in use. If using streams then its recommended to enable stream caching. See more details at http://camel.apache.org/stream-caching.html
[INFO ] 2020-04-17 15:22:25.623 [main] AbstractCamelContext - Route: myroute started and consuming from: direct://myroute
[INFO ] 2020-04-17 15:22:25.625 [main] AbstractCamelContext - Route: orderfault started and consuming from: direct://orderfault
[INFO ] 2020-04-17 15:22:25.637 [main] AbstractCamelContext - Total 2 routes, of which 2 are started
[INFO ] 2020-04-17 15:22:25.638 [main] AbstractCamelContext - Apache Camel 3.2.0 (CamelContext: camel-1) started in 0.329 seconds
[INFO ] 2020-04-17 15:22:25.663 [main] ScheduledRoutePolicy - Scheduled trigger: triggerGroup-myroute.trigger-START-myroute for action: START on route myroute
[INFO ] 2020-04-17 15:22:25.663 [main] ScheduledRoutePolicy - Scheduled trigger: triggerGroup-orderfault.trigger-START-orderfault for action: START on route orderfault
[INFO ] 2020-04-17 15:22:25.664 [main] QuartzComponent - Starting scheduler.
[INFO ] 2020-04-17 15:22:25.665 [main] QuartzScheduler - Scheduler DefaultQuartzScheduler-camel-1_$_NON_CLUSTERED started.
[INFO ] 2020-04-17 15:22:33.101 [main] MockEndpoint - Asserting: mock://test is satisfied
[INFO ] 2020-04-17 15:22:43.134 [main] MyRouterTest - ********************************************************************************
[INFO ] 2020-04-17 15:22:43.135 [main] MyRouterTest - Testing done: myRouteIsScheduledSuccessfully(com.telenet.atoms.eventengine.camel.MyRouterTest)
[INFO ] 2020-04-17 15:22:43.136 [main] MyRouterTest - Took: 17.469 seconds (17469 millis)
[INFO ] 2020-04-17 15:22:43.136 [main] MyRouterTest - ********************************************************************************
[INFO ] 2020-04-17 15:22:43.137 [main] CamelSpringBootExecutionListener - #RunWith(CamelSpringBootRunner.class) after: class com.telenet.atoms.eventengine.camel.MyRouterTest.myRouteIsScheduledSuccessfully
java.lang.AssertionError: mock://test Received message count. Expected: <2> but was: <0>
The SimpleScheduledRoutePolicy works as expected - it starts your Camel Route and that's what it is for: starting and stopping routes.
Because of your test, I guess that you want to have a Scheduler endpoint that triggers messages in a configured interval. For this you have to use Camel Timer, Camel Scheduler or Camel Quartz.
Since your route does not contain any of these, there is simply no scheduler that can be started.
To create a scheduler that (after waiting initially 5 seconds) fires every 3 seconds you can use for example this:
from("scheduler://foo?initialDelay=5s&delay=3s")...
Change at runtime (added due to comment)
The Camel Quartz component obviously uses the Quartz scheduler and Quartz provides a JMX access. So if you want to change the scheduler this is probably your best option.
To start and stop routes at runtime you should have a look at Camel Controlbus.
I am trying to stream data using spark-streaming-kafka-0-10_2.11 artifact and 2.1.1 version along with spark-streaming_2.11 version 2.1.1 and kafka_2.11(0.10.2.1 version).
When I start the program, Spark is not streaming data and neither it is throwing any error.
Below is my code, dependencies, and response.
Dependencies:
"org.apache.kafka" % "kafka_2.11" % "0.10.2.1",
"org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.1.1",
"org.apache.spark" % "spark-streaming_2.11" % "2.1.1"
Code:
val sparkConf = new SparkConf().setMaster("local[*]").setAppName("kafka-dstream").set("spark.driver.allowMultipleContexts", "true")
val sc = new SparkContext(sparkConf)
val ssc = new StreamingContext(sc, Seconds(5))
val kafkaParams = Map[String, Object](
"bootstrap.servers" -> "ec2-xx-yy-zz-abc.us-west-2.compute.amazonaws.com:9092",
"key.deserializer" -> classOf[StringDeserializer],
"value.deserializer" -> classOf[StringDeserializer],
"group.id" -> ("java-test-consumer-" + random.nextInt()+"-" +
System.currentTimeMillis()),
"auto.offset.reset" -> "latest",
"enable.auto.commit" -> (false: java.lang.Boolean)
)
val topics = Array("test")
val stream = KafkaUtils.createDirectStream[String, String](
ssc,
PreferConsistent,
Subscribe[String, String](topics, kafkaParams)
)
stream.foreachRDD(f=>f.foreachPartition(f=>f ->{
while (f.hasNext) {
System.out.println(">>>>>>>>>>>>>>>>>>>>>>>>>>>"+f.next().key());
}
}))
ssc.start()
ssc.awaitTermination()
Response (Logs):
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
[info] Loading project definition from D:\Scala_Workspace\LM\project
[info] Set current project to LM (in build file:/D:/Scala_Workspace/LM/)
[info] Running consumer.One
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/06/09 12:43:44 INFO SparkContext: Running Spark version 2.1.1
17/06/09 12:43:44 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/06/09 12:43:44 INFO SecurityManager: Changing view acls to: ee208961
17/06/09 12:43:44 INFO SecurityManager: Changing modify acls to: ee208961
17/06/09 12:43:44 INFO SecurityManager: Changing view acls groups to:
17/06/09 12:43:44 INFO SecurityManager: Changing modify acls groups to:
17/06/09 12:43:44 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(ee208961); groups wi
permissions: Set(ee208961); groups with modify permissions: Set()
17/06/09 12:43:45 INFO Utils: Successfully started service 'sparkDriver' on port 51695.
17/06/09 12:43:45 INFO SparkEnv: Registering MapOutputTracker
17/06/09 12:43:45 INFO SparkEnv: Registering BlockManagerMaster
17/06/09 12:43:45 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
17/06/09 12:43:45 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
17/06/09 12:43:45 INFO DiskBlockManager: Created local directory at C:\Users\EE208961\AppData\Local\Temp\blockmgr-7f3c43f0-d1f5-43fb-b185-a5bf78de4496
17/06/09 12:43:45 INFO MemoryStore: MemoryStore started with capacity 93.3 MB
17/06/09 12:43:45 INFO SparkEnv: Registering OutputCommitCoordinator
17/06/09 12:43:45 INFO Utils: Successfully started service 'SparkUI' on port 4040.
17/06/09 12:43:46 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.1.22.70:4040
17/06/09 12:43:46 INFO Executor: Starting executor ID driver on host localhost
17/06/09 12:43:46 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 51706.
17/06/09 12:43:46 INFO NettyBlockTransferService: Server created on 10.1.22.70:51706
17/06/09 12:43:46 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
17/06/09 12:43:46 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.1.22.70, 51706, None)
17/06/09 12:43:46 INFO BlockManagerMasterEndpoint: Registering block manager 10.1.22.70:51706 with 93.3 MB RAM, BlockManagerId(driver, 10.1.22.70, 51706,
17/06/09 12:43:46 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.1.22.70, 51706, None)
17/06/09 12:43:46 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.1.22.70, 51706, None)
17/06/09 12:43:46 WARN KafkaUtils: overriding enable.auto.commit to false for executor
17/06/09 12:43:46 WARN KafkaUtils: overriding auto.offset.reset to none for executor
17/06/09 12:43:46 WARN KafkaUtils: overriding executor group.id to spark-executor-sadad
17/06/09 12:43:46 WARN KafkaUtils: overriding receive.buffer.bytes to 65536 see KAFKA-3135
17/06/09 12:43:47 INFO DirectKafkaInputDStream: Slide time = 5000 ms
17/06/09 12:43:47 INFO DirectKafkaInputDStream: Storage level = Serialized 1x Replicated
17/06/09 12:43:47 INFO DirectKafkaInputDStream: Checkpoint interval = null
17/06/09 12:43:47 INFO DirectKafkaInputDStream: Remember interval = 5000 ms
17/06/09 12:43:47 INFO DirectKafkaInputDStream: Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream#5cda65c2
17/06/09 12:43:47 INFO ForEachDStream: Slide time = 5000 ms
17/06/09 12:43:47 INFO ForEachDStream: Storage level = Serialized 1x Replicated
17/06/09 12:43:47 INFO ForEachDStream: Checkpoint interval = null
17/06/09 12:43:47 INFO ForEachDStream: Remember interval = 5000 ms
17/06/09 12:43:47 INFO ForEachDStream: Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream#1b9c49ce
17/06/09 12:43:47 INFO ConsumerConfig: ConsumerConfig values:
metric.reporters = []
metadata.max.age.ms = 300000
partition.assignment.strategy = [org.apache.kafka.clients.consumer.RangeAssignor]
reconnect.backoff.ms = 50
sasl.kerberos.ticket.renew.window.factor = 0.8
max.partition.fetch.bytes = 1048576
bootstrap.servers = [ec2-54-69-23-196.us-west-2.compute.amazonaws.com:9092]
ssl.keystore.type = JKS
enable.auto.commit = false
sasl.mechanism = GSSAPI
interceptor.classes = null
exclude.internal.topics = true
ssl.truststore.password = null
client.id =
ssl.endpoint.identification.algorithm = null
max.poll.records = 2147483647
check.crcs = true
request.timeout.ms = 40000
heartbeat.interval.ms = 3000
auto.commit.interval.ms = 5000
receive.buffer.bytes = 65536
ssl.truststore.type = JKS
ssl.truststore.location = null
ssl.keystore.password = null
fetch.min.bytes = 1
send.buffer.bytes = 131072
value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
group.id = sadad
retry.backoff.ms = 100
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
ssl.trustmanager.algorithm = PKIX
ssl.key.password = null
fetch.max.wait.ms = 500
sasl.kerberos.min.time.before.relogin = 60000
connections.max.idle.ms = 540000
session.timeout.ms = 30000
metrics.num.samples = 2
key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
ssl.protocol = TLS
ssl.provider = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.keystore.location = null
ssl.cipher.suites = null
security.protocol = PLAINTEXT
ssl.keymanager.algorithm = SunX509
metrics.sample.window.ms = 30000
auto.offset.reset = latest
17/06/09 12:43:47 INFO AppInfoParser: Kafka version : 0.10.2.1
17/06/09 12:43:47 INFO AppInfoParser: Kafka commitId : a7a17cdec9eaa6c5
The code starts waiting here,neither it throws error,nor streams.
And what else i observed is,if i don't use "org.apache.kafka" % "kafka_2.11" % "0.10.2.1" dependency, am getting
"Exception in thread "main" org.apache.kafka.common.protocol.types.SchemaException: Error reading field 'topic_metadata': Error reading array of size 291941, only 32 bytes available" error,
so i used kafka_2.11(0.10.2.1 version) dependency.
I am using Spring Boot 1.2.5 and want to do some JUnit4 testing. I have an Eclipse project that contains the test. During initializiation a temporay H2 database is created. I have a schema.xml and data.xml. The initialization is fine, but later three of 5 tables are empty.
First I had the database created by Spring boot. No code from my side, just the XML in the resources folder. This runs without any problems. Then I found that in the test three of the five tables are empty. The schema still exists, but not data.
I then changed to manual creation of the H2 database/datasource bean and in the same method I checked if all records are present. This made no difference in the test results. I could only show that just after creation the database is populated as expected. It seems that during bean creation and JUnit test some routine is doing a delete on three specific tables.
package de.unit4.financials;
import javax.sql.DataSource;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.Bean;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.datasource.embedded.EmbeddedDatabaseBuilder;
import org.springframework.jdbc.datasource.embedded.EmbeddedDatabaseType;
import org.springframework.test.jdbc.JdbcTestUtils;
#SpringBootApplication
public class JUnitConfig {
Logger logger = LogManager.getLogger();
#Bean
public DataSource getDataSource() {
DataSource dataSource = null;
if (dataSource == null) {
dataSource = new EmbeddedDatabaseBuilder().setType(EmbeddedDatabaseType.H2).setName("FanUtilDB")
.addScript("classpath:schema.sql").addScript("classpath:data.sql").build();
}
logger.info("Datasource "+dataSource);
testDB(new JdbcTemplate(dataSource));
return dataSource;
}
public void testDB(JdbcTemplate jdbcTemplate) {
countTableRows("oas_company", jdbcTemplate);
countTableRows("oas_agm", jdbcTemplate);
countTableRows("oas_agmlist", jdbcTemplate);
countTableRows("com_usr", jdbcTemplate);
countTableRows("com_capab", jdbcTemplate);
}
private void countTableRows(String name, JdbcTemplate jdbcTemplate) {
int anzahl = JdbcTestUtils.countRowsInTable(jdbcTemplate, name);
logger.info(name + " = " + anzahl);
}
}
This is the output:
08:58:16.007 [main] INFO Datasource org.springframework.jdbc.datasource.embedded.EmbeddedDatabaseFactory$EmbeddedDataSourceProxy#60094a13 || de.unit4.financials.JUnitConfig getDataSource 54
08:58:16.197 [main] INFO oas_company = 28 || de.unit4.financials.JUnitConfig countTableRows 74
08:58:16.199 [main] INFO oas_agm = 2 || de.unit4.financials.JUnitConfig countTableRows 74
08:58:16.201 [main] INFO oas_agmlist = 2 || de.unit4.financials.JUnitConfig countTableRows 74
08:58:16.203 [main] INFO com_usr = 52 || de.unit4.financials.JUnitConfig countTableRows 74
08:58:16.205 [main] INFO com_capab = 17 || de.unit4.financials.JUnitConfig countTableRows 74
Later the JUnit test is run and it gives me this result:
08:58:19.099 [main] INFO Datasource org.springframework.jdbc.datasource.embedded.EmbeddedDatabaseFactory$EmbeddedDataSourceProxy#60094a13 || de.unit4.financials.periods.CurrentPeriodDBFactory getCurrentPeriod 63
08:58:19.127 [main] INFO oas_company = 0 || de.unit4.financials.FinancialsUtilApplicationTests countTableRows 40
08:58:19.128 [main] INFO oas_agm = 0 || de.unit4.financials.FinancialsUtilApplicationTests countTableRows 40
08:58:19.128 [main] INFO oas_agmlist = 0 || de.unit4.financials.FinancialsUtilApplicationTests countTableRows 40
08:58:19.129 [main] INFO com_usr = 52 || de.unit4.financials.FinancialsUtilApplicationTests countTableRows 40
08:58:19.130 [main] INFO com_capab = 17 || de.unit4.financials.FinancialsUtilApplicationTests countTableRows 40
The DataSource object seems to be the same, but the record count is different. During the test, the first three are empty.
Here is the test:
package de.unit4.financials;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.SpringApplicationConfiguration;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.test.jdbc.JdbcTestUtils;
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(classes = JUnitConfig.class)
public class FinancialsUtilApplicationTests {
static Logger logger = LogManager.getLogger();
#Autowired
JdbcTemplate jdbcTemplate;
#Test
public void contextLoads() {
}
#Test
public void testDB() {
countTableRows("oas_company");
countTableRows("oas_agm");
countTableRows("oas_agmlist");
countTableRows("com_usr");
countTableRows("com_capab");
}
/**
* #param name
*/
private void countTableRows(String name) {
int anzahl = JdbcTestUtils.countRowsInTable(jdbcTemplate, name);
logger.info(name + " = " + anzahl);
}
}
Here is the full console output:
08:58:12.555 [main] DEBUG o.s.t.c.j.SpringJUnit4ClassRunner - SpringJUnit4ClassRunner constructor called with [class de.unit4.financials.periods.PeriodRangeTest].
08:58:12.581 [main] DEBUG o.s.test.context.BootstrapUtils - Instantiating TestContextBootstrapper from class [org.springframework.test.context.support.DefaultTestContextBootstrapper]
08:58:12.607 [main] DEBUG o.s.t.c.s.DefaultTestContextBootstrapper - Found explicit ContextLoader class [org.springframework.boot.test.SpringApplicationContextLoader] for context configuration attributes [ContextConfigurationAttributes#32e6e9c3 declaringClass = 'de.unit4.financials.periods.PeriodRangeTest', classes = '{class de.unit4.financials.JUnitConfig}', locations = '{}', inheritLocations = true, initializers = '{}', inheritInitializers = true, name = [null], contextLoaderClass = 'org.springframework.boot.test.SpringApplicationContextLoader']
08:58:12.618 [main] DEBUG o.s.t.c.support.ActiveProfilesUtils - Could not find an 'annotation declaring class' for annotation type [org.springframework.test.context.ActiveProfiles] and class [de.unit4.financials.periods.PeriodRangeTest]
08:58:12.627 [main] DEBUG o.s.t.c.s.DefaultTestContextBootstrapper - #TestExecutionListeners is not present for class [de.unit4.financials.periods.PeriodRangeTest]: using defaults.
08:58:12.644 [main] INFO o.s.t.c.s.DefaultTestContextBootstrapper - Loaded default TestExecutionListener class names from location [META-INF/spring.factories]: [org.springframework.test.context.web.ServletTestExecutionListener, org.springframework.test.context.support.DependencyInjectionTestExecutionListener, org.springframework.test.context.support.DirtiesContextTestExecutionListener, org.springframework.test.context.transaction.TransactionalTestExecutionListener, org.springframework.test.context.jdbc.SqlScriptsTestExecutionListener]
08:58:12.672 [main] INFO o.s.t.c.s.DefaultTestContextBootstrapper - Could not instantiate TestExecutionListener [org.springframework.test.context.web.ServletTestExecutionListener]. Specify custom listener classes or make the default listener classes (and their required dependencies) available. Offending class: [javax/servlet/ServletContext]
08:58:12.688 [main] INFO o.s.t.c.s.DefaultTestContextBootstrapper - Using TestExecutionListeners: [org.springframework.test.context.support.DependencyInjectionTestExecutionListener#73ad2d6, org.springframework.test.context.support.DirtiesContextTestExecutionListener#7085bdee, org.springframework.test.context.transaction.TransactionalTestExecutionListener#1ce92674, org.springframework.test.context.jdbc.SqlScriptsTestExecutionListener#5700d6b1]
08:58:12.692 [main] DEBUG o.s.t.annotation.ProfileValueUtils - Retrieved #ProfileValueSourceConfiguration [null] for test class [de.unit4.financials.periods.PeriodRangeTest]
08:58:12.694 [main] DEBUG o.s.t.annotation.ProfileValueUtils - Retrieved ProfileValueSource type [class org.springframework.test.annotation.SystemProfileValueSource] for class [de.unit4.financials.periods.PeriodRangeTest]
08:58:12.718 [main] DEBUG o.s.t.c.j.SpringJUnit4ClassRunner - SpringJUnit4ClassRunner constructor called with [class de.unit4.financials.periods.CurrentPeriodDBFactoryTest].
08:58:12.718 [main] DEBUG o.s.test.context.BootstrapUtils - Instantiating TestContextBootstrapper from class [org.springframework.test.context.support.DefaultTestContextBootstrapper]
08:58:12.720 [main] DEBUG o.s.t.c.s.DefaultTestContextBootstrapper - Found explicit ContextLoader class [org.springframework.boot.test.SpringApplicationContextLoader] for context configuration attributes [ContextConfigurationAttributes#4566e5bd declaringClass = 'de.unit4.financials.periods.CurrentPeriodDBFactoryTest', classes = '{class de.unit4.financials.JUnitConfig}', locations = '{}', inheritLocations = true, initializers = '{}', inheritInitializers = true, name = [null], contextLoaderClass = 'org.springframework.boot.test.SpringApplicationContextLoader']
08:58:12.721 [main] DEBUG o.s.t.c.support.ActiveProfilesUtils - Could not find an 'annotation declaring class' for annotation type [org.springframework.test.context.ActiveProfiles] and class [de.unit4.financials.periods.CurrentPeriodDBFactoryTest]
08:58:12.722 [main] DEBUG o.s.t.c.s.DefaultTestContextBootstrapper - #TestExecutionListeners is not present for class [de.unit4.financials.periods.CurrentPeriodDBFactoryTest]: using defaults.
08:58:12.727 [main] INFO o.s.t.c.s.DefaultTestContextBootstrapper - Loaded default TestExecutionListener class names from location [META-INF/spring.factories]: [org.springframework.test.context.web.ServletTestExecutionListener, org.springframework.test.context.support.DependencyInjectionTestExecutionListener, org.springframework.test.context.support.DirtiesContextTestExecutionListener, org.springframework.test.context.transaction.TransactionalTestExecutionListener, org.springframework.test.context.jdbc.SqlScriptsTestExecutionListener]
08:58:12.729 [main] INFO o.s.t.c.s.DefaultTestContextBootstrapper - Could not instantiate TestExecutionListener [org.springframework.test.context.web.ServletTestExecutionListener]. Specify custom listener classes or make the default listener classes (and their required dependencies) available. Offending class: [javax/servlet/ServletContext]
08:58:12.729 [main] INFO o.s.t.c.s.DefaultTestContextBootstrapper - Using TestExecutionListeners: [org.springframework.test.context.support.DependencyInjectionTestExecutionListener#2d928643, org.springframework.test.context.support.DirtiesContextTestExecutionListener#5025a98f, org.springframework.test.context.transaction.TransactionalTestExecutionListener#49993335, org.springframework.test.context.jdbc.SqlScriptsTestExecutionListener#20322d26]
08:58:12.729 [main] DEBUG o.s.t.annotation.ProfileValueUtils - Retrieved #ProfileValueSourceConfiguration [null] for test class [de.unit4.financials.periods.CurrentPeriodDBFactoryTest]
08:58:12.730 [main] DEBUG o.s.t.annotation.ProfileValueUtils - Retrieved ProfileValueSource type [class org.springframework.test.annotation.SystemProfileValueSource] for class [de.unit4.financials.periods.CurrentPeriodDBFactoryTest]
08:58:12.733 [main] DEBUG o.s.t.c.j.SpringJUnit4ClassRunner - SpringJUnit4ClassRunner constructor called with [class de.unit4.financials.periods.PeriodTest].
08:58:12.734 [main] DEBUG o.s.test.context.BootstrapUtils - Instantiating TestContextBootstrapper from class [org.springframework.test.context.support.DefaultTestContextBootstrapper]
08:58:12.736 [main] DEBUG o.s.t.c.s.DefaultTestContextBootstrapper - Found explicit ContextLoader class [org.springframework.boot.test.SpringApplicationContextLoader] for context configuration attributes [ContextConfigurationAttributes#64bf3bbf declaringClass = 'de.unit4.financials.periods.PeriodTest', classes = '{class de.unit4.financials.JUnitConfig}', locations = '{}', inheritLocations = true, initializers = '{}', inheritInitializers = true, name = [null], contextLoaderClass = 'org.springframework.boot.test.SpringApplicationContextLoader']
08:58:12.737 [main] DEBUG o.s.t.c.support.ActiveProfilesUtils - Could not find an 'annotation declaring class' for annotation type [org.springframework.test.context.ActiveProfiles] and class [de.unit4.financials.periods.PeriodTest]
08:58:12.738 [main] DEBUG o.s.t.c.s.DefaultTestContextBootstrapper - #TestExecutionListeners is not present for class [de.unit4.financials.periods.PeriodTest]: using defaults.
08:58:12.743 [main] INFO o.s.t.c.s.DefaultTestContextBootstrapper - Loaded default TestExecutionListener class names from location [META-INF/spring.factories]: [org.springframework.test.context.web.ServletTestExecutionListener, org.springframework.test.context.support.DependencyInjectionTestExecutionListener, org.springframework.test.context.support.DirtiesContextTestExecutionListener, org.springframework.test.context.transaction.TransactionalTestExecutionListener, org.springframework.test.context.jdbc.SqlScriptsTestExecutionListener]
08:58:12.744 [main] INFO o.s.t.c.s.DefaultTestContextBootstrapper - Could not instantiate TestExecutionListener [org.springframework.test.context.web.ServletTestExecutionListener]. Specify custom listener classes or make the default listener classes (and their required dependencies) available. Offending class: [javax/servlet/ServletContext]
08:58:12.745 [main] INFO o.s.t.c.s.DefaultTestContextBootstrapper - Using TestExecutionListeners: [org.springframework.test.context.support.DependencyInjectionTestExecutionListener#544fe44c, org.springframework.test.context.support.DirtiesContextTestExecutionListener#31610302, org.springframework.test.context.transaction.TransactionalTestExecutionListener#71318ec4, org.springframework.test.context.jdbc.SqlScriptsTestExecutionListener#21213b92]
08:58:12.745 [main] DEBUG o.s.t.annotation.ProfileValueUtils - Retrieved #ProfileValueSourceConfiguration [null] for test class [de.unit4.financials.periods.PeriodTest]
08:58:12.745 [main] DEBUG o.s.t.annotation.ProfileValueUtils - Retrieved ProfileValueSource type [class org.springframework.test.annotation.SystemProfileValueSource] for class [de.unit4.financials.periods.PeriodTest]
08:58:12.752 [main] DEBUG o.s.t.c.j.SpringJUnit4ClassRunner - SpringJUnit4ClassRunner constructor called with [class de.unit4.financials.FinancialsUtilApplicationTests].
08:58:12.753 [main] DEBUG o.s.test.context.BootstrapUtils - Instantiating TestContextBootstrapper from class [org.springframework.test.context.support.DefaultTestContextBootstrapper]
08:58:12.762 [main] DEBUG o.s.t.c.s.DefaultTestContextBootstrapper - Found explicit ContextLoader class [org.springframework.boot.test.SpringApplicationContextLoader] for context configuration attributes [ContextConfigurationAttributes#3108bc declaringClass = 'de.unit4.financials.FinancialsUtilApplicationTests', classes = '{class de.unit4.financials.JUnitConfig}', locations = '{}', inheritLocations = true, initializers = '{}', inheritInitializers = true, name = [null], contextLoaderClass = 'org.springframework.boot.test.SpringApplicationContextLoader']
08:58:12.762 [main] DEBUG o.s.t.c.support.ActiveProfilesUtils - Could not find an 'annotation declaring class' for annotation type [org.springframework.test.context.ActiveProfiles] and class [de.unit4.financials.FinancialsUtilApplicationTests]
08:58:12.763 [main] DEBUG o.s.t.c.s.DefaultTestContextBootstrapper - #TestExecutionListeners is not present for class [de.unit4.financials.FinancialsUtilApplicationTests]: using defaults.
08:58:12.769 [main] INFO o.s.t.c.s.DefaultTestContextBootstrapper - Loaded default TestExecutionListener class names from location [META-INF/spring.factories]: [org.springframework.test.context.web.ServletTestExecutionListener, org.springframework.test.context.support.DependencyInjectionTestExecutionListener, org.springframework.test.context.support.DirtiesContextTestExecutionListener, org.springframework.test.context.transaction.TransactionalTestExecutionListener, org.springframework.test.context.jdbc.SqlScriptsTestExecutionListener]
08:58:12.770 [main] INFO o.s.t.c.s.DefaultTestContextBootstrapper - Could not instantiate TestExecutionListener [org.springframework.test.context.web.ServletTestExecutionListener]. Specify custom listener classes or make the default listener classes (and their required dependencies) available. Offending class: [javax/servlet/ServletContext]
08:58:12.770 [main] INFO o.s.t.c.s.DefaultTestContextBootstrapper - Using TestExecutionListeners: [org.springframework.test.context.support.DependencyInjectionTestExecutionListener#335eadca, org.springframework.test.context.support.DirtiesContextTestExecutionListener#210366b4, org.springframework.test.context.transaction.TransactionalTestExecutionListener#eec5a4a, org.springframework.test.context.jdbc.SqlScriptsTestExecutionListener#2b2948e2]
08:58:12.770 [main] DEBUG o.s.t.annotation.ProfileValueUtils - Retrieved #ProfileValueSourceConfiguration [null] for test class [de.unit4.financials.FinancialsUtilApplicationTests]
08:58:12.770 [main] DEBUG o.s.t.annotation.ProfileValueUtils - Retrieved ProfileValueSource type [class org.springframework.test.annotation.SystemProfileValueSource] for class [de.unit4.financials.FinancialsUtilApplicationTests]
08:58:12.786 [main] DEBUG o.s.t.annotation.ProfileValueUtils - Retrieved #ProfileValueSourceConfiguration [null] for test class [de.unit4.financials.periods.PeriodRangeTest]
08:58:12.787 [main] DEBUG o.s.t.annotation.ProfileValueUtils - Retrieved ProfileValueSource type [class org.springframework.test.annotation.SystemProfileValueSource] for class [de.unit4.financials.periods.PeriodRangeTest]
08:58:12.788 [main] DEBUG o.s.t.annotation.ProfileValueUtils - Retrieved #ProfileValueSourceConfiguration [null] for test class [de.unit4.financials.periods.PeriodRangeTest]
08:58:12.789 [main] DEBUG o.s.t.annotation.ProfileValueUtils - Retrieved ProfileValueSource type [class org.springframework.test.annotation.SystemProfileValueSource] for class [de.unit4.financials.periods.PeriodRangeTest]
08:58:12.790 [main] DEBUG o.s.t.annotation.ProfileValueUtils - Retrieved #ProfileValueSourceConfiguration [null] for test class [de.unit4.financials.periods.PeriodRangeTest]
08:58:12.790 [main] DEBUG o.s.t.annotation.ProfileValueUtils - Retrieved ProfileValueSource type [class org.springframework.test.annotation.SystemProfileValueSource] for class [de.unit4.financials.periods.PeriodRangeTest]
08:58:12.793 [main] DEBUG o.s.t.annotation.ProfileValueUtils - Retrieved #ProfileValueSourceConfiguration [null] for test class [de.unit4.financials.periods.PeriodRangeTest]
08:58:12.793 [main] DEBUG o.s.t.annotation.ProfileValueUtils - Retrieved ProfileValueSource type [class org.springframework.test.annotation.SystemProfileValueSource] for class [de.unit4.financials.periods.PeriodRangeTest]
08:58:13.324 [main] DEBUG o.s.t.c.s.DependencyInjectionTestExecutionListener - Performing dependency injection for test context [[DefaultTestContext#6193932a testClass = PeriodRangeTest, testInstance = de.unit4.financials.periods.PeriodRangeTest#647fd8ce, testMethod = [null], testException = [null], mergedContextConfiguration = [MergedContextConfiguration#159f197 testClass = PeriodRangeTest, locations = '{}', classes = '{class de.unit4.financials.JUnitConfig}', contextInitializerClasses = '[]', activeProfiles = '{}', propertySourceLocations = '{}', propertySourceProperties = '{}', contextLoader = 'org.springframework.boot.test.SpringApplicationContextLoader', parent = [null]]]].
08:58:13.395 [main] DEBUG o.s.core.env.StandardEnvironment - Adding [systemProperties] PropertySource with lowest search precedence
08:58:13.398 [main] DEBUG o.s.core.env.StandardEnvironment - Adding [systemEnvironment] PropertySource with lowest search precedence
08:58:13.398 [main] DEBUG o.s.core.env.StandardEnvironment - Initialized StandardEnvironment with PropertySources [systemProperties,systemEnvironment]
08:58:13.401 [main] DEBUG o.s.core.env.StandardEnvironment - Adding [integrationTest] PropertySource with search precedence immediately lower than [systemEnvironment]
. ____ _ __ _ _
/\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \
( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
\\/ ___)| |_)| | | | | || (_| | ) ) ) )
' |____| .__|_| |_|_| |_\__, | / / / /
=========|_|==============|___/=/_/_/_/
:: Spring Boot :: (v1.2.5.RELEASE)
2015-07-20 08:58:13.766 INFO 8552 --- [ main] d.u.financials.periods.PeriodRangeTest : Starting PeriodRangeTest on gsender with PID 8552 (C:\Users\gsender\Documents\workspace-libs\FinancialsUtility\target\test-classes started by GSender in C:\Users\gsender\Documents\workspace-libs\FinancialsUtility)
2015-07-20 08:58:13.812 INFO 8552 --- [ main] s.c.a.AnnotationConfigApplicationContext : Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext#524d6d96: startup date [Mon Jul 20 08:58:13 CEST 2015]; root of context hierarchy
2015-07-20 08:58:15.572 INFO 8552 --- [ main] o.s.j.d.e.EmbeddedDatabaseFactory : Creating embedded database 'FanUtilDB'
2015-07-20 08:58:15.823 INFO 8552 --- [ main] o.s.jdbc.datasource.init.ScriptUtils : Executing SQL script from class path resource [schema.sql]
2015-07-20 08:58:15.851 INFO 8552 --- [ main] o.s.jdbc.datasource.init.ScriptUtils : Executed SQL script from class path resource [schema.sql] in 28 ms.
2015-07-20 08:58:15.851 INFO 8552 --- [ main] o.s.jdbc.datasource.init.ScriptUtils : Executing SQL script from class path resource [data.sql]
2015-07-20 08:58:15.990 INFO 8552 --- [ main] o.s.jdbc.datasource.init.ScriptUtils : Executed SQL script from class path resource [data.sql] in 139 ms.
08:58:16.007 [main] INFO Datasource org.springframework.jdbc.datasource.embedded.EmbeddedDatabaseFactory$EmbeddedDataSourceProxy#60094a13 || de.unit4.financials.JUnitConfig getDataSource 54
08:58:16.197 [main] INFO oas_company = 28 || de.unit4.financials.JUnitConfig countTableRows 74
08:58:16.199 [main] INFO oas_agm = 2 || de.unit4.financials.JUnitConfig countTableRows 74
08:58:16.201 [main] INFO oas_agmlist = 2 || de.unit4.financials.JUnitConfig countTableRows 74
08:58:16.203 [main] INFO com_usr = 52 || de.unit4.financials.JUnitConfig countTableRows 74
08:58:16.205 [main] INFO com_capab = 17 || de.unit4.financials.JUnitConfig countTableRows 74
2015-07-20 08:58:16.271 INFO 8552 --- [ main] o.s.jdbc.datasource.init.ScriptUtils : Executing SQL script from URL [file:/C:/Users/gsender/Documents/workspace-libs/FinancialsUtility/target/test-classes/schema.sql]
2015-07-20 08:58:16.285 INFO 8552 --- [ main] o.s.jdbc.datasource.init.ScriptUtils : Executed SQL script from URL [file:/C:/Users/gsender/Documents/workspace-libs/FinancialsUtility/target/test-classes/schema.sql] in 14 ms.
2015-07-20 08:58:16.289 INFO 8552 --- [ main] o.s.jdbc.datasource.init.ScriptUtils : Executing SQL script from URL [file:/C:/Users/gsender/Documents/workspace-libs/FinancialsUtility/target/test-classes/data.sql]
2015-07-20 08:58:16.391 INFO 8552 --- [ main] o.s.jdbc.datasource.init.ScriptUtils : Executed SQL script from URL [file:/C:/Users/gsender/Documents/workspace-libs/FinancialsUtility/target/test-classes/data.sql] in 101 ms.
2015-07-20 08:58:16.555 INFO 8552 --- [ main] j.LocalContainerEntityManagerFactoryBean : Building JPA container EntityManagerFactory for persistence unit 'default'
2015-07-20 08:58:16.590 INFO 8552 --- [ main] o.hibernate.jpa.internal.util.LogHelper : HHH000204: Processing PersistenceUnitInfo [
name: default
...]
2015-07-20 08:58:16.682 INFO 8552 --- [ main] org.hibernate.Version : HHH000412: Hibernate Core {4.3.10.Final}
2015-07-20 08:58:16.684 INFO 8552 --- [ main] org.hibernate.cfg.Environment : HHH000206: hibernate.properties not found
2015-07-20 08:58:16.686 INFO 8552 --- [ main] org.hibernate.cfg.Environment : HHH000021: Bytecode provider name : javassist
2015-07-20 08:58:16.979 INFO 8552 --- [ main] o.hibernate.annotations.common.Version : HCANN000001: Hibernate Commons Annotations {4.0.5.Final}
2015-07-20 08:58:17.136 INFO 8552 --- [ main] org.hibernate.dialect.Dialect : HHH000400: Using dialect: org.hibernate.dialect.H2Dialect
2015-07-20 08:58:17.424 INFO 8552 --- [ main] o.h.h.i.ast.ASTQueryTranslatorFactory : HHH000397: Using ASTQueryTranslatorFactory
2015-07-20 08:58:18.153 INFO 8552 --- [ main] org.hibernate.tool.hbm2ddl.SchemaExport : HHH000227: Running hbm2ddl schema export
2015-07-20 08:58:18.178 INFO 8552 --- [ main] org.hibernate.tool.hbm2ddl.SchemaExport : HHH000230: Schema export complete
2015-07-20 08:58:19.063 INFO 8552 --- [ main] d.u.financials.periods.PeriodRangeTest : Started PeriodRangeTest in 5.659 seconds (JVM running for 7.356)
08:58:19.099 [main] INFO Datasource org.springframework.jdbc.datasource.embedded.EmbeddedDatabaseFactory$EmbeddedDataSourceProxy#60094a13 || de.unit4.financials.periods.CurrentPeriodDBFactory getCurrentPeriod 63
08:58:19.127 [main] INFO oas_company = 0 || de.unit4.financials.FinancialsUtilApplicationTests countTableRows 40
08:58:19.128 [main] INFO oas_agm = 0 || de.unit4.financials.FinancialsUtilApplicationTests countTableRows 40
08:58:19.128 [main] INFO oas_agmlist = 0 || de.unit4.financials.FinancialsUtilApplicationTests countTableRows 40
08:58:19.129 [main] INFO com_usr = 52 || de.unit4.financials.FinancialsUtilApplicationTests countTableRows 40
08:58:19.130 [main] INFO com_capab = 17 || de.unit4.financials.FinancialsUtilApplicationTests countTableRows 40
2015-07-20 08:58:19.134 INFO 8552 --- [ Thread-1] s.c.a.AnnotationConfigApplicationContext : Closing org.springframework.context.annotation.AnnotationConfigApplicationContext#524d6d96: startup date [Mon Jul 20 08:58:13 CEST 2015]; root of context hierarchy
2015-07-20 08:58:19.386 INFO 8552 --- [ Thread-1] j.LocalContainerEntityManagerFactoryBean : Closing JPA EntityManagerFactory for persistence unit 'default'
2015-07-20 08:58:19.387 INFO 8552 --- [ Thread-1] org.hibernate.tool.hbm2ddl.SchemaExport : HHH000227: Running hbm2ddl schema export
2015-07-20 08:58:19.398 INFO 8552 --- [ Thread-1] org.hibernate.tool.hbm2ddl.SchemaExport : HHH000230: Schema export complete
For Table creation I use (example):
drop table IF EXISTS oas_company;
CREATE TABLE IF NOT EXISTS oas_company (
code varchar(12) NOT NULL,
code_cs int NOT NULL,
For data inserts I use:
delete from oas_agm;
INSERT INTO oas_agm(code, tstamp, name, sname, adddate, deldate, moddate, usrname)
VALUES('U4SW-JUNIT-1', 1, 'Account Test Entwicklung', 'debit', '2015-07-15 00:00:00.0', NULL, '2015-07-15 15:31:39.0', 'INSTALL');
Thanks for any help on this confusing result.
I found the solution: Hibernate recreates the tables after initialisation (by Spring Boot). I added spring.jpa.hibernate.ddl-auto=none to application.properties and the problem was gone.
This is the first time this happened in a project. It is also unclear why only three out of five tables are recreated. At first I assumed it was due to an "old" database file, but then I switched from H2 to HSQL and the problem stayed. An error from Hibernate showed me the path to solution (unable to delete...).
I have a spring boot application where it listens to a RabbitMQ queue.
The problem is when i run my application it hangs at particular step at
hibernate and it takes around 10 minutes to further continue.
Below is where it hangs
INFO [] org.apache.catalina.core.ContainerBase.[Tomcat].[localhost].[/] - Initializing Spring embedded WebApplicationContext
INFO [] org.springframework.web.context.ContextLoader - Root WebApplicationContext: initialization completed in 3338 ms
INFO [] org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor - Initializing ExecutorService 'metricsExecutor'
INFO [] org.springframework.boot.context.embedded.ServletRegistrationBean - Mapping servlet: 'dispatcherServlet' to [/]
INFO [] org.springframework.boot.context.embedded.FilterRegistrationBean - Mapping filter: 'metricFilter' to: [/*]
INFO [] org.springframework.boot.context.embedded.FilterRegistrationBean - Mapping filter: 'characterEncodingFilter' to: [/*]
INFO [] org.springframework.boot.context.embedded.FilterRegistrationBean - Mapping filter: 'webRequestLoggingFilter' to: [/*]
INFO [] org.springframework.boot.context.embedded.FilterRegistrationBean - Mapping filter: 'hiddenHttpMethodFilter' to: [/*]
INFO [] org.springframework.boot.context.embedded.FilterRegistrationBean - Mapping filter: 'applicationContextIdFilter' to: [/*]
[main] INFO [] org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean - Building JPA container EntityManagerFactory for persistence unit 'default'
[main] INFO [] org.hibernate.jpa.internal.util.LogHelper - HHH000204: Processing PersistenceUnitInfo [
name: default
...]
[main] INFO [] org.hibernate.Version - HHH000412: Hibernate Core {4.3.7.Final}
[main] INFO [] org.hibernate.cfg.Environment - HHH000205: Loaded properties from resource hibernate.properties: {hibernate.dialect=org.hibernate.dialect.PostgreSQLDialect, hibernate.show_sql=true, hibernate.bytecode.use_reflection_optimizer=false, hibernate.format_sql=true, hibernate.ejb.naming_strategy=org.hibernate.cfg.DefaultNamingStrategy}
[main] INFO [] org.hibernate.cfg.Environment - HHH000021: Bytecode provider name : javassist
[main] INFO [] org.hibernate.annotations.common.Version - HCANN000001: Hibernate Commons Annotations {4.0.5.Final}
Below is the timing info
2015-07-08 09:31:16,714 [main] INFO [] org.hibernate.Version - HHH000412: Hibernate Core {4.3.7.Final}
2015-07-08 09:31:16,717 [main] INFO [] org.hibernate.cfg.Environment - HHH000205: Loaded properties from resource hibernate.properties: {hibernate.dialect=org.hibernate.dialect.PostgreSQLDialect, hibernate.show_sql=true, hibernate.bytecode.use_reflection_optimizer=false, hibernate.format_sql=true, hibernate.ejb.naming_strategy=org.hibernate.cfg.DefaultNamingStrategy}
2015-07-08 09:31:16,717 [main] INFO [] org.hibernate.cfg.Environment - HHH000021: Bytecode provider name : javassist
2015-07-08 09:31:16,895 [main] INFO [] org.hibernate.annotations.common.Version - HCANN000001: Hibernate Commons Annotations {4.0.5.Final}
In the above line it hangs for 8 min, and then it generates the below log waiting for any message.
15-07-14 15:36:22,917 [main] INFO [] com.test.myApp.reporting.service.Application
- Starting Application on hyd-rupakular-m.local with PID 654 (/Users/myUser/code/myRepo/target/classes started by rupakulr in /Users/myuser/myRepo/xyzabc)
2015-07-14 15:36:22,966 [main] INFO [] org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext - Refreshing org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext#5fa0d903: startup date [Tue Jul 14 15:36:22 IST 2015]; root of context hierarchy
2015-07-14 15:36:24,023 [main] INFO [] org.springframework.beans.factory.xml.XmlBeanDefinitionReader - Loading XML bean definitions from class path resource [spring-integration-context.xml]
2015-07-14 15:36:24,332 [main] INFO [] org.springframework.beans.factory.config.PropertiesFactoryBean - Loading properties file from URL [jar:file:/Users/rupakulr/.m2/repository/org/springframework/integration/spring-integration-core/4.1.2.RELEASE/spring-integration-core-4.1.2.RELEASE.jar!/META-INF/spring.integration.default.properties]
2015-07-14 15:45:09,646 [main] INFO [] org.springframework.boot.actuate.endpoint.mvc.EndpointHandlerMapping - Mapped "{[/manage/env],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke()
2015-07-14 15:45:09,646 [main] INFO [] org.springframework.boot.actuate.endpoint.mvc.EndpointHandlerMapping - Mapped "{[/manage/metrics/{name:.*}],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.MetricsMvcEndpoint.value(java.lang.String)
2015-07-14 15:45:09,646 [main] INFO [] org.springframework.boot.actuate.endpoint.mvc.EndpointHandlerMapping - Mapped "{[/manage/metrics],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke()
2015-07-14 15:45:09,647 [main] INFO [] org.springframework.boot.actuate.endpoint.mvc.EndpointHandlerMapping - Mapped "{[/manage/mappings],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke()
2015-07-14 15:45:09,647 [main] INFO [] org.springframework.boot.actuate.endpoint.mvc.EndpointHandlerMapping - Mapped "{[/manage/trace],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke()
2015-07-14 15:45:09,647 [main] INFO [] org.springframework.boot.actuate.endpoint.mvc.EndpointHandlerMapping - Mapped "{[/manage/shutdown],methods=[POST],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.ShutdownMvcEndpoint.invoke()
2015-07-14 15:45:09,647 [main] INFO [] org.springframework.boot.actuate.endpoint.mvc.EndpointHandlerMapping - Mapped "{[/manage/beans],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke()
2015-07-14 15:45:09,700 [main] INFO [] org.springframework.jmx.export.annotation.AnnotationMBeanExporter - Registering beans for JMX exposure on startup
2015-07-14 15:45:09,708 [main] INFO [] org.springframework.jmx.export.annotation.AnnotationMBeanExporter - Bean with name 'org.springframework.integration.channel.interceptor.WireTap#0' has been autodetected for JMX exposure
2015-07-14 15:45:09,708 [main] INFO [] org.springframework.jmx.export.annotation.AnnotationMBeanExporter - Bean with name 'org.springframework.integration.channel.interceptor.WireTap#1' has been autodetected for JMX exposure
2015-07-14 15:45:09,709 [main] INFO [] org.springframework.jmx.export.annotation.AnnotationMBeanExporter - Bean with name 'org.springframework.integration.config.RouterFactoryBean#0' has been autodetected for JMX exposure
2015-07-14 15:45:09,712 [main] INFO [] org.springframework.jmx.export.annotation.AnnotationMBeanExporter - Located managed bean 'org.springframework.integration.channel.interceptor.WireTap#0': registering with JMX server as MBean [org.springframework.integration.channel.interceptor:name=org.springframework.integration.channel.interceptor.WireTap#0,type=WireTap]
2015-07-14 15:45:09,726 [main] INFO [] org.springframework.jmx.export.annotation.AnnotationMBeanExporter - Located managed bean 'org.springframework.integration.channel.interceptor.WireTap#1': registering with JMX server as MBean [org.springframework.integration.channel.interceptor:name=org.springframework.integration.channel.interceptor.WireTap#1,type=WireTap]
2015-07-14 15:45:09,730 [main] INFO [] org.springframework.jmx.export.annotation.AnnotationMBeanExporter - Located managed bean 'org.springframework.integration.config.RouterFactoryBean#0': registering with JMX server as MBean [org.springframework.integration.router:name=org.springframework.integration.config.RouterFactoryBean#0,type=HeaderValueRouter]
2015-07-14 15:45:09,745 [main] INFO [] org.springframework.boot.actuate.endpoint.jmx.EndpointMBeanExporter - Registering beans for JMX exposure on startup
2015-07-14 15:45:09,749 [main] INFO [] org.springframework.context.support.DefaultLifecycleProcessor - Starting beans in phase -2147483648
2015-07-14 15:45:09,750 [main] INFO [] org.springframework.context.support.DefaultLifecycleProcessor - Starting beans in phase 0
2015-07-14 15:45:09,751 [main] INFO [] org.springframework.integration.endpoint.EventDrivenConsumer - Adding {router} as a subscriber to the 'reporting-dealer-compliance-dealer-list-channel' channel
2015-07-14 15:45:09,751 [main] INFO [] org.springframework.integration.channel.DirectChannel - Channel 'application:8204.reporting-dealer-compliance-dealer-list-channel' has 1 subscriber(s).
2015-07-14 15:45:09,751 [main] INFO [] org.springframework.integration.endpoint.EventDrivenConsumer - started org.springframework.integration.config.ConsumerEndpointFactoryBean#0
2015-07-14 15:45:09,751 [main] INFO [] org.springframework.integration.endpoint.EventDrivenConsumer - Adding {object-to-json-transformer} as a subscriber to the 'reporting-dealer-compliance-dealer-compliance-json-channel' channel
2015-07-14 15:45:09,751 [main] INFO [] org.springframework.integration.channel.DirectChannel - Channel 'application:8204.reporting-dealer-compliance-dealer-compliance-json-channel' has 1 subscriber(s).
2015-07-14 15:45:09,751 [main] INFO [] org.springframework.integration.endpoint.EventDrivenConsumer - started org.springframework.integration.config.ConsumerEndpointFactoryBean#1
2015-07-14 15:45:09,751 [main] INFO [] org.springframework.integration.endpoint.EventDrivenConsumer - Adding {logging-channel-adapter:logging-channel.adapter} as a subscriber to the 'logging-channel' channel
2015-07-14 15:45:09,751 [main] INFO [] org.springframework.integration.channel.DirectChannel - Channel 'application:8204.logging-channel' has 1 subscriber(s).
2015-07-14 15:45:09,751 [main] INFO [] org.springframework.integration.endpoint.EventDrivenConsumer - started logging-channel.adapter
2015-07-14 15:45:09,751 [main] INFO [] org.springframework.integration.endpoint.EventDrivenConsumer - Adding {stream:outbound-channel-adapter(character):std-out-channel.adapter} as a subscriber to the 'std-out-channel' channel
2015-07-14 15:45:09,751 [main] INFO [] org.springframework.integration.channel.DirectChannel - Channel 'application:8204.std-out-channel' has 1 subscriber(s).
2015-07-14 15:45:09,751 [main] INFO [] org.springframework.integration.endpoint.EventDrivenConsumer - started std-out-channel.adapter
2015-07-14 15:45:10,968 [main] INFO [] org.springframework.integration.amqp.inbound.AmqpInboundGateway - started org.springframework.integration.amqp.inbound.AmqpInboundGateway#0
2015-07-14 15:45:10,968 [main] INFO [] org.springframework.integration.endpoint.EventDrivenConsumer - Adding {amqp:outbound-channel-adapter:invalidMessageChannelAdapter} as a subscriber to the 'invalid-message-channel' channel
2015-07-14 15:45:10,968 [main] INFO [] org.springframework.integration.channel.DirectChannel - Channel 'application:8204.invalid-message-channel' has 2 subscriber(s).
2015-07-14 15:45:10,968 [main] INFO [] org.springframework.integration.endpoint.EventDrivenConsumer - started invalidMessageChannelAdapter
2015-07-14 15:45:10,968 [main] INFO [] org.springframework.integration.endpoint.EventDrivenConsumer - Adding {service-activator} as a subscriber to the 'failed-channel' channel
2015-07-14 15:45:10,968 [main] INFO [] org.springframework.integration.channel.DirectChannel - Channel 'application:8204.failed-channel' has 1 subscriber(s).
2015-07-14 15:45:10,969 [main] INFO [] org.springframework.integration.endpoint.EventDrivenConsumer - started org.springframework.integration.config.ConsumerEndpointFactoryBean#2
2015-07-14 15:45:10,969 [main] INFO [] org.springframework.integration.endpoint.EventDrivenConsumer - Adding {amqp:outbound-channel-adapter:failedMessageChannelAdapter} as a subscriber to the 'failed-channel' channel
2015-07-14 15:45:10,969 [main] INFO [] org.springframework.integration.channel.DirectChannel - Channel 'application:8204.failed-channel' has 2 subscriber(s).
2015-07-14 15:45:10,969 [main] INFO [] org.springframework.integration.endpoint.EventDrivenConsumer - started failedMessageChannelAdapter
2015-07-14 15:45:10,969 [main] INFO [] org.springframework.integration.handler.MessageHandlerChain - started org.springframework.integration.handler.MessageHandlerChain#0
2015-07-14 15:45:10,969 [main] INFO [] org.springframework.integration.endpoint.EventDrivenConsumer - Adding {chain} as a subscriber to the 'reporting-dealer-compliance-inbound-channel' channel
2015-07-14 15:45:10,969 [main] INFO [] org.springframework.integration.channel.DirectChannel - Channel 'application:8204.reporting-dealer-compliance-inbound-channel' has 1 subscriber(s).
2015-07-14 15:45:10,969 [main] INFO [] org.springframework.integration.endpoint.EventDrivenConsumer - started org.springframework.integration.config.ConsumerEndpointFactoryBean#3
2015-07-14 15:45:10,969 [main] INFO [] org.springframework.integration.handler.MessageHandlerChain - started org.springframework.integration.handler.MessageHandlerChain#1
2015-07-14 15:45:10,969 [main] INFO [] org.springframework.integration.endpoint.EventDrivenConsumer - Adding {chain} as a subscriber to the 'prepare-csv' channel
2015-07-14 15:45:10,969 [main] INFO [] org.springframework.integration.channel.DirectChannel - Channel 'application:8204.prepare-csv' has 1 subscriber(s).
2015-07-14 15:45:10,969 [main] INFO [] org.springframework.integration.endpoint.EventDrivenConsumer - started org.springframework.integration.config.ConsumerEndpointFactoryBean#4
2015-07-14 15:45:10,971 [main] INFO [] org.springframework.boot.actuate.endpoint.jmx.EndpointMBeanExporter - Located managed bean 'requestMappingEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=requestMappingEndpoint]
2015-07-14 15:45:10,979 [main] INFO [] org.springframework.boot.actuate.endpoint.jmx.EndpointMBeanExporter - Located managed bean 'environmentEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=environmentEndpoint]
2015-07-14 15:45:10,986 [main] INFO [] org.springframework.boot.actuate.endpoint.jmx.EndpointMBeanExporter - Located managed bean 'healthEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=healthEndpoint]
2015-07-14 15:45:10,992 [main] INFO [] org.springframework.boot.actuate.endpoint.jmx.EndpointMBeanExporter - Located managed bean 'beansEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=beansEndpoint]
2015-07-14 15:45:10,997 [main] INFO [] org.springframework.boot.actuate.endpoint.jmx.EndpointMBeanExporter - Located managed bean 'infoEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=infoEndpoint]
2015-07-14 15:45:11,003 [main] INFO [] org.springframework.boot.actuate.endpoint.jmx.EndpointMBeanExporter - Located managed bean 'metricsEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=metricsEndpoint]
2015-07-14 15:45:11,009 [main] INFO [] org.springframework.boot.actuate.endpoint.jmx.EndpointMBeanExporter - Located managed bean 'traceEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=traceEndpoint]
2015-07-14 15:45:11,014 [main] INFO [] org.springframework.boot.actuate.endpoint.jmx.EndpointMBeanExporter - Located managed bean 'dumpEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=dumpEndpoint]
2015-07-14 15:45:11,020 [main] INFO [] org.springframework.boot.actuate.endpoint.jmx.EndpointMBeanExporter - Located managed bean 'autoConfigurationAuditEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=autoConfigurationAuditEndpoint]
2015-07-14 15:45:11,026 [main] INFO [] org.springframework.boot.actuate.endpoint.jmx.EndpointMBeanExporter - Located managed bean 'shutdownEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=shutdownEndpoint]
2015-07-14 15:45:11,032 [main] INFO [] org.springframework.boot.actuate.endpoint.jmx.EndpointMBeanExporter - Located managed bean 'configurationPropertiesReportEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=configurationPropertiesReportEndpoint]
2015-07-14 15:45:11,037 [main] INFO [] org.springframework.integration.endpoint.EventDrivenConsumer - Adding {logging-channel-adapter:_org.springframework.integration.errorLogger} as a subscriber to the 'errorChannel' channel
2015-07-14 15:45:11,037 [main] INFO [] org.springframework.integration.channel.PublishSubscribeChannel - Channel 'application:8204.errorChannel' has 1 subscriber(s).
2015-07-14 15:45:11,037 [main] INFO [] org.springframework.integration.endpoint.EventDrivenConsumer - started _org.springframework.integration.errorLogger
2015-07-14 15:45:11,037 [main] INFO [] org.springframework.context.support.DefaultLifecycleProcessor - Starting beans in phase 2147483647
2015-07-14 15:45:11,089 [main] INFO [] org.apache.coyote.http11.Http11NioProtocol - Initializing ProtocolHandler ["http-nio-8204"]
2015-07-14 15:45:11,095 [main] INFO [] org.apache.coyote.http11.Http11NioProtocol - Starting ProtocolHandler ["http-nio-8204"]
2015-07-14 15:45:11,100 [main] INFO [] org.apache.tomcat.util.net.NioSelectorPool - Using a shared selector for servlet write/read
2015-07-14 15:45:11,112 [main] INFO [] org.springframework.boot.context.embedded.tomcat.TomcatEmbeddedServletContainer - Tomcat started on port(s): 8204 (http)
2015-07-14 15:45:11,115 [main] INFO [] com.test.myApp.reporting.service.Application - Started Application in 528.444 seconds (JVM running for 529.101)
we are experiencing a lot of problem when developing our app, every time we make some changes we have to wait 8 min to test our changes.
In case you are running your application on Linux, try to add this property to the java commandline:
-Djava.security.egd=file:/dev/./urandom
E.g.:
/usr/bin/java -Xmx75m -Djava.security.egd=file:/dev/./urandom -jar app.jar
I had a similar problem, although it did not take 8 minutes but only around 3 to start the application, and this solved it for me.
I think I faced with the similar issue. In my case time to start was proportional to the database size. And the entire problem was that hibernate (we use hibernate-core-5.0.2-Final) loads full metadata from DB just for obtaining Dialect value.
We now use following property to disable:
spring.jpa.properties.hibernate.temp.use_jdbc_metadata_defaults=false
But make sure you specify 'spring.jpa.database-platform' value correct.
Here is related part of JdbcEnvironmentInitiator.java file:
#Override
public JdbcEnvironment initiateService(Map configurationValues, ServiceRegistryImplementor registry) {
final DialectFactory dialectFactory = registry.getService( DialectFactory.class );
// 'hibernate.temp.use_jdbc_metadata_defaults' is a temporary magic value.
// The need for it is intended to be alleviated with future development, thus it is
// not defined as an Environment constant...
//
// it is used to control whether we should consult the JDBC metadata to determine
// certain Settings default values; it is useful to *not* do this when the database
// may not be available (mainly in tools usage).
boolean useJdbcMetadata = ConfigurationHelper.getBoolean(
"hibernate.temp.use_jdbc_metadata_defaults",
configurationValues,
true
);
if ( useJdbcMetadata ) {
final JdbcConnectionAccess jdbcConnectionAccess = buildJdbcConnectionAccess( configurationValues, registry );
try {
final Connection connection = jdbcConnectionAccess.obtainConnection();
try {
final DatabaseMetaData dbmd = connection.getMetaData();
if ( log.isDebugEnabled() ) {
log.debugf(
"Database ->\n"
+ " name : %s\n"
+ " version : %s\n"
+ " major : %s\n"
+ " minor : %s",
dbmd.getDatabaseProductName(),
dbmd.getDatabaseProductVersion(),
dbmd.getDatabaseMajorVersion(),
dbmd.getDatabaseMinorVersion()
);
log.debugf(
"Driver ->\n"
+ " name : %s\n"
+ " version : %s\n"
+ " major : %s\n"
+ " minor : %s",
dbmd.getDriverName(),
dbmd.getDriverVersion(),
dbmd.getDriverMajorVersion(),
dbmd.getDriverMinorVersion()
);
log.debugf( "JDBC version : %s.%s", dbmd.getJDBCMajorVersion(), dbmd.getJDBCMinorVersion() );
}
Dialect dialect = dialectFactory.buildDialect(
configurationValues,
new DialectResolutionInfoSource() {
#Override
public DialectResolutionInfo getDialectResolutionInfo() {
try {
return new DatabaseMetaDataDialectResolutionInfoAdapter( connection.getMetaData() );
}
catch ( SQLException sqlException ) {
throw new HibernateException(
"Unable to access java.sql.DatabaseMetaData to determine appropriate Dialect to use",
sqlException
);
}
}
}
);
return new JdbcEnvironmentImpl(
registry,
dialect,
dbmd
);
}
catch (SQLException e) {
log.unableToObtainConnectionMetadata( e.getMessage() );
}
finally {
try {
jdbcConnectionAccess.releaseConnection( connection );
}
catch (SQLException ignore) {
}
}
}
catch (Exception e) {
log.unableToObtainConnectionToQueryMetadata( e.getMessage() );
}
}
// if we get here, either we were asked to not use JDBC metadata or accessing the JDBC metadata failed.
return new JdbcEnvironmentImpl( registry, dialectFactory.buildDialect( configurationValues, null ) );
}
I am writing one mapper class which should read files from a HDFS Location and create a record (using custom class) for each file. The Code for Mapper class :-
package com.nayan.bigdata.hadoop;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.log4j.Logger;
/**
* #file : FileToRecordMapper.java
* #author : nayan
* #version : 1.0.0
* #date : 27-Aug-2013 12:13:44 PM
* #desc : Mapper class to read files and convert it into records.
*/
public class FileToRecordMapper extends
Mapper<LongWritable, Text, Text, RecordWritable> {
private static Logger logger = Logger.getLogger(FileToRecordMapper.class);
List<Path> allPaths;
FileSystem fs;
#Override
protected void cleanup(Context context)
throws IOException, InterruptedException {
logger.info("Inside cleanup method.");
}
#Override
protected void map(LongWritable key, Text value,
Context context)
throws IOException, InterruptedException {
logger.info("Starting map method of FileToRecordMapper class.");
for(Path path : allPaths) {
FSDataInputStream in = this.fs.open(path);
Text filePath = new Text(path.getName());
Text directoryPath = new Text(path.getParent().getName());
Text filename = new Text(path.getName().substring(path.getName().lastIndexOf('/') + 1,
path.getName().length()));
byte[] b = new byte[1024];
StringBuilder contentBuilder = new StringBuilder();
while ((in.read(b)) > 0) {
contentBuilder.append(new String(b, "UTF-8"));
}
Text fileContent = new Text(contentBuilder.toString());
in.close();
RecordWritable record = new RecordWritable(filePath, filename,
fileContent, new LongWritable(System.currentTimeMillis()));
logger.info("Record Created : " + record);
context.write(directoryPath, record);
logger.info("map method of FileToRecordMapper class completed.");
}
}
#Override
public void run(Context context)
throws IOException, InterruptedException {
logger.info("Inside run method.");
}
#Override
protected void setup(Context context)
throws IOException, InterruptedException {
logger.info("Inside setup method.");
try {
logger.info("Starting configure method of FileToRecordMapper class.");
fs = FileSystem.get(context.getConfiguration());
Path path = new Path(context.getConfiguration().get("mapred.input.dir"));
allPaths = getAllPaths(path);
} catch (IOException e) {
logger.error("Error while fetching paths.", e);
}
logger.info("Paths : " + ((null != allPaths) ? allPaths : "null"));
logger.info("configure method of FileToRecordMapper class completed.");
super.setup(context);
}
private List<Path> getAllPaths(Path path) throws IOException {
ArrayList<Path> paths = new ArrayList<Path>();
getAllPaths(path, paths);
return paths;
}
private void getAllPaths(Path path, List<Path> paths) throws IOException{
try {
if (!this.fs.isFile(path)) {
for (FileStatus s : fs.listStatus(path)) {
getAllPaths(s.getPath(), paths);
}
} else {
paths.add(path);
}
} catch (IOException e) {
logger.error("File System Exception.", e);
throw e;
}
}
}
Class for record is :-
package com.nayan.bigdata.hadoop;
import java.io.DataInput;
import java.io.DataOutput;
import java.io.IOException;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.io.Writable;
/**
* #file : RecordWritable.java
* #author : nayan
* #version : 1.0.0
* #date : 21-Aug-2013 1:53:12 PM
* #desc : Class to create a record in Accumulo
*/
public class RecordWritable implements Writable {
private Text filePath;
private Text fileName;
private Text fileContent;
private LongWritable timeStamp;
public RecordWritable() {
this.filePath = new Text();
this.fileName = new Text();
this.fileContent = new Text();
this.timeStamp = new LongWritable(System.currentTimeMillis());
}
/**
* #param filePath
* #param fileName
* #param fileContent
* #param timeStamp
*/
public RecordWritable(Text filePath, Text fileName, Text fileContent,
LongWritable timeStamp) {
this.filePath = filePath;
this.fileName = fileName;
this.fileContent = fileContent;
this.timeStamp = timeStamp;
}
public Text getFilePath() {
return filePath;
}
public void setFilePath(Text filePath) {
this.filePath = filePath;
}
public Text getFileName() {
return fileName;
}
public void setFileName(Text fileName) {
this.fileName = fileName;
}
public Text getFileContent() {
return fileContent;
}
public void setFileContent(Text fileContent) {
this.fileContent = fileContent;
}
public LongWritable getTimeStamp() {
return timeStamp;
}
public void setTimeStamp(LongWritable timeStamp) {
this.timeStamp = timeStamp;
}
#Override
public int hashCode() {
return this.filePath.getLength() + this.fileName.getLength() + this.fileContent.getLength();
}
#Override
public boolean equals(Object obj) {
if(obj instanceof RecordWritable) {
RecordWritable otherRecord = (RecordWritable) obj;
return this.filePath.equals(otherRecord.filePath) && this.fileName.equals(otherRecord.fileName);
}
return false;
}
#Override
public String toString() {
StringBuilder recordDesc = new StringBuilder("Record Details ::\t");
recordDesc.append("File Path + ").append(this.filePath).append("\t");
recordDesc.append("File Name + ").append(this.fileName).append("\t");
recordDesc.append("File Content Length + ").append(this.fileContent.getLength()).append("\t");
recordDesc.append("File TimeStamp + ").append(this.timeStamp).append("\t");
return recordDesc.toString();
}
#Override
public void readFields(DataInput din) throws IOException {
filePath.readFields(din);
fileName.readFields(din);
fileContent.readFields(din);
timeStamp.readFields(din);
}
#Override
public void write(DataOutput dout) throws IOException {
filePath.write(dout);
fileName.write(dout);
fileContent.write(dout);
timeStamp.write(dout);
}
}
Job Runner class :-
package com.nayan.bigdata.hadoop;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.conf.Configured;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.util.Tool;
import org.apache.hadoop.util.ToolRunner;
import org.apache.log4j.Logger;
/**
* #file : HadoopJobRunner.java
* #author : nayan
* #version : 1.0.0
* #date : 22-Aug-2013 12:45:15 PM
* #desc : Class to run Hadoop MR job.
*/
public class HadoopJobRunner extends Configured implements Tool {
private static Logger logger = Logger.getLogger(HadoopJobRunner.class);
/**
* #param args
* #throws Exception
*/
public static void main(String[] args) throws Exception {
int res = ToolRunner.run(new Configuration(), new HadoopJobRunner(), args);
System.exit(res);
}
#Override
public int run(String[] arg0) throws Exception {
logger.info("Initiating Hadoop Job.");
Configuration conf = new Configuration(true);
conf.setStrings("mapred.output.dir", arg0[1]);
conf.setStrings("mapred.input.dir", arg0[0]);
Job mrJob = new Job(conf, "FileRecordsJob");
mrJob.setJarByClass(HadoopJobRunner.class);
mrJob.setMapOutputKeyClass(Text.class);
mrJob.setMapOutputValueClass(RecordWritable.class);
mrJob.setMapperClass(FileToRecordMapper.class);
mrJob.setReducerClass(FileRecordsReducer.class);
mrJob.setOutputKeyClass(Text.class);
mrJob.setOutputValueClass(RecordWritable.class);
logger.info("MapRed Job Configuration : " + mrJob.getConfiguration().toString());
logger.info("Input Path : " + mrJob.getConfiguration().get("mapred.input.dir"));
return mrJob.waitForCompletion(true) ? 0 : 1;
}
}
Pom file for the project :-
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.nayan.bigdata</groupId>
<artifactId>BigDataOperations</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<name>BigDataOperations</name>
<properties>
<hadoop.version>0.20.2</hadoop.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-all</artifactId>
<version>1.3</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>com.nayan.bigdata.hadoop.HadoopJobRunner</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
</project>
When I run the jar, I am getting output on console :-
[root#koversevm tmp]# hadoop jar BigDataOperations-1.0-SNAPSHOT.jar /usr/hadoop/sample /usr/hadoop/jobout
13/08/28 18:33:57 INFO hadoop.HadoopJobRunner: Initiating Hadoop Job.
13/08/28 18:33:57 INFO hadoop.HadoopJobRunner: Setting the input/output path.
13/08/28 18:33:57 INFO hadoop.HadoopJobRunner: MapRed Job Configuration : Configuration: core-default.xml, core-site.xml, mapred-default.xml, mapred-site.xml
13/08/28 18:33:57 INFO hadoop.HadoopJobRunner: Input Path : null
13/08/28 18:33:58 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
13/08/28 18:33:58 INFO input.FileInputFormat: Total input paths to process : 8
13/08/28 18:33:58 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
13/08/28 18:33:58 WARN snappy.LoadSnappy: Snappy native library not loaded
13/08/28 18:33:58 INFO mapred.JobClient: Running job: job_201308281800_0008
13/08/28 18:33:59 INFO mapred.JobClient: map 0% reduce 0%
13/08/28 18:34:06 INFO mapred.JobClient: map 25% reduce 0%
13/08/28 18:34:13 INFO mapred.JobClient: map 50% reduce 0%
13/08/28 18:34:17 INFO mapred.JobClient: map 75% reduce 0%
13/08/28 18:34:23 INFO mapred.JobClient: map 100% reduce 0%
13/08/28 18:34:24 INFO mapred.JobClient: map 100% reduce 33%
13/08/28 18:34:26 INFO mapred.JobClient: map 100% reduce 100%
13/08/28 18:34:27 INFO mapred.JobClient: Job complete: job_201308281800_0008
13/08/28 18:34:27 INFO mapred.JobClient: Counters: 25
13/08/28 18:34:27 INFO mapred.JobClient: Job Counters
13/08/28 18:34:27 INFO mapred.JobClient: Launched reduce tasks=1
13/08/28 18:34:27 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=44066
13/08/28 18:34:27 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0
13/08/28 18:34:27 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0
13/08/28 18:34:27 INFO mapred.JobClient: Launched map tasks=8
13/08/28 18:34:27 INFO mapred.JobClient: Data-local map tasks=8
13/08/28 18:34:27 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=19034
13/08/28 18:34:27 INFO mapred.JobClient: FileSystemCounters
13/08/28 18:34:27 INFO mapred.JobClient: FILE_BYTES_READ=6
13/08/28 18:34:27 INFO mapred.JobClient: HDFS_BYTES_READ=1011
13/08/28 18:34:27 INFO mapred.JobClient: FILE_BYTES_WRITTEN=549207
13/08/28 18:34:27 INFO mapred.JobClient: Map-Reduce Framework
13/08/28 18:34:27 INFO mapred.JobClient: Map input records=0
13/08/28 18:34:27 INFO mapred.JobClient: Reduce shuffle bytes=48
13/08/28 18:34:27 INFO mapred.JobClient: Spilled Records=0
13/08/28 18:34:27 INFO mapred.JobClient: Map output bytes=0
13/08/28 18:34:27 INFO mapred.JobClient: CPU time spent (ms)=3030
13/08/28 18:34:27 INFO mapred.JobClient: Total committed heap usage (bytes)=1473413120
13/08/28 18:34:27 INFO mapred.JobClient: Combine input records=0
13/08/28 18:34:27 INFO mapred.JobClient: SPLIT_RAW_BYTES=1011
13/08/28 18:34:27 INFO mapred.JobClient: Reduce input records=0
13/08/28 18:34:27 INFO mapred.JobClient: Reduce input groups=0
13/08/28 18:34:27 INFO mapred.JobClient: Combine output records=0
13/08/28 18:34:27 INFO mapred.JobClient: Physical memory (bytes) snapshot=1607675904
13/08/28 18:34:27 INFO mapred.JobClient: Reduce output records=0
13/08/28 18:34:27 INFO mapred.JobClient: Virtual memory (bytes) snapshot=23948111872
13/08/28 18:34:27 INFO mapred.JobClient: Map output records=0
But when I look into logs I found following exception :-
Task Logs: 'attempt_201308281800_0008_m_000000_0'
stdout logs
2013-08-28 18:34:01 DEBUG Child:82 - Child starting
2013-08-28 18:34:02 DEBUG Groups:136 - Creating new Groups object
2013-08-28 18:34:02 DEBUG Groups:59 - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; cacheTimeout=300000
2013-08-28 18:34:02 DEBUG UserGroupInformation:193 - hadoop login
2013-08-28 18:34:02 DEBUG UserGroupInformation:142 - hadoop login commit
2013-08-28 18:34:02 DEBUG UserGroupInformation:172 - using local user:UnixPrincipal: mapred
2013-08-28 18:34:02 DEBUG UserGroupInformation:664 - UGI loginUser:mapred (auth:SIMPLE)
2013-08-28 18:34:02 DEBUG FileSystem:1598 - Creating filesystem for file:///var/lib/hadoop-0.20/cache/mapred/mapred/local/taskTracker/root/jobcache/job_201308281800_0008/jobToken
2013-08-28 18:34:02 DEBUG TokenCache:182 - Task: Loaded jobTokenFile from: /var/lib/hadoop-0.20/cache/mapred/mapred/local/taskTracker/root/jobcache/job_201308281800_0008/jobToken; num of sec keys = 0 Number of tokens 1
2013-08-28 18:34:02 DEBUG Child:106 - loading token. # keys =0; from file=/var/lib/hadoop-0.20/cache/mapred/mapred/local/taskTracker/root/jobcache/job_201308281800_0008/jobToken
2013-08-28 18:34:02 DEBUG UserGroupInformation:1300 - PriviledgedAction as:job_201308281800_0008 (auth:SIMPLE) from:org.apache.hadoop.mapred.Child.main(Child.java:121)
2013-08-28 18:34:02 DEBUG Client:256 - The ping interval is60000ms.
2013-08-28 18:34:02 DEBUG Client:299 - Use SIMPLE authentication for protocol TaskUmbilicalProtocol
2013-08-28 18:34:02 DEBUG Client:569 - Connecting to /127.0.0.1:50925
2013-08-28 18:34:02 DEBUG Client:762 - IPC Client (47) connection to /127.0.0.1:50925 from job_201308281800_0008: starting, having connections 1
2013-08-28 18:34:02 DEBUG Client:808 - IPC Client (47) connection to /127.0.0.1:50925 from job_201308281800_0008 sending #0
2013-08-28 18:34:02 DEBUG Client:861 - IPC Client (47) connection to /127.0.0.1:50925 from job_201308281800_0008 got value #0
2013-08-28 18:34:02 DEBUG RPC:230 - Call: getProtocolVersion 98
2013-08-28 18:34:02 DEBUG Client:808 - IPC Client (47) connection to /127.0.0.1:50925 from job_201308281800_0008 sending #1
2013-08-28 18:34:02 DEBUG Client:861 - IPC Client (47) connection to /127.0.0.1:50925 from job_201308281800_0008 got value #1
2013-08-28 18:34:02 DEBUG SortedRanges:347 - currentIndex 0 0:0
2013-08-28 18:34:02 DEBUG Counters:177 - Creating group org.apache.hadoop.mapred.Task$Counter with bundle
2013-08-28 18:34:02 DEBUG Counters:314 - Adding SPILLED_RECORDS
2013-08-28 18:34:02 DEBUG Counters:177 - Creating group org.apache.hadoop.mapred.Task$Counter with bundle
2013-08-28 18:34:02 DEBUG SortedRanges:347 - currentIndex 0 0:0
2013-08-28 18:34:02 DEBUG SortedRanges:347 - currentIndex 1 0:0
2013-08-28 18:34:02 DEBUG RPC:230 - Call: getTask 208
2013-08-28 18:34:03 DEBUG TaskRunner:653 - mapred.local.dir for child : /var/lib/hadoop-0.20/cache/mapred/mapred/local/taskTracker/root/jobcache/job_201308281800_0008/attempt_201308281800_0008_m_000000_0
2013-08-28 18:34:03 DEBUG NativeCodeLoader:40 - Trying to load the custom-built native-hadoop library...
2013-08-28 18:34:03 DEBUG NativeCodeLoader:47 - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
2013-08-28 18:34:03 DEBUG NativeCodeLoader:48 - java.library.path=/usr/java/jdk1.6.0_45/jre/lib/amd64/server:/usr/java/jdk1.6.0_45/jre/lib/amd64:/usr/java/jdk1.6.0_45/jre/../lib/amd64:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib:/var/lib/hadoop-0.20/cache/mapred/mapred/local/taskTracker/root/jobcache/job_201308281800_0008/attempt_201308281800_0008_m_000000_0/work
2013-08-28 18:34:03 WARN NativeCodeLoader:52 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2013-08-28 18:34:03 DEBUG TaskRunner:709 - Fully deleting contents of /var/lib/hadoop-0.20/cache/mapred/mapred/local/taskTracker/root/jobcache/job_201308281800_0008/attempt_201308281800_0008_m_000000_0/work
2013-08-28 18:34:03 INFO JvmMetrics:71 - Initializing JVM Metrics with processName=MAP, sessionId=
2013-08-28 18:34:03 DEBUG Child:251 - Creating remote user to execute task: root
2013-08-28 18:34:03 DEBUG UserGroupInformation:1300 - PriviledgedAction as:root (auth:SIMPLE) from:org.apache.hadoop.mapred.Child.main(Child.java:260)
2013-08-28 18:34:03 DEBUG FileSystem:1598 - Creating filesystem for hdfs://localhost:8020
2013-08-28 18:34:04 DEBUG Client:256 - The ping interval is60000ms.
2013-08-28 18:34:04 DEBUG Client:299 - Use SIMPLE authentication for protocol ClientProtocol
2013-08-28 18:34:04 DEBUG Client:569 - Connecting to localhost/127.0.0.1:8020
2013-08-28 18:34:04 DEBUG Client:808 - IPC Client (47) connection to localhost/127.0.0.1:8020 from root sending #2
2013-08-28 18:34:04 DEBUG Client:762 - IPC Client (47) connection to localhost/127.0.0.1:8020 from root: starting, having connections 2
2013-08-28 18:34:04 DEBUG Client:861 - IPC Client (47) connection to localhost/127.0.0.1:8020 from root got value #2
2013-08-28 18:34:04 DEBUG RPC:230 - Call: getProtocolVersion 18
2013-08-28 18:34:04 DEBUG DFSClient:274 - Short circuit read is false
2013-08-28 18:34:04 DEBUG DFSClient:280 - Connect to datanode via hostname is false
2013-08-28 18:34:04 DEBUG Task:516 - using new api for output committer
2013-08-28 18:34:04 INFO ProcessTree:65 - setsid exited with exit code 0
2013-08-28 18:34:04 INFO Task:539 - Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin#79ee2c2c
2013-08-28 18:34:04 DEBUG ProcfsBasedProcessTree:238 - [ 16890 ]
2013-08-28 18:34:04 DEBUG Client:808 - IPC Client (47) connection to localhost/127.0.0.1:8020 from root sending #3
2013-08-28 18:34:04 DEBUG Client:861 - IPC Client (47) connection to localhost/127.0.0.1:8020 from root got value #3
2013-08-28 18:34:04 DEBUG RPC:230 - Call: getBlockLocations 12
2013-08-28 18:34:04 DEBUG DFSClient:2595 - Connecting to /127.0.0.1:50010
2013-08-28 18:34:04 DEBUG FSInputChecker:1653 - DFSClient readChunk got seqno 0 offsetInBlock 0 lastPacketInBlock false packetLen 520
2013-08-28 18:34:04 DEBUG Counters:314 - Adding SPLIT_RAW_BYTES
2013-08-28 18:34:04 DEBUG DFSClient:2529 - Client couldn't reuse - didnt send code
2013-08-28 18:34:04 INFO MapTask:613 - Processing split: hdfs://localhost:8020/usr/hadoop/sample/2012MTCReportFINAL.pdf:0+1419623
2013-08-28 18:34:04 DEBUG Counters:314 - Adding MAP_INPUT_RECORDS
2013-08-28 18:34:04 DEBUG FileSystem:1598 - Creating filesystem for file:///
2013-08-28 18:34:04 INFO MapTask:803 - io.sort.mb = 100
2013-08-28 18:34:05 INFO MapTask:815 - data buffer = 79691776/99614720
2013-08-28 18:34:05 INFO MapTask:816 - record buffer = 262144/327680
2013-08-28 18:34:05 DEBUG Counters:314 - Adding MAP_OUTPUT_BYTES
2013-08-28 18:34:05 DEBUG Counters:314 - Adding MAP_OUTPUT_RECORDS
2013-08-28 18:34:05 DEBUG Counters:314 - Adding COMBINE_INPUT_RECORDS
2013-08-28 18:34:05 DEBUG Counters:314 - Adding COMBINE_OUTPUT_RECORDS
2013-08-28 18:34:05 WARN LoadSnappy:46 - Snappy native library not loaded
2013-08-28 18:34:05 DEBUG Client:808 - IPC Client (47) connection to localhost/127.0.0.1:8020 from root sending #4
2013-08-28 18:34:05 DEBUG Client:861 - IPC Client (47) connection to localhost/127.0.0.1:8020 from root got value #4
2013-08-28 18:34:05 DEBUG RPC:230 - Call: getBlockLocations 4
2013-08-28 18:34:05 INFO FileToRecordMapper:65 - Inside run method.
2013-08-28 18:34:05 INFO MapTask:1142 - Starting flush of map output
2013-08-28 18:34:05 INFO Task:830 - Task:attempt_201308281800_0008_m_000000_0 is done. And is in the process of commiting
2013-08-28 18:34:05 DEBUG Counters:177 - Creating group FileSystemCounters with nothing
2013-08-28 18:34:05 DEBUG Counters:314 - Adding FILE_BYTES_WRITTEN
2013-08-28 18:34:05 DEBUG Counters:314 - Adding HDFS_BYTES_READ
2013-08-28 18:34:05 DEBUG Counters:314 - Adding COMMITTED_HEAP_BYTES
2013-08-28 18:34:05 DEBUG ProcfsBasedProcessTree:238 - [ 16890 ]
2013-08-28 18:34:05 DEBUG Counters:314 - Adding CPU_MILLISECONDS
2013-08-28 18:34:05 DEBUG Counters:314 - Adding PHYSICAL_MEMORY_BYTES
2013-08-28 18:34:05 DEBUG Counters:314 - Adding VIRTUAL_MEMORY_BYTES
2013-08-28 18:34:05 DEBUG Client:808 - IPC Client (47) connection to localhost/127.0.0.1:8020 from root sending #5
2013-08-28 18:34:05 DEBUG Client:861 - IPC Client (47) connection to localhost/127.0.0.1:8020 from root got value #5
2013-08-28 18:34:05 DEBUG RPC:230 - Call: getFileInfo 2
2013-08-28 18:34:05 DEBUG Task:658 - attempt_201308281800_0008_m_000000_0 Progress/ping thread exiting since it got interrupted
2013-08-28 18:34:05 DEBUG Client:808 - IPC Client (47) connection to /127.0.0.1:50925 from job_201308281800_0008 sending #6
2013-08-28 18:34:05 DEBUG Client:861 - IPC Client (47) connection to /127.0.0.1:50925 from job_201308281800_0008 got value #6
2013-08-28 18:34:05 DEBUG RPC:230 - Call: statusUpdate 3
2013-08-28 18:34:05 DEBUG Client:808 - IPC Client (47) connection to /127.0.0.1:50925 from job_201308281800_0008 sending #7
2013-08-28 18:34:05 DEBUG Client:861 - IPC Client (47) connection to /127.0.0.1:50925 from job_201308281800_0008 got value #7
2013-08-28 18:34:05 DEBUG RPC:230 - Call: done 1
2013-08-28 18:34:05 INFO Task:942 - Task 'attempt_201308281800_0008_m_000000_0' done.
2013-08-28 18:34:05 INFO TaskLogsTruncater:69 - Initializing logs' truncater with mapRetainSize=-1 and reduceRetainSize=-1
2013-08-28 18:34:05 DEBUG TaskLogsTruncater:174 - Truncation is not needed for /usr/lib/hadoop-0.20/logs/userlogs/job_201308281800_0008/attempt_201308281800_0008_m_000000_0/stdout
2013-08-28 18:34:05 DEBUG TaskLogsTruncater:174 - Truncation is not needed for /usr/lib/hadoop-0.20/logs/userlogs/job_201308281800_0008/attempt_201308281800_0008_m_000000_0/stderr
2013-08-28 18:34:05 DEBUG TaskLogsTruncater:202 - Cannot open /usr/lib/hadoop-0.20/logs/userlogs/job_201308281800_0008/attempt_201308281800_0008_m_000000_0/syslog for reading. Continuing with other log files
java.io.FileNotFoundException: /usr/lib/hadoop-0.20/logs/userlogs/job_201308281800_0008/attempt_201308281800_0008_m_000000_0/syslog (No such file or directory)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.<init>(FileInputStream.java:120)
at org.apache.hadoop.mapred.TaskLogsTruncater.truncateLogs(TaskLogsTruncater.java:199)
at org.apache.hadoop.mapred.Child$4.run(Child.java:271)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1278)
at org.apache.hadoop.mapred.Child.main(Child.java:260)
2013-08-28 18:34:05 DEBUG TaskLogsTruncater:202 - Cannot open /usr/lib/hadoop-0.20/logs/userlogs/job_201308281800_0008/attempt_201308281800_0008_m_000000_0/profile.out for reading. Continuing with other log files
java.io.FileNotFoundException: /usr/lib/hadoop-0.20/logs/userlogs/job_201308281800_0008/attempt_201308281800_0008_m_000000_0/profile.out (No such file or directory)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.<init>(FileInputStream.java:120)
at org.apache.hadoop.mapred.TaskLogsTruncater.truncateLogs(TaskLogsTruncater.java:199)
at org.apache.hadoop.mapred.Child$4.run(Child.java:271)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1278)
at org.apache.hadoop.mapred.Child.main(Child.java:260)
2013-08-28 18:34:05 DEBUG TaskLogsTruncater:202 - Cannot open /usr/lib/hadoop-0.20/logs/userlogs/job_201308281800_0008/attempt_201308281800_0008_m_000000_0/debugout for reading. Continuing with other log files
java.io.FileNotFoundException: /usr/lib/hadoop-0.20/logs/userlogs/job_201308281800_0008/attempt_201308281800_0008_m_000000_0/debugout (No such file or directory)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.<init>(FileInputStream.java:120)
at org.apache.hadoop.mapred.TaskLogsTruncater.truncateLogs(TaskLogsTruncater.java:199)
at org.apache.hadoop.mapred.Child$4.run(Child.java:271)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1278)
at org.apache.hadoop.mapred.Child.main(Child.java:260)
I have checked the permission and it works fine for the Sample WordCount program. I am new to Hadoop. I googled but could not find anything substantial. I am using hadoop-0.20.2-cdh3u6 on a single node setup.