Related
Hi I have a implemented oauth2 service in springboot. Auth-server and resource servers are enabled with eureka client they have successfully registered to eureka server as well. URI for auth-server in resource server is configured this way
security:
oauth2:
resource:
token-info-uri: http://auth-server/oauth/check_token
Now auth-server in the above URI is the app name of authserver. I am expecting resource server should contact auth server using token-info-uri after it gets url resolved from eureka server. But with this setup I am getting an error nested exception is java.net.UnknownHostException: auth-server
I modified my configuration to this
security:
oauth2:
resource:
prefer-token-info: false
service-id: auth-server
token-info-uri: http://${security.oauth2.resource.service-id}/oauth/check_token
loadBalanced: true
but I am getting response
{
"error": "invalid_token",
"error_description": "e2f95093-085c-4b59-90a5-c89fb5d1eccb"
}
When I debug I have this log
asset-mgmt-v1.1_1 | 2019-11-16 17:52:08.048 INFO 1 --- [nio-6001-exec-2] o.s.web.servlet.DispatcherServlet : Completed initialization in 21 ms
asset-mgmt-v1.1_1 | 2019-11-16 17:52:08.086 DEBUG 1 --- [nio-6001-exec-2] o.s.b.a.s.o.r.UserInfoTokenServices : Getting user info from: null
asset-mgmt-v1.1_1 | 2019-11-16 17:52:08.109 DEBUG 1 --- [nio-6001-exec-2] org.springframework.web.HttpLogging : HTTP GET
asset-mgmt-v1.1_1 | 2019-11-16 17:52:08.117 DEBUG 1 --- [nio-6001-exec-2] org.springframework.web.HttpLogging : Accept=[application/json, application/*+json]
asset-mgmt-v1.1_1 | 2019-11-16 17:52:08.119 WARN 1 --- [nio-6001-exec-2] o.s.b.a.s.o.r.UserInfoTokenServices : Could not fetch user details: class java.lang.IllegalStateException, Request URI does not contain a valid hostname:
asset-mgmt-v1.1_1 | 2019-11-16 17:52:08.120 DEBUG 1 --- [nio-6001-exec-2] o.s.b.a.s.o.r.UserInfoTokenServices : userinfo returned error: Could not fetch user details
asset-mgmt-v1.1_1 | 2019-11-16 17:52:08.127 DEBUG 1 --- [nio-6001-exec-2] o.s.b.a.audit.listener.AuditListener : AuditEvent [timestamp=2019-11-16T17:52:08.125Z, principal=access-token, type=AUTHENTICATION_FAILURE, data={type=org.springframework.security.authentication.BadCredentialsException, message=e2f95093-085c-4b59-90a5-c89fb5d1eccb}]
Basically URI is picked up is what I see
you cannot use just name in property files. it should be
security:
oauth2:
resource:
service-id: {Service ID as at eureka server registered}
token-info-uri: http://${security.oauth2.resource.service-id}/oauth/check_token
loadBalanced=true
prefer-token-info=false
P.S I just typed by hand. make sure keep proper spaces
In the error log presented by Darshu, there is the following message:" asset-mgmt-v1.1_1 | 2019-11-16 17:52:08.120 DEBUG 1 --- [nio-6001-exec-2] o.s.b.a.s.o.r.UserInfoTokenServices : userinfo returned error: Could not fetch user details".
The prefer-token-info must be set true to determine that token-info-uri is preferred over user-info-uri.
security:
oauth2:
resource:
prefer-token-info: true
See more in https://docs.spring.io/spring-security-oauth2-boot/docs/2.0.0.RC2/reference/htmlsingle/#boot-features-security-oauth2-resource-server
I am running spark-submit in yarn client mode. Yarn has been setup with HDP sandbox with kerberos enabled. HDP Sandbox is running on docker container on Mac host.
When spark submit is run from within the docker container of the sandbox, it’s runs successfully but when spark submit is run from the host machine it fails immediately after ACCEPTED state with error:
19/07/28 00:41:21 INFO yarn.Client: Application report for application_1564298049378_0008 (state: ACCEPTED)
19/07/28 00:41:22 INFO yarn.Client: Application report for application_1564298049378_0008 (state: ACCEPTED)
19/07/28 00:41:23 INFO yarn.Client: Application report for application_1564298049378_0008 (state: FAILED)
19/07/28 00:41:23 INFO yarn.Client:
client token: N/A
diagnostics: Application application_1564298049378_0008 failed 2 times due to AM Container for appattempt_1564298049378_0008_000002 exited with exitCode: -1000
Failing this attempt.Diagnostics: (Client.java:1558)
... 37 more
Caused by: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
I could not find any more information about the failure. Any help will be greatly appreciated.
Here is the resourcemanager log:
2019-07-28 22:39:04,654 INFO resourcemanager.ClientRMService (ClientRMService.java:getNewApplicationId(341)) - Allocated new applicationId: 20
2019-07-28 22:39:10,982 INFO capacity.CapacityScheduler (CapacityScheduler.java:checkAndGetApplicationPriority(2526)) - Application 'application_1564332457320_0020' is submitted without priority hence considering default queue/cluster priority: 0
2019-07-28 22:39:10,982 INFO capacity.CapacityScheduler (CapacityScheduler.java:checkAndGetApplicationPriority(2547)) - Priority '0' is acceptable in queue : santosh for application: application_1564332457320_0020
2019-07-28 22:39:10,983 WARN rmapp.RMAppImpl (RMAppImpl.java:(473)) - The specific max attempts: 0 for application: 20 is invalid, because it is out of the range [1, 2]. Use the global max attempts instead.
2019-07-28 22:39:10,983 INFO collector.TimelineCollectorManager (TimelineCollectorManager.java:putIfAbsent(142)) - the collector for application_1564332457320_0020 was added
2019-07-28 22:39:10,984 INFO resourcemanager.ClientRMService (ClientRMService.java:submitApplication(648)) - Application with id 20 submitted by user santosh
2019-07-28 22:39:10,984 INFO security.DelegationTokenRenewer (DelegationTokenRenewer.java:handleAppSubmitEvent(458)) - application_1564332457320_0020 found existing hdfs token Kind: HDFS_DELEGATION_TOKEN, Service: 192.168.50.1:8020, Ident: (token for santosh: HDFS_DELEGATION_TOKEN owner=santosh#XXX.XX, renewer=yarn, realUser=, issueDate=1564353550169, maxDate=1564958350169, sequenceNumber=125, masterKeyId=20)
2019-07-28 22:39:11,011 INFO security.DelegationTokenRenewer (DelegationTokenRenewer.java:renewToken(635)) - Renewed delegation-token= [Kind: HDFS_DELEGATION_TOKEN, Service: 192.168.50.1:8020, Ident: (token for santosh: HDFS_DELEGATION_TOKEN owner=santosh#XXX.XX, renewer=yarn, realUser=, issueDate=1564353550169, maxDate=1564958350169, sequenceNumber=125, masterKeyId=20);exp=1564439951007; apps=[application_1564332457320_0020]]
2019-07-28 22:39:11,011 INFO security.DelegationTokenRenewer (DelegationTokenRenewer.java:setTimerForTokenRenewal(613)) - Renew Kind: HDFS_DELEGATION_TOKEN, Service: 192.168.50.1:8020, Ident: (token for santosh: HDFS_DELEGATION_TOKEN owner=santosh#XXX.XX, renewer=yarn, realUser=, issueDate=1564353550169, maxDate=1564958350169, sequenceNumber=125, masterKeyId=20);exp=1564439951007; apps=[application_1564332457320_0020] in 86399996 ms, appId = [application_1564332457320_0020]
2019-07-28 22:39:11,011 INFO rmapp.RMAppImpl (RMAppImpl.java:transition(1259)) - Storing application with id application_1564332457320_0020
2019-07-28 22:39:11,012 INFO rmapp.RMAppImpl (RMAppImpl.java:handle(912)) - application_1564332457320_0020 State change from NEW to NEW_SAVING on event = START
2019-07-28 22:39:11,012 INFO recovery.RMStateStore (RMStateStore.java:transition(222)) - Storing info for app: application_1564332457320_0020
2019-07-28 22:39:11,022 INFO rmapp.RMAppImpl (RMAppImpl.java:handle(912)) - application_1564332457320_0020 State change from NEW_SAVING to SUBMITTED on event = APP_NEW_SAVED
2019-07-28 22:39:11,022 INFO capacity.ParentQueue (ParentQueue.java:addApplication(494)) - Application added - appId: application_1564332457320_0020 user: santosh leaf-queue of parent: root #applications: 1
2019-07-28 22:39:11,023 INFO capacity.CapacityScheduler (CapacityScheduler.java:addApplication(990)) - Accepted application application_1564332457320_0020 from user: santosh, in queue: santosh
2019-07-28 22:39:11,023 INFO rmapp.RMAppImpl (RMAppImpl.java:handle(912)) - application_1564332457320_0020 State change from SUBMITTED to ACCEPTED on event = APP_ACCEPTED
2019-07-28 22:39:11,023 INFO resourcemanager.ApplicationMasterService (ApplicationMasterService.java:registerAppAttempt(479)) - Registering app attempt : appattempt_1564332457320_0020_000001
2019-07-28 22:39:11,024 INFO attempt.RMAppAttemptImpl (RMAppAttemptImpl.java:handle(925)) - appattempt_1564332457320_0020_000001 State change from NEW to SUBMITTED on event = START
2019-07-28 22:39:11,024 INFO capacity.LeafQueue (LeafQueue.java:activateApplications(911)) - Application application_1564332457320_0020 from user: santosh activated in queue: santosh
2019-07-28 22:39:11,025 INFO capacity.LeafQueue (LeafQueue.java:addApplicationAttempt(941)) - Application added - appId: application_1564332457320_0020 user: santosh, leaf-queue: santosh #user-pending-applications: 0 #user-active-applications: 1 #queue-pending-applications: 0 #queue-active-applications: 1
2019-07-28 22:39:11,025 INFO capacity.CapacityScheduler (CapacityScheduler.java:addApplicationAttempt(1036)) - Added Application Attempt appattempt_1564332457320_0020_000001 to scheduler from user santosh in queue santosh
2019-07-28 22:39:11,028 INFO attempt.RMAppAttemptImpl (RMAppAttemptImpl.java:handle(925)) - appattempt_1564332457320_0020_000001 State change from SUBMITTED to SCHEDULED on event = ATTEMPT_ADDED
2019-07-28 22:39:11,033 INFO allocator.AbstractContainerAllocator (AbstractContainerAllocator.java:getCSAssignmentFromAllocateResult(129)) - assignedContainer application attempt=appattempt_1564332457320_0020_000001 container=null queue=santosh clusterResource= type=OFF_SWITCH requestedPartition=
2019-07-28 22:39:11,034 INFO rmcontainer.RMContainerImpl (RMContainerImpl.java:handle(490)) - container_e20_1564332457320_0020_01_000001 Container Transitioned from NEW to ALLOCATED
2019-07-28 22:39:11,035 INFO fica.FiCaSchedulerNode (FiCaSchedulerNode.java:allocateContainer(169)) - Assigned container container_e20_1564332457320_0020_01_000001 of capacity on host sandbox-hdp.hortonworks.com:45454, which has 1 containers, used and available after allocation
2019-07-28 22:39:11,038 INFO security.NMTokenSecretManagerInRM (NMTokenSecretManagerInRM.java:createAndGetNMToken(200)) - Sending NMToken for nodeId : sandbox-hdp.hortonworks.com:45454 for container : container_e20_1564332457320_0020_01_000001
2019-07-28 22:39:11,043 INFO rmcontainer.RMContainerImpl (RMContainerImpl.java:handle(490)) - container_e20_1564332457320_0020_01_000001 Container Transitioned from ALLOCATED to ACQUIRED
2019-07-28 22:39:11,043 INFO security.NMTokenSecretManagerInRM (NMTokenSecretManagerInRM.java:clearNodeSetForAttempt(146)) - Clear node set for appattempt_1564332457320_0020_000001
2019-07-28 22:39:11,044 INFO capacity.ParentQueue (ParentQueue.java:apply(1332)) - assignedContainer queue=root usedCapacity=0.25 absoluteUsedCapacity=0.25 used= cluster=
2019-07-28 22:39:11,044 INFO capacity.CapacityScheduler (CapacityScheduler.java:tryCommit(2890)) - Allocation proposal accepted
2019-07-28 22:39:11,044 INFO attempt.RMAppAttemptImpl (RMAppAttemptImpl.java:storeAttempt(2213)) - Storing attempt: AppId: application_1564332457320_0020 AttemptId: appattempt_1564332457320_0020_000001 MasterContainer: Container: [ContainerId: container_e20_1564332457320_0020_01_000001, AllocationRequestId: -1, Version: 0, NodeId: sandbox-hdp.hortonworks.com:45454, NodeHttpAddress: sandbox-hdp.hortonworks.com:8042, Resource: , Priority: 0, Token: Token { kind: ContainerToken, service: 172.18.0.3:45454 }, ExecutionType: GUARANTEED, ]
2019-07-28 22:39:11,051 INFO attempt.RMAppAttemptImpl (RMAppAttemptImpl.java:handle(925)) - appattempt_1564332457320_0020_000001 State change from SCHEDULED to ALLOCATED_SAVING on event = CONTAINER_ALLOCATED
2019-07-28 22:39:11,057 INFO attempt.RMAppAttemptImpl (RMAppAttemptImpl.java:handle(925)) - appattempt_1564332457320_0020_000001 State change from ALLOCATED_SAVING to ALLOCATED on event = ATTEMPT_NEW_SAVED
2019-07-28 22:39:11,060 INFO amlauncher.AMLauncher (AMLauncher.java:run(307)) - Launching masterappattempt_1564332457320_0020_000001
2019-07-28 22:39:11,068 INFO amlauncher.AMLauncher (AMLauncher.java:launch(109)) - Setting up container Container: [ContainerId: container_e20_1564332457320_0020_01_000001, AllocationRequestId: -1, Version: 0, NodeId: sandbox-hdp.hortonworks.com:45454, NodeHttpAddress: sandbox-hdp.hortonworks.com:8042, Resource: , Priority: 0, Token: Token { kind: ContainerToken, service: 172.18.0.3:45454 }, ExecutionType: GUARANTEED, ] for AM appattempt_1564332457320_0020_000001
2019-07-28 22:39:11,069 INFO security.AMRMTokenSecretManager (AMRMTokenSecretManager.java:createAndGetAMRMToken(195)) - Create AMRMToken for ApplicationAttempt: appattempt_1564332457320_0020_000001
2019-07-28 22:39:11,069 INFO security.AMRMTokenSecretManager (AMRMTokenSecretManager.java:createPassword(307)) - Creating password for appattempt_1564332457320_0020_000001
2019-07-28 22:39:11,265 INFO amlauncher.AMLauncher (AMLauncher.java:launch(130)) - Done launching container Container: [ContainerId: container_e20_1564332457320_0020_01_000001, AllocationRequestId: -1, Version: 0, NodeId: sandbox-hdp.hortonworks.com:45454, NodeHttpAddress: sandbox-hdp.hortonworks.com:8042, Resource: , Priority: 0, Token: Token { kind: ContainerToken, service: 172.18.0.3:45454 }, ExecutionType: GUARANTEED, ] for AM appattempt_1564332457320_0020_000001
2019-07-28 22:39:11,265 INFO attempt.RMAppAttemptImpl (RMAppAttemptImpl.java:handle(925)) - appattempt_1564332457320_0020_000001 State change from ALLOCATED to LAUNCHED on event = LAUNCHED
2019-07-28 22:39:11,852 INFO resourcemanager.ResourceTrackerService (ResourceTrackerService.java:updateAppCollectorsMap(713)) - Update collector information for application application_1564332457320_0020 with new address: sandbox-hdp.hortonworks.com:35197 timestamp: 1564332457320, 36
2019-07-28 22:39:11,854 INFO rmcontainer.RMContainerImpl (RMContainerImpl.java:handle(490)) - container_e20_1564332457320_0020_01_000001 Container Transitioned from ACQUIRED to RUNNING
2019-07-28 22:39:12,833 INFO provider.BaseAuditHandler (BaseAuditHandler.java:logStatus(312)) - Audit Status Log: name=yarn.async.batch.hdfs, interval=01:11.979 minutes, events=162, succcessCount=162, totalEvents=17347, totalSuccessCount=17347
2019-07-28 22:39:12,834 INFO destination.HDFSAuditDestination (HDFSAuditDestination.java:logJSON(179)) - Flushing HDFS audit. Event Size:1
2019-07-28 22:39:12,857 INFO resourcemanager.ResourceTrackerService (ResourceTrackerService.java:updateAppCollectorsMap(713)) - Update collector information for application application_1564332457320_0020 with new address: sandbox-hdp.hortonworks.com:35197 timestamp: 1564332457320, 37
2019-07-28 22:39:14,054 INFO rmcontainer.RMContainerImpl (RMContainerImpl.java:handle(490)) - container_e20_1564332457320_0020_01_000001 Container Transitioned from RUNNING to COMPLETED
2019-07-28 22:39:14,055 INFO attempt.RMAppAttemptImpl (RMAppAttemptImpl.java:rememberTargetTransitionsAndStoreState(1412)) - Updating application attempt appattempt_1564332457320_0020_000001 with final state: FAILED, and exit status: -1000
2019-07-28 22:39:14,055 INFO attempt.RMAppAttemptImpl (RMAppAttemptImpl.java:handle(925)) - appattempt_1564332457320_0020_000001 State change from LAUNCHED to FINAL_SAVING on event = CONTAINER_FINISHED
2019-07-28 22:39:14,066 INFO resourcemanager.ApplicationMasterService (ApplicationMasterService.java:unregisterAttempt(496)) - Unregistering app attempt : appattempt_1564332457320_0020_000001
2019-07-28 22:39:14,066 INFO security.AMRMTokenSecretManager (AMRMTokenSecretManager.java:applicationMasterFinished(124)) - Application finished, removing password for appattempt_1564332457320_0020_000001
2019-07-28 22:39:14,066 INFO attempt.RMAppAttemptImpl (RMAppAttemptImpl.java:handle(925)) - appattempt_1564332457320_0020_000001 State change from FINAL_SAVING to FAILED on event = ATTEMPT_UPDATE_SAVED
2019-07-28 22:39:14,067 INFO rmapp.RMAppImpl (RMAppImpl.java:transition(1538)) - The number of failed attempts is 1. The max attempts is 2
2019-07-28 22:39:14,067 INFO resourcemanager.ApplicationMasterService (ApplicationMasterService.java:registerAppAttempt(479)) - Registering app attempt : appattempt_1564332457320_0020_000002
2019-07-28 22:39:14,067 INFO attempt.RMAppAttemptImpl (RMAppAttemptImpl.java:handle(925)) - appattempt_1564332457320_0020_000002 State change from NEW to SUBMITTED on event = START
2019-07-28 22:39:14,067 INFO capacity.CapacityScheduler (CapacityScheduler.java:doneApplicationAttempt(1085)) - Application Attempt appattempt_1564332457320_0020_000001 is done. finalState=FAILED
2019-07-28 22:39:14,067 INFO scheduler.AppSchedulingInfo (AppSchedulingInfo.java:clearRequests(159)) - Application application_1564332457320_0020 requests cleared
2019-07-28 22:39:14,067 INFO capacity.LeafQueue (LeafQueue.java:removeApplicationAttempt(1003)) - Application removed - appId: application_1564332457320_0020 user: santosh queue: santosh #user-pending-applications: 0 #user-active-applications: 0 #queue-pending-applications: 0 #queue-active-applications: 0
2019-07-28 22:39:14,068 INFO capacity.LeafQueue (LeafQueue.java:activateApplications(911)) - Application application_1564332457320_0020 from user: santosh activated in queue: santosh
2019-07-28 22:39:14,068 INFO capacity.LeafQueue (LeafQueue.java:addApplicationAttempt(941)) - Application added - appId: application_1564332457320_0020 user: santosh, leaf-queue: santosh #user-pending-applications: 0 #user-active-applications: 1 #queue-pending-applications: 0 #queue-active-applications: 1
2019-07-28 22:39:14,068 INFO capacity.CapacityScheduler (CapacityScheduler.java:addApplicationAttempt(1036)) - Added Application Attempt appattempt_1564332457320_0020_000002 to scheduler from user santosh in queue santosh
2019-07-28 22:39:14,068 INFO attempt.RMAppAttemptImpl (RMAppAttemptImpl.java:handle(925)) - appattempt_1564332457320_0020_000002 State change from SUBMITTED to SCHEDULED on event = ATTEMPT_ADDED
2019-07-28 22:39:14,074 INFO allocator.AbstractContainerAllocator (AbstractContainerAllocator.java:getCSAssignmentFromAllocateResult(129)) - assignedContainer application attempt=appattempt_1564332457320_0020_000002 container=null queue=santosh clusterResource= type=OFF_SWITCH requestedPartition=
2019-07-28 22:39:14,074 INFO rmcontainer.RMContainerImpl (RMContainerImpl.java:handle(490)) - container_e20_1564332457320_0020_02_000001 Container Transitioned from NEW to ALLOCATED
2019-07-28 22:39:14,075 INFO fica.FiCaSchedulerNode (FiCaSchedulerNode.java:allocateContainer(169)) - Assigned container container_e20_1564332457320_0020_02_000001 of capacity on host sandbox-hdp.hortonworks.com:45454, which has 1 containers, used and available after allocation
2019-07-28 22:39:14,075 INFO security.NMTokenSecretManagerInRM (NMTokenSecretManagerInRM.java:createAndGetNMToken(200)) - Sending NMToken for nodeId : sandbox-hdp.hortonworks.com:45454 for container : container_e20_1564332457320_0020_02_000001
2019-07-28 22:39:14,076 INFO rmcontainer.RMContainerImpl (RMContainerImpl.java:handle(490)) - container_e20_1564332457320_0020_02_000001 Container Transitioned from ALLOCATED to ACQUIRED
2019-07-28 22:39:14,076 INFO security.NMTokenSecretManagerInRM (NMTokenSecretManagerInRM.java:clearNodeSetForAttempt(146)) - Clear node set for appattempt_1564332457320_0020_000002
2019-07-28 22:39:14,076 INFO capacity.ParentQueue (ParentQueue.java:apply(1332)) - assignedContainer queue=root usedCapacity=0.25 absoluteUsedCapacity=0.25 used= cluster=
2019-07-28 22:39:14,076 INFO capacity.CapacityScheduler (CapacityScheduler.java:tryCommit(2890)) - Allocation proposal accepted
2019-07-28 22:39:14,076 INFO attempt.RMAppAttemptImpl (RMAppAttemptImpl.java:storeAttempt(2213)) - Storing attempt: AppId: application_1564332457320_0020 AttemptId: appattempt_1564332457320_0020_000002 MasterContainer: Container: [ContainerId: container_e20_1564332457320_0020_02_000001, AllocationRequestId: -1, Version: 0, NodeId: sandbox-hdp.hortonworks.com:45454, NodeHttpAddress: sandbox-hdp.hortonworks.com:8042, Resource: , Priority: 0, Token: Token { kind: ContainerToken, service: 172.18.0.3:45454 }, ExecutionType: GUARANTEED, ]
2019-07-28 22:39:14,077 INFO attempt.RMAppAttemptImpl (RMAppAttemptImpl.java:handle(925)) - appattempt_1564332457320_0020_000002 State change from SCHEDULED to ALLOCATED_SAVING on event = CONTAINER_ALLOCATED
2019-07-28 22:39:14,088 INFO attempt.RMAppAttemptImpl (RMAppAttemptImpl.java:handle(925)) - appattempt_1564332457320_0020_000002 State change from ALLOCATED_SAVING to ALLOCATED on event = ATTEMPT_NEW_SAVED
2019-07-28 22:39:14,089 INFO amlauncher.AMLauncher (AMLauncher.java:run(307)) - Launching masterappattempt_1564332457320_0020_000002
2019-07-28 22:39:14,091 INFO amlauncher.AMLauncher (AMLauncher.java:launch(109)) - Setting up container Container: [ContainerId: container_e20_1564332457320_0020_02_000001, AllocationRequestId: -1, Version: 0, NodeId: sandbox-hdp.hortonworks.com:45454, NodeHttpAddress: sandbox-hdp.hortonworks.com:8042, Resource: , Priority: 0, Token: Token { kind: ContainerToken, service: 172.18.0.3:45454 }, ExecutionType: GUARANTEED, ] for AM appattempt_1564332457320_0020_000002
2019-07-28 22:39:14,092 INFO security.AMRMTokenSecretManager (AMRMTokenSecretManager.java:createAndGetAMRMToken(195)) - Create AMRMToken for ApplicationAttempt: appattempt_1564332457320_0020_000002
2019-07-28 22:39:14,092 INFO security.AMRMTokenSecretManager (AMRMTokenSecretManager.java:createPassword(307)) - Creating password for appattempt_1564332457320_0020_000002
2019-07-28 22:39:14,110 INFO amlauncher.AMLauncher (AMLauncher.java:launch(130)) - Done launching container Container: [ContainerId: container_e20_1564332457320_0020_02_000001, AllocationRequestId: -1, Version: 0, NodeId: sandbox-hdp.hortonworks.com:45454, NodeHttpAddress: sandbox-hdp.hortonworks.com:8042, Resource: , Priority: 0, Token: Token { kind: ContainerToken, service: 172.18.0.3:45454 }, ExecutionType: GUARANTEED, ] for AM appattempt_1564332457320_0020_000002
2019-07-28 22:39:14,110 INFO attempt.RMAppAttemptImpl (RMAppAttemptImpl.java:handle(925)) - appattempt_1564332457320_0020_000002 State change from ALLOCATED to LAUNCHED on event = LAUNCHED
2019-07-28 22:39:15,056 INFO rmcontainer.RMContainerImpl (RMContainerImpl.java:handle(490)) - container_e20_1564332457320_0020_02_000001 Container Transitioned from ACQUIRED to RUNNING
2019-07-28 22:39:16,752 INFO rmcontainer.RMContainerImpl (RMContainerImpl.java:handle(490)) - container_e20_1564332457320_0020_02_000001 Container Transitioned from RUNNING to COMPLETED
2019-07-28 22:39:16,755 INFO attempt.RMAppAttemptImpl (RMAppAttemptImpl.java:rememberTargetTransitionsAndStoreState(1412)) - Updating application attempt appattempt_1564332457320_0020_000002 with final state: FAILED, and exit status: -1000
2019-07-28 22:39:16,755 INFO attempt.RMAppAttemptImpl (RMAppAttemptImpl.java:handle(925)) - appattempt_1564332457320_0020_000002 State change from LAUNCHED to FINAL_SAVING on event = CONTAINER_FINISHED
2019-07-28 22:39:16,899 INFO resourcemanager.ApplicationMasterService (ApplicationMasterService.java:unregisterAttempt(496)) - Unregistering app attempt : appattempt_1564332457320_0020_000002
2019-07-28 22:39:16,900 INFO security.AMRMTokenSecretManager (AMRMTokenSecretManager.java:applicationMasterFinished(124)) - Application finished, removing password for appattempt_1564332457320_0020_000002
2019-07-28 22:39:16,900 INFO attempt.RMAppAttemptImpl (RMAppAttemptImpl.java:handle(925)) - appattempt_1564332457320_0020_000002 State change from FINAL_SAVING to FAILED on event = ATTEMPT_UPDATE_SAVED
2019-07-28 22:39:16,900 INFO rmapp.RMAppImpl (RMAppImpl.java:transition(1538)) - The number of failed attempts is 2. The max attempts is 2
2019-07-28 22:39:16,900 INFO rmapp.RMAppImpl (RMAppImpl.java:rememberTargetTransitionsAndStoreState(1278)) - Updating application application_1564332457320_0020 with final state: FAILED
2019-07-28 22:39:16,900 INFO rmapp.RMAppImpl (RMAppImpl.java:handle(912)) - application_1564332457320_0020 State change from ACCEPTED to FINAL_SAVING on event = ATTEMPT_FAILED
2019-07-28 22:39:16,900 INFO recovery.RMStateStore (RMStateStore.java:transition(260)) - Updating info for app: application_1564332457320_0020
2019-07-28 22:39:16,900 INFO capacity.CapacityScheduler (CapacityScheduler.java:doneApplicationAttempt(1085)) - Application Attempt appattempt_1564332457320_0020_000002 is done. finalState=FAILED
2019-07-28 22:39:16,901 INFO scheduler.AppSchedulingInfo (AppSchedulingInfo.java:clearRequests(159)) - Application application_1564332457320_0020 requests cleared
2019-07-28 22:39:16,901 INFO capacity.LeafQueue (LeafQueue.java:removeApplicationAttempt(1003)) - Application removed - appId: application_1564332457320_0020 user: santosh queue: santosh #user-pending-applications: 0 #user-active-applications: 0 #queue-pending-applications: 0 #queue-active-applications: 0
2019-07-28 22:39:16,916 INFO rmapp.RMAppImpl (RMAppImpl.java:transition(1197)) - Application application_1564332457320_0020 failed 2 times due to AM Container for appattempt_1564332457320_0020_000002 exited with exitCode: -1000
Failing this attempt.Diagnostics: (Client.java:1558)
at org.apache.hadoop.ipc.Client.call(Client.java:1389)
... 37 more
Caused by: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:173)
at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:390)
at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:614)
at org.apache.hadoop.ipc.Client$Connection.access$2300(Client.java:410)
at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:800)
at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:796)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:796)
... 40 more
Caused by: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
Can't create instance of GremlinServer with HBase and Elasticsearch.
When i run shell script: bin/gremlin-server.sh config/gremlin.yaml. I get exception:
Exception in thread "main" java.lang.IllegalStateException: java.lang.NoSuchMethodException: org.janusgraph.graphdb.tinkerpop.plugin.JanusGraphGremlinPlugin.build()
Gremlin-server logs
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/user/janusgraph/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/user/janusgraph/lib/logback-classic-1.1.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
0 [main] INFO org.apache.tinkerpop.gremlin.server.GremlinServer -
\,,,/
(o o)
-----oOOo-(3)-oOOo-----
135 [main] INFO org.apache.tinkerpop.gremlin.server.GremlinServer - Configuring Gremlin Server from config/gremlin.yaml
211 [main] INFO org.apache.tinkerpop.gremlin.server.util.MetricManager - Configured Metrics Slf4jReporter configured with interval=180000ms and loggerName=org.apache.tinkerpop.gremlin.server.Settings$Slf4jReporterMetrics
557 [main] INFO org.janusgraph.diskstorage.hbase.HBaseCompatLoader - Instantiated HBase compatibility layer supporting runtime HBase version 1.2.6: org.janusgraph.diskstorage.hbase.HBaseCompat1_0
835 [main] INFO org.janusgraph.diskstorage.hbase.HBaseStoreManager - HBase configuration: setting zookeeper.znode.parent=/hbase-unsecure
836 [main] INFO org.janusgraph.diskstorage.hbase.HBaseStoreManager - Copied host list from root.storage.hostname to hbase.zookeeper.quorum: main.local,data1.local,data2.local
836 [main] INFO org.janusgraph.diskstorage.hbase.HBaseStoreManager - Copied Zookeeper Port from root.storage.port to hbase.zookeeper.property.clientPort: 2181
866 [main] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
1214 [main] INFO org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper - Process identifier=hconnection-0x1e44b638 connecting to ZooKeeper ensemble=main.local:2181,data1.local:2181,data2.local:2181
1220 [main] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ZooKeeper - Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT
1220 [main] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ZooKeeper - Client environment:host.name=main.local
1220 [main] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ZooKeeper - Client environment:java.version=1.8.0_212
1220 [main] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ZooKeeper - Client environment:java.vendor=Oracle Corporation
1220 [main] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ZooKeeper - Client environment:java.home=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.212.b04-0.el7_6.x86_64/jre
1221 [main] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ZooKeeper - Client environment:java.class.path=/home/user/janusgraph/conf/gremlin-server:/home/user/janusgraph/lib/slf4j-log4j12-
// Here hanusgraph download very many dependencies
1256 [main] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ZooKeeper - Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
1256 [main] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ZooKeeper - Client environment:java.io.tmpdir=/tmp
1256 [main] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ZooKeeper - Client environment:java.compiler=<NA>
1256 [main] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ZooKeeper - Client environment:os.name=Linux
1256 [main] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ZooKeeper - Client environment:os.arch=amd64
1256 [main] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ZooKeeper - Client environment:os.version=3.10.0-862.el7.x86_64
1256 [main] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ZooKeeper - Client environment:user.name=user
1257 [main] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ZooKeeper - Client environment:user.home=/home/user
1257 [main] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ZooKeeper - Client environment:user.dir=/home/user/janusgraph
1257 [main] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ZooKeeper - Initiating client connection, connectString=main.local:2181,data1.local:2181,data2.local:2181 sessionTimeout=90000 watcher=hconnection-0x1e44b6380x0, quorum=main.local:2181,data1.local:2181,data2.local:2181, baseZNode=/hbase-unsecure
1274 [main-SendThread(data2.local:2181)] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ClientCnxn - Opening socket connection to server data2.local/xxx.xxx.xxx.xxx:2181. Will not attempt to authenticate using SASL (unknown error)
1394 [main-SendThread(data2.local:2181)] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ClientCnxn - Socket connection established to data2.local/xxx.xxx.xxx.xxx, initiating session
1537 [main-SendThread(data2.local:2181)] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ClientCnxn - Session establishment complete on server data2.local/xxx.xxx.xxx.xxx:2181, sessionid = 0x26b266353e50014, negotiated timeout = 60000
3996 [main] INFO org.janusgraph.core.util.ReflectiveConfigOptionLoader - Loaded and initialized config classes: 13 OK out of 13 attempts in PT0.631S
4103 [main] INFO org.reflections.Reflections - Reflections took 60 ms to scan 2 urls, producing 0 keys and 0 values
4400 [main] WARN org.janusgraph.graphdb.configuration.GraphDatabaseConfiguration - Local setting cache.db-cache-time=180000 (Type: GLOBAL_OFFLINE) is overridden by globally managed value (10000). Use the ManagementSystem interface instead of the local configuration to control this setting.
4453 [main] WARN org.janusgraph.graphdb.configuration.GraphDatabaseConfiguration - Local setting cache.db-cache-clean-wait=20 (Type: GLOBAL_OFFLINE) is overridden by globally managed value (50). Use the ManagementSystem interface instead of the local configuration to control this setting.
4473 [main] INFO org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation - Closing master protocol: MasterService
4474 [main] INFO org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation - Closing zookeeper sessionid=0x26b266353e50014
4485 [main] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ZooKeeper - Session: 0x26b266353e50014 closed
4485 [main-EventThread] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ClientCnxn - EventThread shut down
4500 [main] INFO org.janusgraph.graphdb.configuration.GraphDatabaseConfiguration - Generated unique-instance-id=c0a8873843641-main-local1
4530 [main] INFO org.janusgraph.diskstorage.hbase.HBaseStoreManager - HBase configuration: setting zookeeper.znode.parent=/hbase-unsecure
4530 [main] INFO org.janusgraph.diskstorage.hbase.HBaseStoreManager - Copied host list from root.storage.hostname to hbase.zookeeper.quorum: main.local,data1.local,data2.local
4531 [main] INFO org.janusgraph.diskstorage.hbase.HBaseStoreManager - Copied Zookeeper Port from root.storage.port to hbase.zookeeper.property.clientPort: 2181
4532 [main] INFO org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper - Process identifier=hconnection-0x5bb3d42d connecting to ZooKeeper ensemble=main.local:2181,data1.local:2181,data2.local:2181
4532 [main] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ZooKeeper - Initiating client connection, connectString=main.local:2181,data1.local:2181,data2.local:2181 sessionTimeout=90000 watcher=hconnection-0x5bb3d42d0x0, quorum=main.local:2181,data1.local:2181,data2.local:2181, baseZNode=/hbase-unsecure
4534 [main-SendThread(main.local:2181)] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ClientCnxn - Opening socket connection to server main.local/xxx.xxx.xxx.xxx:2181. Will not attempt to authenticate using SASL (unknown error)
4534 [main-SendThread(main.local:2181)] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ClientCnxn - Socket connection established to main.local/xxx.xxx.xxx.xxx:2181, initiating session
4611 [main-SendThread(main.local:2181)] INFO org.apache.hadoop.hbase.shaded.org.apache.zookeeper.ClientCnxn - Session establishment complete on server main.local/xxx.xxx.xxx.xxx:2181, sessionid = 0x36b266353fd0021, negotiated timeout = 60000
4616 [main] INFO org.janusgraph.diskstorage.Backend - Configuring index [search]
5781 [main] INFO org.janusgraph.diskstorage.Backend - Initiated backend operations thread pool of size 16
6322 [main] INFO org.janusgraph.diskstorage.Backend - Configuring total store cache size: 186687592
7555 [main] INFO org.janusgraph.graphdb.database.IndexSerializer - Hashing index keys
7925 [main] INFO org.janusgraph.diskstorage.log.kcvs.KCVSLog - Loaded unidentified ReadMarker start time 2019-06-13T09:54:08.929Z into org.janusgraph.diskstorage.log.kcvs.KCVSLog$MessagePuller#656d10a4
7927 [main] INFO org.apache.tinkerpop.gremlin.server.GremlinServer - Graph [graph] was successfully configured via [config/db.properties].
7927 [main] INFO org.apache.tinkerpop.gremlin.server.util.ServerGremlinExecutor - Initialized Gremlin thread pool. Threads in pool named with pattern gremlin-*
Exception in thread "main" java.lang.IllegalStateException: java.lang.NoSuchMethodException: org.janusgraph.graphdb.tinkerpop.plugin.JanusGraphGremlinPlugin.build()
at org.apache.tinkerpop.gremlin.groovy.engine.GremlinExecutor.initializeGremlinScriptEngineManager(GremlinExecutor.java:522)
at org.apache.tinkerpop.gremlin.groovy.engine.GremlinExecutor.<init>(GremlinExecutor.java:126)
at org.apache.tinkerpop.gremlin.groovy.engine.GremlinExecutor.<init>(GremlinExecutor.java:83)
at org.apache.tinkerpop.gremlin.groovy.engine.GremlinExecutor$Builder.create(GremlinExecutor.java:813)
at org.apache.tinkerpop.gremlin.server.util.ServerGremlinExecutor.<init>(ServerGremlinExecutor.java:169)
at org.apache.tinkerpop.gremlin.server.util.ServerGremlinExecutor.<init>(ServerGremlinExecutor.java:89)
at org.apache.tinkerpop.gremlin.server.GremlinServer.<init>(GremlinServer.java:110)
at org.apache.tinkerpop.gremlin.server.GremlinServer.main(GremlinServer.java:363)
Caused by: java.lang.NoSuchMethodException: org.janusgraph.graphdb.tinkerpop.plugin.JanusGraphGremlinPlugin.build()
at java.lang.Class.getMethod(Class.java:1786)
at org.apache.tinkerpop.gremlin.groovy.engine.GremlinExecutor.initializeGremlinScriptEngineManager(GremlinExecutor.java:492)
... 7 more
Graph configuration:
storage.backend=hbase
storage.hostname=main.local,data1.local,data2.local
storage.port=2181
storage.hbase.ext.zookeeper.znode.parent=/hbase-unsecure
cache.db-cache=true
cache.db-cache-clean-wait=20
cache.db-cache-time=180000
cache.db-cache-size=0.5
index.search.backend=elasticsearch
index.search.hostname=xxx.xxx.xxx.xxx
index.search.port=9200
index.search.elasticsearch.client-only=false
gremlin.graph=org.janusgraph.core.JanusGraphFactory
host=0.0.0.0
Gremlin-server configuration
host: localhost
port: 8182
channelizer: org.apache.tinkerpop.gremlin.server.channel.HttpChannelizer
graphs: { graph: config/db.properties }
scriptEngines: {
gremlin-groovy: {
plugins: {
org.janusgraph.graphdb.tinkerpop.plugin.JanusGraphGremlinPlugin: {},
org.apache.tinkerpop.gremlin.server.jsr223.GremlinServerGremlinPlugin: {},
org.apache.tinkerpop.gremlin.tinkergraph.jsr223.TinkerGraphGremlinPlugin: {},
org.apache.tinkerpop.gremlin.jsr223.ImportGremlinPlugin: { classImports: [java.lang.Math], methodImports: [java.lang.Math#*] },
org.apache.tinkerpop.gremlin.jsr223.ScriptFileGremlinPlugin: { files: [scripts/janusgraph.groovy] }
}
}
}
serializers:
- { className: org.apache.tinkerpop.gremlin.driver.ser.GryoMessageSerializerV3d0, config: { ioRegistries: [org.janusgraph.graphdb.tinkerpop.JanusGraphIoRegistry] } }
- { className: org.apache.tinkerpop.gremlin.driver.ser.GryoMessageSerializerV3d0, config: { serializeResultToString: true } }
- { className: org.apache.tinkerpop.gremlin.driver.ser.GraphSONMessageSerializerV3d0, config: { ioRegistries: [org.janusgraph.graphdb.tinkerpop.JanusGraphIoRegistry] } }
metrics: {
slf4jReporter: {enabled: true, interval: 180000}
}
What do I need to do to server start without error?
I have put together a crawler in a test environment that was running just fine with 2 small sites, including successfully indexing to solr. So, the integration between nutch and solr seem to be fine.
The only change I have made is adding another site to seed.txt and another line in regex-urlfilters.txt, using the exact same syntax as the other sites.
Now when I run the crawler it runs fine for a while then crashes with a 'Job failed!' error and little helpful information.
This is the output to console. It is useful to note that this is the 3rd segment created in the crawl so it has already successfully indexed 2 segments before the error. Could there be something in the new site that is causing corruption?
Indexing 20151030150906 to index
/opt/apache-nutch-1.10/bin/nutch index -Dsolr.server.url=http://localhost:8983/solr/TestCrawlCore TestCrawl//crawldb -linkdb TestCrawl//linkdb TestCrawl//segments/20151030150906
Indexer: starting at 2015-10-30 15:14:00
Indexer: deleting gone documents: false
Indexer: URL filtering: false
Indexer: URL normalizing: false
Active IndexWriters :
SOLRIndexWriter
solr.server.url : URL of the SOLR instance (mandatory)
solr.commit.size : buffer size when sending to SOLR (default 1000)
solr.mapping.file : name of the mapping file for fields (default solrindex-mapping.xml)
solr.auth : use authentication (default false)
solr.auth.username : username for authentication
solr.auth.password : password for authentication
Indexer: java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1357)
at org.apache.nutch.indexer.IndexingJob.index(IndexingJob.java:113)
at org.apache.nutch.indexer.IndexingJob.run(IndexingJob.java:177)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.nutch.indexer.IndexingJob.main(IndexingJob.java:187)
Error running:
/opt/apache-nutch-1.10/bin/nutch index -Dsolr.server.url=http://localhost:8983/solr/TestCrawlCore TestCrawl//crawldb -linkdb TestCrawl//linkdb TestCrawl//segments/20151030150906
Failed with exit value 255.
This is the relevant data from hadoop.log
2015-10-30 15:14:00,854 INFO indexer.IndexingJob - Indexer: starting at 2015-10-30 15:14:00
2015-10-30 15:14:00,909 INFO indexer.IndexingJob - Indexer: deleting gone documents: false
2015-10-30 15:14:00,909 INFO indexer.IndexingJob - Indexer: URL filtering: false
2015-10-30 15:14:00,910 INFO indexer.IndexingJob - Indexer: URL normalizing: false
2015-10-30 15:14:01,113 INFO indexer.IndexWriters - Adding org.apache.nutch.indexwriter.solr.SolrIndexWriter
2015-10-30 15:14:01,113 INFO indexer.IndexingJob - Active IndexWriters :
SOLRIndexWriter
solr.server.url : URL of the SOLR instance (mandatory)
solr.commit.size : buffer size when sending to SOLR (default 1000)
solr.mapping.file : name of the mapping file for fields (default solrindex-mapping.xml)
solr.auth : use authentication (default false)
solr.auth.username : username for authentication
solr.auth.password : password for authentication
2015-10-30 15:14:01,118 INFO indexer.IndexerMapReduce - IndexerMapReduce: crawldb: TestCrawl/crawldb
2015-10-30 15:14:01,118 INFO indexer.IndexerMapReduce - IndexerMapReduce: linkdb: TestCrawl/linkdb
2015-10-30 15:14:01,119 INFO indexer.IndexerMapReduce - IndexerMapReduces: adding segment: TestCrawl/segments/20151030150906
2015-10-30 15:14:01,264 WARN util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2015-10-30 15:14:01,722 INFO anchor.AnchorIndexingFilter - Anchor deduplication is: off
2015-10-30 15:14:02,253 INFO indexer.IndexWriters - Adding org.apache.nutch.indexwriter.solr.SolrIndexWriter
2015-10-30 15:14:02,271 INFO solr.SolrMappingReader - source: content dest: content
2015-10-30 15:14:02,271 INFO solr.SolrMappingReader - source: title dest: title
2015-10-30 15:14:02,271 INFO solr.SolrMappingReader - source: host dest: host
2015-10-30 15:14:02,271 INFO solr.SolrMappingReader - source: segment dest: segment
2015-10-30 15:14:02,271 INFO solr.SolrMappingReader - source: boost dest: boost
2015-10-30 15:14:02,271 INFO solr.SolrMappingReader - source: digest dest: digest
2015-10-30 15:14:02,271 INFO solr.SolrMappingReader - source: tstamp dest: tstamp
2015-10-30 15:14:02,370 INFO solr.SolrIndexWriter - Indexing 38 documents
2015-10-30 15:14:02,487 INFO solr.SolrIndexWriter - Indexing 38 documents
2015-10-30 15:14:02,524 WARN mapred.LocalJobRunner - job_local593696138_0001
org.apache.solr.common.SolrException: Bad Request
Bad Request
request: http://localhost:8983/solr/TestCrawlCore/update?wt=javabin&version=2
at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:430)
at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:244)
at org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:105)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.close(SolrIndexWriter.java:153)
at org.apache.nutch.indexer.IndexWriters.close(IndexWriters.java:115)
at org.apache.nutch.indexer.IndexerOutputFormat$1.close(IndexerOutputFormat.java:44)
at org.apache.hadoop.mapred.ReduceTask$OldTrackingRecordWriter.close(ReduceTask.java:467)
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:535)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:421)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:398)
2015-10-30 15:14:03,508 ERROR indexer.IndexingJob - Indexer: java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1357)
at org.apache.nutch.indexer.IndexingJob.index(IndexingJob.java:113)
at org.apache.nutch.indexer.IndexingJob.run(IndexingJob.java:177)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.nutch.indexer.IndexingJob.main(IndexingJob.java:187)
I'm just figuring this stuff out so I don't know the next step in troubleshooting this problem. Any help would be appreciated. I'm happy to include more information if there is something specific that would be helpful.
This turned out to be a mismatch between the nutch and solr schemas.
Thanks to TMBT (see comments above) I found an additional error in the Solr logs claiming "unidentified field: "anchor".
All I had to do was copy the anchor field declaration from the nutch schema into the Solr schema and restart the solr service. Now running fine.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 3 years ago.
Improve this question
(Projektassistent v1.4 / installed with V-Modell-XT-1.4-online-installer.jar) exporting a project model to HTML causes this logoutput or more specific the last error message:
2013-12-18 11:23:37,009 [Thread-5] ERROR com.foursoft.fourever.export - Exception caught from V-Modell exporter.
de.tuc.in.sse.weit.export.steuerung.exception.ExportException: Unerwarteter Fehler im JooConverter (OpenOffice)!
net.sf.jooreports.openoffice.connection.OpenOfficeException: conversion failed; com.sun.star.lang.DisposedException: java.io.IOException: com.sun.star.io.IOException: java.net.SocketException: Connection reset
at de.tuc.in.sse.weit.export.steuerung.handler.JooConverterHandler.convert(JooConverterHandler.java:99)
...
at com.foursoft.projektassistent.view.util.SwingWorkerVariant$2.run(SwingWorkerVariant.java:66)
at java.lang.Thread.run(Unknown Source)
2013-12-18 11:23:37,010 [Thread-5] ERROR com.foursoft.projektassistent.projekt - Error while exporting.
com.foursoft.fourever.export.exception.ExportException: Unerwarteter Fehler im JooConverter (OpenOffice)!
net.sf.jooreports.openoffice.connection.OpenOfficeException: conversion failed; com.sun.star.lang.DisposedException: java.io.IOException: com.sun.star.io.IOException: java.net.SocketException: Connection reset
at edu.tum.cs.vmodell.export.impl.VModellExporterImpl.export(VModellExporterImpl.java:179)
at com.foursoft.fourever.vmodell.exporter.ExporterBackupFacade.export(ExporterBackupFacade.java:59)
at com.foursoft.fourever.vmodell.exporter.ExporterMergeFacade.export(ExporterMergeFacade.java:77)
at com.foursoft.projektassistent.projekt.impl.VMProjektManagerImpl.exportVModel(VMProjektManagerImpl.java:2540)
at com.foursoft.projektassistent.view.impl.TailorView$1.doNonUILogic(TailorView.java:254)
at com.foursoft.projektassistent.view.util.SwingWorkerVariant.construct(SwingWorkerVariant.java:108)
at com.foursoft.projektassistent.view.util.SwingWorkerVariant$2.run(SwingWorkerVariant.java:66)
at java.lang.Thread.run(Unknown Source)
Caused by: de.tuc.in.sse.weit.export.steuerung.exception.ExportException: Unerwarteter Fehler im JooConverter (OpenOffice)!
net.sf.jooreports.openoffice.connection.OpenOfficeException: conversion failed; com.sun.star.lang.DisposedException: java.io.IOException: com.sun.star.io.IOException: java.net.SocketException: Connection reset
at de.tuc.in.sse.weit.export.steuerung.handler.JooConverterHandler.convert(JooConverterHandler.java:99)
at de.tuc.in.sse.weit.export.steuerung.handler.HTMLHandler.convert(HTMLHandler.java:98)
at de.tuc.in.sse.weit.export.steuerung.impl.ODTReportGenerator.export(ODTReportGenerator.java:600)
at edu.tum.cs.vmodell.export.impl.VModellExporterImpl.export(VModellExporterImpl.java:173)
... 7 more
The cause is not clear, but since the *.odt file is nevertheless created in the export dir it can just be openened and one can Save as ... HTML file instead as well.
and here some previous log entries that could potentially be useful finding the root cause:
2013-12-18 11:22:26,192 [Thread-3] ERROR com.foursoft.fourever.xmlfileio - No complex type Zusatzthema
2013-12-18 11:22:26,192 [Thread-3] ERROR com.foursoft.fourever.xmlfileio - No complex type Zusatzthema
2013-12-18 11:23:00,484 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.TableHandler - Table configuration: Invalid table header cell style!
2013-12-18 11:23:01,765 [Thread-5] WARN com.foursoft.fourever.openoffice - OpenOffice already running....
2013-12-18 11:23:02,754 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.template.TemplateTransformer - Could not open referenced template "MakeLinkTo": No template found for name "MakeLinkTo"!
2013-12-18 11:23:02,769 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.template.TemplateTransformer - Could not open referenced template "MakeLinkTo": No template found for name "MakeLinkTo"!
2013-12-18 11:23:20,604 [Thread-5] WARN com.foursoft.fourever.openoffice - OpenOffice already running....
2013-12-18 11:23:20,674 [Thread-5] ERROR com.foursoft.fourever.draw - Draw error occurred.
com.sun.star.lang.DisposedException: java_remote_bridge com.sun.star.lib.uno.bridges.java_remote.java_remote_bridge#393ec620 is disposed
at com.sun.star.lib.uno.bridges.java_remote.java_remote_bridge.checkDisposed(java_remote_bridge.java:702)
...
at com.foursoft.projektassistent.view.util.SwingWorkerVariant$2.run(SwingWorkerVariant.java:66)
at java.lang.Thread.run(Unknown Source)
2013-12-18 11:23:20,680 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.ImageHandler - Could not open image images/PTV_df42106fce7993f.gif!
2013-12-18 11:23:20,690 [Thread-5] WARN com.foursoft.fourever.openoffice - OpenOffice already running....
2013-12-18 11:23:20,691 [Thread-5] ERROR com.foursoft.fourever.draw - Draw error occurred.
de.tuc.in.sse.weit.export.openoffice.exception.OpenOfficeException: OpenOffice document not initialized.
at de.tuc.in.sse.weit.export.openoffice.documents.ODTReportDocument.<init>(ODTReportDocument.java:53)
...
at com.foursoft.projektassistent.view.util.SwingWorkerVariant$2.run(SwingWorkerVariant.java:66)
at java.lang.Thread.run(Unknown Source)
2013-12-18 11:23:20,703 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,703 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-dc2f116434d0752-1393a1164341bca2
2013-12-18 11:23:20,704 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,704 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-dc2f116434d0752-1393a1164341bca2
2013-12-18 11:23:20,704 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-dc2f116434d0752-6b811643421c56
2013-12-18 11:23:20,704 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,704 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-dc2f116434d0752-6b811643421c56
2013-12-18 11:23:20,704 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-10a1511759fd9725-16eb911759ff2d84
2013-12-18 11:23:20,704 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,704 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-10a1511759fd9725-16eb911759ff2d84
2013-12-18 11:23:20,704 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-11af9116435066e8-d32f117c547ee58
2013-12-18 11:23:20,705 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,705 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-11af9116435066e8-d32f117c547ee58
2013-12-18 11:23:20,705 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-11af9116435066e8-f710117c54823f8
2013-12-18 11:23:20,705 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,705 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-11af9116435066e8-d32f117c547ee58
2013-12-18 11:23:20,705 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-11af9116435066e8-f710117c54823f8
2013-12-18 11:23:20,706 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,707 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: fb18116435f01eb-19a411643651eae
2013-12-18 11:23:20,707 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: fb18116435f01eb-12e1511643667610
2013-12-18 11:23:20,707 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: fb18116435f01eb-11fa2116436549b5
2013-12-18 11:23:20,707 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,707 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: fb18116435f01eb-11fa2116436549b5
2013-12-18 11:23:20,707 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: fb18116435f01eb-59ca116436809b0
2013-12-18 11:23:20,707 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: fb18116435f01eb-15be1164380d71d-395d116436569c0
2013-12-18 11:23:20,708 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,708 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: fb18116435f01eb-15be1164380d71d-395d116436569c0
2013-12-18 11:23:20,708 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: fb18116435f01eb-15be1164380d71d-1195711a977992e2
2013-12-18 11:23:20,708 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: fb18116435f01eb-15be1164380d71d-1011c11a977d2f81
2013-12-18 11:23:20,708 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: fb18116435f01eb-15be1164380d71d-e45611643658caa
2013-12-18 11:23:20,708 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,708 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: fb18116435f01eb-15be1164380d71d-e45611643658caa
2013-12-18 11:23:20,709 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: fb18116435f01eb-15be1164380d71d-395d116436569c0
2013-12-18 11:23:20,709 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,709 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: fb18116435f01eb-15be1164380d71d-e45611643658caa
2013-12-18 11:23:20,709 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: fb18116435f01eb-d675116437dfb3e
2013-12-18 11:23:20,709 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: fb18116435f01eb-28871164365afd2
2013-12-18 11:23:20,709 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,709 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: fb18116435f01eb-28871164365afd2
2013-12-18 11:23:20,709 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: fb18116435f01eb-11fa2116436549b5
2013-12-18 11:23:20,709 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,710 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: fb18116435f01eb-28871164365afd2
2013-12-18 11:23:20,710 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: fb18116435f01eb-79ac116437f3890
2013-12-18 11:23:20,710 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: fb18116435f01eb-1dd1164365ce47
2013-12-18 11:23:20,710 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,710 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-11af9116435066e8-f710117c54823f8
2013-12-18 11:23:20,710 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-11af9116435066e8-63ab117c5488b42
2013-12-18 11:23:20,711 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,711 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-9e811176339c5eb
2013-12-18 11:23:20,711 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-15212117633cecca
2013-12-18 11:23:20,711 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-107e0117633ad729
2013-12-18 11:23:20,711 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,711 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-9e811176339c5eb
2013-12-18 11:23:20,711 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-15212117633cecca
2013-12-18 11:23:20,711 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-a291117633bff32-395d116436569c0
2013-12-18 11:23:20,711 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,711 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-107e0117633ad729
2013-12-18 11:23:20,711 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-e3e117633f85dd
2013-12-18 11:23:20,711 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-161e9117634165fd
2013-12-18 11:23:20,711 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-11121117633b3993
2013-12-18 11:23:20,712 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,712 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-a291117633bff32-395d116436569c0
2013-12-18 11:23:20,712 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-a291117633bff32-1195711a977992e2
2013-12-18 11:23:20,712 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-a291117633bff32-1011c11a977d2f81
2013-12-18 11:23:20,712 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-a291117633bff32-e45611643658caa
2013-12-18 11:23:20,713 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,713 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-a291117633bff32-e45611643658caa
2013-12-18 11:23:20,713 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-a291117633bff32-395d116436569c0
2013-12-18 11:23:20,713 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,713 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-a291117633bff32-e45611643658caa
2013-12-18 11:23:20,713 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-161e9117634165fd
2013-12-18 11:23:20,713 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-11121117633b3993
2013-12-18 11:23:20,714 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,714 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-11121117633b3993
2013-12-18 11:23:20,714 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-9e811176339c5eb
2013-12-18 11:23:20,714 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,714 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-11121117633b3993
2013-12-18 11:23:20,714 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-e9061176344244b
2013-12-18 11:23:20,714 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: 15bc311643601d9c-e4a5117633b8bb8
2013-12-18 11:23:20,714 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,714 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-11af9116435066e8-f710117c54823f8
2013-12-18 11:23:20,714 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-11af9116435066e8-63ab117c5488b42
2013-12-18 11:23:20,714 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,714 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-11af9116435066e8-63ab117c5488b42
2013-12-18 11:23:20,714 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-11af9116435066e8-d32f117c547ee58
2013-12-18 11:23:20,714 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,714 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-11af9116435066e8-63ab117c5488b42
2013-12-18 11:23:20,714 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-10a1511759fd9725-16eb911759ff2d84
2013-12-18 11:23:20,715 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Could not find a snippet image for the snippet with id set:
2013-12-18 11:23:20,715 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-11af9116435066e8-63ab117c5488b42
2013-12-18 11:23:20,715 [Thread-5] WARN de.tuc.in.sse.weit.export.ootrans.transform.snippets.impl.Snippet - Id: d46811643407f31-1c1a116435138ce-13cc01196fc504a7