Elasticsearch NoNodeAvailableException - elasticsearch

I am getting following error from Elasticsearch.
`<html><head><title>Apache Tomcat/7.0.64 - Error report</title><style><!--H1 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:22px;} H2 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:16px;} H3 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:14px;} BODY {font-family:Tahoma,Arial,sans-serif;color:black;background-color:white;} B {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;} P {font-family:Tahoma,Arial,sans-serif;background:white;color:black;font-size:12px;}A {color : black;}A.name {color : black;}HR {color : #525D76;}--></style> </head><body><h1>HTTP Status 500 - Request processing failed; nested exception is org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: []</h1><HR size="1" noshade="noshade"><p><b>type</b> Exception report</p><p><b>message</b> <u>Request processing failed; nested exception is org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: []</u></p><p><b>description</b> <u>The server encountered an internal error that prevented it from fulfilling this request.</u></p><p><b>exception</b> <pre>org.springframework.web.util.NestedServletException: Request processing failed; nested exception is org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: []
org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:943)
org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:822)
javax.servlet.http.HttpServlet.service(HttpServlet.java:624)
org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:807)
javax.servlet.http.HttpServlet.service(HttpServlet.java:731)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
myapp.filter.SimpleCORSFilter.doFilter(SimpleCORSFilter.java:22)
</pre></p><p><b>root cause</b> <pre>org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: []
org.elasticsearch.client.transport.TransportClientNodesService.ensureNodesAreAvailable(TransportClientNodesService.java:305)
org.elasticsearch.client.transport.TransportClientNodesService.execute(TransportClientNodesService.java:200)
org.elasticsearch.client.transport.support.InternalTransportIndicesAdminClient.execute(InternalTransportIndicesAdminClient.java:86)
org.elasticsearch.client.support.AbstractIndicesAdminClient.exists(AbstractIndicesAdminClient.java:178)
org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsRequestBuilder.doExecute(IndicesExistsRequestBuilder.java:53)
org.elasticsearch.action.ActionRequestBuilder.execute(ActionRequestBuilder.java:91)
org.elasticsearch.action.ActionRequestBuilder.execute(ActionRequestBuilder.java:65)
myapp.dao.CommonDao.ValidateTenant(CommonDao.java:7)
myapp.dao.CliffDomainDao.viewCliffDomains(CliffDomainDao.java:55)
myapp.service.CliffDomainService.getall(CliffDomainService.java:53)
myapp.controller.CliffDomainController.getall(CliffDomainController.java:79)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:497)
org.springframework.web.method.support.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:214)
org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:132)
org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:104)
org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandleMethod(RequestMappingHandlerAdapter.java:748)
org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:689)
org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:83)
org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:945)
org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:876)
org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:931)
org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:822)
javax.servlet.http.HttpServlet.service(HttpServlet.java:624)
org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:807)
javax.servlet.http.HttpServlet.service(HttpServlet.java:731)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
myapp.filter.SimpleCORSFilter.doFilter(SimpleCORSFilter.java:22)
</pre></p><p><b>note</b> <u>The full stack trace of the root cause is available in the Apache Tomcat/7.0.64 logs.</u></p><HR size="1" noshade="noshade"><h3>Apache Tomcat/7.0.64</h3></body></html>`
I am running Elasticsearch 1.7.2 on Ubuntu.
Changes that I have made in elasticsearch.yml
################################### Cluster ###################################
# Cluster name identifies your cluster for auto-discovery. If you're running
# multiple clusters on the same network, make sure you're using unique names.
#
cluster.name: cliffservice
# Set the address other nodes will use to communicate with this node. If not
# set, it is automatically derived. It must point to an actual IP address.
#
#network.publish_host: 192.168.0.1
network.publish_host: 10.100.10.231
# Set both 'bind_host' and 'publish_host':
#
#network.host: 192.168.0.1
Connection code
public TransportClient getClient() {
Settings settings = ImmutableSettings.settingsBuilder().put("cluster.name", "cliffservice").build();
TransportClient client = new TransportClient(settings);
client = client.addTransportAddress(new InetSocketTransportAddress(this.host, this.port));
this.esclient = client;
return client;
}
Get data from elastic client
SearchResponse response = Connection.getEsclient().prepareSearch("testindex").setTypes("testtype").execute()
.actionGet();
if (response.getHits().getHits().length > 0) {
for (SearchHit hit : response.getHits().getHits()) {
CliffDomain cliffDomain = new CliffDomain();
assetDomain.setCliffDomainId(hit.getId());
assetDomain.setCliffDomainName((String) hit.sourceAsMap().get("clifftDomainName"));
searchResponse.add(cliffDomain);
}
}
What wrong I am doing?

Your client and service are in different clusters. The client is in assetservice and the server is in cliffservice. Also, make sure that the client can connect to the port 9300 of 10.100.10.231.

Finally i figured out the problem. I was using elasticsearch-1.7.1 in my Java application. In my local computer Elasticsearch-1.7.1 is installed but in server the version of elasticsearch was 1.7.2. That was the problem. I downgraded server's elasticsearch to 1.7.1 and now it is working fine.

Related

Bulk Request failed in ElasticSinkConnector

I got the following error while creating the elasticsinkconnector.
CREATE SOURCE CONNECTOR testdemosinkconnector WITH(
"type.name"= '_doc',
"input.data.format"= 'AVRO',
"connector.class"= 'io.confluent.connect.elasticsearch.ElasticsearchSinkConnector',
"tasks.max"= '1',
"transforms"= 'Dealership',
"topics"= 'es.contact.model',
"transforms.Dealership.type"= 'io.confluent.connect.transforms.ExtractTopic$Value',
"transforms.Dealership.field"= 'indexTopicName',
"transforms.Dealership.skip.missing.or.null"= 'true',
"connection.url"= 'https://elasticsearchdemo.es.us-central1.gcp.cloud.es.io:9243',
"connection.username"= 'elastic',
"connection.password"= 'BUgBxOBg3dv4jp4Z3W7p4tHC',
"key.ignore"= 'true',
"value.converter"= 'io.confluent.connect.avro.AvroConverter',
"value.converter.schemas.enable"= 'true',
"value.converter.schema.registry.url"= 'http://localhost:8081',
"bulk.size.bytes"= '-1',
"behavior.on.null.values"= 'IGNORE',enter code here
"behavior.on.malformed.documents"= 'IGNORE',
"max.retries"= '5',
"retry.backoff.ms"= '5000'
);
The error is,
FAILED | org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception.
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:618)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:334)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:235)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:204)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:200)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:255)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: org.apache.kafka.connect.errors.ConnectException: Bulk request failed
at io.confluent.connect.elasticsearch.ElasticsearchClient$1.afterBulk(ElasticsearchClient.java:397)
at org.elasticsearch.action.bulk.BulkRequestHandler$1.onFailure(BulkRequestHandler.java:70)
at org.elasticsearch.action.ActionListener$5.onFailure(ActionListener.java:258)
at org.elasticsearch.action.bulk.Retry$RetryHandler.onFailure(Retry.java:126)
at io.confluent.connect.elasticsearch.ElasticsearchClient.lambda$null$1(ElasticsearchClient.java:174)
... 5 more
Caused by: org.apache.kafka.connect.errors.ConnectException: Failed to execute bulk request due to 'java.io.IOException: Unable to parse response body for Response{requestLine=POST /_bulk?timeout=1m HTTP/1.1, host=https://elasticsearchdemo.es.us-central1.gcp.cloud.es.io:9243, response=HTTP/1.1 200 OK}' after 6 attempt(s)
at io.confluent.connect.elasticsearch.RetryUtil.callWithRetries(RetryUtil.java:165)
at io.confluent.connect.elasticsearch.RetryUtil.callWithRetries(RetryUtil.java:119)
at io.confluent.connect.elasticsearch.ElasticsearchClient.callWithRetries(ElasticsearchClient.java:425)
at io.confluent.connect.elasticsearch.ElasticsearchClient.lambda$null$1(ElasticsearchClient.java:168)
... 5 more
Caused by: java.io.IOException: Unable to parse response body for Response{requestLine=POST /_bulk?timeout=1m HTTP/1.1, host=https://elasticsearchdemo.es.us-central1.gcp.cloud.es.io:9243, response=HTTP/1.1 200 OK}
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1632)
at org.elasticsearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:1583)
at org.elasticsearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:1553)
at org.elasticsearch.client.RestHighLevelClient.bulk(RestHighLevelClient.java:533)
at io.confluent.connect.elasticsearch.ElasticsearchClient.lambda$null$0(ElasticsearchClient.java:170)
at io.confluent.connect.elasticsearch.RetryUtil.callWithRetries(RetryUtil.java:158)
... 8 more
Caused by: java.lang.NullPointerException
at java.base/java.util.Objects.requireNonNull(Objects.java:221)
at org.elasticsearch.action.DocWriteResponse.<init>(DocWriteResponse.java:127)
at org.elasticsearch.action.index.IndexResponse.<init>(IndexResponse.java:54)
at org.elasticsearch.action.index.IndexResponse.<init>(IndexResponse.java:39)
at org.elasticsearch.action.index.IndexResponse$Builder.build(IndexResponse.java:107)
at org.elasticsearch.action.index.IndexResponse$Builder.build(IndexResponse.java:104)
at org.elasticsearch.action.bulk.BulkItemResponse.fromXContent(BulkItemResponse.java:159)
at org.elasticsearch.action.bulk.BulkResponse.fromXContent(BulkResponse.java:196)
at org.elasticsearch.client.RestHighLevelClient.parseEntity(RestHighLevelClient.java:1892)
at org.elasticsearch.client.RestHighLevelClient.lambda$performRequestAndParseEntity$8(RestHighLevelClient.java:1554)
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1630)
13 more
Please help me to resolve this error.
Elastic Sink Connector Version : 11.1.10
Elastic Search Version : 8.2.2
Elasticsearch version 8 is not supported by the Confluent Elasticsearch Sink Connector Version 11.1.10 so most likely that it is why it can't parse the Elasticsearch response properly
As of version 11.0.0, the connector uses the Elasticsearch High Level REST Client (version 7.0.1), which means only Elasticsearch 7.x is supported.
https://docs.confluent.io/kafka-connect-elasticsearch/current/overview.html

Debezium content based router setup unable to find DebeziumException

I'm trying to setup a Debezium source connector in docker which uses ContentBasedRouting and I'm following the official doc.
I have all the following dependencies in /kafka/connect/debezium-connector-mysql:
antlr4-runtime-4.7.2.jar groovy-3.0.9.jar
debezium-connector-mysql-1.0.3.Final.jar groovy-json-3.0.9.jar
debezium-core-1.0.3.Final.jar groovy-jsr223-3.0.9.jar
debezium-ddl-parser-1.0.3.Final.jar mysql-binlog-connector-java-0.19.1.jar
debezium-scripting-1.7.2.Final.jar mysql-connector-java-8.0.16.jar
At the moment I'm trying to add the connector with the following transforms:
"transforms": "route",
"transforms.route.type": "io.debezium.transforms.ContentBasedRouter",
"transforms.route.language": "jsr223.groovy",
"transforms.route.topic.expression": "value.after.name == 'x' ? 'topic1' : null",
"transforms.route.null.handling.mode": "drop"
I get http status 500 and the log:
javax.servlet.ServletException: org.glassfish.jersey.server.ContainerException: java.lang.NoClassDefFoundError: io/debezium/DebeziumException
at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:408)
at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:346)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:365)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:318)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:205)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:852)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:544)
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1581)
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1307)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:482)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1549)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1204)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:221)
at org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:173)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at org.eclipse.jetty.server.Server.handle(Server.java:494)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:374)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:268)
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:782)
at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:918)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: org.glassfish.jersey.server.ContainerException: java.lang.NoClassDefFoundError: io/debezium/DebeziumException
at org.glassfish.jersey.servlet.internal.ResponseWriter.rethrow(ResponseWriter.java:254)
at org.glassfish.jersey.servlet.internal.ResponseWriter.failure(ResponseWriter.java:236)
at org.glassfish.jersey.server.ServerRuntime$Responder.process(ServerRuntime.java:436)
at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:261)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:248)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:244)
at org.glassfish.jersey.internal.Errors.process(Errors.java:292)
at org.glassfish.jersey.internal.Errors.process(Errors.java:274)
at org.glassfish.jersey.internal.Errors.process(Errors.java:244)
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:265)
at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:232)
at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:679)
at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:392)
... 28 more
Caused by: java.lang.NoClassDefFoundError: io/debezium/DebeziumException
at java.base/java.lang.Class.forName0(Native Method)
at java.base/java.lang.Class.forName(Class.java:398)
at org.apache.kafka.common.config.ConfigDef.parseType(ConfigDef.java:719)
at org.apache.kafka.connect.runtime.ConnectorConfig.enrich(ConnectorConfig.java:308)
at org.apache.kafka.connect.runtime.AbstractHerder.validateConnectorConfig(AbstractHerder.java:302)
at org.apache.kafka.connect.runtime.distributed.DistributedHerder$6.call(DistributedHerder.java:745)
at org.apache.kafka.connect.runtime.distributed.DistributedHerder$6.call(DistributedHerder.java:742)
at org.apache.kafka.connect.runtime.distributed.DistributedHerder.tick(DistributedHerder.java:342)
at org.apache.kafka.connect.runtime.distributed.DistributedHerder.run(DistributedHerder.java:282)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
... 1 more
Caused by: java.lang.ClassNotFoundException: io.debezium.DebeziumException
at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:471)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:588)
at org.apache.kafka.connect.runtime.isolation.PluginClassLoader.loadClass(PluginClassLoader.java:104)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
... 14 more
Tried adding a connector without the transform (everything else unchanged) and it worked.
What am I doing wrong?
Ok so I solved it by starting from clean environment because I've probably played with a mix of too many versions. Made sure I'm using Debezium 1.8 for everything and maybe the weirdest thing is that I had to put all the Debezium jars on the connect container in /kafka/libs because it didn't work in /kafka/connect/debezium-connector-mysql for some reason...

Nutch with solr on https

Good morning,
I come to you because I have a problem with Nutch (1.14) and Solr (7.2)
so it works fine until I put SSL in place.
With Solr in http, once the crawl is finished I execute this command
bin/nutch index -Dsolr.server.url=http://127.0.0.1:8983/solr/CORENAME crawltest/crawldb/ -linkdb crawltest/linkdb/ crawltest/segments/* -filter -normalize -deleteGone
And it works very well
However, once SSL is activated and the solr server in HTTPS, it is impossible to send the data to Solr.
I have added in nutch site the following properties
<name>solr.auth</name>
<value>true</value>
<property>
<name>solr.auth.username</name>
<value>xxxx</value>
<property>
<name>solr.auth.password</name>
<value>xxxx</value>
property>
<name>solr.server.type</name>
<value>https</value>
property>
<name>solr.server.url</name>
<value>https://127.0.0.1:8983/solr/CORENAME</value>
But when I execute the previous command I get an error of this type
Caused by: org.apache.solr.client.solrj.SolrServerException: IOException occured when talking to server at: https://127.0.0.1:8983/solr/CORENAME
&
caused by: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
&
Caused by: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
Did you succeed in sending the data to a HTTPS solr?
Thanks
EDIT
To fix this errors following SSL procedure https://lucene.apache.org/solr/guide/7_0/enabling-ssl.html
And at the end execute this
keytool -import -file /path/to/solr/solr-ssl.pem -alias solr_cert -keystore /path/to/java-cacert (jre/lib/security/cacerts)
the default password is changeit
it advances a little, once the certificate import in cacerts I do not have any more this error.
Still in the same context, after activating SSL and authentication on the solr server. I use Nutch to Crawl the urls and send the data to solr.
Since the implementation of SSL I can no longer send data to SOLR.
When i execute this bin/nutch index -Dsolr.server.url=https://localhost:8983/solr/CORE -Dsolr.auth=true -Dsolr.auth.username='solr' -Dsolr.auth.password='xxxx' crawltest/crawldb/ -linkdb crawltest/linkdb/ crawltest/segments/* -filter -normalize -deleteGone
I have the following two errors:
java.lang.Exception: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://localhost:8983/solr/CORE: Expected mime type application/octet-stream but got text/html. <html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 401 Unauthorized</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /solr/CORE/update. Reason:
<pre> Unauthorized</pre></p>
</body>
</html>
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused by: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://localhost:8983/solr/CORE: Expected mime type application/octet-stream but got text/html. <html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 401 Unauthorized</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /solr/CORE/update. Reason:
<pre> Unauthorized</pre></p>
</body>
</html>
EDIT :
The first error is due to an authentication error.
After filling in the right values, I have a new error that I don't understand.. Do you have any ideas?
2018-06-20 09:47:18,116 INFO regex.RegexURLNormalizer - can't find rules for scope 'indexer', using default
2018-06-20 09:47:19,151 INFO indexer.IndexWriters - Adding org.apache.nutch.indexwriter.solr.SolrIndexWriter
2018-06-20 09:47:19,194 INFO solr.SolrMappingReader - source: content dest: content
2018-06-20 09:47:19,194 INFO solr.SolrMappingReader - source: title dest: title
2018-06-20 09:47:19,194 INFO solr.SolrMappingReader - source: host dest: host
2018-06-20 09:47:19,194 INFO solr.SolrMappingReader - source: segment dest: segment
2018-06-20 09:47:19,194 INFO solr.SolrMappingReader - source: boost dest: boost
2018-06-20 09:47:19,195 INFO solr.SolrMappingReader - source: digest dest: digest
2018-06-20 09:47:19,195 INFO solr.SolrMappingReader - source: tstamp dest: tstamp
2018-06-20 09:47:19,525 INFO solr.SolrIndexWriter - Indexing 250/250 documents
2018-06-20 09:47:19,525 INFO solr.SolrIndexWriter - Deleting 0 documents
2018-06-20 09:47:19,808 INFO solr.SolrIndexWriter - Indexing 250/250 documents
2018-06-20 09:47:19,809 INFO solr.SolrIndexWriter - Deleting 0 documents
2018-06-20 09:47:19,951 WARN mapred.LocalJobRunner - job_local146539832_0001
java.lang.Exception: java.io.IOException
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused by: java.io.IOException
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.makeIOException(SolrIndexWriter.java:234)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.push(SolrIndexWriter.java:213)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.write(SolrIndexWriter.java:174)
at org.apache.nutch.indexer.IndexWriters.write(IndexWriters.java:87)
at org.apache.nutch.indexer.IndexerOutputFormat$1.write(IndexerOutputFormat.java:50)
at org.apache.nutch.indexer.IndexerOutputFormat$1.write(IndexerOutputFormat.java:41)
at org.apache.hadoop.mapred.ReduceTask$OldTrackingRecordWriter.write(ReduceTask.java:493)
at org.apache.hadoop.mapred.ReduceTask$3.collect(ReduceTask.java:422)
at org.apache.nutch.indexer.IndexerMapReduce.reduce(IndexerMapReduce.java:369)
at org.apache.nutch.indexer.IndexerMapReduce.reduce(IndexerMapReduce.java:57)
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:444)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:319)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.solr.client.solrj.SolrServerException: IOException occured when talking to server at: https://localhost:8983/solr/ESRF-EXTERNAL
at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:589)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:240)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:229)
at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1219)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.push(SolrIndexWriter.java:210)
... 16 more
Caused by: java.net.SocketException: Broken pipe (Write failed)
at java.net.SocketOutputStream.socketWrite0(Native Method)
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111)
at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
at sun.security.ssl.OutputRecord.writeBuffer(OutputRecord.java:431)
at sun.security.ssl.OutputRecord.write(OutputRecord.java:417)
at sun.security.ssl.SSLSocketImpl.writeRecordInternal(SSLSocketImpl.java:886)
at sun.security.ssl.SSLSocketImpl.writeRecord(SSLSocketImpl.java:857)
at sun.security.ssl.AppOutputStream.write(AppOutputStream.java:123)
at org.apache.http.impl.io.AbstractSessionOutputBuffer.write(AbstractSessionOutputBuffer.java:181)
at org.apache.http.impl.io.ContentLengthOutputStream.write(ContentLengthOutputStream.java:115)
at org.apache.http.entity.InputStreamEntity.writeTo(InputStreamEntity.java:146)
at org.apache.http.entity.HttpEntityWrapper.writeTo(HttpEntityWrapper.java:96)
at org.apache.http.impl.client.EntityEnclosingRequestWrapper$EntityWrapper.writeTo(EntityEnclosingRequestWrapper.java:112)
at org.apache.http.impl.entity.EntitySerializer.serialize(EntitySerializer.java:117)
at org.apache.http.impl.AbstractHttpClientConnection.sendRequestEntity(AbstractHttpClientConnection.java:265)
at org.apache.http.impl.conn.ManagedClientConnectionImpl.sendRequestEntity(ManagedClientConnectionImpl.java:203)
at org.apache.http.protocol.HttpRequestExecutor.doSendRequest(HttpRequestExecutor.java:237)
at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:122)
at org.apache.http.impl.client.DefaultRequestDirector.tryExecute(DefaultRequestDirector.java:685)
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:487)
at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:882)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:107)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55)
at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:481)
... 20 more
2018-06-20 09:47:20,873 ERROR indexer.IndexingJob - Indexer: java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:873)
at org.apache.nutch.indexer.IndexingJob.index(IndexingJob.java:147)
at org.apache.nutch.indexer.IndexingJob.run(IndexingJob.java:230)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.nutch.indexer.IndexingJob.main(IndexingJob.java:239)
Stacktrace
name cpuTime / userTime
process reaper (37)
java.util.concurrent.SynchronousQueue$TransferStack#24197386
1.8587ms
0.0000ms
process reaper (36)
java.util.concurrent.SynchronousQueue$TransferStack#24197386
1.2672ms
0.0000ms
Scheduler-201556483 (31) java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#6202c0c1
1.1534ms
0.0000ms
searcherExecutor-7-thread-1 (30) java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#663d5859
63.2030ms
50.0000ms
DestroyJavaVM (27)
1164.4748ms
1040.0000ms
Thread-12 (25)
java.lang.Object#233fcafa
0.1211ms
0.0000ms
Connection evictor (23)
0.9319ms
0.0000ms
Connection evictor (22)
2.0995ms
0.0000ms
org.eclipse.jetty.server.session.HashSessionManager#1a052a00Timer (21) java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#6626f9cf
4.2127ms
0.0000ms
qtp2012232625-20 (20) java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#12d62902
56.7955ms
50.0000ms
qtp2012232625-19 (19)
47.6864ms
40.0000ms
qtp2012232625-18 (18) java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#12d62902
79.3320ms
70.0000ms
qtp2012232625-17 (17) java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#12d62902
100.9593ms
90.0000ms
qtp2012232625-16-acceptor-0#2d033cc4-ServerConnector#23c4c714{SSL,[ssl, http/1.1]}{0.0.0.0:8983} (16)
4.5898ms
0.0000ms
qtp2012232625-15 (15) java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#12d62902
73.3096ms
60.0000ms
qtp2012232625-14 (14) java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#12d62902
18.7950ms
10.0000ms
qtp2012232625-13 (13)
79.7804ms
70.0000ms
qtp2012232625-12 (12)
70.2385ms
60.0000ms
qtp2012232625-11 (11) java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#12d62902
22.1012ms
10.0000ms
ShutdownMonitor (10)
0.3055ms
0.0000ms
Signal Dispatcher (5)
0.0873ms
0.0000ms
Finalizer (3)
java.lang.ref.ReferenceQueue$Lock#1e254491
8.2575ms
0.0000ms
Reference Handler (2)
java.lang.ref.Reference$Lock#431035b5
6.3846ms
0.0000ms
EDIT2
To test I disabled authentication to see if the problem did not come from https. Without authentication it works!
I tried to change the file and include it in jetty-https.xml rather than jetty.xml.
I have 2 account configure like this
<security-constraint>
<web-resource-collection>
<web-resource-name>Solr authenticated application</web-resource-name>
<url-pattern>/</url-pattern>
</web-resource-collection>
<auth-constraint>
<role-name>admin</role-name>
</auth-constraint>
</security-constraint>
<login-config>
<auth-method>BASIC</auth-method>
<realm-name>Test Realm</realm-name>
</login-config>
security.json
{
"authentication":{
"blockUnknown": true,
"class":"solr.BasicAuthPlugin",
"credentials":{"solr":"xxxx"}
},
"authorization":{
"class":"solr.RuleBasedAuthorizationPlugin",
"permissions":[{"name":"security-edit",
"role":"admin"}],
"user-role":{"solr":"admin"}
}}
when i execute the following command
bin/nutch index -Dsolr.server.url=https://localhost:8983/solr/MYCORE -Dsolr.auth=true -Dsolr.auth.username='admin' -Dsolr.auth.password='xxxx' crawltest/crawldb/ -linkdb crawltest/linkdb/ crawltest/segments/* -filter -normalize -deleteGone
and I get this error
java.lang.Exception: java.io.IOException
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused by: java.io.IOException
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.makeIOException(SolrIndexWriter.java:234)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.push(SolrIndexWriter.java:213)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.write(SolrIndexWriter.java:174)
at org.apache.nutch.indexer.IndexWriters.write(IndexWriters.java:87)
at org.apache.nutch.indexer.IndexerOutputFormat$1.write(IndexerOutputFormat.java:50)
at org.apache.nutch.indexer.IndexerOutputFormat$1.write(IndexerOutputFormat.java:41)
at org.apache.hadoop.mapred.ReduceTask$OldTrackingRecordWriter.write(ReduceTask.java:493)
at org.apache.hadoop.mapred.ReduceTask$3.collect(ReduceTask.java:422)
at org.apache.nutch.indexer.IndexerMapReduce.reduce(IndexerMapReduce.java:369)
at org.apache.nutch.indexer.IndexerMapReduce.reduce(IndexerMapReduce.java:57)
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:444)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:319)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.solr.client.solrj.SolrServerException: IOException occured when talking to server at: https://localhost:8983/solr/MYCORE
at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:589)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:240)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:229)
at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1219)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.push(SolrIndexWriter.java:210)
... 16 more
Caused by: java.net.SocketException: Broken pipe (Write failed)
at java.net.SocketOutputStream.socketWrite0(Native Method)
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111)
at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
at sun.security.ssl.OutputRecord.writeBuffer(OutputRecord.java:431)
at sun.security.ssl.OutputRecord.write(OutputRecord.java:417)
at sun.security.ssl.SSLSocketImpl.writeRecordInternal(SSLSocketImpl.java:886)
at sun.security.ssl.SSLSocketImpl.writeRecord(SSLSocketImpl.java:857)
at sun.security.ssl.AppOutputStream.write(AppOutputStream.java:123)
at org.apache.http.impl.io.AbstractSessionOutputBuffer.write(AbstractSessionOutputBuffer.java:181)
at org.apache.http.impl.io.ContentLengthOutputStream.write(ContentLengthOutputStream.java:115)
at org.apache.http.entity.InputStreamEntity.writeTo(InputStreamEntity.java:146)
at org.apache.http.entity.HttpEntityWrapper.writeTo(HttpEntityWrapper.java:96)
at org.apache.http.impl.client.EntityEnclosingRequestWrapper$EntityWrapper.writeTo(EntityEnclosingRequestWrapper.java:112)
at org.apache.http.impl.entity.EntitySerializer.serialize(EntitySerializer.java:117)
at org.apache.http.impl.AbstractHttpClientConnection.sendRequestEntity(AbstractHttpClientConnection.java:265)
at org.apache.http.impl.conn.ManagedClientConnectionImpl.sendRequestEntity(ManagedClientConnectionImpl.java:203)
at org.apache.http.protocol.HttpRequestExecutor.doSendRequest(HttpRequestExecutor.java:237)
at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:122)
at org.apache.http.impl.client.DefaultRequestDirector.tryExecute(DefaultRequestDirector.java:685)
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:487)
at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:882)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:107)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55)
at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:481)
... 20 more
2018-06-25 09:38:41,870 ERROR indexer.IndexingJob - Indexer: java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:873)
at org.apache.nutch.indexer.IndexingJob.index(IndexingJob.java:147)
at org.apache.nutch.indexer.IndexingJob.run(IndexingJob.java:230)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.nutch.indexer.IndexingJob.main(IndexingJob.java:239)
And when i execute this
bin/nutch index -Dsolr.server.url=https://localhost:8983/solr/MYCORE -Dsolr.auth=true -Dsolr.auth.username='solr' -Dsolr.auth.password='xxxxx' crawltest/crawldb/ -linkdb crawltest/linkdb/ crawltest/segments/* -filter -normalize -deleteGone
I get this error
java.lang.Exception: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://localhost:8983/solr/MYCORE: Expected mime type application/octet-stream but got text/html. <html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 401 Unauthorized</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /solr/MYCORE/update. Reason:
<pre> Unauthorized</pre></p>
</body>
</html>
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused by: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://localhost:8983/solr/MYCORE: Expected mime type application/octet-stream but got text/html. <html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 401 Unauthorized</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /solr/MYCORE/update. Reason:
<pre> Unauthorized</pre></p>
</body>
</html>
at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:544)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:240)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:229)
at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1219)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.push(SolrIndexWriter.java:210)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.write(SolrIndexWriter.java:174)
at org.apache.nutch.indexer.IndexWriters.write(IndexWriters.java:87)
at org.apache.nutch.indexer.IndexerOutputFormat$1.write(IndexerOutputFormat.java:50)
at org.apache.nutch.indexer.IndexerOutputFormat$1.write(IndexerOutputFormat.java:41)
at org.apache.hadoop.mapred.ReduceTask$OldTrackingRecordWriter.write(ReduceTask.java:493)
at org.apache.hadoop.mapred.ReduceTask$3.collect(ReduceTask.java:422)
at org.apache.nutch.indexer.IndexerMapReduce.reduce(IndexerMapReduce.java:369)
at org.apache.nutch.indexer.IndexerMapReduce.reduce(IndexerMapReduce.java:57)
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:444)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:319)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
2018-06-25 09:45:20,106 ERROR indexer.IndexingJob - Indexer: java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:873)
at org.apache.nutch.indexer.IndexingJob.index(IndexingJob.java:147)
at org.apache.nutch.indexer.IndexingJob.run(IndexingJob.java:230)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.nutch.indexer.IndexingJob.main(IndexingJob.java:239)
or now this :
java.lang.Exception: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://127.0.0.1:8983/solr/MYCORE: Expected mime type application/octet-stream but got text/html. <html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=ISO-8859-1"/>
<title>Error 503 </title>
</head>
<body>
<h2>HTTP ERROR: 503</h2>
<p>Problem accessing /solr/MYCORE/update. Reason:
<pre> Service Unavailable</pre></p>
<hr />
</body>
</html>
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused by: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://127.0.0.1:8983/solr/MYCORE: Expected mime type application/octet-stream but got text/html. <html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=ISO-8859-1"/>
<title>Error 503 </title>
</head>
<body>
<h2>HTTP ERROR: 503</h2>
<p>Problem accessing /solr/MYCORE/update. Reason:
<pre> Service Unavailable</pre></p>
<hr />
</body>
</html>
Solr logs :
2018-06-25 14:18:44.352 INFO (main) [ ] o.e.j.s.Server jetty-9.3.20.v20170531
2018-06-25 14:18:44.597 WARN (main) [ ] o.e.j.w.WebAppContext Failed startup of context o.e.j.w.WebAppContext#5891e32e{/solr,file:///app/solr-7.2.1/server/solr-webapp/webapp/,UNAVAILABLE}{/app/solr-7.2.1/server/solr-webapp/webapp}
java.lang.IllegalStateException: No LoginService for org.eclipse.jetty.security.authentication.BasicAuthenticator#64c87930 in org.eclipse.jetty.security.ConstraintSecurityHandler#400cff1a
at org.eclipse.jetty.security.authentication.LoginAuthenticator.setConfiguration(LoginAuthenticator.java:76)
at org.eclipse.jetty.security.SecurityHandler.doStart(SecurityHandler.java:354)
at org.eclipse.jetty.security.ConstraintSecurityHandler.doStart(ConstraintSecurityHandler.java:448)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:131)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:105)
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61)
at org.eclipse.jetty.server.handler.ScopedHandler.doStart(ScopedHandler.java:120)
at org.eclipse.jetty.server.session.SessionHandler.doStart(SessionHandler.java:116)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:131)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:105)
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61)
at org.eclipse.jetty.server.handler.ScopedHandler.doStart(ScopedHandler.java:120)
at org.eclipse.jetty.server.handler.ContextHandler.startContext(ContextHandler.java:809)
at org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:345)
at org.eclipse.jetty.webapp.WebAppContext.startWebapp(WebAppContext.java:1406)
at org.eclipse.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1368)
at org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:778)
at org.eclipse.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:262)
at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:522)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.deploy.bindings.StandardStarter.processBinding(StandardStarter.java:41)
at org.eclipse.jetty.deploy.AppLifeCycle.runBindings(AppLifeCycle.java:188)
at org.eclipse.jetty.deploy.DeploymentManager.requestAppGoal(DeploymentManager.java:499)
at org.eclipse.jetty.deploy.DeploymentManager.addApp(DeploymentManager.java:147)
at org.eclipse.jetty.deploy.providers.ScanningAppProvider.fileAdded(ScanningAppProvider.java:180)
at org.eclipse.jetty.deploy.providers.WebAppProvider.fileAdded(WebAppProvider.java:458)
at org.eclipse.jetty.deploy.providers.ScanningAppProvider$1.fileAdded(ScanningAppProvider.java:64)
at org.eclipse.jetty.util.Scanner.reportAddition(Scanner.java:610)
at org.eclipse.jetty.util.Scanner.reportDifferences(Scanner.java:529)
at org.eclipse.jetty.util.Scanner.scan(Scanner.java:392)
at org.eclipse.jetty.util.Scanner.doStart(Scanner.java:313)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.deploy.providers.ScanningAppProvider.doStart(ScanningAppProvider.java:150)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.deploy.DeploymentManager.startAppProvider(DeploymentManager.java:561)
at org.eclipse.jetty.deploy.DeploymentManager.doStart(DeploymentManager.java:236)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:131)
at org.eclipse.jetty.server.Server.start(Server.java:422)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:113)
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61)
at org.eclipse.jetty.server.Server.doStart(Server.java:389)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.xml.XmlConfiguration$1.run(XmlConfiguration.java:1520)
at java.security.AccessController.doPrivileged(Native Method)
at org.eclipse.jetty.xml.XmlConfiguration.main(XmlConfiguration.java:1442)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.eclipse.jetty.start.Main.invokeMain(Main.java:215)
at org.eclipse.jetty.start.Main.start(Main.java:458)
at org.eclipse.jetty.start.Main.main(Main.java:76)
2018-06-25 14:18:44.745 INFO (main) [ ] o.e.j.s.Server Started #799ms
I Solved "No LoginService" by move the security.json on /var/solr/data/ (SOLR_HOME)
EDIT3:
now I only get an error message "No allow" when I want to send nutch data to solr. Also I can no longer connect to the admin interface I get the same error. I think it came from the security.json file
{
"authentication":{
"class":"solr.BasicAuthPlugin",
"credentials":{"solr":"xxxxxx"}
},
"authorization":{
"class":"solr.RuleBasedAuthorizationPlugin"
"permissions":[{"name":"security-edit","role":"adminRole"},{"name":"collection-admin-edit","role":"adminRole"},{"name":"update","role":"adminRole"},{"name":"all","role":"adminRole"},{"name":"core-admin-edit","role":"adminRole"},{"name":"read","role":"adminRole"},{"name":"config-edit","role":"adminRole"},{"name":"core-admin-read","role":"adminRole"},{"name":"core-admin-read","role":"adminRole"}]
"user-role":{"solr":"adminRole"}
}}
What did I do wrong? thanks
I added a new answer because the previous was too long
SOLVED API AUTHENTICATION BUT NOT WITH NUTCH:
So to make authentication work via the API I deleted the configuration made in jetty-https.xml, webdefault.xml and I deleted the realm.properties file as well as the basic auth options in solr.in.sh
I only work on the security.json file in SOLR HOME
In fact the biggest problem was that I did not use an encrypted password to test the connection while without it impossible to connect. On the other hand I still have the problem via nutch which is not allowed.
Here is the security.json file
`{
"authentication":{
"class":"solr.BasicAuthPlugin",
"credentials":{
"solr":"hzMjhfgN4b9X8KR0QgLB2Um3cUzqDzJygtEBL/O7g5E= CkP7HyXjYvqKNF3F4hBjnVvKGQOkLc/ta4FaNIkqgII="
}
},
"authorization":{
"class":"solr.RuleBasedAuthorizationPlugin",
"permissions":[
{
"name":"security-edit",
"role":"adminRole"
},
{
"name":"collection-admin-edit",
"role":"adminRole"
},
{
"name":"update",
"role":"adminRole"
},
{
"name":"config-edit",
"role":"adminRole"
},
{
"name":"core-admin-edit",
"role":"adminRole"
},
{
"name":"core-admin-read",
"role":"adminRole"
{
"name":"schema-edit",
"role":"adminRole"
},
{
"name":"all",
"role":"adminRole"
}
],
"user-role":{
"solr":"adminRole"
}
}
}
`
The encrypted password represented value "test"
To test the file code i suggest this http://json.parser.online.fr/
What could I have missed to be able to update solr with nutch?
SOLVED
Add in update role path for dataimport
{
"name":"update",
"path":"/dataimport",
"role":"adminRole"
},
But now i can index nutch to solr but i have a new error when i'm crawling...
`Thu Jun 28 09:21:03 CEST 2018 : Iteration 2 of 5
Generating a new segment
/app/nutch-external/bin/nutch generate -D mapreduce.job.reduces=2 -D mapred.child.java.opts=-Xmx1000m -D mapreduce.reduce.speculative=false -D mapreduce.map.speculative=false -D mapreduce.map.output.compress=true crawl//crawldb crawl//segments -topN 50000 -numFetchers 1 -noFilter
Generator: starting at 2018-06-28 09:21:04
Generator: Selecting best-scoring urls due for fetch.
Generator: filtering: false
Generator: normalizing: true
Generator: topN: 50000
Generator: 0 records selected for fetching, exiting ...
Generate returned 1 (no new segments created)
Escaping loop: no more URLs to fetch now
`
And during the first Iteration i have this errors
`Authorization challenge processed
No form element found with 'id' = adminRole, trying 'name'.
No form element found with 'id' = adminRole, trying 'name'.
No form element found with 'name' = adminRole
No form element found with 'name' = adminRole
Supported authentication schemes in the order of preference: [ntlm, digest, basic]
Supported authentication schemes in the order of preference: [ntlm, digest, basic]
Challenge for ntlm authentication scheme not available
Challenge for ntlm authentication scheme not available
Challenge for digest authentication scheme not available
basic authentication scheme selected
Using authentication scheme: basic
Authorization challenge processed
No form element found with 'id' = adminRole, trying 'name'.
No form element found with 'name' = adminRole
Failed to get protocol output
java.lang.RuntimeException: java.lang.IllegalArgumentException: No form exists: adminRole
at org.apache.nutch.protocol.httpclient.Http.resolveCredentials(Http.java:506)
at org.apache.nutch.protocol.httpclient.Http.getResponse(Http.java:183)
at org.apache.nutch.protocol.http.api.HttpBase.getProtocolOutput(HttpBase.java:276)
at org.apache.nutch.fetcher.FetcherThread.run(FetcherThread.java:342)
Caused by: java.lang.IllegalArgumentException: No form exists: adminRole
at org.apache.nutch.protocol.httpclient.HttpFormAuthentication.getLoginFormParams(HttpFormAuthentication.java:219)
at org.apache.nutch.protocol.httpclient.HttpFormAuthentication.login(HttpFormAuthentication.java:95)
at org.apache.nutch.protocol.httpclient.Http.resolveCredentials(Http.java:504)
... 3 more
Challenge for digest authentication scheme not available
basic authentication scheme selected
Using authentication scheme: basic
Authorization challenge processed
No form element found with 'id' = adminRole, trying 'name'.
`
It's still security.json file. Have you any ideas ? thanks
SOLVED
I solved this by configure httpclient-auth.xml in /nutch/conf/
<auth-configuration>
<credentials username="solr" password="xxxxx">
<authscope host="localhost" port="8983"/>
</credentials>
</auth-configuration>
Thanks for your help

Spring Integration RSS Feed - 403 Error

I am trying to use Spring Integration to poll RSS feeds and I'm using SO user feed URL as an example. However when the application runs I get the following error:
Disconnected from the target VM, address: '127.0.0.1:58603', transport: 'socket'
2016-03-04 19:34:36.252 ERROR 2345 --- [ask-scheduler-4] o.s.integration.handler.LoggingHandler : org.springframework.messaging.MessagingException: Failed to retrieve feed at url 'http://stackoverflow.com/feeds/user/813852'; nested exception is com.rometools.fetcher.FetcherException: Authentication required for that resource. HTTP Response code was:403
at org.springframework.integration.feed.inbound.FeedEntryMessageSource.getFeed(FeedEntryMessageSource.java:216)
at org.springframework.integration.feed.inbound.FeedEntryMessageSource.populateEntryList(FeedEntryMessageSource.java:182)
at org.springframework.integration.feed.inbound.FeedEntryMessageSource.doReceive(FeedEntryMessageSource.java:157)
at org.springframework.integration.feed.inbound.FeedEntryMessageSource.receive(FeedEntryMessageSource.java:122)
at org.springframework.integration.endpoint.SourcePollingChannelAdapter.receiveMessage(SourcePollingChannelAdapter.java:175)
at org.springframework.integration.endpoint.AbstractPollingEndpoint.doPoll(AbstractPollingEndpoint.java:224)
at org.springframework.integration.endpoint.AbstractPollingEndpoint.access$000(AbstractPollingEndpoint.java:57)
at org.springframework.integration.endpoint.AbstractPollingEndpoint$1.call(AbstractPollingEndpoint.java:176)
at org.springframework.integration.endpoint.AbstractPollingEndpoint$1.call(AbstractPollingEndpoint.java:173)
at org.springframework.integration.endpoint.AbstractPollingEndpoint$Poller$1.run(AbstractPollingEndpoint.java:330)
at org.springframework.integration.util.ErrorHandlingTaskExecutor$1.run(ErrorHandlingTaskExecutor.java:55)
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50)
at org.springframework.integration.util.ErrorHandlingTaskExecutor.execute(ErrorHandlingTaskExecutor.java:51)
at org.springframework.integration.endpoint.AbstractPollingEndpoint$Poller.run(AbstractPollingEndpoint.java:324)
at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54)
at org.springframework.scheduling.concurrent.ReschedulingRunnable.run(ReschedulingRunnable.java:81)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: com.rometools.fetcher.FetcherException: Authentication required for that resource. HTTP Response code was:403
at com.rometools.fetcher.impl.AbstractFeedFetcher.throwAuthenticationError(AbstractFeedFetcher.java:183)
at com.rometools.fetcher.impl.AbstractFeedFetcher.handleErrorCodes(AbstractFeedFetcher.java:170)
at com.rometools.fetcher.impl.HttpURLFeedFetcher.retrieveAndCacheFeed(HttpURLFeedFetcher.java:186)
at com.rometools.fetcher.impl.HttpURLFeedFetcher.retrieveFeed(HttpURLFeedFetcher.java:140)
at com.rometools.fetcher.impl.HttpURLFeedFetcher.retrieveFeed(HttpURLFeedFetcher.java:99)
at org.springframework.integration.feed.inbound.FeedEntryMessageSource.getFeed(FeedEntryMessageSource.java:204)
... 22 more
The URL loads fine through the browser (even in incognito mode so the error message is probably misleading). I downloaded an RSS reader and the feed worked fine.
This is how I configured the feed:
<int-feed:inbound-channel-adapter id="stackOverflow"
channel="log"
url="http://stackoverflow.com/feeds/user/813852">
<int:poller fixed-rate="10000" max-messages-per-poll="100" />
</int-feed:inbound-channel-adapter>
<int:logging-channel-adapter id="log" level="DEBUG" />
I also tried another RSS feed URL and it worked. Any ideas what could be wrong?
I just ran a wireshark trace and it looks like StackOverflow doesn't like the java user agent:
User-Agent: Java/1.8.0_66\r\n
...
<p>The owner of this website (stackoverflow.com) has banned your access based
on your browser's signature (27e797571e1923de-ua21).</p>\n
Setting a custom user agent goes around this restriction:
public class CustomFeedFetcher extends HttpURLFeedFetcher {
#Override
public void setUserAgent(String s) {
super.setUserAgent("AgentName");
}
}

Using JMeter with service integration bus (SI bus) in Websphere

I'm trying to use JMS Point-to-Point in JMeter to send messages to a SI bus queue in Websphere.
I'm fairly certain I have the following correct:
QueueConnection Factory
JNDI name Request queue
JNDI Properties, Initial Context Factory set to com.ibm.websphere.naming.WsnInitialContextFactory
I have guessed the URL:
iiop://ipaddressofWebSphereServer:2809
When I run the sampler I get the following error in the jmeter.log:
2013/03/14 09:09:01 ERROR - jmeter.protocol.jms.sampler.JMSSampler: Error getting WsnNameService properties javax.naming.NamingException: Error getting WsnNameService properties [Root exception is org.omg.CORBA.TRANSIENT: initial and forwarded IOR inaccessible vmcid: IBM minor code: E07 completed: No]
at com.ibm.ws.naming.util.WsnInitCtxFactory.mergeWsnNSProperties(WsnInitCtxFactory.java:1444)
at com.ibm.ws.naming.util.WsnInitCtxFactory.getRootContextFromServer(WsnInitCtxFactory.java:951)
at com.ibm.ws.naming.util.WsnInitCtxFactory.getRootJndiContext(WsnInitCtxFactory.java:866)
at com.ibm.ws.naming.util.WsnInitCtxFactory.getInitialContextInternal(WsnInitCtxFactory.java:546)
at com.ibm.ws.naming.util.WsnInitCtx.getContext(WsnInitCtx.java:123)
at com.ibm.ws.naming.util.WsnInitCtx.getContextIfNull(WsnInitCtx.java:798)
at com.ibm.ws.naming.util.WsnInitCtx.lookup(WsnInitCtx.java:164)
at com.ibm.ws.naming.util.WsnInitCtx.lookup(WsnInitCtx.java:179)
at javax.naming.InitialContext.lookup(InitialContext.java:436)
at org.apache.jmeter.protocol.jms.sampler.JMSSampler.threadStarted(JMSSampler.java:303)
at org.apache.jmeter.threads.JMeterThread$ThreadListenerTraverser.addNode(JMeterThread.java:597)
at org.apache.jorphan.collections.HashTree.traverseInto(HashTree.java:1001)
at org.apache.jorphan.collections.HashTree.traverseInto(HashTree.java:1002)
at org.apache.jorphan.collections.HashTree.traverse(HashTree.java:986)
at org.apache.jmeter.threads.JMeterThread.threadStarted(JMeterThread.java:566)
at org.apache.jmeter.threads.JMeterThread.initRun(JMeterThread.java:554)
at org.apache.jmeter.threads.JMeterThread.run(JMeterThread.java:252)
at java.lang.Thread.run(Thread.java:736)
Caused by: org.omg.CORBA.TRANSIENT: initial and forwarded IOR inaccessible vmcid: IBM minor code: E07 completed: No
at com.ibm.rmi.corba.ClientDelegate.createRequest(ClientDelegate.java:1270)
at com.ibm.CORBA.iiop.ClientDelegate.createRequest(ClientDelegate.java:1330)
at com.ibm.rmi.corba.ClientDelegate.createRequest(ClientDelegate.java:1158)
at com.ibm.CORBA.iiop.ClientDelegate.createRequest(ClientDelegate.java:1296)
at com.ibm.rmi.corba.ClientDelegate.request(ClientDelegate.java:1877)
at com.ibm.CORBA.iiop.ClientDelegate.request(ClientDelegate.java:1252)
at org.omg.CORBA.portable.ObjectImpl._request(ObjectImpl.java:458)
at com.ibm.WsnBootstrap._WsnNameServiceStub.getProperties(_WsnNameServiceStub.java:38)
at com.ibm.ws.naming.util.WsnInitCtxFactory.mergeWsnNSProperties(WsnInitCtxFactory.java:1441)
... 17 more
Caused by: java.net.UnknownHostException: ipaddressofWebSphereServer
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:207)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:377)
at java.net.Socket.connect(Socket.java:539)
at java.net.Socket.connect(Socket.java:488)
at java.net.Socket.<init>(Socket.java:385)
at java.net.Socket.<init>(Socket.java:199)
at com.ibm.ws.orbimpl.transport.WSTCPTransportConnection.createSocket(WSTCPTransportConnection.java:270)
at com.ibm.CORBA.transport.TransportConnectionBase.connect(TransportConnectionBase.java:354)
at com.ibm.ws.orbimpl.transport.WSTransport.getConnection(WSTransport.java:436)
at com.ibm.CORBA.transport.TransportBase.getConnection(TransportBase.java:187)
at com.ibm.rmi.iiop.TransportManager.get(TransportManager.java:89)
at com.ibm.rmi.iiop.GIOPImpl.getConnection(GIOPImpl.java:130)
at com.ibm.rmi.iiop.GIOPImpl.locate(GIOPImpl.java:219)
at com.ibm.rmi.corba.ClientDelegate.locate(ClientDelegate.java:1974)
at com.ibm.rmi.corba.ClientDelegate._createRequest(ClientDelegate.java:1999)
at com.ibm.rmi.corba.ClientDelegate.createRequest(ClientDelegate.java:1180)
at com.ibm.rmi.corba.ClientDelegate.createRequest(ClientDelegate.java:1266)
... 25 more
2013/03/14 09:09:01 WARN - jmeter.protocol.jms.sampler.JMSSampler: Session may not be null while creating message java.lang.IllegalStateException: Session may not be null while creating message
at org.apache.jmeter.protocol.jms.sampler.JMSSampler.createMessage(JMSSampler.java:179)
at org.apache.jmeter.protocol.jms.sampler.JMSSampler.sample(JMSSampler.java:140)
at org.apache.jmeter.threads.JMeterThread.process_sampler(JMeterThread.java:428)
at org.apache.jmeter.threads.JMeterThread.run(JMeterThread.java:256)
at java.lang.Thread.run(Thread.java:736)
And I get this error in the orbtrc log:
09:09:01.878 com.ibm.ws.orbimpl.transport.WSTCPTransportConnection connect:403 Thread Group 1-1 ORBRas[default] java.net.UnknownHostException: <i><b> ipaddressofWebSphereServer</i></b>
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:207)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:377)
at java.net.Socket.connect(Socket.java:539)
at java.net.Socket.connect(Socket.java:488)
at java.net.Socket.<init>(Socket.java:385)
at java.net.Socket.<init>(Socket.java:199)
at com.ibm.ws.orbimpl.transport.WSTCPTransportConnection.createSocket(WSTCPTransportConnection.java:270)
at com.ibm.CORBA.transport.TransportConnectionBase.connect(TransportConnectionBase.java:354)
at com.ibm.ws.orbimpl.transport.WSTransport.getConnection(WSTransport.java:436)
at com.ibm.CORBA.transport.TransportBase.getConnection(TransportBase.java:187)
at com.ibm.rmi.iiop.TransportManager.get(TransportManager.java:89)
at com.ibm.rmi.iiop.GIOPImpl.getConnection(GIOPImpl.java:130)
at com.ibm.rmi.iiop.GIOPImpl.locate(GIOPImpl.java:219)
at com.ibm.rmi.corba.ClientDelegate.locate(ClientDelegate.java:1974)
at com.ibm.rmi.corba.ClientDelegate._createRequest(ClientDelegate.java:1999)
at com.ibm.rmi.corba.ClientDelegate.createRequest(ClientDelegate.java:1180)
at com.ibm.rmi.corba.ClientDelegate.createRequest(ClientDelegate.java:1266)
at com.ibm.CORBA.iiop.ClientDelegate.createRequest(ClientDelegate.java:1330)
at com.ibm.rmi.corba.ClientDelegate.createRequest(ClientDelegate.java:1158)
at com.ibm.CORBA.iiop.ClientDelegate.createRequest(ClientDelegate.java:1296)
at com.ibm.rmi.corba.ClientDelegate.request(ClientDelegate.java:1877)
at com.ibm.CORBA.iiop.ClientDelegate.request(ClientDelegate.java:1252)
at org.omg.CORBA.portable.ObjectImpl._request(ObjectImpl.java:458)
at com.ibm.WsnBootstrap._WsnNameServiceStub.getProperties(_WsnNameServiceStub.java:38)
at com.ibm.ws.naming.util.WsnInitCtxFactory.mergeWsnNSProperties(WsnInitCtxFactory.java:1441)
at com.ibm.ws.naming.util.WsnInitCtxFactory.getRootContextFromServer(WsnInitCtxFactory.java:951)
at com.ibm.ws.naming.util.WsnInitCtxFactory.getRootJndiContext(WsnInitCtxFactory.java:866)
at com.ibm.ws.naming.util.WsnInitCtxFactory.getInitialContextInternal(WsnInitCtxFactory.java:546)
at com.ibm.ws.naming.util.WsnInitCtx.getContext(WsnInitCtx.java:123)
at com.ibm.ws.naming.util.WsnInitCtx.getContextIfNull(WsnInitCtx.java:798)
at com.ibm.ws.naming.util.WsnInitCtx.lookup(WsnInitCtx.java:164)
at com.ibm.ws.naming.util.WsnInitCtx.lookup(WsnInitCtx.java:179)
at javax.naming.InitialContext.lookup(InitialContext.java:436)
at org.apache.jmeter.protocol.jms.sampler.JMSSampler.threadStarted(JMSSampler.java:303)
at org.apache.jmeter.threads.JMeterThread$ThreadListenerTraverser.addNode(JMeterThread.java:597)
at org.apache.jorphan.collections.HashTree.traverseInto(HashTree.java:1001)
at org.apache.jorphan.collections.HashTree.traverseInto(HashTree.java:1002)
at org.apache.jorphan.collections.HashTree.traverse(HashTree.java:986)
at org.apache.jmeter.threads.JMeterThread.threadStarted(JMeterThread.java:566)
at org.apache.jmeter.threads.JMeterThread.initRun(JMeterThread.java:554)
at org.apache.jmeter.threads.JMeterThread.run(JMeterThread.java:252)
at java.lang.Thread.run(Thread.java:736)

Resources