Nutch with solr on https - https

Good morning,
I come to you because I have a problem with Nutch (1.14) and Solr (7.2)
so it works fine until I put SSL in place.
With Solr in http, once the crawl is finished I execute this command
bin/nutch index -Dsolr.server.url=http://127.0.0.1:8983/solr/CORENAME crawltest/crawldb/ -linkdb crawltest/linkdb/ crawltest/segments/* -filter -normalize -deleteGone
And it works very well
However, once SSL is activated and the solr server in HTTPS, it is impossible to send the data to Solr.
I have added in nutch site the following properties
<name>solr.auth</name>
<value>true</value>
<property>
<name>solr.auth.username</name>
<value>xxxx</value>
<property>
<name>solr.auth.password</name>
<value>xxxx</value>
property>
<name>solr.server.type</name>
<value>https</value>
property>
<name>solr.server.url</name>
<value>https://127.0.0.1:8983/solr/CORENAME</value>
But when I execute the previous command I get an error of this type
Caused by: org.apache.solr.client.solrj.SolrServerException: IOException occured when talking to server at: https://127.0.0.1:8983/solr/CORENAME
&
caused by: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
&
Caused by: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
Did you succeed in sending the data to a HTTPS solr?
Thanks
EDIT
To fix this errors following SSL procedure https://lucene.apache.org/solr/guide/7_0/enabling-ssl.html
And at the end execute this
keytool -import -file /path/to/solr/solr-ssl.pem -alias solr_cert -keystore /path/to/java-cacert (jre/lib/security/cacerts)
the default password is changeit

it advances a little, once the certificate import in cacerts I do not have any more this error.
Still in the same context, after activating SSL and authentication on the solr server. I use Nutch to Crawl the urls and send the data to solr.
Since the implementation of SSL I can no longer send data to SOLR.
When i execute this bin/nutch index -Dsolr.server.url=https://localhost:8983/solr/CORE -Dsolr.auth=true -Dsolr.auth.username='solr' -Dsolr.auth.password='xxxx' crawltest/crawldb/ -linkdb crawltest/linkdb/ crawltest/segments/* -filter -normalize -deleteGone
I have the following two errors:
java.lang.Exception: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://localhost:8983/solr/CORE: Expected mime type application/octet-stream but got text/html. <html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 401 Unauthorized</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /solr/CORE/update. Reason:
<pre> Unauthorized</pre></p>
</body>
</html>
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused by: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://localhost:8983/solr/CORE: Expected mime type application/octet-stream but got text/html. <html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 401 Unauthorized</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /solr/CORE/update. Reason:
<pre> Unauthorized</pre></p>
</body>
</html>
EDIT :
The first error is due to an authentication error.
After filling in the right values, I have a new error that I don't understand.. Do you have any ideas?
2018-06-20 09:47:18,116 INFO regex.RegexURLNormalizer - can't find rules for scope 'indexer', using default
2018-06-20 09:47:19,151 INFO indexer.IndexWriters - Adding org.apache.nutch.indexwriter.solr.SolrIndexWriter
2018-06-20 09:47:19,194 INFO solr.SolrMappingReader - source: content dest: content
2018-06-20 09:47:19,194 INFO solr.SolrMappingReader - source: title dest: title
2018-06-20 09:47:19,194 INFO solr.SolrMappingReader - source: host dest: host
2018-06-20 09:47:19,194 INFO solr.SolrMappingReader - source: segment dest: segment
2018-06-20 09:47:19,194 INFO solr.SolrMappingReader - source: boost dest: boost
2018-06-20 09:47:19,195 INFO solr.SolrMappingReader - source: digest dest: digest
2018-06-20 09:47:19,195 INFO solr.SolrMappingReader - source: tstamp dest: tstamp
2018-06-20 09:47:19,525 INFO solr.SolrIndexWriter - Indexing 250/250 documents
2018-06-20 09:47:19,525 INFO solr.SolrIndexWriter - Deleting 0 documents
2018-06-20 09:47:19,808 INFO solr.SolrIndexWriter - Indexing 250/250 documents
2018-06-20 09:47:19,809 INFO solr.SolrIndexWriter - Deleting 0 documents
2018-06-20 09:47:19,951 WARN mapred.LocalJobRunner - job_local146539832_0001
java.lang.Exception: java.io.IOException
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused by: java.io.IOException
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.makeIOException(SolrIndexWriter.java:234)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.push(SolrIndexWriter.java:213)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.write(SolrIndexWriter.java:174)
at org.apache.nutch.indexer.IndexWriters.write(IndexWriters.java:87)
at org.apache.nutch.indexer.IndexerOutputFormat$1.write(IndexerOutputFormat.java:50)
at org.apache.nutch.indexer.IndexerOutputFormat$1.write(IndexerOutputFormat.java:41)
at org.apache.hadoop.mapred.ReduceTask$OldTrackingRecordWriter.write(ReduceTask.java:493)
at org.apache.hadoop.mapred.ReduceTask$3.collect(ReduceTask.java:422)
at org.apache.nutch.indexer.IndexerMapReduce.reduce(IndexerMapReduce.java:369)
at org.apache.nutch.indexer.IndexerMapReduce.reduce(IndexerMapReduce.java:57)
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:444)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:319)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.solr.client.solrj.SolrServerException: IOException occured when talking to server at: https://localhost:8983/solr/ESRF-EXTERNAL
at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:589)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:240)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:229)
at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1219)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.push(SolrIndexWriter.java:210)
... 16 more
Caused by: java.net.SocketException: Broken pipe (Write failed)
at java.net.SocketOutputStream.socketWrite0(Native Method)
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111)
at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
at sun.security.ssl.OutputRecord.writeBuffer(OutputRecord.java:431)
at sun.security.ssl.OutputRecord.write(OutputRecord.java:417)
at sun.security.ssl.SSLSocketImpl.writeRecordInternal(SSLSocketImpl.java:886)
at sun.security.ssl.SSLSocketImpl.writeRecord(SSLSocketImpl.java:857)
at sun.security.ssl.AppOutputStream.write(AppOutputStream.java:123)
at org.apache.http.impl.io.AbstractSessionOutputBuffer.write(AbstractSessionOutputBuffer.java:181)
at org.apache.http.impl.io.ContentLengthOutputStream.write(ContentLengthOutputStream.java:115)
at org.apache.http.entity.InputStreamEntity.writeTo(InputStreamEntity.java:146)
at org.apache.http.entity.HttpEntityWrapper.writeTo(HttpEntityWrapper.java:96)
at org.apache.http.impl.client.EntityEnclosingRequestWrapper$EntityWrapper.writeTo(EntityEnclosingRequestWrapper.java:112)
at org.apache.http.impl.entity.EntitySerializer.serialize(EntitySerializer.java:117)
at org.apache.http.impl.AbstractHttpClientConnection.sendRequestEntity(AbstractHttpClientConnection.java:265)
at org.apache.http.impl.conn.ManagedClientConnectionImpl.sendRequestEntity(ManagedClientConnectionImpl.java:203)
at org.apache.http.protocol.HttpRequestExecutor.doSendRequest(HttpRequestExecutor.java:237)
at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:122)
at org.apache.http.impl.client.DefaultRequestDirector.tryExecute(DefaultRequestDirector.java:685)
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:487)
at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:882)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:107)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55)
at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:481)
... 20 more
2018-06-20 09:47:20,873 ERROR indexer.IndexingJob - Indexer: java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:873)
at org.apache.nutch.indexer.IndexingJob.index(IndexingJob.java:147)
at org.apache.nutch.indexer.IndexingJob.run(IndexingJob.java:230)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.nutch.indexer.IndexingJob.main(IndexingJob.java:239)
Stacktrace
name cpuTime / userTime
process reaper (37)
java.util.concurrent.SynchronousQueue$TransferStack#24197386
1.8587ms
0.0000ms
process reaper (36)
java.util.concurrent.SynchronousQueue$TransferStack#24197386
1.2672ms
0.0000ms
Scheduler-201556483 (31) java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#6202c0c1
1.1534ms
0.0000ms
searcherExecutor-7-thread-1 (30) java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#663d5859
63.2030ms
50.0000ms
DestroyJavaVM (27)
1164.4748ms
1040.0000ms
Thread-12 (25)
java.lang.Object#233fcafa
0.1211ms
0.0000ms
Connection evictor (23)
0.9319ms
0.0000ms
Connection evictor (22)
2.0995ms
0.0000ms
org.eclipse.jetty.server.session.HashSessionManager#1a052a00Timer (21) java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#6626f9cf
4.2127ms
0.0000ms
qtp2012232625-20 (20) java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#12d62902
56.7955ms
50.0000ms
qtp2012232625-19 (19)
47.6864ms
40.0000ms
qtp2012232625-18 (18) java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#12d62902
79.3320ms
70.0000ms
qtp2012232625-17 (17) java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#12d62902
100.9593ms
90.0000ms
qtp2012232625-16-acceptor-0#2d033cc4-ServerConnector#23c4c714{SSL,[ssl, http/1.1]}{0.0.0.0:8983} (16)
4.5898ms
0.0000ms
qtp2012232625-15 (15) java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#12d62902
73.3096ms
60.0000ms
qtp2012232625-14 (14) java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#12d62902
18.7950ms
10.0000ms
qtp2012232625-13 (13)
79.7804ms
70.0000ms
qtp2012232625-12 (12)
70.2385ms
60.0000ms
qtp2012232625-11 (11) java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject#12d62902
22.1012ms
10.0000ms
ShutdownMonitor (10)
0.3055ms
0.0000ms
Signal Dispatcher (5)
0.0873ms
0.0000ms
Finalizer (3)
java.lang.ref.ReferenceQueue$Lock#1e254491
8.2575ms
0.0000ms
Reference Handler (2)
java.lang.ref.Reference$Lock#431035b5
6.3846ms
0.0000ms
EDIT2
To test I disabled authentication to see if the problem did not come from https. Without authentication it works!
I tried to change the file and include it in jetty-https.xml rather than jetty.xml.
I have 2 account configure like this
<security-constraint>
<web-resource-collection>
<web-resource-name>Solr authenticated application</web-resource-name>
<url-pattern>/</url-pattern>
</web-resource-collection>
<auth-constraint>
<role-name>admin</role-name>
</auth-constraint>
</security-constraint>
<login-config>
<auth-method>BASIC</auth-method>
<realm-name>Test Realm</realm-name>
</login-config>
security.json
{
"authentication":{
"blockUnknown": true,
"class":"solr.BasicAuthPlugin",
"credentials":{"solr":"xxxx"}
},
"authorization":{
"class":"solr.RuleBasedAuthorizationPlugin",
"permissions":[{"name":"security-edit",
"role":"admin"}],
"user-role":{"solr":"admin"}
}}
when i execute the following command
bin/nutch index -Dsolr.server.url=https://localhost:8983/solr/MYCORE -Dsolr.auth=true -Dsolr.auth.username='admin' -Dsolr.auth.password='xxxx' crawltest/crawldb/ -linkdb crawltest/linkdb/ crawltest/segments/* -filter -normalize -deleteGone
and I get this error
java.lang.Exception: java.io.IOException
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused by: java.io.IOException
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.makeIOException(SolrIndexWriter.java:234)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.push(SolrIndexWriter.java:213)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.write(SolrIndexWriter.java:174)
at org.apache.nutch.indexer.IndexWriters.write(IndexWriters.java:87)
at org.apache.nutch.indexer.IndexerOutputFormat$1.write(IndexerOutputFormat.java:50)
at org.apache.nutch.indexer.IndexerOutputFormat$1.write(IndexerOutputFormat.java:41)
at org.apache.hadoop.mapred.ReduceTask$OldTrackingRecordWriter.write(ReduceTask.java:493)
at org.apache.hadoop.mapred.ReduceTask$3.collect(ReduceTask.java:422)
at org.apache.nutch.indexer.IndexerMapReduce.reduce(IndexerMapReduce.java:369)
at org.apache.nutch.indexer.IndexerMapReduce.reduce(IndexerMapReduce.java:57)
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:444)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:319)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.solr.client.solrj.SolrServerException: IOException occured when talking to server at: https://localhost:8983/solr/MYCORE
at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:589)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:240)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:229)
at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1219)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.push(SolrIndexWriter.java:210)
... 16 more
Caused by: java.net.SocketException: Broken pipe (Write failed)
at java.net.SocketOutputStream.socketWrite0(Native Method)
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111)
at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
at sun.security.ssl.OutputRecord.writeBuffer(OutputRecord.java:431)
at sun.security.ssl.OutputRecord.write(OutputRecord.java:417)
at sun.security.ssl.SSLSocketImpl.writeRecordInternal(SSLSocketImpl.java:886)
at sun.security.ssl.SSLSocketImpl.writeRecord(SSLSocketImpl.java:857)
at sun.security.ssl.AppOutputStream.write(AppOutputStream.java:123)
at org.apache.http.impl.io.AbstractSessionOutputBuffer.write(AbstractSessionOutputBuffer.java:181)
at org.apache.http.impl.io.ContentLengthOutputStream.write(ContentLengthOutputStream.java:115)
at org.apache.http.entity.InputStreamEntity.writeTo(InputStreamEntity.java:146)
at org.apache.http.entity.HttpEntityWrapper.writeTo(HttpEntityWrapper.java:96)
at org.apache.http.impl.client.EntityEnclosingRequestWrapper$EntityWrapper.writeTo(EntityEnclosingRequestWrapper.java:112)
at org.apache.http.impl.entity.EntitySerializer.serialize(EntitySerializer.java:117)
at org.apache.http.impl.AbstractHttpClientConnection.sendRequestEntity(AbstractHttpClientConnection.java:265)
at org.apache.http.impl.conn.ManagedClientConnectionImpl.sendRequestEntity(ManagedClientConnectionImpl.java:203)
at org.apache.http.protocol.HttpRequestExecutor.doSendRequest(HttpRequestExecutor.java:237)
at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:122)
at org.apache.http.impl.client.DefaultRequestDirector.tryExecute(DefaultRequestDirector.java:685)
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:487)
at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:882)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:107)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55)
at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:481)
... 20 more
2018-06-25 09:38:41,870 ERROR indexer.IndexingJob - Indexer: java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:873)
at org.apache.nutch.indexer.IndexingJob.index(IndexingJob.java:147)
at org.apache.nutch.indexer.IndexingJob.run(IndexingJob.java:230)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.nutch.indexer.IndexingJob.main(IndexingJob.java:239)
And when i execute this
bin/nutch index -Dsolr.server.url=https://localhost:8983/solr/MYCORE -Dsolr.auth=true -Dsolr.auth.username='solr' -Dsolr.auth.password='xxxxx' crawltest/crawldb/ -linkdb crawltest/linkdb/ crawltest/segments/* -filter -normalize -deleteGone
I get this error
java.lang.Exception: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://localhost:8983/solr/MYCORE: Expected mime type application/octet-stream but got text/html. <html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 401 Unauthorized</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /solr/MYCORE/update. Reason:
<pre> Unauthorized</pre></p>
</body>
</html>
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused by: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://localhost:8983/solr/MYCORE: Expected mime type application/octet-stream but got text/html. <html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 401 Unauthorized</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /solr/MYCORE/update. Reason:
<pre> Unauthorized</pre></p>
</body>
</html>
at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:544)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:240)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:229)
at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1219)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.push(SolrIndexWriter.java:210)
at org.apache.nutch.indexwriter.solr.SolrIndexWriter.write(SolrIndexWriter.java:174)
at org.apache.nutch.indexer.IndexWriters.write(IndexWriters.java:87)
at org.apache.nutch.indexer.IndexerOutputFormat$1.write(IndexerOutputFormat.java:50)
at org.apache.nutch.indexer.IndexerOutputFormat$1.write(IndexerOutputFormat.java:41)
at org.apache.hadoop.mapred.ReduceTask$OldTrackingRecordWriter.write(ReduceTask.java:493)
at org.apache.hadoop.mapred.ReduceTask$3.collect(ReduceTask.java:422)
at org.apache.nutch.indexer.IndexerMapReduce.reduce(IndexerMapReduce.java:369)
at org.apache.nutch.indexer.IndexerMapReduce.reduce(IndexerMapReduce.java:57)
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:444)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:319)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
2018-06-25 09:45:20,106 ERROR indexer.IndexingJob - Indexer: java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:873)
at org.apache.nutch.indexer.IndexingJob.index(IndexingJob.java:147)
at org.apache.nutch.indexer.IndexingJob.run(IndexingJob.java:230)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.nutch.indexer.IndexingJob.main(IndexingJob.java:239)
or now this :
java.lang.Exception: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://127.0.0.1:8983/solr/MYCORE: Expected mime type application/octet-stream but got text/html. <html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=ISO-8859-1"/>
<title>Error 503 </title>
</head>
<body>
<h2>HTTP ERROR: 503</h2>
<p>Problem accessing /solr/MYCORE/update. Reason:
<pre> Service Unavailable</pre></p>
<hr />
</body>
</html>
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused by: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://127.0.0.1:8983/solr/MYCORE: Expected mime type application/octet-stream but got text/html. <html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=ISO-8859-1"/>
<title>Error 503 </title>
</head>
<body>
<h2>HTTP ERROR: 503</h2>
<p>Problem accessing /solr/MYCORE/update. Reason:
<pre> Service Unavailable</pre></p>
<hr />
</body>
</html>
Solr logs :
2018-06-25 14:18:44.352 INFO (main) [ ] o.e.j.s.Server jetty-9.3.20.v20170531
2018-06-25 14:18:44.597 WARN (main) [ ] o.e.j.w.WebAppContext Failed startup of context o.e.j.w.WebAppContext#5891e32e{/solr,file:///app/solr-7.2.1/server/solr-webapp/webapp/,UNAVAILABLE}{/app/solr-7.2.1/server/solr-webapp/webapp}
java.lang.IllegalStateException: No LoginService for org.eclipse.jetty.security.authentication.BasicAuthenticator#64c87930 in org.eclipse.jetty.security.ConstraintSecurityHandler#400cff1a
at org.eclipse.jetty.security.authentication.LoginAuthenticator.setConfiguration(LoginAuthenticator.java:76)
at org.eclipse.jetty.security.SecurityHandler.doStart(SecurityHandler.java:354)
at org.eclipse.jetty.security.ConstraintSecurityHandler.doStart(ConstraintSecurityHandler.java:448)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:131)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:105)
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61)
at org.eclipse.jetty.server.handler.ScopedHandler.doStart(ScopedHandler.java:120)
at org.eclipse.jetty.server.session.SessionHandler.doStart(SessionHandler.java:116)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:131)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:105)
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61)
at org.eclipse.jetty.server.handler.ScopedHandler.doStart(ScopedHandler.java:120)
at org.eclipse.jetty.server.handler.ContextHandler.startContext(ContextHandler.java:809)
at org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:345)
at org.eclipse.jetty.webapp.WebAppContext.startWebapp(WebAppContext.java:1406)
at org.eclipse.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1368)
at org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:778)
at org.eclipse.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:262)
at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:522)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.deploy.bindings.StandardStarter.processBinding(StandardStarter.java:41)
at org.eclipse.jetty.deploy.AppLifeCycle.runBindings(AppLifeCycle.java:188)
at org.eclipse.jetty.deploy.DeploymentManager.requestAppGoal(DeploymentManager.java:499)
at org.eclipse.jetty.deploy.DeploymentManager.addApp(DeploymentManager.java:147)
at org.eclipse.jetty.deploy.providers.ScanningAppProvider.fileAdded(ScanningAppProvider.java:180)
at org.eclipse.jetty.deploy.providers.WebAppProvider.fileAdded(WebAppProvider.java:458)
at org.eclipse.jetty.deploy.providers.ScanningAppProvider$1.fileAdded(ScanningAppProvider.java:64)
at org.eclipse.jetty.util.Scanner.reportAddition(Scanner.java:610)
at org.eclipse.jetty.util.Scanner.reportDifferences(Scanner.java:529)
at org.eclipse.jetty.util.Scanner.scan(Scanner.java:392)
at org.eclipse.jetty.util.Scanner.doStart(Scanner.java:313)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.deploy.providers.ScanningAppProvider.doStart(ScanningAppProvider.java:150)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.deploy.DeploymentManager.startAppProvider(DeploymentManager.java:561)
at org.eclipse.jetty.deploy.DeploymentManager.doStart(DeploymentManager.java:236)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:131)
at org.eclipse.jetty.server.Server.start(Server.java:422)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:113)
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:61)
at org.eclipse.jetty.server.Server.doStart(Server.java:389)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.xml.XmlConfiguration$1.run(XmlConfiguration.java:1520)
at java.security.AccessController.doPrivileged(Native Method)
at org.eclipse.jetty.xml.XmlConfiguration.main(XmlConfiguration.java:1442)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.eclipse.jetty.start.Main.invokeMain(Main.java:215)
at org.eclipse.jetty.start.Main.start(Main.java:458)
at org.eclipse.jetty.start.Main.main(Main.java:76)
2018-06-25 14:18:44.745 INFO (main) [ ] o.e.j.s.Server Started #799ms
I Solved "No LoginService" by move the security.json on /var/solr/data/ (SOLR_HOME)
EDIT3:
now I only get an error message "No allow" when I want to send nutch data to solr. Also I can no longer connect to the admin interface I get the same error. I think it came from the security.json file
{
"authentication":{
"class":"solr.BasicAuthPlugin",
"credentials":{"solr":"xxxxxx"}
},
"authorization":{
"class":"solr.RuleBasedAuthorizationPlugin"
"permissions":[{"name":"security-edit","role":"adminRole"},{"name":"collection-admin-edit","role":"adminRole"},{"name":"update","role":"adminRole"},{"name":"all","role":"adminRole"},{"name":"core-admin-edit","role":"adminRole"},{"name":"read","role":"adminRole"},{"name":"config-edit","role":"adminRole"},{"name":"core-admin-read","role":"adminRole"},{"name":"core-admin-read","role":"adminRole"}]
"user-role":{"solr":"adminRole"}
}}
What did I do wrong? thanks

I added a new answer because the previous was too long
SOLVED API AUTHENTICATION BUT NOT WITH NUTCH:
So to make authentication work via the API I deleted the configuration made in jetty-https.xml, webdefault.xml and I deleted the realm.properties file as well as the basic auth options in solr.in.sh
I only work on the security.json file in SOLR HOME
In fact the biggest problem was that I did not use an encrypted password to test the connection while without it impossible to connect. On the other hand I still have the problem via nutch which is not allowed.
Here is the security.json file
`{
"authentication":{
"class":"solr.BasicAuthPlugin",
"credentials":{
"solr":"hzMjhfgN4b9X8KR0QgLB2Um3cUzqDzJygtEBL/O7g5E= CkP7HyXjYvqKNF3F4hBjnVvKGQOkLc/ta4FaNIkqgII="
}
},
"authorization":{
"class":"solr.RuleBasedAuthorizationPlugin",
"permissions":[
{
"name":"security-edit",
"role":"adminRole"
},
{
"name":"collection-admin-edit",
"role":"adminRole"
},
{
"name":"update",
"role":"adminRole"
},
{
"name":"config-edit",
"role":"adminRole"
},
{
"name":"core-admin-edit",
"role":"adminRole"
},
{
"name":"core-admin-read",
"role":"adminRole"
{
"name":"schema-edit",
"role":"adminRole"
},
{
"name":"all",
"role":"adminRole"
}
],
"user-role":{
"solr":"adminRole"
}
}
}
`
The encrypted password represented value "test"
To test the file code i suggest this http://json.parser.online.fr/
What could I have missed to be able to update solr with nutch?
SOLVED
Add in update role path for dataimport
{
"name":"update",
"path":"/dataimport",
"role":"adminRole"
},
But now i can index nutch to solr but i have a new error when i'm crawling...
`Thu Jun 28 09:21:03 CEST 2018 : Iteration 2 of 5
Generating a new segment
/app/nutch-external/bin/nutch generate -D mapreduce.job.reduces=2 -D mapred.child.java.opts=-Xmx1000m -D mapreduce.reduce.speculative=false -D mapreduce.map.speculative=false -D mapreduce.map.output.compress=true crawl//crawldb crawl//segments -topN 50000 -numFetchers 1 -noFilter
Generator: starting at 2018-06-28 09:21:04
Generator: Selecting best-scoring urls due for fetch.
Generator: filtering: false
Generator: normalizing: true
Generator: topN: 50000
Generator: 0 records selected for fetching, exiting ...
Generate returned 1 (no new segments created)
Escaping loop: no more URLs to fetch now
`
And during the first Iteration i have this errors
`Authorization challenge processed
No form element found with 'id' = adminRole, trying 'name'.
No form element found with 'id' = adminRole, trying 'name'.
No form element found with 'name' = adminRole
No form element found with 'name' = adminRole
Supported authentication schemes in the order of preference: [ntlm, digest, basic]
Supported authentication schemes in the order of preference: [ntlm, digest, basic]
Challenge for ntlm authentication scheme not available
Challenge for ntlm authentication scheme not available
Challenge for digest authentication scheme not available
basic authentication scheme selected
Using authentication scheme: basic
Authorization challenge processed
No form element found with 'id' = adminRole, trying 'name'.
No form element found with 'name' = adminRole
Failed to get protocol output
java.lang.RuntimeException: java.lang.IllegalArgumentException: No form exists: adminRole
at org.apache.nutch.protocol.httpclient.Http.resolveCredentials(Http.java:506)
at org.apache.nutch.protocol.httpclient.Http.getResponse(Http.java:183)
at org.apache.nutch.protocol.http.api.HttpBase.getProtocolOutput(HttpBase.java:276)
at org.apache.nutch.fetcher.FetcherThread.run(FetcherThread.java:342)
Caused by: java.lang.IllegalArgumentException: No form exists: adminRole
at org.apache.nutch.protocol.httpclient.HttpFormAuthentication.getLoginFormParams(HttpFormAuthentication.java:219)
at org.apache.nutch.protocol.httpclient.HttpFormAuthentication.login(HttpFormAuthentication.java:95)
at org.apache.nutch.protocol.httpclient.Http.resolveCredentials(Http.java:504)
... 3 more
Challenge for digest authentication scheme not available
basic authentication scheme selected
Using authentication scheme: basic
Authorization challenge processed
No form element found with 'id' = adminRole, trying 'name'.
`
It's still security.json file. Have you any ideas ? thanks
SOLVED
I solved this by configure httpclient-auth.xml in /nutch/conf/
<auth-configuration>
<credentials username="solr" password="xxxxx">
<authscope host="localhost" port="8983"/>
</credentials>
</auth-configuration>
Thanks for your help

Related

Bulk Request failed in ElasticSinkConnector

I got the following error while creating the elasticsinkconnector.
CREATE SOURCE CONNECTOR testdemosinkconnector WITH(
"type.name"= '_doc',
"input.data.format"= 'AVRO',
"connector.class"= 'io.confluent.connect.elasticsearch.ElasticsearchSinkConnector',
"tasks.max"= '1',
"transforms"= 'Dealership',
"topics"= 'es.contact.model',
"transforms.Dealership.type"= 'io.confluent.connect.transforms.ExtractTopic$Value',
"transforms.Dealership.field"= 'indexTopicName',
"transforms.Dealership.skip.missing.or.null"= 'true',
"connection.url"= 'https://elasticsearchdemo.es.us-central1.gcp.cloud.es.io:9243',
"connection.username"= 'elastic',
"connection.password"= 'BUgBxOBg3dv4jp4Z3W7p4tHC',
"key.ignore"= 'true',
"value.converter"= 'io.confluent.connect.avro.AvroConverter',
"value.converter.schemas.enable"= 'true',
"value.converter.schema.registry.url"= 'http://localhost:8081',
"bulk.size.bytes"= '-1',
"behavior.on.null.values"= 'IGNORE',enter code here
"behavior.on.malformed.documents"= 'IGNORE',
"max.retries"= '5',
"retry.backoff.ms"= '5000'
);
The error is,
FAILED | org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception.
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:618)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:334)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:235)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:204)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:200)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:255)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: org.apache.kafka.connect.errors.ConnectException: Bulk request failed
at io.confluent.connect.elasticsearch.ElasticsearchClient$1.afterBulk(ElasticsearchClient.java:397)
at org.elasticsearch.action.bulk.BulkRequestHandler$1.onFailure(BulkRequestHandler.java:70)
at org.elasticsearch.action.ActionListener$5.onFailure(ActionListener.java:258)
at org.elasticsearch.action.bulk.Retry$RetryHandler.onFailure(Retry.java:126)
at io.confluent.connect.elasticsearch.ElasticsearchClient.lambda$null$1(ElasticsearchClient.java:174)
... 5 more
Caused by: org.apache.kafka.connect.errors.ConnectException: Failed to execute bulk request due to 'java.io.IOException: Unable to parse response body for Response{requestLine=POST /_bulk?timeout=1m HTTP/1.1, host=https://elasticsearchdemo.es.us-central1.gcp.cloud.es.io:9243, response=HTTP/1.1 200 OK}' after 6 attempt(s)
at io.confluent.connect.elasticsearch.RetryUtil.callWithRetries(RetryUtil.java:165)
at io.confluent.connect.elasticsearch.RetryUtil.callWithRetries(RetryUtil.java:119)
at io.confluent.connect.elasticsearch.ElasticsearchClient.callWithRetries(ElasticsearchClient.java:425)
at io.confluent.connect.elasticsearch.ElasticsearchClient.lambda$null$1(ElasticsearchClient.java:168)
... 5 more
Caused by: java.io.IOException: Unable to parse response body for Response{requestLine=POST /_bulk?timeout=1m HTTP/1.1, host=https://elasticsearchdemo.es.us-central1.gcp.cloud.es.io:9243, response=HTTP/1.1 200 OK}
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1632)
at org.elasticsearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:1583)
at org.elasticsearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:1553)
at org.elasticsearch.client.RestHighLevelClient.bulk(RestHighLevelClient.java:533)
at io.confluent.connect.elasticsearch.ElasticsearchClient.lambda$null$0(ElasticsearchClient.java:170)
at io.confluent.connect.elasticsearch.RetryUtil.callWithRetries(RetryUtil.java:158)
... 8 more
Caused by: java.lang.NullPointerException
at java.base/java.util.Objects.requireNonNull(Objects.java:221)
at org.elasticsearch.action.DocWriteResponse.<init>(DocWriteResponse.java:127)
at org.elasticsearch.action.index.IndexResponse.<init>(IndexResponse.java:54)
at org.elasticsearch.action.index.IndexResponse.<init>(IndexResponse.java:39)
at org.elasticsearch.action.index.IndexResponse$Builder.build(IndexResponse.java:107)
at org.elasticsearch.action.index.IndexResponse$Builder.build(IndexResponse.java:104)
at org.elasticsearch.action.bulk.BulkItemResponse.fromXContent(BulkItemResponse.java:159)
at org.elasticsearch.action.bulk.BulkResponse.fromXContent(BulkResponse.java:196)
at org.elasticsearch.client.RestHighLevelClient.parseEntity(RestHighLevelClient.java:1892)
at org.elasticsearch.client.RestHighLevelClient.lambda$performRequestAndParseEntity$8(RestHighLevelClient.java:1554)
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1630)
13 more
Please help me to resolve this error.
Elastic Sink Connector Version : 11.1.10
Elastic Search Version : 8.2.2
Elasticsearch version 8 is not supported by the Confluent Elasticsearch Sink Connector Version 11.1.10 so most likely that it is why it can't parse the Elasticsearch response properly
As of version 11.0.0, the connector uses the Elasticsearch High Level REST Client (version 7.0.1), which means only Elasticsearch 7.x is supported.
https://docs.confluent.io/kafka-connect-elasticsearch/current/overview.html

SonarQube local installation not starting

I tried to install sonarqube as discribed here with the zip method: https://docs.sonarqube.org/latest/setup/get-started-2-minutes/
Further than I try to execute on Windows:
C:\sonarqube\bin\windows-x86-64\StartSonar.bat
Unfortunately the server doesn't start. I checked the logs (below) and receive an error. Has someone an idea how I can fix it? Thanks!
--> Wrapper Started as Console
Launching a JVM...
Wrapper (Version 3.2.3) http://wrapper.tanukisoftware.org
Copyright 1999-2006 Tanuki Software, Inc. All Rights Reserved.
2022.02.03 13:28:24 INFO app[][o.s.a.AppFileSystem] Cleaning or creating temp directory C:\KBData\sonarqube-kbecom\temp
2022.02.03 13:28:24 INFO app[][o.s.a.es.EsSettings] Elasticsearch listening on [HTTP: 127.0.0.1:9001, TCP: 127.0.0.1:3235]
2022.02.03 13:28:24 INFO app[][o.s.a.ProcessLauncherImpl] Launch process[[key='es', ipcIndex=1, logFilenamePrefix=es]] from [C:\KBData\sonarqube-kbecom\elasticsearch]: C:\Program Files\Java\jdk-11.0.9\bin\java -XX:+UseG1GC -Djava.io.tmpdir=C:\KBData\sonarqube-kbecom\temp -XX:ErrorFile=../logs/es_hs_err_pid%p.log -Des.networkaddress.cache.ttl=60 -Des.networkaddress.cache.negative.ttl=10 -XX:+AlwaysPreTouch -Xss1m -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djna.nosys=true -Djna.tmpdir=C:\KBData\sonarqube-kbecom\temp -XX:-OmitStackTraceInFastThrow -Dio.netty.noUnsafe=true -Dio.netty.noKeySetOptimization=true -Dio.netty.recycler.maxCapacityPerThread=0 -Dio.netty.allocator.numDirectArenas=0 -Dlog4j.shutdownHookEnabled=false -Dlog4j2.disable.jmx=true -Dlog4j2.formatMsgNoLookups=true -Djava.locale.providers=COMPAT -Dcom.redhat.fips=false -Xmx512m -Xms512m -XX:MaxDirectMemorySize=256m -XX:+HeapDumpOnOutOfMemoryError -Delasticsearch -Des.path.home=C:\KBData\sonarqube-kbecom\elasticsearch -Des.path.conf=C:\KBData\sonarqube-kbecom\temp\conf\es -cp lib/* org.elasticsearch.bootstrap.Elasticsearch
2022.02.03 13:28:24 INFO app[][o.s.a.SchedulerImpl] Waiting for Elasticsearch to be up and running
2022.02.03 13:28:24 ERROR app[][o.s.a.p.EsManagedProcess] Failed to check status
org.elasticsearch.ElasticsearchException: java.util.concurrent.ExecutionException: org.elasticsearch.client.ResponseException: method [GET], host [http://127.0.0.1:9001], URI [/], status line [HTTP/1.1 404 Not Found]
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN""http://www.w3.org/TR/html4/strict.dtd">
<HTML><HEAD><TITLE>Not Found</TITLE>
<META HTTP-EQUIV="Content-Type" Content="text/html; charset=us-ascii"></HEAD>
<BODY><h2>Not Found</h2>
<hr><p>HTTP Error 404. The requested resource is not found.</p>
</BODY></HTML>
at org.elasticsearch.client.RestHighLevelClient.performClientRequest(RestHighLevelClient.java:2695)
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:2171)
at org.elasticsearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:2137)
at org.elasticsearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:2105)
at org.elasticsearch.client.ClusterClient.health(ClusterClient.java:151)
at org.sonar.application.es.EsConnectorImpl.getClusterHealthStatus(EsConnectorImpl.java:64)
at org.sonar.application.process.EsManagedProcess.checkStatus(EsManagedProcess.java:97)
at org.sonar.application.process.EsManagedProcess.checkOperational(EsManagedProcess.java:82)
at org.sonar.application.process.EsManagedProcess.isOperational(EsManagedProcess.java:67)
at org.sonar.application.process.ManagedProcessHandler.refreshState(ManagedProcessHandler.java:220)
at org.sonar.application.process.ManagedProcessHandler$EventWatcher.run(ManagedProcessHandler.java:285)
Caused by: java.util.concurrent.ExecutionException: org.elasticsearch.client.ResponseException: method [GET], host [http://127.0.0.1:9001], URI [/], status line [HTTP/1.1 404 Not Found]
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN""http://www.w3.org/TR/html4/strict.dtd">
<HTML><HEAD><TITLE>Not Found</TITLE>
<META HTTP-EQUIV="Content-Type" Content="text/html; charset=us-ascii"></HEAD>
<BODY><h2>Not Found</h2>
<hr><p>HTTP Error 404. The requested resource is not found.</p>
</BODY></HTML>
at org.elasticsearch.common.util.concurrent.BaseFuture$Sync.getValue(BaseFuture.java:257)
at org.elasticsearch.common.util.concurrent.BaseFuture$Sync.get(BaseFuture.java:244)
at org.elasticsearch.common.util.concurrent.BaseFuture.get(BaseFuture.java:75)
at org.elasticsearch.client.RestHighLevelClient.performClientRequest(RestHighLevelClient.java:2692)
... 10 common frames omitted
Caused by: org.elasticsearch.client.ResponseException: method [GET], host [http://127.0.0.1:9001], URI [/], status line [HTTP/1.1 404 Not Found]
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN""http://www.w3.org/TR/html4/strict.dtd">
<HTML><HEAD><TITLE>Not Found</TITLE>
<META HTTP-EQUIV="Content-Type" Content="text/html; charset=us-ascii"></HEAD>
<BODY><h2>Not Found</h2>
<hr><p>HTTP Error 404. The requested resource is not found.</p>
</BODY></HTML>
at org.elasticsearch.client.RestClient.convertResponse(RestClient.java:331)
at org.elasticsearch.client.RestClient.access$1800(RestClient.java:106)
at org.elasticsearch.client.RestClient$1.completed(RestClient.java:381)
at org.elasticsearch.client.RestClient$1.completed(RestClient.java:377)
at org.apache.http.concurrent.BasicFuture.completed(BasicFuture.java:122)
at org.apache.http.impl.nio.client.DefaultClientExchangeHandlerImpl.responseCompleted(DefaultClientExchangeHandlerImpl.java:181)
at org.apache.http.nio.protocol.HttpAsyncRequestExecutor.processResponse(HttpAsyncRequestExecutor.java:448)
at org.apache.http.nio.protocol.HttpAsyncRequestExecutor.inputReady(HttpAsyncRequestExecutor.java:338)
at org.apache.http.impl.nio.DefaultNHttpClientConnection.consumeInput(DefaultNHttpClientConnection.java:265)
at org.apache.http.impl.nio.client.InternalIODispatch.onInputReady(InternalIODispatch.java:81)
at org.apache.http.impl.nio.client.InternalIODispatch.onInputReady(InternalIODispatch.java:39)
at org.apache.http.impl.nio.reactor.AbstractIODispatch.inputReady(AbstractIODispatch.java:114)
at org.apache.http.impl.nio.reactor.BaseIOReactor.readable(BaseIOReactor.java:162)
at org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvent(AbstractIOReactor.java:337)
at org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvents(AbstractIOReactor.java:315)
at org.apache.http.impl.nio.reactor.AbstractIOReactor.execute(AbstractIOReactor.java:276)
at org.apache.http.impl.nio.reactor.BaseIOReactor.execute(BaseIOReactor.java:104)
at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor$Worker.run(AbstractMultiworkerIOReactor.java:591)
at java.base/java.lang.Thread.run(Thread.java:834)
2022.02.03 13:28:25 ERROR app[][o.s.a.p.EsManagedProcess] Failed to check status
org.elasticsearch.ElasticsearchException: java.util.concurrent.ExecutionException: org.elasticsearch.client.ResponseException: method [GET], host [http://127.0.0.1:9001], URI [/], status line [HTTP/1.1 404 Not Found]
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN""http://www.w3.org/TR/html4/strict.dtd">
<HTML><HEAD><TITLE>Not Found</TITLE>
<META HTTP-EQUIV="Content-Type" Content="text/html; charset=us-ascii"></HEAD>
<BODY><h2>Not Found</h2>
<hr><p>HTTP Error 404. The requested resource is not found.</p>
</BODY></HTML>
at org.elasticsearch.client.RestHighLevelClient.performClientRequest(RestHighLevelClient.java:2695)
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:2171)
at org.elasticsearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:2137)
at org.elasticsearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:2105)
at org.elasticsearch.client.ClusterClient.health(ClusterClient.java:151)
at org.sonar.application.es.EsConnectorImpl.getClusterHealthStatus(EsConnectorImpl.java:64)
at org.sonar.application.process.EsManagedProcess.checkStatus(EsManagedProcess.java:97)
at org.sonar.application.process.EsManagedProcess.checkOperational(EsManagedProcess.java:82)
at org.sonar.application.process.EsManagedProcess.isOperational(EsManagedProcess.java:67)
at org.sonar.application.process.ManagedProcessHandler.refreshState(ManagedProcessHandler.java:220)
at org.sonar.application.process.ManagedProcessHandler$EventWatcher.run(ManagedProcessHandler.java:285)
Caused by: java.util.concurrent.ExecutionException: org.elasticsearch.client.ResponseException: method [GET], host [http://127.0.0.1:9001], URI [/], status line [HTTP/1.1 404 Not Found]
I have now changed the port of sonar.search.port as well as sonar.web.port. Now it's working.

Spring Integration RSS Feed - 403 Error

I am trying to use Spring Integration to poll RSS feeds and I'm using SO user feed URL as an example. However when the application runs I get the following error:
Disconnected from the target VM, address: '127.0.0.1:58603', transport: 'socket'
2016-03-04 19:34:36.252 ERROR 2345 --- [ask-scheduler-4] o.s.integration.handler.LoggingHandler : org.springframework.messaging.MessagingException: Failed to retrieve feed at url 'http://stackoverflow.com/feeds/user/813852'; nested exception is com.rometools.fetcher.FetcherException: Authentication required for that resource. HTTP Response code was:403
at org.springframework.integration.feed.inbound.FeedEntryMessageSource.getFeed(FeedEntryMessageSource.java:216)
at org.springframework.integration.feed.inbound.FeedEntryMessageSource.populateEntryList(FeedEntryMessageSource.java:182)
at org.springframework.integration.feed.inbound.FeedEntryMessageSource.doReceive(FeedEntryMessageSource.java:157)
at org.springframework.integration.feed.inbound.FeedEntryMessageSource.receive(FeedEntryMessageSource.java:122)
at org.springframework.integration.endpoint.SourcePollingChannelAdapter.receiveMessage(SourcePollingChannelAdapter.java:175)
at org.springframework.integration.endpoint.AbstractPollingEndpoint.doPoll(AbstractPollingEndpoint.java:224)
at org.springframework.integration.endpoint.AbstractPollingEndpoint.access$000(AbstractPollingEndpoint.java:57)
at org.springframework.integration.endpoint.AbstractPollingEndpoint$1.call(AbstractPollingEndpoint.java:176)
at org.springframework.integration.endpoint.AbstractPollingEndpoint$1.call(AbstractPollingEndpoint.java:173)
at org.springframework.integration.endpoint.AbstractPollingEndpoint$Poller$1.run(AbstractPollingEndpoint.java:330)
at org.springframework.integration.util.ErrorHandlingTaskExecutor$1.run(ErrorHandlingTaskExecutor.java:55)
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50)
at org.springframework.integration.util.ErrorHandlingTaskExecutor.execute(ErrorHandlingTaskExecutor.java:51)
at org.springframework.integration.endpoint.AbstractPollingEndpoint$Poller.run(AbstractPollingEndpoint.java:324)
at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54)
at org.springframework.scheduling.concurrent.ReschedulingRunnable.run(ReschedulingRunnable.java:81)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: com.rometools.fetcher.FetcherException: Authentication required for that resource. HTTP Response code was:403
at com.rometools.fetcher.impl.AbstractFeedFetcher.throwAuthenticationError(AbstractFeedFetcher.java:183)
at com.rometools.fetcher.impl.AbstractFeedFetcher.handleErrorCodes(AbstractFeedFetcher.java:170)
at com.rometools.fetcher.impl.HttpURLFeedFetcher.retrieveAndCacheFeed(HttpURLFeedFetcher.java:186)
at com.rometools.fetcher.impl.HttpURLFeedFetcher.retrieveFeed(HttpURLFeedFetcher.java:140)
at com.rometools.fetcher.impl.HttpURLFeedFetcher.retrieveFeed(HttpURLFeedFetcher.java:99)
at org.springframework.integration.feed.inbound.FeedEntryMessageSource.getFeed(FeedEntryMessageSource.java:204)
... 22 more
The URL loads fine through the browser (even in incognito mode so the error message is probably misleading). I downloaded an RSS reader and the feed worked fine.
This is how I configured the feed:
<int-feed:inbound-channel-adapter id="stackOverflow"
channel="log"
url="http://stackoverflow.com/feeds/user/813852">
<int:poller fixed-rate="10000" max-messages-per-poll="100" />
</int-feed:inbound-channel-adapter>
<int:logging-channel-adapter id="log" level="DEBUG" />
I also tried another RSS feed URL and it worked. Any ideas what could be wrong?
I just ran a wireshark trace and it looks like StackOverflow doesn't like the java user agent:
User-Agent: Java/1.8.0_66\r\n
...
<p>The owner of this website (stackoverflow.com) has banned your access based
on your browser's signature (27e797571e1923de-ua21).</p>\n
Setting a custom user agent goes around this restriction:
public class CustomFeedFetcher extends HttpURLFeedFetcher {
#Override
public void setUserAgent(String s) {
super.setUserAgent("AgentName");
}
}

Elasticsearch NoNodeAvailableException

I am getting following error from Elasticsearch.
`<html><head><title>Apache Tomcat/7.0.64 - Error report</title><style><!--H1 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:22px;} H2 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:16px;} H3 {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;font-size:14px;} BODY {font-family:Tahoma,Arial,sans-serif;color:black;background-color:white;} B {font-family:Tahoma,Arial,sans-serif;color:white;background-color:#525D76;} P {font-family:Tahoma,Arial,sans-serif;background:white;color:black;font-size:12px;}A {color : black;}A.name {color : black;}HR {color : #525D76;}--></style> </head><body><h1>HTTP Status 500 - Request processing failed; nested exception is org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: []</h1><HR size="1" noshade="noshade"><p><b>type</b> Exception report</p><p><b>message</b> <u>Request processing failed; nested exception is org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: []</u></p><p><b>description</b> <u>The server encountered an internal error that prevented it from fulfilling this request.</u></p><p><b>exception</b> <pre>org.springframework.web.util.NestedServletException: Request processing failed; nested exception is org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: []
org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:943)
org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:822)
javax.servlet.http.HttpServlet.service(HttpServlet.java:624)
org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:807)
javax.servlet.http.HttpServlet.service(HttpServlet.java:731)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
myapp.filter.SimpleCORSFilter.doFilter(SimpleCORSFilter.java:22)
</pre></p><p><b>root cause</b> <pre>org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: []
org.elasticsearch.client.transport.TransportClientNodesService.ensureNodesAreAvailable(TransportClientNodesService.java:305)
org.elasticsearch.client.transport.TransportClientNodesService.execute(TransportClientNodesService.java:200)
org.elasticsearch.client.transport.support.InternalTransportIndicesAdminClient.execute(InternalTransportIndicesAdminClient.java:86)
org.elasticsearch.client.support.AbstractIndicesAdminClient.exists(AbstractIndicesAdminClient.java:178)
org.elasticsearch.action.admin.indices.exists.indices.IndicesExistsRequestBuilder.doExecute(IndicesExistsRequestBuilder.java:53)
org.elasticsearch.action.ActionRequestBuilder.execute(ActionRequestBuilder.java:91)
org.elasticsearch.action.ActionRequestBuilder.execute(ActionRequestBuilder.java:65)
myapp.dao.CommonDao.ValidateTenant(CommonDao.java:7)
myapp.dao.CliffDomainDao.viewCliffDomains(CliffDomainDao.java:55)
myapp.service.CliffDomainService.getall(CliffDomainService.java:53)
myapp.controller.CliffDomainController.getall(CliffDomainController.java:79)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:497)
org.springframework.web.method.support.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:214)
org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:132)
org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:104)
org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandleMethod(RequestMappingHandlerAdapter.java:748)
org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:689)
org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:83)
org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:945)
org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:876)
org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:931)
org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:822)
javax.servlet.http.HttpServlet.service(HttpServlet.java:624)
org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:807)
javax.servlet.http.HttpServlet.service(HttpServlet.java:731)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
myapp.filter.SimpleCORSFilter.doFilter(SimpleCORSFilter.java:22)
</pre></p><p><b>note</b> <u>The full stack trace of the root cause is available in the Apache Tomcat/7.0.64 logs.</u></p><HR size="1" noshade="noshade"><h3>Apache Tomcat/7.0.64</h3></body></html>`
I am running Elasticsearch 1.7.2 on Ubuntu.
Changes that I have made in elasticsearch.yml
################################### Cluster ###################################
# Cluster name identifies your cluster for auto-discovery. If you're running
# multiple clusters on the same network, make sure you're using unique names.
#
cluster.name: cliffservice
# Set the address other nodes will use to communicate with this node. If not
# set, it is automatically derived. It must point to an actual IP address.
#
#network.publish_host: 192.168.0.1
network.publish_host: 10.100.10.231
# Set both 'bind_host' and 'publish_host':
#
#network.host: 192.168.0.1
Connection code
public TransportClient getClient() {
Settings settings = ImmutableSettings.settingsBuilder().put("cluster.name", "cliffservice").build();
TransportClient client = new TransportClient(settings);
client = client.addTransportAddress(new InetSocketTransportAddress(this.host, this.port));
this.esclient = client;
return client;
}
Get data from elastic client
SearchResponse response = Connection.getEsclient().prepareSearch("testindex").setTypes("testtype").execute()
.actionGet();
if (response.getHits().getHits().length > 0) {
for (SearchHit hit : response.getHits().getHits()) {
CliffDomain cliffDomain = new CliffDomain();
assetDomain.setCliffDomainId(hit.getId());
assetDomain.setCliffDomainName((String) hit.sourceAsMap().get("clifftDomainName"));
searchResponse.add(cliffDomain);
}
}
What wrong I am doing?
Your client and service are in different clusters. The client is in assetservice and the server is in cliffservice. Also, make sure that the client can connect to the port 9300 of 10.100.10.231.
Finally i figured out the problem. I was using elasticsearch-1.7.1 in my Java application. In my local computer Elasticsearch-1.7.1 is installed but in server the version of elasticsearch was 1.7.2. That was the problem. I downgraded server's elasticsearch to 1.7.1 and now it is working fine.

Error while sending mails using EWSJavaAPI

I am trying to send mails from my java web application by using EWS Java api. But ia m getting an error when sending mails. We have Exchange 2013 server installed.
Here is my code
ExchangeService service = new ExchangeService();
ExchangeCredentials credentials = new WebCredentials("username","password");
service.setCredentials(credentials);
try {
service.setUrl(new URI("host/ews/Exchange.asmx"));
service.setTraceEnabled(true);
EmailMessage msg = new EmailMessage(service);
msg.setSubject("Hello world!");
msg.setBody(MessageBody
.getMessageBodyFromText("Sent using the EWS Managed API."));
msg.getToRecipients().add("emailAddress");
msg.send();
}
catch (URISyntaxException e) {
e.printStackTrace();
}
}
And here is the trace report and error log
<Trace Tag="EwsRequestHttpHeaders" Tid="1" Time="2014-08-14 05:58:04Z">
POST /ews/Exchange.asmx HTTP/1.1
Content-type : text/xml; charset=utf-8
Accept-Encoding : gzip,deflate
Keep-Alive : 300
User-Agent : ExchangeServicesClient/0.0.0.0
Connection : Keep-Alive
Accept : text/xml
</Trace>
<Trace Tag="EwsRequest" Tid="1" Time="2014-08-14 05:58:04Z">
<?xml version="1.0" encoding="utf-8"?><soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:m="http://schemas.microsoft.com/exchange/services/2006/messages" xmlns:t="http://schemas.microsoft.com/exchange/services/2006/types"><soap:Header><t:RequestServerVersion Version="Exchange2010_SP1"></t:RequestServerVersion></soap:Header><soap:Body><m:CreateItem MessageDisposition="SendOnly"><m:Items><t:Message><t:Subject>Hello world!</t:Subject><t:Body BodyType="HTML">Sent using the EWS Managed API.</t:Body><t:ToRecipients><t:Mailbox><t:EmailAddress>emailAddress</t:EmailAddress></t:Mailbox></t:ToRecipients></t:Message></m:Items></m:CreateItem></soap:Body></soap:Envelope>
Exception in thread "main" microsoft.exchange.webservices.data.ServiceRequestException: The request failed. sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at microsoft.exchange.webservices.data.ServiceRequestBase.validateAndEmitRequest(ServiceRequestBase.java:719)
at microsoft.exchange.webservices.data.SimpleServiceRequestBase.internalExecute(SimpleServiceRequestBase.java:36)
at microsoft.exchange.webservices.data.MultiResponseServiceRequest.execute(MultiResponseServiceRequest.java:140)
at microsoft.exchange.webservices.data.ExchangeService.internalCreateItems(ExchangeService.java:461)
at microsoft.exchange.webservices.data.ExchangeService.createItem(ExchangeService.java:530)
at microsoft.exchange.webservices.data.Item.internalCreate(Item.java:214)
at microsoft.exchange.webservices.data.EmailMessage.internalSend(EmailMessage.java:125)
at microsoft.exchange.webservices.data.EmailMessage.send(EmailMessage.java:253)
at SendEmail.main(SendEmail.java:28)
Caused by: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at com.sun.net.ssl.internal.ssl.Alerts.getSSLException(Alerts.java:174)
at com.sun.net.ssl.internal.ssl.SSLSocketImpl.fatal(SSLSocketImpl.java:1747)
at com.sun.net.ssl.internal.ssl.Handshaker.fatalSE(Handshaker.java:241)
at com.sun.net.ssl.internal.ssl.Handshaker.fatalSE(Handshaker.java:235)
at com.sun.net.ssl.internal.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1209)
at com.sun.net.ssl.internal.ssl.ClientHandshaker.processMessage(ClientHandshaker.java:135)
at com.sun.net.ssl.internal.ssl.Handshaker.processLoop(Handshaker.java:593)
at com.sun.net.ssl.internal.ssl.Handshaker.process_record(Handshaker.java:529)
at com.sun.net.ssl.internal.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:943)
at com.sun.net.ssl.internal.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1188)
at com.sun.net.ssl.internal.ssl.SSLSocketImpl.writeRecord(SSLSocketImpl.java:654)
at com.sun.net.ssl.internal.ssl.AppOutputStream.write(AppOutputStream.java:100)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:123)
at org.apache.commons.httpclient.methods.EntityEnclosingMethod.writeRequestBody(EntityEnclosingMethod.java:506)
at org.apache.commons.httpclient.HttpMethodBase.writeRequest(HttpMethodBase.java:2114)
at org.apache.commons.httpclient.HttpMethodBase.execute(HttpMethodBase.java:1096)
at org.apache.commons.httpclient.HttpMethodDirector.executeWithRetry(HttpMethodDirector.java:398)
at org.apache.commons.httpclient.HttpMethodDirector.executeMethod(HttpMethodDirector.java:171)
at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:397)
at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:323)
at microsoft.exchange.webservices.data.HttpClientWebRequest.executeRequest(HttpClientWebRequest.java:300)
at microsoft.exchange.webservices.data.ServiceRequestBase.emit(ServiceRequestBase.java:326)
at microsoft.exchange.webservices.data.ServiceRequestBase.validateAndEmitRequest(ServiceRequestBase.java:714)
... 8 more
Caused by: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:323)
at sun.security.validator.PKIXValidator.engineValidate(PKIXValidator.java:217)
at sun.security.validator.Validator.validate(Validator.java:218)
at com.sun.net.ssl.internal.ssl.X509TrustManagerImpl.validate(X509TrustManagerImpl.java:126)
at com.sun.net.ssl.internal.ssl.X509TrustManagerImpl.checkServerTrusted(X509TrustManagerImpl.java:209)
at microsoft.exchange.webservices.data.EwsX509TrustManager.checkServerTrusted(EwsX509TrustManager.java:62)
at com.sun.net.ssl.internal.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1201)
... 27 more
Caused by: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.provider.certpath.SunCertPathBuilder.engineBuild(SunCertPathBuilder.java:174)
at java.security.cert.CertPathBuilder.build(CertPathBuilder.java:238)
at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:318)
... 33 more
i got this problem solved. Added the certificate for my exchange server in java. This site helped- https://blogs.oracle.com/gc/entry/unable_to_find_valid_certification

Resources