Spring boot apache camel and apache camel XPATH - spring

Apache came XPATH fails when parsing xml file with content .
Please find the route below
fromF("file://%s?recursive=true", inputDir)
.routeId("PollFiles")
.log("*** file found ${header.CamelFileName}")
.toF("file://%s?recursive=true",
archiveDir)
.log("*** file found ${body}")
//.convertBodyTo(String.class)
.choice().when()
.xpath("//Available[Class='package']"). log("*** found ${body}")
.end();
Error
org.apache.camel.TypeConversionException: Error during type conversion from type: java.lang.String to the required type: org.w3c.dom.Document with value [Body is instance of java.io.InputStream] due java.io.FileNotFoundException: /Users/solution//X1.DTD (No such file or directory)
Would appreciate your assistence

It's not xpath related, your error says:
java.io.FileNotFoundException: /Users/solution//n (No such file or directory)
that means the file supplied to your method does not exist.

Related

Spring Boot 2.2.2 RELEASE - Could not locate PropertySource : Could not extract response

With Spring Boot 2.2.2.RELEASE
I am seeing below error
27 Jan 2020;21:45:43.870 [main] WARN o.s.c.c.c.ConfigServicePropertySourceLocator - Could not locate PropertySource: Could not extract response: no suitable HttpMessageConverter found for response type [class org.springframework.cloud.config.environment.Environment] and content type [text/html;charset=UTF-8]
Using below versions
implementation 'org.springframework.boot:spring-boot-starter-batch:2.2.2.RELEASE'
implementation 'org.springframework.batch:spring-batch-integration:2.2.2.RELEASE'
implementation 'org.springframework.cloud:spring-cloud-starter-config:2.2.0.RELEASE'
implementation 'org.springframework.cloud:spring-cloud-starter-security:2.1.5.RELEASE'
Surprisingly, i don't see this problem when i am running application in IDE, But seeing this error when running after build.

Create Kafka connect without confluent

I recently started with Kafka and I try to create a Kafka connect to connect to oracle but I can't do it. The information I found is about confluent, but that does't work in Windows ... How can i configure one or create it with java?
I use for my test standalone conecction:
cmd .\windows\connect-standalone.bat .\config\connect-standalone.properties .\config\connect-bbdd.properties ->
name=jdbc-conector
connector.class=io.confluent.connect.jdbc.JdbcSourceConnector
tasks.max=1
connection.url=dbc:oracle:thin#localhost:xe
connection.user: user
connection.password: pwd
mode = bulk
topic.prefix=test
table.whitelist: mytable
Error:
WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig)
WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig)
WARN The configuration 'offset.storage.file.filename' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig)
WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig)
jul 21, 2019 10:36:13 PM org.glassfish.jersey.internal.Errors logErrors
ADVERTENCIA: The following warnings have been detected: WARNING: The (sub)resource method createConnector in
org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation.
WARNING: The (sub)resource method listConnectors in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains
empty path annotation.
WARNING: The (sub)resource method listConnectorPlugins in org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource
contains empty path annotation.
WARNING: The (sub)resource method serverInfo in org.apache.kafka.connect.runtime.rest.resources.RootResource contains empty path annotation.
[2019-07-21 22:36:13,886] ERROR Failed to create job for ..\config\connect-bbdd.properties (org.apache.kafka.connect.cli.ConnectStandalone)
[2019-07-21 22:36:13,888] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone)
Caused by: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector configuration
is invalid and contains the following 2 error(s):
Invalid value java.sql.SQLException: No suitable driver found for jdbc:oracle:thin#localhost:xe
for configuration Couldn't open connection to jdbc:oracle:thin#localhost:xe
You can also find the above list of errors at the endpoint `/{connectorType}/config/validate`
at org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:79)
at org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:66)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:118)
...and other errors from "any class loader (org.reflections.Reflections)"
The confluent command doesn't work natively in Windows, no.
But connect-distributed or connect-standalone are not only in Confluent, and should both work and load the JDBC connectors provided within Confluent Platform if you did download it on Windows.
Otherwise, if you have only Apache Kafka, you will need to download JDBC Connector separately and set it up yourself via the plugin.path property mentioned in the Connect config files.
This error that you get:
No suitable driver found for jdbc:oracle:thin#localhost:xe
for configuration Couldn't open connection to jdbc:oracle:thin#localhost:xe
is because you've not made the Oracle JDBC driver available. See https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector#jdbc-drivers.

Need help configuring ActiveMQ URL with multiple options in JBoss EAP standalone-full.xml

In my standalone-full.xml for JBoss EAP 7.0.x, I have an ActiveMQ resource adaptor where I put the ActiveMQ connection URL. My ActiveMQ connection URL has multiple options, and according to ActiveMQ syntax, the & is used to join the options. For instance:
failover:(tcp://localhost:61616)?startupMaxReconnectAttempts=15&jms.useCompression=true
When I started JBoss server, it threw the following exception:
11:13:19,593 ERROR [org.jboss.as.server] (Controller Boot Thread) WFLYSRV0055: Caught exception during boot: org.jboss.as.controller.persistence.ConfigurationPersistenceException: WFLYCTL0085: Failed to parse configuration
at org.jboss.as.controller.persistence.XmlConfigurationPersister.load(XmlConfigurationPersister.java:131)
at org.jboss.as.server.ServerService.boot(ServerService.java:362)
at org.jboss.as.controller.AbstractControllerService$1.run(AbstractControllerService.java:301)
at java.lang.Thread.run(Thread.java:745)
Caused by: javax.xml.stream.XMLStreamException: com.ctc.wstx.exc.WstxUnexpectedCharException: Unexpected character '=' (code 61); expected a semi-colon after the reference for entity 'jms.useCompression'
at [row,col {unknown-source}]: [407,107]
at org.jboss.as.connector.subsystems.resourceadapters.ResourceAdapterSubsystemParser.readElement(ResourceAdapterSubsystemParser.java:461)
at org.jboss.as.connector.subsystems.resourceadapters.ResourceAdapterSubsystemParser.readElement(ResourceAdapterSubsystemParser.java:123)
at org.jboss.staxmapper.XMLMapperImpl.processNested(XMLMapperImpl.java:110)
at org.jboss.staxmapper.XMLExtendedStreamReaderImpl.handleAny(XMLExtendedStreamReaderImpl.java:69)
at org.jboss.as.server.parsing.StandaloneXml_4.parseServerProfile(StandaloneXml_4.java:546)
at org.jboss.as.server.parsing.StandaloneXml_4.readServerElement(StandaloneXml_4.java:242)
at org.jboss.as.server.parsing.StandaloneXml_4.readElement(StandaloneXml_4.java:141)
at org.jboss.as.server.parsing.StandaloneXml.readElement(StandaloneXml.java:103)
at org.jboss.as.server.parsing.StandaloneXml.readElement(StandaloneXml.java:49)
at org.jboss.staxmapper.XMLMapperImpl.processNested(XMLMapperImpl.java:110)
at org.jboss.staxmapper.XMLMapperImpl.parseDocument(XMLMapperImpl.java:69)
at org.jboss.as.controller.persistence.XmlConfigurationPersister.load(XmlConfigurationPersister.java:123)
... 3 more
Caused by: com.ctc.wstx.exc.WstxUnexpectedCharException: Unexpected character '=' (code 61); expected a semi-colon after the reference for entity 'jms.useCompression'
at [row,col {unknown-source}]: [407,107]
at com.ctc.wstx.sr.StreamScanner.throwUnexpectedChar(StreamScanner.java:647)
at com.ctc.wstx.sr.StreamScanner.parseEntityName(StreamScanner.java:2066)
at com.ctc.wstx.sr.StreamScanner.fullyResolveEntity(StreamScanner.java:1525)
at com.ctc.wstx.sr.BasicStreamReader.readTextSecondary(BasicStreamReader.java:4701)
at com.ctc.wstx.sr.BasicStreamReader.readCoalescedText(BasicStreamReader.java:4146)
at com.ctc.wstx.sr.BasicStreamReader.getElementText(BasicStreamReader.java:683)
at org.jboss.staxmapper.XMLExtendedStreamReaderImpl.getElementText(XMLExtendedStreamReaderImpl.java:144)
at org.jboss.as.connector.util.AbstractParser.rawElementText(AbstractParser.java:61)
at org.jboss.as.connector.subsystems.resourceadapters.CommonIronJacamarParser.parseConfigProperties(CommonIronJacamarParser.java:121)
at org.jboss.as.connector.subsystems.resourceadapters.ResourceAdapterParser.parseResourceAdapter(ResourceAdapterParser.java:311)
at org.jboss.as.connector.subsystems.resourceadapters.ResourceAdapterParser.parseResourceAdapters(ResourceAdapterParser.java:138)
at org.jboss.as.connector.subsystems.resourceadapters.ResourceAdapterParser.parse(ResourceAdapterParser.java:104)
at org.jboss.as.connector.subsystems.resourceadapters.ResourceAdapterSubsystemParser.readElement(ResourceAdapterSubsystemParser.java:452)
... 14 more
11:13:19,595 FATAL [org.jboss.as.server] (Controller Boot Thread) WFLYSRV0056: Server boot has failed in an unrecoverable manner; exiting. See previous messages for details.
I wonder if anyone has encountered and resolved this issue?
You need to use & instead of only '&' like the below:
failover:(tcp://localhost:61616)?startupMaxReconnectAttempts=15&jms.useCompression=true

Log4j2 encoding issue

When I try to run Elasticsearch on Windows 10 as main language is English, everything works fine. But if I change the main language as Turkish, I get error messages as:
2018-07-26 14:42:39,485 main ERROR Unable to locate plugin type for IfFileName
2018-07-26 14:42:39,633 main ERROR Unable to locate plugin for IfAccumulatedFileSize
2018-07-26 14:42:39,634 main ERROR Unable to locate plugin for IfFileName
2018-07-26 14:42:39,637 main ERROR Unable to invoke factory method in class org.apache.logging.log4j.core.appender.rolling.action.DeleteAction for element Delete: java.lang.NullPointerException java.lang.NullPointerException
at org.apache.logging.log4j.core.config.plugins.visitors.PluginElementVisitor.findNamedNode(PluginElementVisitor.java:103)
at org.apache.logging.log4j.core.config.plugins.visitors.PluginElementVisitor.visit(PluginElementVisitor.java:87)
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.generateParameters(PluginBuilder.java:248)
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:135)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:958)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:898)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:890)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:890)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:890)
at org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:513)
at org.apache.logging.log4j.core.config.AbstractConfiguration.initialize(AbstractConfiguration.java:237)
at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:249)
at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:545)
at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:261)
at org.elasticsearch.common.logging.LogConfigurator.configure(LogConfigurator.java:163)
at org.elasticsearch.common.logging.LogConfigurator.configure(LogConfigurator.java:119)
at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:291)
at org.elasticsearch.bootstrap.Elasticsearch.init(Elasticsearch.java:121)
at org.elasticsearch.bootstrap.Elasticsearch.execute(Elasticsearch.java:112)
at org.elasticsearch.cli.EnvironmentAwareCommand.execute(EnvironmentAwareCommand.java:86)
at org.elasticsearch.cli.Command.mainWithoutErrorHandling(Command.java:124)
at org.elasticsearch.cli.Command.main(Command.java:90)
at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:92)
at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:85)
2018-07-26 14:42:39,645 main ERROR Null object returned for Delete in DefaultRolloverStrategy.
So it seem like a charset problem. The file is encoded as UTF-8, I check it with Notepad++. Elasticsearch has JVM option -Dfile.encoding=UTF-8. I double checked the log4j2.properties file and IfFileName has no space after it.
And if I change IfFileName as ıfFileName (which ı is a Turkish character - lower I) error becomes:
2018-07-26 14:54:25,819 main ERROR Unable to locate plugin type for ıfFileName
Does anyone have an idea about how to fix that?
Adding -Duser.language=en JVM parameter fixed the problem.
I had the same problem but didn't know where to add the -Duser.language=en. However, I found it out it is under the sonar.properties, the line where sonar.search.javaAdditionalOpts= is located remove the # at the begining and write as sonar.search.javaAdditionalOpts=-Duser.language=en and save the file.
This is a bug in Log4j2, which uses String#toLowerCase() without a locale parameter: in the Turkish locale IfFileName is lowercased as ıffilename (with a dotless i). I have reported this as GH issue #1281.
Until this is fixed you can write plugin types in all lowercase (English) letters: e.g. iffilename instead of IfFileName.

Error When Loading Data from HDFS and Writing to HBase using Pig

How to load the output data of a mapreduce program which is in the hdfs into hbase?
I tried to running the following pig command to load the data from hdfs to hbase:-
A = LOAD 'hdfs://b**/user/user1/development/hbase/output/part-00000' USING PigStorage('t') as (strdata1:chararray, strdata2:chararray);
STORE A INTO 'hbase://mydata' USING org.apache.pig.backend.hadoop.hbase.HBaseStorage('mycf:strdata2');
where, hdfs://b**/user/user1/development/hbase/output/part-00000 is the map-reduce output mydata is the hbase table name created
mycf is the column family name
I am getting the following error:-
ERROR 2017: Internal error creating job configuration.
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobCreationException: ERROR 2017: Internal error creating job configuration.
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(JobControlCompiler.java:673)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:256)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:147)
at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.execute(HExecutionEngine.java:378)
at org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1198)
at org.apache.pig.PigServer.execute(PigServer.java:1190)
at org.apache.pig.PigServer.access$100(PigServer.java:128)
at org.apache.pig.PigServer$Graph.execute(PigServer.java:1517)
at org.apache.pig.PigServer.executeBatchEx(PigServer.java:362)
at org.apache.pig.PigServer.executeBatch(PigServer.java:329)
at org.apache.pig.tools.grunt.GruntParser.executeBatch(GruntParser.java:112)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:169)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:141)
at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:90)
at org.apache.pig.Main.run(Main.java:406)
at org.apache.pig.Main.main(Main.java:107)
Caused by: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: hbase://mydata_logs at org.apache.hadoop.fs.Path.initialize(Path.java:148)
at org.apache.hadoop.fs.Path.<init>(Path.java:71)
at org.apache.hadoop.fs.Path.<init>(Path.java:45)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(JobControlCompiler.java:476)
... 15 more
Caused by: java.net.URISyntaxException: Relative path in absolute URI: hbase://mydata_logs at java.net.URI.checkPath(URI.java:1787)
at java.net.URI.<init>(URI.java:735)
at org.apache.hadoop.fs.Path.initialize(Path.java:145)
Only remove the hbase:// schema from data source string like that :
STORE A INTO 'mydata' USING org.apache.pig.backend.hadoop.hbase.HBaseStorage('mycf:strdata2');

Resources