Spring threshold file size - spring

In version 2.1.3, Failed to bind properties under 'spring.servlet.multipart.file-size-threshold' to org.springframework.util.unit.DataSize
Property:
spring.servlet.multipart.file-size-threshold=2KB
Exception:
Description:
Failed to bind properties under
'spring.servlet.multipart.file-size-threshold' to
org.springframework.util.unit.DataSize:
Property: spring.servlet.multipart.file-size-threshold
Value: 2KB
Origin: class path resource [application.properties]:24:46
Reason: failed to convert java.lang.String to org.springframework.util.unit.DataSize
Action:
Update your application's configuration

You have a trailing space in the property's value. You have configured a size of 2KB and it should be 2KB.

Related

Spring application.yaml and boolean values through env file

In my Spring Boot project, I have to pass to application.yaml file the SHOW_SQL boolean value from an environment file.
Here is application.yaml
spring:
...
show-sql: ${SHOW_SQL}
...
server:
port: 8080
Here is my env file (used to start the spring app)
SHOW_SQL=true
During the mvn test process I receive this error
Failed to bind properties under 'spring.jpa.show-sql' to boolean:
Property: spring.jpa.show-sql
Value: "${SHOW_SQL}"
Origin: class path resource [application.yaml] - 13:15
Reason: failed to convert java.lang.String to boolean (caused by java.lang.IllegalArgumentException: Invalid boolean value '${SHOW_SQL}')
Action:
Update your application's configuration
Is there a way to use boolean values from an .env file and pass them to the application yaml?

avroserializer declarion in spring properties

I am trying to use an avroserializer for value serialization in a kafka publisher. This is the entry I added in my application properties:
spring.kafka.producer.value-serializer=io.confluent.kafka.serializers.KafkaAvroSerializer.class
I am getting the below exception.Please help me what to change here.
Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled.
2021-06-22 00:25:15.926 ERROR 25080 --- [ main] o.s.b.d.LoggingFailureAnalysisReporter :
***************************
APPLICATION FAILED TO START
***************************
Description:
Failed to bind properties under 'spring.kafka.producer.value-serializer' to java.lang.Class<?>:
Property: spring.kafka.producer.value-serializer
Value: io.confluent.kafka.serializers.KafkaAvroSerializer.class
Origin: class path resource [application.properties] - 22:40
Reason: No converter found capable of converting from type [java.lang.String] to type [java.lang.Class<?>]
Action:
Update your application's configuration
I tried with this option also:
spring.kafka.producer.value-serializer=io.confluent.kafka.serializers.KafkaAvroSerializer

Jooq and Spring boot: Upgraded JOOQ via starter: failed to bind spring.jooq.sql-dialect' to org.jooq.SQLDialect

So i upgraded spring-boot-parent-starter to 2.2.8.RELEASE, which results in jooq 3.12.4 . Previously i had 3.11.5.
I am getting the following error
Failed to bind properties under 'spring.jooq.sql-dialect' to org.jooq.SQLDialect:
Property: spring.jooq.sqldialect
Value: MYSQL_5_7
Origin: "spring.jooq.SQLDialect" from property source "applicationConfig: [classpath:/config/application.yaml]"
Reason: failed to convert java.lang.String to org.jooq.SQLDialect
Here is what my application.yaml was before
spring:
jooq:
sql-dialect: mysql_5_7
If you read the whole error message you will see this:
Failed to bind properties under 'spring.jooq.sql-dialect' to org.jooq.SQLDialect:
Property: spring.jooq.sql-dialect
Value: MYSQL_5_7
Origin: class path resource [application.properties]:2:25
Reason: failed to convert java.lang.String to org.jooq.SQLDialect
Action:
Update your application's configuration. The following values are valid:
CUBRID
DEFAULT
DERBY
FIREBIRD
H2
HSQLDB
MARIADB
MYSQL
POSTGRES
SQL99
SQLITE
MYSQL_5_7 is not a value supported by jOOQ open source. It's only availble in the pro version
MYSQL_5_7
#Pro
public static final SQLDialect MYSQL_5_7
The MySQL 5.7 dialect.
This dialect is available in commercial jOOQ distributions, only.

Create Kafka connect without confluent

I recently started with Kafka and I try to create a Kafka connect to connect to oracle but I can't do it. The information I found is about confluent, but that does't work in Windows ... How can i configure one or create it with java?
I use for my test standalone conecction:
cmd .\windows\connect-standalone.bat .\config\connect-standalone.properties .\config\connect-bbdd.properties ->
name=jdbc-conector
connector.class=io.confluent.connect.jdbc.JdbcSourceConnector
tasks.max=1
connection.url=dbc:oracle:thin#localhost:xe
connection.user: user
connection.password: pwd
mode = bulk
topic.prefix=test
table.whitelist: mytable
Error:
WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig)
WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig)
WARN The configuration 'offset.storage.file.filename' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig)
WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig)
jul 21, 2019 10:36:13 PM org.glassfish.jersey.internal.Errors logErrors
ADVERTENCIA: The following warnings have been detected: WARNING: The (sub)resource method createConnector in
org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation.
WARNING: The (sub)resource method listConnectors in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains
empty path annotation.
WARNING: The (sub)resource method listConnectorPlugins in org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource
contains empty path annotation.
WARNING: The (sub)resource method serverInfo in org.apache.kafka.connect.runtime.rest.resources.RootResource contains empty path annotation.
[2019-07-21 22:36:13,886] ERROR Failed to create job for ..\config\connect-bbdd.properties (org.apache.kafka.connect.cli.ConnectStandalone)
[2019-07-21 22:36:13,888] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone)
Caused by: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector configuration
is invalid and contains the following 2 error(s):
Invalid value java.sql.SQLException: No suitable driver found for jdbc:oracle:thin#localhost:xe
for configuration Couldn't open connection to jdbc:oracle:thin#localhost:xe
You can also find the above list of errors at the endpoint `/{connectorType}/config/validate`
at org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:79)
at org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:66)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:118)
...and other errors from "any class loader (org.reflections.Reflections)"
The confluent command doesn't work natively in Windows, no.
But connect-distributed or connect-standalone are not only in Confluent, and should both work and load the JDBC connectors provided within Confluent Platform if you did download it on Windows.
Otherwise, if you have only Apache Kafka, you will need to download JDBC Connector separately and set it up yourself via the plugin.path property mentioned in the Connect config files.
This error that you get:
No suitable driver found for jdbc:oracle:thin#localhost:xe
for configuration Couldn't open connection to jdbc:oracle:thin#localhost:xe
is because you've not made the Oracle JDBC driver available. See https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector#jdbc-drivers.

Spring boot apache camel and apache camel XPATH

Apache came XPATH fails when parsing xml file with content .
Please find the route below
fromF("file://%s?recursive=true", inputDir)
.routeId("PollFiles")
.log("*** file found ${header.CamelFileName}")
.toF("file://%s?recursive=true",
archiveDir)
.log("*** file found ${body}")
//.convertBodyTo(String.class)
.choice().when()
.xpath("//Available[Class='package']"). log("*** found ${body}")
.end();
Error
org.apache.camel.TypeConversionException: Error during type conversion from type: java.lang.String to the required type: org.w3c.dom.Document with value [Body is instance of java.io.InputStream] due java.io.FileNotFoundException: /Users/solution//X1.DTD (No such file or directory)
Would appreciate your assistence
It's not xpath related, your error says:
java.io.FileNotFoundException: /Users/solution//n (No such file or directory)
that means the file supplied to your method does not exist.

Resources