How to resolve ClassCastException in MultiResourceItemReader Spring Batch - spring

I'm reading multiple files from the S3 bucket using MultiResourceItemReader, I'm getting ClassCastException before executing the myReader() method, Something wrong with MultiResourceItemReader not sure what's going wrong here.
Please find my code below:
#Bean
public MultiResourceItemReader<String> multiResourceReader()
{
String bucket = "mybucket;
String key = "/myfiles";
List<InputStream> resourceList = s3Client.getFiles(bucket, key);
List<InputStreamResource> inputStreamResourceList = new ArrayList<>();
for (InputStream s: resourceList) {
inputStreamResourceList.add(new InputStreamResource(s));
}
Resource[] resources = inputStreamResourceList.toArray(new InputStreamResource[inputStreamResourceList.size()]);
//InputStreamResource[] resources = inputStreamResourceList.toArray(new InputStreamResource[inputStreamResourceList.size()]);
// I'm getting all the stream content - I verified my stream is not null
for (int i = 0; i < resources.length; i++) {
try {
InputStream s = resources[i].getInputStream();
String result = IOUtils.toString(s, StandardCharsets.UTF_8);
System.out.println(result);
} catch (IOException e) {
e.printStackTrace();
}
}
MultiResourceItemReader<String> resourceItemReader = new MultiResourceItemReader<>();
resourceItemReader.setResources(resources);
resourceItemReader.setDelegate(myReader());
resourceItemReader.setDelegate((ResourceAwareItemReaderItemStream<? extends String>) new CustomComparator());
return resourceItemReader;
}
Exception:
Caused by: java.lang.ClassCastException: class CustomComparator cannot be cast to class org.springframework.batch.item.file.ResourceAwareItemReaderItemStream (CustomComparator and org.springframework.batch.item.file.ResourceAwareItemReaderItemStream are in unnamed module of loader org.springframework.boot.loader.LaunchedURLClassLoader #cc285f4)
at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:244)
at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:331)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:154)
... 65 common frames omitted
Can someone please help me to resolve this issue. Appreciated your help in advance. Thanks.

The reason you see the NullPointerException is due to the default comparator used by the MultiResourceItemReader to sort the resources after loading them.
The default compare behavior calls the getFilename() method of the InputStreamResource.
Refer - https://github.com/spring-projects/spring-batch/blob/115c3022147692155d45e23cdd5cef84895bf9f5/spring-batch-infrastructure/src/main/java/org/springframework/batch/item/file/MultiResourceItemReader.java#L82
But the InputStreamResource just inherits the getFileName() method from its parent AbstractResource, which just returns null.
https://github.com/spring-projects/spring-framework/blob/316e84f04f3dbec3ea5ab8563cc920fb21f49749/spring-core/src/main/java/org/springframework/core/io/AbstractResource.java#L220
The solution is to provide a custom comparator for the MultiResourceItemReader. Here is a simple example, assuming you do not want to sort the resources in a specific way before processing:
public class CustomComparator implements Comparator<InputStream>{
#Override
public int compare(InputStream is1, InputStream is2) {
//comparing based on last modified time
return Long.compare(is1.hashCode(),is2.hashCode());
}
}
MultiResourceItemReader<String> resourceItemReader = new MultiResourceItemReader<>();
resourceItemReader.setResources(resources);
resourceItemReader.setDelegate(myReader());
//UPDATED with correction - set custom Comparator
resourceItemReader.setComparator(new CustomComparator());
Refer this answer for how a Comparator is used by Spring Batch MultiResourceItemReader.
File processing order with Spring Batch

Related

JAXBElement: providing codec (/converter?) for class java.lang.Class

I have been evaluating to adopt spring-data-mongodb for a project. In summary, my aim is:
Using existing XML schema files to generate Java classes.
This is achieved using JAXB xjc
The root class is TSDProductDataType and is further modeled as below:
The thing to note here is that ExtensionType contains protected List<Object> any; allowing it to store Objects of any class. In my case, it is amongst the classes named TSDModule_Name_HereModuleType and can be browsed here
Use spring-data-mongodb as persistence store
This is achieved using a simple ProductDataRepository
#RepositoryRestResource(collectionResourceRel = "product", path = "product")
public interface ProductDataRepository extends MongoRepository<TSDProductDataType, String> {
TSDProductDataType queryByGtin(#Param("gtin") String gtin);
}
The unmarshalled TSDProductDataType, however, contains JAXBElement which spring-data-mongodb doesn't seem to handle by itself and throws a CodecConfigurationException org.bson.codecs.configuration.CodecConfigurationException: Can't find a codec for class java.lang.Class.
Here is the faulty statement:
TSDProductDataType tsdProductDataType = jaxbElement.getValue();
repository.save(tsdProductDataType);
I tried playing around with Converters for spring-data-mongodb as explained here, however, it seems I am missing something since the exception is about "Codecs" and not "Converters".
Any help is appreciated.
EDIT:
Adding converters for JAXBElement
Note: Works with version 1.5.6.RELEASE of org.springframework.boot::spring-boot-starter-parent. With version 2.0.0.M3, hell breaks loose
It seems that I missed something while trying to add converter earlier. So, I added it like below for testing:
#Component
#ReadingConverter
public class JAXBElementReadConverter implements Converter<DBObject, JAXBElement> {
//#Autowired
//MongoConverter converter;
#Override
public JAXBElement convert(DBObject dbObject) {
Class declaredType, scope;
QName name = qNameFromString((String)dbObject.get("name"));
Object rawValue = dbObject.get("value");
try {
declaredType = Class.forName((String)dbObject.get("declaredType"));
} catch (ClassNotFoundException e) {
if (rawValue.getClass().isArray()) declaredType = List.class;
else declaredType = LinkedHashMap.class;
}
try {
scope = Class.forName((String) dbObject.get("scope"));
} catch (ClassNotFoundException e) {
scope = JAXBElement.GlobalScope.class;
}
//Object value = rawValue instanceof DBObject ? converter.read(declaredType, (DBObject) rawValue) : rawValue;
Object value = "TODO";
return new JAXBElement(name, declaredType, scope, value);
}
QName qNameFromString(String s) {
String[] parts = s.split("[{}]");
if (parts.length > 2) return new QName(parts[1], parts[2], parts[0]);
if (parts.length == 1) return new QName(parts[0]);
return new QName("undef");
}
}
#Component
#WritingConverter
public class JAXBElementWriteConverter implements Converter<JAXBElement, DBObject> {
//#Autowired
//MongoConverter converter;
#Override
public DBObject convert(JAXBElement jaxbElement) {
DBObject dbObject = new BasicDBObject();
dbObject.put("name", qNameToString(jaxbElement.getName()));
dbObject.put("declaredType", jaxbElement.getDeclaredType().getName());
dbObject.put("scope", jaxbElement.getScope().getCanonicalName());
//dbObject.put("value", converter.convertToMongoType(jaxbElement.getValue()));
dbObject.put("value", "TODO");
dbObject.put("_class", JAXBElement.class.getName());
return dbObject;
}
public String qNameToString(QName name) {
if (name.getNamespaceURI() == XMLConstants.NULL_NS_URI) return name.getLocalPart();
return name.getPrefix() + '{' + name.getNamespaceURI() + '}' + name.getLocalPart();
}
}
#SpringBootApplication
public class TsdApplication {
public static void main(String[] args) {
SpringApplication.run(TsdApplication.class, args);
}
#Bean
public CustomConversions customConversions() {
return new CustomConversions(Arrays.asList(
new JAXBElementReadConverter(),
new JAXBElementWriteConverter()
));
}
}
So far so good. However, how do I instantiate MongoConverter converter;?
MongoConverter is an interface so I guess I need an instantiable class adhering to this interface. Any suggestions?
I understand the desire for convenience in being able to just map an existing domain object to the database layer with no boilerplate, but even if you weren't having the JAXB class structure issue, I would still be recommending away from using it verbatim. Unless this is a simple one-off project, you almost definitely will hit a point where your domain models will need to change but your persisted data need to remain in an existing state. If you are just straight persisting the data, you have no mechanism to convert between a newer domain schema and an older persisted data scheme. Versioning of the persisted data scheme would be wise too.
The link you posted for writing the customer converters is one way to achieve this and fits in nicely with the Spring ecosystem. That method should also solve the issue you are experiencing (about the underlying messy JAXB data structure not converting cleanly).
Are you unable to get that method working? Ensure you are loading them into the Spring context with #Component plus auto-class scanning or manually via some Configuration class.
EDIT to address your EDIT:
Add the following to each of your converters:
private final MongoConverter converter;
public JAXBElement____Converter(MongoConverter converter) {
this.converter = converter;
}
Try changing your bean definition to:
#Bean
public CustomConversions customConversions(#Lazy MongoConverter converter) {
return new CustomConversions(Arrays.asList(
new JAXBElementReadConverter(converter),
new JAXBElementWriteConverter(converter)
));
}

How to pull sub, sub objects with Spring WS and JAXB

I'm attempting to pull data from a SOAP service that I have no control over. The hierarchy contains ProductOrder -> ShipTo -> Item where there are one or more shipToes and one or more Items per shipto.
Their API uses a mock SQL like query language. I'm getting stack traces like the following when trying to pull data including the items. if I exclude item, I'm able to pull the ProductOrders along with ShipTo objects, but items is always an empty list.
java.lang.IllegalStateException: Failed to execute CommandLineRunner
at
org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:779)
~[spring-boot-1.5.0.RELEASE.jar:1.5.0.RELEASE] at
org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:760)
~[spring-boot-1.5.0.RELEASE.jar:1.5.0.RELEASE] at
org.springframework.boot.SpringApplication.afterRefresh(SpringApplication.java:747)
~[spring-boot-1.5.0.RELEASE.jar:1.5.0.RELEASE] at
org.springframework.boot.SpringApplication.run(SpringApplication.java:315)
~[spring-boot-1.5.0.RELEASE.jar:1.5.0.RELEASE] at
edu.umich.oud.giftformatter.convioexport.Application.main(Application.java:39)
~[classes/:na] Caused by:
org.springframework.oxm.UncategorizedMappingException: Unknown JAXB
exception; nested exception is javax.xml.bind.JAXBException: Field
order for ShipTo.Item.ItemId does not match the schema definition for
record type ProductOrder at
org.springframework.oxm.jaxb.Jaxb2Marshaller.convertJaxbException(Jaxb2Marshaller.java:915)
~[spring-oxm-4.3.6.RELEASE.jar:4.3.6.RELEASE] at
edu.umich.oud.giftformatter.convioexport.CustJaxbUnMarshaller.unmarshal(CustJaxbUnMarshaller.java:37)
~[classes/:na] at
org.springframework.ws.support.MarshallingUtils.unmarshal(MarshallingUtils.java:62)
~[spring-ws-core-2.4.0.RELEASE.jar:2.4.0.RELEASE] at
org.springframework.ws.client.core.WebServiceTemplate$3.extractData(WebServiceTemplate.java:413)
~[spring-ws-core-2.4.0.RELEASE.jar:2.4.0.RELEASE] at
org.springframework.ws.client.core.WebServiceTemplate.doSendAndReceive(WebServiceTemplate.java:619)
~[spring-ws-core-2.4.0.RELEASE.jar:2.4.0.RELEASE] at
org.springframework.ws.client.core.WebServiceTemplate.sendAndReceive(WebServiceTemplate.java:555)
~[spring-ws-core-2.4.0.RELEASE.jar:2.4.0.RELEASE] at
org.springframework.ws.client.core.WebServiceTemplate.marshalSendAndReceive(WebServiceTemplate.java:390)
~[spring-ws-core-2.4.0.RELEASE.jar:2.4.0.RELEASE] at
org.springframework.ws.client.core.WebServiceTemplate.marshalSendAndReceive(WebServiceTemplate.java:383)
~[spring-ws-core-2.4.0.RELEASE.jar:2.4.0.RELEASE] at
edu.umich.oud.giftformatter.convioexport.services.ConvioClient.queryInternal(ConvioClient.java:159)
~[classes/:na] at
edu.umich.oud.giftformatter.convioexport.services.ConvioClient.query(ConvioClient.java:134)
~[classes/:na] at
edu.umich.oud.giftformatter.convioexport.services.ProductOrderService.getProductOrders(ProductOrderService.java:87)
~[classes/:na] at
edu.umich.oud.giftformatter.convioexport.services.ConvioService.load(ConvioService.java:82)
~[classes/:na] at
edu.umich.oud.giftformatter.convioexport.Application.lambda$runner$0(Application.java:72)
~[classes/:na] at
org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:776)
~[spring-boot-1.5.0.RELEASE.jar:1.5.0.RELEASE] ... 4 common frames
omitted Caused by: javax.xml.bind.JAXBException: Field order for
ShipTo.Item.ItemId does not match the schema definition for record
type ProductOrder ... 17 common frames omitted
The product order service contains a method like so:
public List<ProductOrderObj> getProductOrders(final Date startDate, final Date endDate) {
final String query = String.format("SELECT siteId,orderId,transactionId,purchaseAmount,taxDeductibleValue,\n" +
"shippingCharge,additionalDonation,discountAmount,discountCode,\n" +
"creationDate,createdBy,modifyDate,lastChangeBy,storeId,payment,\n" +
"purchaser,interactionSource,shipTo,\n" +
"receiptNumber,shipTo.item FROM ProductOrder where creationDate > %s and creationDate < %s",
convertDate(startDate), convertDate(endDate));
log.info("query is " + query);
final Session session = convioClient.startSession();
final ArrayList<ProductOrderObj> events = new ArrayList<>();
for (int page = 1; page < 100; page++) {
final List<? extends RecordObj> items = convioClient.query(session, page, ConvioConfiguration.MAX_DOWNLOADS_PER_REQUEST, query);
if (items.size() < ConvioConfiguration.MAX_DOWNLOADS_PER_REQUEST) {
events.addAll((List<ProductOrderObj>) items);
break;
}
events.addAll((List<ProductOrderObj>) items);
}
return events;
}
Which in turn calls the convioService.query method that effectively does this
private List<? extends RecordObj> queryInternal(final Session session, final
int page, final int pageSize, final String q) {
// setup query
final Query query = new Query();
query.setPage(BigInteger.valueOf(page));
query.setPageSize(BigInteger.valueOf(pageSize));
query.setQueryString(q);
log.trace(q);
// perform query
try {
final Object obj = getWebServiceTemplate().marshalSendAndReceive(query,
new SoapActionExecutionIdCallback(session));
final QueryResponse response = (QueryResponse) obj;
if (response != null) {
log.debug("Response was a " + response.getClass().getName());
return response.getRecord();
}
} catch (final Exception e) {
log.error(e.getMessage());
throw e;
}
throw new NullPointerException("response was null");
}
There seemed to be two issues causing this not to work:
Bad field definition for the child object. shipTo.items vs shipTo.Items
Disabling validation of the dtd in the marshaller/unmarshaller

ArrayWritable as key in Hadoop MapReduce

I am attempting to create a dynamic map reduce application that takes in dimensions from an external properties file. The main problem lies in the fact that the variables i.e. the key will be composite and may be of whatever numbers, e.g pair of 3 keys, pair of 4 keys, etc.
My Mapper:
public void map(AvroKey<flumeLogs> key, NullWritable value, Context context) throws IOException, InterruptedException{
Configuration conf = context.getConfiguration();
int dimensionCount = Integer.parseInt(conf.get("dimensionCount"));
String[] dimensions = conf.get("dimensions").split(","); //this gets the dimensions from the run method in main
Text[] values = new Text[dimensionCount]; //This is supposed to be my composite key
for (int i=0; i<dimensionCount; i++){
switch(dimensions[i]){
case "region": values[i] = new Text("-");
break;
case "event": values[i] = new Text("-");
break;
case "eventCode": values[i] = new Text("-");
break;
case "mobile": values[i] = new Text("-");
}
}
context.write(new StringArrayWritable(values), new IntWritable(1));
}
The values will have good logic later.
My StringArrayWritable:
public class StringArrayWritable extends ArrayWritable {
public StringArrayWritable() {
super(Text.class);
}
public StringArrayWritable(Text[] values){
super(Text.class, values);
Text[] texts = new Text[values.length];
for (int i = 0; i < values.length; i++) {
texts[i] = new Text(values[i]);
}
set(texts);
}
#Override
public String toString(){
StringBuilder sb = new StringBuilder();
for(String s : super.toStrings()){
sb.append(s).append("\t");
}
return sb.toString();
}
}
The error I am getting:
Error: java.io.IOException: Initialization of all the collectors failed. Error in last collector was :class StringArrayWritable
at org.apache.hadoop.mapred.MapTask.createSortingCollector(MapTask.java:414)
at org.apache.hadoop.mapred.MapTask.access$100(MapTask.java:81)
at org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:698)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:770)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.ClassCastException: class StringArrayWritable
at java.lang.Class.asSubclass(Class.java:3165)
at org.apache.hadoop.mapred.JobConf.getOutputKeyComparator(JobConf.java:892)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.init(MapTask.java:1005)
at org.apache.hadoop.mapred.MapTask.createSortingCollector(MapTask.java:402)
... 9 more
Any help would be greatly appreciated.
Thanks a lot.
You're trying to use a Writable object as the key. In mapreduce the key must implement the WritableComparable interface. ArrayWritable only implements the Writable interface.
The difference between the two is that the comaprable interface requires you to implement a compareTo method so that mapreduce is able to sort and group the keys correctly.

Drive V3 API : java.lang.IllegalArgumentException

This is how I create "AbstractInputStreamContent" from inputstream of file:
final Long length = Long.valueOf(filesData.get(uploadedFileName).get("size")).longValue();
final InputStream fileStream = item.openStream(); //FileItemStream item
AbstractInputStreamContent fileContent = new AbstractInputStreamContent(uploadedFileMimeType) {
#Override
public boolean retrySupported() {
return false;
}
#Override
public long getLength() throws IOException {
return length;
}
#Override
public InputStream getInputStream() throws IOException {
return fileStream;
}
};
And "InputStreamContent" as:
InputStreamContent fileContent = new InputStreamContent(uploadedFileMimeType, item.openStream());
fileContent.setLength(Long.valueOf(filesData.get(uploadedFileName).get("size")).longValue());
To replace old file with new file I use(both files are of .docx format):
Drive.Files.Update update = driveService.files().update(fileIdOfFileToReplace,fileMeta,fileContent);
update.set("uploadType", "resumable");
update.getMediaHttpUploader().setDirectUploadEnabled(false);
update.getMediaHttpUploader().setChunkSize(MediaHttpUploader.DEFAULT_CHUNK_SIZE);
File updatedFile = update.execute();
Uploading a new file works fine whether I use InputStreamContent or AbstractInputStreamContent. But update gives "java.lang.IllegalArgumentException" with both
java.lang.IllegalArgumentException
at com.google.api.client.repackaged.com.google.common.base.Preconditions.checkArgument(Preconditions.java:111)
at com.google.api.client.util.Preconditions.checkArgument(Preconditions.java:37)
at com.google.api.client.googleapis.media.MediaHttpUploader.setInitiationRequestMethod(MediaHttpUploader.java:872)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.initializeMediaUpload(AbstractGoogleClientRequest.java:237)
at com.google.api.services.drive.Drive$Files$Update.<init>(Drive.java:3163)
at com.google.api.services.drive.Drive$Files.update(Drive.java:3113)
at com.util.DocumentsUtil.updateFile(DocumentsUtil.java:22)
at com.controllers.collab.documents.Documents.fileUpload(Documents.java:165)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:44)
at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:221)
at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:136)
at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:110)
Solved it by giving exclusion of google-api-client from each google api maven dependency. Later added google-api-client-1.22.0-SNAPSHOT dependency from sonatype repo. It works fine

Camel's BridgePropertyPlaceholderConfigurer is not working when using Java config

I'm using Spring Java config and writing a console application with a few Camel routes. I have several properties sources in my app, so I use two PropertyPlaceholderConfigurers:
#Configuration
#Import(CamelConfig.class)
#ComponentScan(basePackageClasses = {App.class})
public class Config
{
final static String ENV = System.getProperty( "ENV" );
#Bean
public static BridgePropertyPlaceholderConfigurer properties()
{
final BridgePropertyPlaceholderConfigurer result = new BridgePropertyPlaceholderConfigurer();
result.setOrder( 0 );
result.setIgnoreUnresolvablePlaceholders( true );
result.setLocations( new ClassPathResource( "a/b/c/environments/base.properties" ),
new ClassPathResource( "a/b/c/environments/" + ENV + "/env.properties" ) );
return result;
}
#Bean
public static BridgePropertyPlaceholderConfigurer dlqAppProperties()
{
final YamlPropertiesFactoryBean yaml = new YamlPropertiesFactoryBean();
final BridgePropertyPlaceholderConfigurer result = new BridgePropertyPlaceholderConfigurer();
yaml.setResources( new ClassPathResource( "app.yaml" ) );
result.setOrder( 1 );
result.setIgnoreUnresolvablePlaceholders( true );
result.setProperties( yaml.getObject() );
return result;
}
}
As per this doc I'm using BridgePropertyPlaceholderConfigurer class to make Spring properties available in Camel. It's config is simple too:
#Configuration
public class CamelConfig extends SingleRouteCamelConfiguration
{
#Override
protected CamelContext createCamelContext() throws Exception
{
final SpringCamelContext result = new SpringCamelContext( getApplicationContext() );
return result;
}
#Override
protected void setupCamelContext( CamelContext camelContext ) throws Exception
{
}
#Bean
#Override
public RouteBuilder route()
{
return (new Routes()).builder();
}
}
Test route (Scala DSL) is simple too:
class Routes extends RouteBuilder {
"timer://{{foo}}?period=2s" ==> {
process((exchange) => {
exchange.getIn.setBody("test")
})
to("log:test")
}
}
But the context does not start with following exception:
Exception in thread "main" org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'camelContext' defined in class path resource [a/b/c/config/CamelConfig.class]: Invocation of init method failed; nested exception is org.apache.camel.FailedToCreateRouteException: Failed to create route route1: Route(route1)[[From[timer://{{foo}}?period=2s]] -> [process[... because of Failed to resolve endpoint: timer://{{foo}}?period=2s due to: PropertiesComponent with name properties must be defined in CamelContext to support property placeholders.
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1566)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:476)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:303)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:299)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:194)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:755)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:757)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:480)
at org.springframework.context.annotation.AnnotationConfigApplicationContext.<init>(AnnotationConfigApplicationContext.java:84)
at a.b.c.App.main(App.java:13)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
Caused by: org.apache.camel.FailedToCreateRouteException: Failed to create route route1: Route(route1)[[From[timer://{{foo}}?period=2s]] -> [process[... because of Failed to resolve endpoint: timer://{{foo}}?period=2s due to: PropertiesComponent with name properties must be defined in CamelContext to support property placeholders.
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:182)
at org.apache.camel.impl.DefaultCamelContext.startRoute(DefaultCamelContext.java:770)
at org.apache.camel.impl.DefaultCamelContext.startRouteDefinitions(DefaultCamelContext.java:1914)
at org.apache.camel.impl.DefaultCamelContext.doStartCamel(DefaultCamelContext.java:1670)
at org.apache.camel.impl.DefaultCamelContext.doStart(DefaultCamelContext.java:1544)
at org.apache.camel.spring.SpringCamelContext.doStart(SpringCamelContext.java:179)
at org.apache.camel.support.ServiceSupport.start(ServiceSupport.java:61)
at org.apache.camel.impl.DefaultCamelContext.start(DefaultCamelContext.java:1512)
at org.apache.camel.spring.SpringCamelContext.maybeStart(SpringCamelContext.java:228)
at org.apache.camel.spring.SpringCamelContext.afterPropertiesSet(SpringCamelContext.java:104)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1625)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1562)
... 16 more
Caused by: org.apache.camel.ResolveEndpointFailedException: Failed to resolve endpoint: timer://{{foo}}?period=2s due to: PropertiesComponent with name properties must be defined in CamelContext to support property placeholders.
at org.apache.camel.impl.DefaultCamelContext.getEndpoint(DefaultCamelContext.java:477)
at org.apache.camel.util.CamelContextHelper.getMandatoryEndpoint(CamelContextHelper.java:63)
at org.apache.camel.model.RouteDefinition.resolveEndpoint(RouteDefinition.java:192)
at org.apache.camel.impl.DefaultRouteContext.resolveEndpoint(DefaultRouteContext.java:106)
at org.apache.camel.impl.DefaultRouteContext.resolveEndpoint(DefaultRouteContext.java:112)
at org.apache.camel.model.FromDefinition.resolveEndpoint(FromDefinition.java:72)
at org.apache.camel.impl.DefaultRouteContext.getEndpoint(DefaultRouteContext.java:88)
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:890)
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:177)
... 27 more
Caused by: java.lang.IllegalArgumentException: PropertiesComponent with name properties must be defined in CamelContext to support property placeholders.
at org.apache.camel.impl.DefaultCamelContext.resolvePropertyPlaceholders(DefaultCamelContext.java:1121)
at org.apache.camel.impl.DefaultCamelContext.getEndpoint(DefaultCamelContext.java:475)
... 35 more
Looks like the bridge does not work (but I definitely can use placeholders in Spring). What can be the problem?
Looks like if you want to use BridgePropertyPlaceholderConfigurer, you need to instantiate Camel contexts with CamelContextFactoryBean. It has initPropertyPlaceholder method:
#Override
protected void initPropertyPlaceholder() throws Exception {
super.initPropertyPlaceholder();
Map<String, BridgePropertyPlaceholderConfigurer> beans = applicationContext.getBeansOfType(BridgePropertyPlaceholderConfigurer.class);
if (beans.size() == 1) {
// setup properties component that uses this beans
BridgePropertyPlaceholderConfigurer configurer = beans.values().iterator().next();
String id = beans.keySet().iterator().next();
LOG.info("Bridging Camel and Spring property placeholder configurer with id: " + id);
// get properties component
PropertiesComponent pc = getContext().getComponent("properties", PropertiesComponent.class);
// replace existing resolver with us
configurer.setResolver(pc.getPropertiesResolver());
configurer.setParser(pc.getPropertiesParser());
String ref = "ref:" + id;
// use the bridge to handle the resolve and parsing
pc.setPropertiesResolver(configurer);
pc.setPropertiesParser(configurer);
// and update locations to have our as ref first
String[] locations = pc.getLocations();
String[] updatedLocations;
if (locations != null && locations.length > 0) {
updatedLocations = new String[locations.length + 1];
updatedLocations[0] = ref;
System.arraycopy(locations, 0, updatedLocations, 1, locations.length);
} else {
updatedLocations = new String[]{ref};
}
pc.setLocations(updatedLocations);
} else if (beans.size() > 1) {
LOG.warn("Cannot bridge Camel and Spring property placeholders, as exact only 1 bean of type BridgePropertyPlaceholderConfigurer"
+ " must be defined, was {} beans defined.", beans.size());
}
}
Well, the problem now is to have two bridges, but that's another story..
I had the same problem. Here's what worked for me (inspired by the initPropertyPlaceholder() method):
import org.apache.camel.component.properties.PropertiesComponent;
import org.apache.camel.spring.javaconfig.CamelConfiguration;
import org.apache.camel.spring.spi.BridgePropertyPlaceholderConfigurer;
#Configuration
#ComponentScan
public class AwesomeConfig extends CamelConfiguration {
private static final String PROPERTIES_BEAN_NAME = "springProperties";
#Resource(name = PROPERTIES_BEAN_NAME)
private BridgePropertyPlaceholderConfigurer springProperties;
#Bean(PROPERTIES_BEAN_NAME)
public static BridgePropertyPlaceholderConfigurer springProperties() throws Exception {
BridgePropertyPlaceholderConfigurer configurer = new BridgePropertyPlaceholderConfigurer();
configurer.setSystemPropertiesMode(BridgePropertyPlaceholderConfigurer.SYSTEM_PROPERTIES_MODE_OVERRIDE);
String defaultPropertiesPath = buildProperties().getProperty("properties.path");
String propertiesPath = System.getProperty(PROPERTY_FILE_SYSTEM_PROPERTY, defaultPropertiesPath);
configurer.setLocations(new ClassPathResource("META-INF/application.properties"));
return configurer;
}
#Bean
public PropertiesComponent camelProperties() throws Exception {
PropertiesComponent camelProperties = new PropertiesComponent();
springProperties.setParser(camelProperties.getPropertiesParser());
springProperties.setResolver(camelProperties.getPropertiesResolver());
camelProperties.setSystemPropertiesMode(springProperties.getSystemPropertiesMode());
camelProperties.setPropertiesResolver(springProperties);
camelProperties.setPropertiesParser(springProperties);
camelProperties.setLocation("ref:" + PROPERTIES_BEAN_NAME);
return camelProperties;
}
#Override
protected void setupCamelContext(CamelContext camelContext) throws Exception {
camelContext.addComponent("properties", camelProperties());
}
}
And here's how I use it:
import org.apache.camel.spring.javaconfig.Main;
public class AwesomeMain extends Main {
setConfigClass(AwesomeConfig.class);
}
public static void main(String... args) throws Exception {
AwesomeMain main = new AwesomeMain();
instance = main;
main.run(args);
}
Try to rename your first BridgePropertyPlaceholderConfigurer bean (method's name in your case).
Look what I have hacked up. Haven't fully tested but wanted to share; should work with Spring 5.x. Basically copies all of the Environment to the Camel's properties, so I don't use the Camel's "bridge" at all. One thing I am not sure for today, if I have to put it into "initial" or "overiding" properties:
#Configuration
public static class CamelConfig extends CamelConfiguration {
#Autowired
private ConfigurableEnvironment environment;
#Bean
... some beans ...
//#Bean -- haven't yet found out if we need it as a bean ...
private PropertiesComponent camelProperties() throws Exception {
PropertiesComponent camelProperties = new PropertiesComponent();
// just brutally copy all the properties form environment
HashSet<String> propertyNames = new HashSet<String>(100);
for (PropertySource ps : environment.getPropertySources()) {
if (ps instanceof MapPropertySource) {
MapPropertySource mps = (MapPropertySource) ps;
propertyNames.addAll(Arrays.asList(mps.getPropertyNames()));
}
}
Properties allProps = new Properties();
for (String prop : propertyNames) {
allProps.setProperty(prop, environment.getProperty(prop));
}
camelProperties.setInitialProperties(allProps);
// TODO: check it this is better or worse
//camelProperties.setOverrideProperties(allProps);
return camelProperties;
}
#Override
protected void setupCamelContext(CamelContext camelContext) throws Exception {
... some configs. ...
camelContext.addComponent("properties", camelProperties());
}
}

Resources