Spring Data MongoDB - $eq within $project support - spring

I'm currently writing an aggregation query for MongoDB in my Spring project in which I'm using $project operator. Within this operator I would like to compare two fields in order to return the result as projected "matches" key value. Here's the mongoDB shell equivalent (which works):
{$project:
{matches:
{$eq: ["$lastDate", "$meta.date"]}
}
}
I've read Spring Data MongoDB documentation and found some useful info about ProjectionOperator's 'andExpression' method which uses SpEL. The result Java code of my investigation was:
new ProjectionOperation().andExpression("lastDate == meta.date").as("matches")
Unfortunately I'm receiving exception:
java.lang.IllegalArgumentException: Unsupported Element:
org.springframework.data.mongodb.core.spel.OperatorNode#70c1152a Type: class org.springframework.data.mongodb.core.spel.OperatorNode You probably have a syntax error in your SpEL expression!
As far as I've checked, Spring Data MongoDB handles all Arithmetic operators correctly but cannot handle the comparison ones. Therefore I want to ask is there any other way to create such query with Spring Data MongoDB? Or maybe I don't know something crucial about SpEL?

I resolved this issue by passing JSON aggregate command (created with DBObjects in order to preserve flexibility of the query) to MongoDB, i.e.:
MongoOperations#executeCommand(DBObject command)

Related

Exception:rg.springframework.cloud.gcp.data.datastore.core.mapping.DatastoreDataException: Unable to convert class PageRequest to Datastore

I have issue with Spring data. When I use #Query annotation with Pageable, for example:
#Query("SELECT * FROM myTable WHERE myTable.t_blob_name > NULL")
Slice<MyTable> findAllWithPageable(Pageable pageable);
I will have next exception (That application runs on Google cloud platform):
org.springframework.cloud.gcp.data.datastore.core.mapping.DatastoreDataException: Unable to convert class org.springframework.data.domain.PageRequest to Datastore supported type.
at org.springframework.cloud.gcp.data.datastore.core.convert.DatastoreNativeTypes.wrapValue (DatastoreNativeTypes.java:166)
at org.springframework.cloud.gcp.data.datastore.core.convert.TwoStepsConversions.convertOnWriteSingle (TwoStepsConversions.java:320)
at org.springframework.cloud.gcp.data.datastore.repository.query.GqlDatastoreQuery.bindArgsToGqlQuery (GqlDatastoreQuery.java:233)
at org.springframework.cloud.gcp.data.datastore.repository.query.GqlDatastoreQuery.execute (GqlDatastoreQuery.java:118)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.doInvoke (RepositoryFactorySupport.java:605)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.lambda$invoke$3 (RepositoryFactorySupport.java:595)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.invoke (RepositoryFactorySupport.java:595)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed (ReflectiveMethodInvocation.java:186)
at org.springframework.data.projection.DefaultMethodInvokingMethodInterceptor.invoke (DefaultMethodInvokingMethodInterceptor.java:59)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed (ReflectiveMethodInvocation.java:186)
at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke (ExposeInvocationInterceptor.java:93)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed (ReflectiveMethodInvocation.java:186)
at org.springframework.data.repository.core.support.SurroundingTransactionDetectorMethodInterceptor.invoke (SurroundingTransactionDetectorMethodInterceptor.java:61)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed (ReflectiveMethodInvocation.java:186)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke (JdkDynamicAopProxy.java:212)
It's a google cloud DataStore syntax and the main question - how to resolve that problem?
There is a list of supported classes that can be used within GQL. You can find it in reference point 158.5.2 and directly in code on GitHub if you prefer. Actually the Pageable class is not among them.
I noticed that you are not using this paramater in the query so I suggest to use Slice<MyTable> findAllWithPageable() Similar query you may find in here.
I hope it will help!

XQuery: Using xs:time($arg) vs cast as xs:time

In one of my projects I'm using
XQuery 3.0
Saxon HE 9.8 (transitive because of Camel)
Spring Boot 2.1.0
Apache Camel 2.22.0
I consume a XML message in which the following element occurs:
<mytimeelement></mytimeelement>
As you can see it is empty so I thought that the following XQuery-Expression would return an empty sequence:
$transaction/*:flags/*:mytimeelement
Unfortunately this seems not to be the case because calling the XQuery-Expression from above in an xs:time($arg) like:
xs:time($transaction/*:flags/*:mytimeelement)
does not return an empty sequence as I would have expected but instead returns an exception:
Invalid time "" (too short)
The thing is: I want to use the xs:time($arg) as validation that if a value is in the element it has to have the correct format, but if it's empty it doesn't matter. So I did this not only with xs:time but also with xs:date and xs:decimal.
My question now is: Why is the expression not returning an empty sequence but an empty string? Or should I better use a cast as xs:time instead?

Mongo equivalent of Spring 'Containing'

In my mongorepository I have a function called:
findByParticipantIdsContaining(String participantId);
What is the mongo operation used for Containing? I thought it'd be something like:
Criteria containsParticipantId = where(participantId).in("participantIds");
but it's not...
EDIT
I think it may be:
Criteria containsParticipantId = where("participantIds").is(participantId);
I'll give it a test.
The Spring containing operation looks like a LIKE operation in MySQL. In mongo, LIKE operator doesn't exist but instead you can use is operator as you suggested but with a regex as value

Parquet-MR AvroParquetWriter - how to convert data to Parquet (with Specific Mapping)

I'm working on a tool for converting data from a homegrown format to Parquet and JSON (for use in different settings with Spark, Drill and MongoDB), using Avro with Specific Mapping as the stepping stone. I have to support conversion of new data on a regular basis and on client machines which is why I try to write my own standalone conversion tool with a (Avro|Parquet|JSON) switch instead of using Drill or Spark or other tools as converters as I probably would if this was a one time job. I'm basing the whole thing on Avro because this seems like the easiest way to get conversion to Parquet and JSON under one hood.
I used Specific Mapping to profit from static type checking, wrote an IDL, converted that to a schema.avsc, generated classes and set up a sample conversion with specific constructor, but now I'm stuck configuring the writers. All Avro-Parquet conversion examples I could find [0] use AvroParquetWriter with deprecated signatures (mostly: Path file, Schema schema) and Generic Mapping.
AvroParquetWriter has only one none-deprecated Constructor, with this signature:
AvroParquetWriter(
Path file,
WriteSupport<T> writeSupport,
CompressionCodecName compressionCodecName,
int blockSize,
int pageSize,
boolean enableDictionary,
boolean enableValidation,
WriterVersion writerVersion,
Configuration conf
)
Most of the parameters are not hard to figure out but WriteSupport<T> writeSupport throws me off. I can't find any further documentation or an example.
Staring at the source of AvroParquetWriter I see GenericData model pop up a few times but only one line mentioning SpecificData: GenericData model = SpecificData.get();.
So I have a few questions:
1) Does AvroParquetWriter not support Avro Specific Mapping? Or does it by means of that SpecificData.get() method? The comment "Utilities for generated Java classes and interfaces." over 'SpecificData.class` seems to suggest that but how exactly should I proceed?
2) What's going on in the AvroParquetWriter constructor, is there an example or some documentation to be found somewhere?
3) More specifically: the signature of the WriteSupport method asks for 'Schema avroSchema' and 'GenericData model'. What does GenericData model refer to? Maybe I'm not seeing the forest because of all the trees here...
To give an example of what I'm aiming for, my central piece of Avro conversion code currently looks like this:
DatumWriter<MyData> avroDatumWriter = new SpecificDatumWriter<>(MyData.class);
DataFileWriter<MyData> dataFileWriter = new DataFileWriter<>(avroDatumWriter);
dataFileWriter.create(schema, avroOutput);
The Parquet equivalent currently looks like this:
AvroParquetWriter<SpecificRecord> parquetWriter = new AvroParquetWriter<>(parquetOutput, schema);
but this is not more than a beginning and is modeled after the examples I found, using the deprecated constructor, so will have to change anyway.
Thanks,
Thomas
[0] Hadoop - The definitive Guide, O'Reilly, https://gist.github.com/hammer/76996fb8426a0ada233e, http://www.programcreek.com/java-api-example/index.php?api=parquet.avro.AvroParquetWriter
Try AvroParquetWriter.builder :
MyData obj = ... // should be avro Object
ParquetWriter<Object> pw = AvroParquetWriter.builder(file)
.withSchema(obj.getSchema())
.build();
pw.write(obj);
pw.close();
Thanks.

Calling HSQLDB IDENTITY function with CallableStatement to get output

I'm trying to use CallableStatements to get the value of IDENTITY() in HSQLDB from Java JDBC.
I can prepareCall fine. The issue is with registerOutputParameter. I get "parameter index out of range" no matter what index I pass in.
I've tried SQL snippets like "{? = CALL IDENTITY()}" with no luck.
Any clues? Am I completely off track in how to invoke HSQLDB function routines from JDBC?
Instead of using IDENTITY(), use getGeneratedKeys() to retrieve any keys generated by the (insert) statement.
Note that you do need to use one of the Statement.execute... or Connection.prepare... methods that will enable this feature.
Gah.
http://sourceforge.net/tracker/index.php?func=detail&aid=3530755&group_id=23316&atid=378134
Output parameters for function invocation is not supported. Use executeQuery and grab the ResultSet.

Resources