GraphQL mutation implementation in Spring Boot using #GraphQlRepository - spring-boot

I have the following Spring Boot reactive "stack" with GraphQL and MongoDB (in Kotlin):
spring-boot-starter-webflux
spring-boot-starter-graphql
spring-boot-starter-data-mongodb-reactive
A very basic example for a server which exposes a GraphQL API to query customers:
import org.springframework.boot.autoconfigure.SpringBootApplication
import org.springframework.boot.runApplication
import org.springframework.data.annotation.Id
import org.springframework.data.mongodb.core.mapping.Document
import org.springframework.data.mongodb.repository.ReactiveMongoRepository
import org.springframework.graphql.data.GraphQlRepository
#SpringBootApplication class ServerApplication
fun main(args: Array<String>) {
runApplication<ServerApplication>(*args)
}
#Document
data class Customer(
#Id val id: String? = null,
val name: String?,
)
#GraphQlRepository
interface CustomerRepository : ReactiveMongoRepository<Customer, String>
In combination with the following GraphQL schema file
type Customer {
id: ID
name: String
}
type Query {
customers: [Customer]
customerById(id: ID!): Customer
}
type Mutation {
createCustomer(name: String!): Customer
}
It is already possible to query customers / customerById and retrieve the data accordingly using e.g.:
{
customers { id name }
customerById(id: "...") { name }
}
This is made possible by the #GraphQlRepository annotation, which automatically registers a handler for fetching data directly from the database.
However, I can't find anything in the documentation about how mutations are implemented i.e. if there is such a simple automatic solution like for the queries or if this has to be implemented manually by a controller with #MutationMapping.
#Controller
class CustomerController(val customerRepository: CustomerRepository) {
#MutationMapping
fun createCustomer(#Argument name: String?): Mono<Customer> {
return customerRepository.save(Customer(name = name))
}
}

As far as I know, mutations must be implemented through a dedicated, #MutationMapping annotated method in the Controller like you suggest. I have made a couple of exercises and the only difference from your example is that I have used a special Input type -both in the schema and in the Java codebase- to define it; in your case, a String will do.
The schema:
type Query{
obras: [Obra]
obrasPorArtista(apellidoArtista:String!): [Obra]
}
type Mutation{
addObra(nueva: ObraInput): Obra
}
type Obra{
id: ID
titulo: String
precio: Float
}
input ObraInput{
titulo: String
artista: String
precio: Float
}
The Controller (the service is injected):
#MutationMapping
public Mono<Obra> addObra(#Argument ObraInput nueva){
return obraService.guardarObra(nueva);
}
The ObraInput:
public record ObraInput(String titulo, String artista, double precio) {}
(The Obra is an Entity with the JPA annotations, columns, etc)
Hope it helps!

Related

Spring data elasticsearch how to create repository method for keyword field

Let's say I have mapping like this, and I want to search by the "requestId.keyword" field to fetch the exact match requests. How can I implement it with the Spring Data Elasticsearch repository without using #Query annotation?
"requestId": {
"type": "text",
"analyzer": "1_to_15_analyzer_without_space",
"search_analyzer": "all_symbols_and_fold_analyzer",
"fields": {
"keyword": {
"type": "keyword"
}
}
}
This is not possible with the mechanism to build queries by introspecting the method name. The first idea is to have something like (I am using a Foo entity here):
SearchHits<Foo> searchByRequestId_Keyword(String keyword);
The analysis of the method name is done in the spring-data-common module which only uses the property names of the Java properties of an entity (might be nested). But the keyword subfield only exists in Elasticsearch and - if not autocreated - in the #MultiField annotation. But the code to parse the methodname does not use store-specific information and so an approach like this will not work and fail with the error that keyword is not a property of text - which is right for the Java object.
What you can do is to first add a custom repository fragment interface:
public interface FooKeywordRepository {
SearchHits<Foo> searchByRequestIdKeyword(String keyword);
}
and provide an implementation that must be named like the interface with Impl as suffix:
import org.elasticsearch.index.query.QueryBuilders;
import org.springframework.data.elasticsearch.core.ElasticsearchOperations;
import org.springframework.data.elasticsearch.core.SearchHits;
import org.springframework.data.elasticsearch.core.query.Criteria;
import org.springframework.data.elasticsearch.core.query.CriteriaQuery;
import org.springframework.data.elasticsearch.core.query.NativeSearchQueryBuilder;
import org.springframework.data.elasticsearch.core.query.Query;
public class FooKeywordRepositoryImpl implements FooKeywordRepository {
private final ElasticsearchOperations operations;
public FooKeywordRepositoryImpl(ElasticsearchOperations operations) {
this.operations = operations;
}
#Override
public SearchHits<Foo> searchByRequestIdKeyword(String keyword) {
Query query1 = new NativeSearchQueryBuilder()
.withQuery(QueryBuilders.termQuery("requestId.keyword", keyword))
.build();
Query query2 = new CriteriaQuery(Criteria.where("requestId.keyword").is(keyword));
return operations.search(query1, Foo.class); // could be query2 as well
}
}
You have an ElasticsearchOperations injected and use that to execute a query that you build. I have put in two ways to build the query, both work.
Your repository definition to use would then be:
public interface FooRepository extends ElasticsearchRepository<Foo, String>, FooKeywordRepository {
// other custom methods if needed
}

JSON key is missing (using #JsonComponent on Spring-boot with kotlin)

Thanks reading this question.
this problem confused me.
I created code that response JSON data like below.
#RestController
class JsonTestController {
#GetMapping("jsonTest")
fun jsonTest(): ResponseEntity<HaveBoolean> {
val value = BooleanValue(true)
return ResponseEntity.ok(HaveBoolean(value))
}
data class BooleanValue(val value: Boolean)
data class HaveBoolean(
val isAdmin: BooleanValue,
)
}
and #JsonComponent is below.
#JsonComponent
class BooleanValueJson {
class Serializer : JsonSerializer<JsonTestController.BooleanValue>() {
override fun serialize(value: JsonTestController.BooleanValue, gen: JsonGenerator, serializers: SerializerProvider) {
gen.writeBoolean(value.value)
}
}
class Deserializer : JsonDeserializer<JsonTestController.BooleanValue>() {
override fun deserialize(p: JsonParser, ctxt: DeserializationContext): JsonTestController.BooleanValue =
JsonTestController.BooleanValue(p.valueAsBoolean)
}
}
When I request localhost://8082/jsonTest, I got empty json ({}).
but, I tried other variable name like hoge, mean coding like below.
data class HaveBoolean(
val hoge: BooleanValue,
)
then, I request again, I can get correctly json ({"hoge": true}).
Can't I use isAdmin name on data class ?
Do you have any idea why this problem is happening?
thanks.
This is a known issue with jackson in kotlin. Jackson basically tries to remove is from the name but kotlin data class implementation doesn't have a proper getter without "is" resulting in mismatch. You can add JsonProperty("isAdmin") to the variable and it should work.
data class HaveBoolean(
#get:JsonProperty("isAdmin")
val isAdmin: BooleanValue,
)

spring-data-neo4j v6: No converter found capable of converting from type [MyDTO] to type [org.neo4j.driver.Value]

Situation
I'm migrating a kotlin spring data neo4j application from spring-data-neo4j version 5.2.0.RELEASE to version 6.0.11.
The original application has several Repository interfaces with custom queries which take some DTO as a parameter, and use the various DTO fields to construct the query. All those types of queries currently fail with
org.springframework.core.convert.ConverterNotFoundException: No converter found capable of converting from type [MyDTO] to type [org.neo4j.driver.Value]
The reference documentation for spring-data-neo4j v6 only provides examples where parameters passed to custom query methods of a #Repository interface are of the same type as the #Node class associated with that repository. The documentation does not explicitly state that only parameters of the Node class are allowed.
Question
Is there any way to pass an arbitrary DTO (not being a #Node class) to a custom query method in a #Repository interface in spring-data-neo4j v6 like it was possible in v5?
Code samples
Example node entity
#Node
data class MyEntity(
#Id
val attr1: String,
val attr2: String,
val attr3: String
)
Example DTO
data class MyDTO(
val field1: String,
val field2: String
)
Example Repository interface
#Repository
interface MyRepository : PagingAndSortingRepository<MyEntity, String> {
// ConverterNotFoundException is thrown when this method is called
#Query("MATCH (e:MyEntity {attr1: {0}.field1}) " +
"CREATE (e)-[l:LINK]->(n:OtherEntity {attr2: {0}.field2))")
fun doSomethingWithDto(dto: MyDTO)
}
Solutions tried so far
Annotate DTO as if it were a Node entity
Based on the following found in the reference docs https://docs.spring.io/spring-data/neo4j/docs/current/reference/html/#custom-queries.parameters
Mapped entities (everything with a #Node) passed as parameter to a
function that is annotated with a custom query will be turned into a
nested map.
#Node
data class MyDTO(
#Id
val field1: String,
val field2: String
)
Replace {0} with $0 in custom query
Based on the following found in the reference docs https://docs.spring.io/spring-data/neo4j/docs/current/reference/html/#custom-queries.parameters
You do this exactly the same way as in a standard Cypher query issued
in the Neo4j Browser or the Cypher-Shell, with the $ syntax (from
Neo4j 4.0 on upwards, the old {foo} syntax for Cypher parameters has
been removed from the database).
...
[In the given listing] we are referring to the parameter by its name.
You can also use $0 etc. instead.
#Repository
interface MyRepository : PagingAndSortingRepository<MyEntity, String> {
// ConverterNotFoundException is thrown when this method is called
#Query("MATCH (e:MyEntity {attr1: $0.field1}) " +
"CREATE (e)-[l:LINK]->(n:OtherEntity {attr2: $0.field2))")
fun doSomethingWithDto(dto: MyDTO)
}
Details
spring-boot-starter: v2.4.10
spring-data-neo4j: v6.0.12
neo4j-java-driver: v4.1.4
Neo4j server version: v3.5.29
RTFM Custom conversions ...
Found the solution myself. Hopefully someone else may benefit from this as well.
Solution
Create a custom converter
import mypackage.model.*
import com.fasterxml.jackson.core.type.TypeReference
import com.fasterxml.jackson.module.kotlin.jacksonObjectMapper
import org.neo4j.driver.Value
import org.neo4j.driver.Values
import org.springframework.core.convert.TypeDescriptor
import org.springframework.core.convert.converter.GenericConverter
import org.springframework.core.convert.converter.GenericConverter.ConvertiblePair
import java.util.HashSet
class DtoToNeo4jValueConverter : GenericConverter {
override fun getConvertibleTypes(): Set<ConvertiblePair>? {
val convertiblePairs: MutableSet<ConvertiblePair> = HashSet()
convertiblePairs.add(ConvertiblePair(MyDTO::class.java, Value::class.java))
return convertiblePairs
}
override fun convert(source: Any?, sourceType: TypeDescriptor, targetType: TypeDescriptor?): Any? {
return if (MyDTO::class.java.isAssignableFrom(sourceType.type)) {
// generic way of converting an object into a map
val dataclassAsMap = jacksonObjectMapper().convertValue(source as MyDTO, object :
TypeReference<Map<String, Any>>() {})
Values.value(dataclassAsMap)
} else null
}
}
Register custom converter in config
import org.springframework.context.annotation.Bean
import org.springframework.context.annotation.Configuration
import org.springframework.data.neo4j.core.convert.Neo4jConversions
import org.springframework.core.convert.converter.GenericConverter
import java.util.*
#Configuration
class MyNeo4jConfig {
#Bean
override fun neo4jConversions(): Neo4jConversions? {
val additionalConverters: Set<GenericConverter?> = Collections.singleton(DtoToNeo4jValueConverter())
return Neo4jConversions(additionalConverters)
}
}
It's ridiculous that the framework would force you to write a custom converter for this. I made a #Transient object in my overridden User class for a limited set of update-able user profile fields, and I'm encountering the same error. I guess I will just have to break up the object into its component String fields in the method params to get around this problem. What a mess.
#Query("MATCH (u:User) WHERE u.username = :#{#username} SET u.firstName = :#{#up.firstName},u.lastName = :#{#up.firstName},u.intro = :#{#up.intro} RETURN u")
Mono<User> update(#Param("username") String username,#Param("up") UserProfile up);
No converter found capable of converting from type [...UserProfile] to type [org.neo4j.driver.Value]

Javax Validation Custom enum constrains in Kotlin

I'm trying to create a custom annotation and validator to use in conjunction with the javax validation Api and I'm having trouble access the values of an enum.
The objective of the annotation and the validator is validate if an input data is present within the enum values.
This is the annotation class
import javax.validation.Constraint
import javax.validation.Payload
import kotlin.reflect.KClass
#kotlin.annotation.Target(
AnnotationTarget.FIELD,
)
#kotlin.annotation.Retention(AnnotationRetention.RUNTIME)
#MustBeDocumented
#Constraint(validatedBy = [ValueOfEnumValidator::class])
annotation class ValueOfEnum(
val enumClass: KClass<Enum<*>>,
val message: String ="",
val groups: Array<KClass<*>> = [],
val payload: Array<KClass<out Payload>> = []
)
This is the validator implementation
import javax.validation.ConstraintValidator
import javax.validation.ConstraintValidatorContext
class ValueOfEnumValidator: ConstraintValidator<ValueOfEnum, CharSequence> {
private val acceptedValues: MutableList<String> = mutableListOf()
override fun initialize(constraintAnnotation: ValueOfEnum) {
super.initialize(constraintAnnotation)
acceptedValues.addAll(constraintAnnotation.enumClass.java
.enumConstants
.map {it.name}
)
}
override fun isValid(value: CharSequence?, context: ConstraintValidatorContext): Boolean {
return if (value == null) {
true
} else acceptedValues.contains(value.toString())
}
}
I'm aiming to use annotation like this:
#field:ValueOfEnum(enumClass = SortDirectionEnum::class, message = "{variants.sorted.sort.direction.not.valid}")
var sortDirection:String?=
But my IDE is reporting me the following error in the enumClass parameter
Type mismatch.
Required:KClass<Enum<*>>
Found:KClass<SortDirectionEnum>
How can I make the annotation generic enough to support different enums, and fix this issue ?
You are restricting enumClass to instances of Enum<*>, allowing Enum instances (Enum is an abstract class though, so nothing can be used) with all types of data, you however want to also allow child classes of Enum, which can be achieved with the out keyword there.
val enumClass: KClass<out Enum<*>>,
https://kotlinlang.org/docs/generics.html

Using a custom ID in a Spring Data REST repository backed by MongoDB

I'm working on a new project using Spring and MongoDB. I'm a novice with both of these so bear with me, I couldn't find a definitive answer to this question.
I'm using spring-boot-starter-data-rest and have a repository like this:
#RepositoryRestResource(collectionResourceRel = "widget", path = "widget")
interface WidgetRepository : MongoRepository<Widget, ObjectId> {
fun findByType(#Param("type") type: String): List<Widget>
}
For an entity like this:
data class Widget #JsonCreator constructor(#JsonProperty val type: String) {
#Id
lateinit var id: ObjectId
}
This automatically gives you a CRUD API using the Mongo document ID:
GET /widget/{mongo doc id}
GET /widget/search/findByType?type=
POST /widget
PUT /widget
PATCH /widget
But I don't want to use the Mongo document ID. I want to introduce a secondary identifier and use that everywhere in the API. This is because the "widgets" are items in the physical world that are barcoded, and we don't want to print the Mongo document ID in the barcode.
Obviously we can implement this using Spring REST API tools, eg
#GetMapping("/widget/{barcode}/")
fun getByBarcode(#PathVariable barcode: String): Widget {
return widgetRepository.findByBarcode(barcode)
}
etc.. but is there any clever way to get #RepositoryRestResource to build its automagic API for us with a custom ID? Maybe by implementing CrudRepository<Widget, Barcode> in such a way that we have to wrap a MongoRepository<Widget, ObjectId> ? I'm not familiar enough with how Spring works under the hood to know if anything like this is even possible.
Thanks in advance
I think you are looking for an EntityLookup:
SPI to customize which property of an entity is used as unique identifier and how the entity instance is looked up from the backend.
First - sorry if I make any mistake, I do not use to program in Kotlin - you need to include the barcode property in your entity:
data class Widget #JsonCreator constructor(#JsonProperty val type: String, #JsonProperty val barcode: String) {
#Id
lateinit var id: ObjectId
}
Then, modify your repository and define a new method that will provide a Widget given its barcode:
#RepositoryRestResource(collectionResourceRel = "widget", path = "widget")
interface WidgetRepository : MongoRepository<Widget, ObjectId> {
fun findByBarcode(#Param("barcode") barcode: String): Optional<Widget>
fun findByType(#Param("type") type: String): List<Widget>
}
Finally, configure a RestRepositoryConfigurer and register the EntityLookup through an EntityLookupRegistrar:
#Component
class RestRepositoryConfigurator : RepositoryRestConfigurer {
override fun configureRepositoryRestConfiguration(config: RepositoryRestConfiguration) {
config.withEntityLookup()
.forRepository(WidgetRepository::class.java)
.withIdMapping(Widget::barcode)
.withLookup(WidgetRepository::findByBarcode)
}
}
Please, review the section 15.1 of the Spring Data Rest documentation if you need more information.

Resources