DGS Framework: how to provide additional data to the data fetcher? - spring-boot

I have a example DgsQuery:
#DgsQuery
public List<Sensor> sensors(DgsDataFetchingEnvironment dfe) {
SensorType sensorType = SensorType.newBuilder()
.id(UUID.randomUUID().toString())
.build();
Sensor sensor = new Sensor();
sensor.setId(UUID.randomUUID().toString());
sensor.setName("foo");
sensor.setSensorType(sensorType);
return Arrays.asList(sensor);
}
And a data fetcher:
#DgsData(parentType = DgsConstants.SENSOR.TYPE_NAME, field = DgsConstants.SENSOR.SensorType)
public CompletableFuture<SensorType> sensorType(DgsDataFetchingEnvironment dfe) {
Sensor sensor = dfe.getSource();
UUID sensorTypeId = UUID.fromString(sensor.getSensorType().getId());
// fetch sensorType by ID from DB, map to generated class and return
LOGGER.info("Would now fetch sensorType with ID {} via RPC or whatever you want ;)", sensorTypeId);
return null;
}
In the dataFetcher I need to be able to access some additional data / objects that are initialized in the #DgsQuery method. How can I do this?
Example usecase:
In the #DgsQuery there is a Entity that has been loaded from the database, I need to pass this entity to the data fetcher.
Currently I have only access to the Sensor object via
Sensor sensor = dfe.getSource();
but none of some further data that might be already loaded.

Found it out:
#DgsQuery
public DataFetcherResult<List<Sensor>> sensors(DgsDataFetchingEnvironment dfe) {
SensorType sensorType = SensorType.newBuilder()
.id(UUID.randomUUID().toString())
.build();
Sensor sensor = new Sensor();
sensor.setId(UUID.randomUUID().toString());
sensor.setName("foo");
sensor.setSensorType(sensorType);
return DataFetcherResult.<List<Sensor>>newResult()
.data(Arrays.asList(sensor))
.localContext("foo")
.build();
}
and in the data fetcher:
Object localContext = dfe.getLocalContext(); ( --> "foo")

Related

Spring WebFlux check user exists

I want to check that the user has not been created yet before creating a new one, if there is then create an error... I found a similar question, but I can't remake it =(
Spring WebFlux: Emit exception upon null value in Spring Data MongoDB reactive repositories?
public Mono<CustomerDto> createCustomer(Mono<CustomerDto> dtoMono) {
//How Create Mono error???
Mono<Customer> fallback = Mono.error(new DealBoardException("Customer with email: " + dtoMono ???));
return dtoMono.map(customerConverter::convertDto) //convert from DTO to Document
.map(document -> {
customerRepository.findByEmailOrPhone(document.getEmail(), document.getPhone())
})
.switchIfEmpty() //How check such customer doesn't exists?
.map(document -> { //Filling in additional information from other services
var customerRequest = customerConverter.convertDocumentToStripe(document);
var customerStripe = customerExternalService.createCustomer(customerRequest);
document.setCustomerId(customerStripe.getId());
return document;
})
.flatMap(customerRepository::save) //Save to MongoDB
.map(customerConverter::convertDocument); //convert from Document to Dto
}
public Mono<User> create(String username, String password) {
User user = new User();
user.setUsername(username);
user.setPassword(encoder.encode(password));
return userRepo.existsUserByUsername(username)
.flatMap(exists -> (exists) ? Mono.error(UserAlreadyExists::new) : userRepo.save(user));
}
Add the following index declaration on the top of your Customer class:
#CompoundIndex(def = "{'email': 1, 'phone': 1}", unique = true)
That will prevent duplicate entries to be inserted in the database.
You can catch your org.springframework.dao.DuplicateKeyException with the following construct:
customerService.save(newCustomer).onErrorMap(...);
Ref: MongoDB Compound Indexes official documentation

How to implement a list of DB update queries in one call with SpringBoot Webflux + R2dbc application

The goal of my springBoot webflux r2dbc application is Controller accepts a Request including a list of DB UPDATE or INSERT details, and Response a result summary back.
I can write a ReactiveCrudRepository based repository to implement each DB operation. But I don't know how to write the Service to group the executions of the list of DB operations and compose a result summary response.
I am new to java reactive programing. Thanks for any suggestions and help.
Chen
I get the hint from here: https://www.vinsguru.com/spring-webflux-aggregation/ . Ideas are :
From request to create 3 Monos
Mono<List> monoEndDateSet -- DB Row ids of update operation;
Mono<List> monoCreateList -- DB Row ids of new inserted;
Mono monoRespFilled -- partly fill some known fields;
use Mono.zip aggregate the 3 monos, map and aggregate the Tuple3 to Mono to return.
Below are key part of codes:
public Mono<ChangeSupplyResponse> ChangeSupplies(ChangeSupplyRequest csr){
ChangeSupplyResponse resp = ChangeSupplyResponse.builder().build();
resp.setEventType(csr.getEventType());
resp.setSupplyOperationId(csr.getSupplyOperationId());
resp.setTeamMemberId(csr.getTeamMemberId());
resp.setRequestTimeStamp(csr.getTimestamp());
resp.setProcessStart(OffsetDateTime.now());
resp.setUserId(csr.getUserId());
Mono<List<Long>> monoEndDateSet = getEndDateIdList(csr);
Mono<List<Long>> monoCreateList = getNewSupplyEntityList(csr);
Mono<ChangeSupplyResponse> monoRespFilled = Mono.just(resp);
return Mono.zip(monoRespFilled, monoEndDateSet, monoCreateList).map(this::combine).as(operator::transactional);
}
private ChangeSupplyResponse combine(Tuple3<ChangeSupplyResponse, List<Long>, List<Long>> tuple){
ChangeSupplyResponse resp = tuple.getT1().toBuilder().build();
List<Long> endDateIds = tuple.getT2();
resp.setEndDatedDemandStreamSupplyIds(endDateIds);
List<Long> newIds = tuple.getT3();
resp.setNewCreatedDemandStreamSupplyIds(newIds);
resp.setSuccess(true);
Duration span = Duration.between(resp.getProcessStart(), OffsetDateTime.now());
resp.setProcessDurationMillis(span.toMillis());
return resp;
}
private Mono<List<Long>> getNewSupplyEntityList(ChangeSupplyRequest csr) {
Flux<DemandStreamSupplyEntity> fluxNewCreated = Flux.empty();
for (SrmOperation so : csr.getOperations()) {
if (so.getType() == SrmOperationType.createSupply) {
DemandStreamSupplyEntity e = buildEntity(so, csr);
fluxNewCreated = fluxNewCreated.mergeWith(this.demandStreamSupplyRepository.save(e));
}
}
return fluxNewCreated.map(e -> e.getDemandStreamSupplyId()).collectList();
}
...

Unit Test JPA Specification's content

We have implemented filtering for a repository by using JPA's Specification as follows:
public Page<CustomerDTO> searchCustomers(SearchDto searchDto) {
var queryFilters = this.getQueryFitlers(searchDto);
var paginationAndSorting = this.getPaginationAndSorting(searchDto.getPageNumber(),
searchDto.getPageSize());
return customerRepository.findAll(queryFilters, paginationAndSorting)
.map(entity -> {
CustomerDTO dto = new CustomerDTO();
copyProperties(entity, dto);
return dto;
})
}
And here is the queryFilters() method which uses Specifications:
private Specification<Customer> getQueryFitlers(CustomerSearchSpecDTO filteringValues) {
Specification<Customer> specification = Specification.where(null);
if (isNotBlank(filteringValues.getLastname())) {
specification = specification.and(CustomerRepository.hasLastname(filteringValues.getLastname()));
}
if (isNotBlank(filteringValues.getTaxId())) {
specification = specification.and(CustomerRepository.hasTaxId(filteringValues.getTaxId()));
}
// several more fields with the same approach
return specification;
}
Since these query filters are optional depending upon if the searchField is empty or not, what we would like to do is verify that the specification contains the proper "filters".
For example, if the searchDto input contains only taxId not blank, then I want to check that the returned specification contains such a "filter / specification".
Note: grabbing a reference to the result of this.getQueryFilters() (which is the Specifications) is not a problem, we already achieved that.

Iterate through Flux items and add them in to Mono object

I am working on the api, which takes ids. For the given id, I want to download related data from s3 and put them in a new object lets call it data
class Data {
private List<S3Object> s3Objects;
//getter-setter
}
public Mono<ResponseEntity<Data>> getData(#RequestParam List<String> tagIds){
Data data = new Data();
Flux<S3Object> s3ObjectFlux = Flux.fromStream(tagIds.stream())
.parallel()
.runOn(Schedulers.boundedElastic())
.flatMap(id -> fetchResources(id))
.flatMap(idS3Object -> Mono.just(s3Object))
.ordered((u1, u2) -> u2.hashCode() - u1.hashCode());
//how do i add it in data object to convert Mono<Data>?
}
You need to collect it into a list and then map it to create a Data object as follows:
public Mono<ResponseEntity<Data>> getData(#RequestParam List<String> tagIds){
Flux<S3Object> s3ObjectFlux = Flux.fromStream(tagIds.stream())
.parallel()
.runOn(Schedulers.boundedElastic())
.flatMap(id -> fetchResources(id))
.flatMap(idS3Object -> Mono.just(s3Object))
.ordered((u1, u2) -> u2.hashCode() - u1.hashCode());
Mono<Data> data = s3ObjectFlux.collectList()
.map(s3Objects -> new Data(s3Objects));
}
Creating a constructor that accepts the S3 objects list is helpful:
class Data {
private List<S3Object> s3Objects;
public Data(List<S3Object> s3Objects) {
this.s3Objects = s3Objects;
}
//getter-setter
}

leftjoin on two GlobalKTables

I am trying to join a stream to 2 differents GlobalTables, treating them as a lookup, more specifically, devices (user agent) and geocoding (ip address).
The issue being with the serialization, but I dont get why. It gets stuck on DEFAULT_VALUE_SERDE_CLASS_CONFIG but the topic to which I want to write is serialized correctly.
//
// Set up serialization / de-serialization
private static Serde<String> stringSerde = Serdes.String();
private static Serde<PodcastData> podcastSerde = StreamsSerdes.PodCastSerde();
private static Serde<GeoCodedData> geocodedSerde = StreamsSerdes.GeoIPSerde();
private static Serde<DeviceData> deviceSerde = StreamsSerdes.DeviceSerde();
private static Serde<JoinedPodcastGeoDeviceData> podcastGeoDeviceSerde = StreamsSerdes.PodcastGeoDeviceSerde();
private static Serde<JoinedPodCastDeviceData> podcastDeviceSerde = StreamsSerdes.PodcastDeviceDataSerde()
...
GlobalKTable<String, DeviceData> deviceIDTable = builder.globalTable(kafkaProperties.getProperty("deviceid-topic"));
GlobalKTable<String, GeoCodedData> geoIPTable = builder.globalTable(kafkaProperties.getProperty("geoip-topic"));
//
// Stream from source topic
KStream<String, PodcastData> podcastStream = builder.stream(
kafkaProperties.getProperty("source-topic"),
Consumed.with(stringSerde, podcastSerde));
//
podcastStream
// left join the podcast stream to the device table, looking up the device
.leftJoin(deviceIDTable,
// get a DeviceData object from the user agent
(podcastID, podcastData) -> podcastData.getUser_agent(),
// join podcast and device and return a JoinedPodCastDeviceData object
(podcastData, deviceData) -> {
JoinedPodCastDeviceData data =
JoinedPodCastDeviceData.builder().build();
data.setPodcastObject(podcastData);
data.setDeviceData(deviceData);
return data;
})
// left join the podcast stream to the geo table, looking up the geo data
.leftJoin(geoIPTable,
// get a Geo object from the ip address
(podcastID, podcastDeviceData) -> podcastDeviceData.getPodcastObject().getIp_address(),
// join podcast and geo
(podcastDeviceData, geoCodedData) -> {
JoinedPodcastGeoDeviceData data=
JoinedPodcastGeoDeviceData.builder().build();
data.setGeoData(geoCodedData);
data.setDeviceData(podcastDeviceData.getDeviceData());
data.setPodcastData(podcastDeviceData.getPodcastObject());
return data;
})
//
.to(kafkaProperties.getProperty("sink-topic"),
Produced.with(stringSerde, podcastGeoDeviceSerde));
...
...
streamsConfiguration.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, stringSerde.getClass().getName());
streamsConfiguration.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, stringSerde.getClass().getName());
The error
ERROR java.lang.String cannot be cast to DeviceData
streamsConfiguration.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, stringSerde.getClass().getName());
Due to above value, the application will use String serde as default value serde unless you specify explicitly while making KTable/KStream/GlobalKTable.
Since expected value Type for deviceIDTable is DeviceData, specify that as given below:
You need to define the value serde in GlobalKTable .
GlobalKTable<String, DeviceData> deviceIDTable = builder.globalTable(kafkaProperties.getProperty("deviceid-topic"), Materialized.<String, DeviceData, KeyValueStore<Bytes, byte[]>>as(DEVICE_STORE)
.withKeySerde(stringSerde)
.withValueSerde(deviceSerde));

Resources