i'm using spring-data-elasticsearch to do CRUD operations.
I have a custom Repository that extends ElasticsearchRepository.
Ultimately ElasticsearchRepository extends CrudRepository which implies updating an existing record is possible.
The question is, how do you accomplish this? I haven't found a method called "update()"
I thought doing the following would work (code stolen from https://github.com/BioMedCentralLtd/spring-data-elasticsearch-sample-application)
//create
Book book = new Book();
book.setId("123455");
book.setName("Spring Data Elasticsearch");
book.setVersion(System.currentTimeMillis());
repository.save(book);
//update
book.setName("THIS IS A COMPLETELY NEW TITLE");
repository.save(book);
However the 2nd save throws an InvocationTargetException
Examining it with the debugger shows:
[book][0] [book][123455]: version conflict, current [1447792071681], provided [1447792071681]
The Book object looks like:
#Document(indexName = "book",type = "book" , shards = 1, replicas = 0, indexStoreType = "memory", refreshInterval = "-1")
public class Book {
#Id
private String id;
private String name;
private Long price;
#Version
private Long version;
public Map<Integer, Collection<String>> getBuckets() {
return buckets;
}
public void setBuckets(Map<Integer, Collection<String>> buckets) {
this.buckets = buckets;
}
#Field(type = FieldType.Nested)
private Map<Integer, Collection<String>> buckets = new HashMap();
public Book(){}
public Book(String id, String name,Long version) {
this.id = id;
this.name = name;
this.version = version;
}
getters and setters removed for space
}
My Repository code is even simpler:
import org.springframework.data.elasticsearch.entities.Book;
import org.springframework.data.elasticsearch.repository.ElasticsearchRepository;
public interface BookRepository extends ElasticsearchRepository<Book, Long> {
}
Do I have to provide an update method?
EDIT:
Nevermind. I changed the update to:
//update
book.setName("THIS IS A COMPLETELY NEW TITLE");
book.setVersion(System.currentTimeMillis());
repository.save(book);
and it updated the record.
You can use UpdateQuery and ElasticSearchTemplate to update any partial document. e.g
final UpdateRequest updateRequest = new UpdateRequest();
updateRequest.index(mainIndexName);
updateRequest.type(typeName);
updateRequest.id(id);
updateRequest.doc(XContentFactory.jsonBuilder().startObject()
.field("accountType", accountType)
.endObject());
final UpdateQuery updateQuery = new UpdateQueryBuilder().withId(id)
.withClass(<DocumentClass>).withUpdateRequest(updateRequest).build();
UpdateResponse updateResponse = elasticSearchTemplate.update(updateQuery);
I updated indexed document as follows code snippet:
IndexRequest indexRequest = new IndexRequest(INDEX_NAME,INDEX_NAME,docid);
indexRequest.source(fldName, fldValue);
UpdateRequest updateRequest = new UpdateRequest();
updateRequest.index(INDEX_NAME);
updateRequest.type(INDEX_NAME);
updateRequest.id(docid);
updateRequest.doc(indexRequest);
try {
UpdateResponse res=client.update(updateRequest).get();
logger.info("update es {}:{}",fe,res.getGetResult());
} catch (Exception e) {
logger.error("update",e);
throw e;
}
The second update fails because you're trying to update an entity whose version hasn't changed. The error message you're getting is ES telling you, "hey, you can't save the same version twice!" Try this:
//create
Book book = new Book();
book.setId("123455");
book.setName("Spring Data Elasticsearch");
book.setVersion(System.currentTimeMillis());
repository.save(book);
//update
book.setName("THIS IS A COMPLETELY NEW TITLE");
book.setVersion(System.currentTimeMillis()); // you're saving a new version
repository.save(book);
I think the ElasticSearch is a similar to the JSON storage:
if(exist) {
update it;// push json to cover it
} else {
add it;// new save();
}
It will update the JSON when Id/Entity is exist, or it will add it.
XContentType contentType =
org.elasticsearch.client.Requests.INDEX_CONTENT_TYPE;
public XContentBuilder getBuilder(User assign){
try {
XContentBuilder builder = XContentFactory.contentBuilder(contentType);
builder.startObject();
Map<String,?> assignMap=objectMap.convertValue(assign, Map.class);
builder.field("assignee",assignMap);
return builder;
} catch (IOException e) {
log.error("custom field index",e);
}
IndexRequest indexRequest = new IndexRequest();
indexRequest.source(getBuilder(assign));
UpdateQuery updateQuery = new UpdateQueryBuilder()
.withType(<IndexType>)
.withIndexName(<IndexName>)
.withId(String.valueOf(id))
.withClass(<IndexClass>)
.withIndexRequest(indexRequest)
.build();
Related
In the project am using olingo 2.0.12 jar in the java code.
During the create Entity service call ,
Is there a way to check for which entity data insert requested and,
Alter column values / append new column values before data persisted?
Is there a way to add above?
Code snippet given below,
public class A extends ODataJPADefaultProcessor{
#Override
public ODataResponse createEntity(final PostUriInfo uriParserResultView, final InputStream content,
final String requestContentType, final String contentType) throws ODataJPAModelException,
ODataJPARuntimeException, ODataNotFoundException, EdmException, EntityProviderException {
// Need to check the entity name and need to alter/add column values
}
}
Yes one of the possible ways would be to create your own CustomODataJPAProcessor which extends ODataJPADefaultProcessor.
You will have to register this in JPAServiceFactory by overriding the method
#Override
public ODataSingleProcessor createCustomODataProcessor(ODataJPAContext oDataJPAContext) {
return new CustomODataJPAProcessor(this.oDataJPAContext);
}
Now Olingo will use CustomODataJPAProcessor which can implement the following code to check the entities and transform them if needed
Sample code of CustomODataJPAProcessor
public class CustomODataJPAProcessor extends ODataJPADefaultProcessor {
Logger LOG = LoggerFactory.getLogger(this.getClass());
public CustomODataJPAProcessor(ODataJPAContext oDataJPAContext) {
super(oDataJPAContext);
}
#Override
public ODataResponse createEntity(final PostUriInfo uriParserResultView, final InputStream content,
final String requestContentType, final String contentType) throws ODataException {
ODataResponse oDataResponse = null;
oDataJPAContext.setODataContext(getContext());
InputStream forwardedInputStream = content;
try {
if (uriParserResultView.getTargetEntitySet().getName().equals("Students")) {
LOG.info("Students Entity Set Executed");
if (requestContentType.equalsIgnoreCase(ContentType.APPLICATION_JSON.toContentTypeString())) {
#SuppressWarnings("deprecation")
JsonElement elem = new JsonParser().parse(new InputStreamReader(content));
Gson gson = new GsonBuilder().setFieldNamingPolicy(FieldNamingPolicy.UPPER_CAMEL_CASE).create();
Student s = gson.fromJson(elem, Student.class);
// Change some values
s.setStudentID("Test" + s.getStudentID());
forwardedInputStream = new ByteArrayInputStream(gson.toJson(s).getBytes());
}
}
Object createdJpaEntity = jpaProcessor.process(uriParserResultView, forwardedInputStream,
requestContentType);
oDataResponse = responseBuilder.build(uriParserResultView, createdJpaEntity, contentType);
} catch (JsonIOException | JsonSyntaxException e) {
throw new RuntimeException(e);
} finally {
close();
}
return oDataResponse;
}
}
In Summery
Register your custom org.apache.olingo.odata2.service.factory Code Link
Create your own CustomODataJPAProcessor Code Link
Override createCustomODataProcessor in JPAServiceFactory to use the custom processor Code Link
I am working with DynamoDB with Spring Boot 2.1, and I'm facing an error when I need o user the clause IN during the conditional evaluation. Even with lines that fulfill the requirements, the query result is empty.
How can I return the lines from the table after explicit the result within the IN clause ?
public class DynamoRepository {
private final DynamoDBMapper dynamoDBMapper;
public Optional<List<USER>> query(String id) {
Map<String, String> ean = new HashMap<>();
ean.put("#status", "status");
Map<String, AttributeValue> eav = new HashMap<>();
eav.put(":id", new AttributeValue().withS(documento));
DynamoDBQueryExpression<USER> queryExpression = new DynamoDBQueryExpression<USER>()
.withKeyConditionExpression("id = :id")
.withFilterExpression("#status in (ACTIVE, PENDING)")
.withExpressionAttributeNames(ean)
.withExpressionAttributeValues(eav);
List<USER> query = dynamoDBMapper.query(USER.class, queryExpression);
return query.isEmpty() ? Optional.empty() : Optional.of(query);
}
}
After taking a while, my solution was to define the status' values as Expression Attribute Values like the code below
public class DynamoRepository {
private final DynamoDBMapper dynamoDBMapper;
public Optional<List<USER>> query(String id) {
Map<String, String> ean = new HashMap<>();
ean.put("#status", "status");
Map<String, AttributeValue> eav = new HashMap<>();
eav.put(":id", new AttributeValue().withS(documento));
eav.put(":active", new AttributeValue().withS("ACTIVE"));
eav.put(":pending", new AttributeValue().withS("PENDING"));
DynamoDBQueryExpression<USER> queryExpression = new DynamoDBQueryExpression<USER>()
.withKeyConditionExpression("id = :id")
.withFilterExpression("#status in (:active, :pending)")
.withExpressionAttributeNames(ean)
.withExpressionAttributeValues(eav);
List<USER> query = dynamoDBMapper.query(USER.class, queryExpression);
return query.isEmpty() ? Optional.empty() : Optional.of(query);
}
}
I have a spring application that uses the modelmapper to convert between the entity and the DTO objects. I have a String in the DTO that represents a ZonedDateTime object in the Entity. I have written the following snippet in the SpringAppConfiguration
#Bean
public ModelMapper contactModelMapper() {
Converter<String, ZonedDateTime> toZonedDateTimeString = new AbstractConverter<String, ZonedDateTime>() {
#Override
public ZonedDateTime convert(String source) {
DateTimeFormatter formatter = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss");
LocalDateTime datel = LocalDateTime.parse(source, formatter);
ZonedDateTime result = datel.atZone(ZoneId.systemDefault());
return result;
}
};
Converter<ZonedDateTime, String> toStringZonedDateTime = new AbstractConverter<ZonedDateTime, String>() {
#Override
public String convert(ZonedDateTime source) {
String result = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss").format(source);
return result;
}
};
PropertyMap<Contact, ContactDTO> contactDTOmap = new PropertyMap<Contact, ContactDTO>() {
#Override
protected void configure() {
map().setTenantId(source.getTenant().getTenantId());
//if (source.getCreatedDateTime() != null) map().setCreatedDateTime(source.getCreatedDateTime());
//when(Conditions.isNotNull()).map(source.getCreatedDateTime(), map().getCreatedDateTime());
}
};
/* this is for userDTO to BO.. */
PropertyMap<ContactDTO, Contact> contactMap = new PropertyMap<ContactDTO, Contact>() {
#Override
protected void configure() {
map().getTenant().setTenantId(source.getTenantId());
}
};
ModelMapper contactModelMapper = new ModelMapper();
contactModelMapper.addMappings(contactDTOmap);
contactModelMapper.addMappings(contactMap);
contactModelMapper.addConverter(toStringZonedDateTime);
contactModelMapper.addConverter(toZonedDateTimeString);
return contactModelMapper;
}
As you can see there are 2 converters. The one that changes from DTO string to the ZonedDateTime object in entity does not get executed at all. The one for vice versa conversion is executing properly.
I would appreciate any help, any suggessions for this.
Thanks
I have resolved the error after a lot of reading online and experimenting. It seems the order of the addConverter calls matters. I had added the converter for dto to entity conversion after the converter for entity to dto conversion. As soon as the order was put right the code started working. Posting this so that it helps someone as the documentation for modelmapper is very choppy..
I have a Class
#Document
public class MyDocument {
#Id
private String id;
private String title;
private String description;
private String tagLine;
#CreatedDate
private Date createdDate;
#LastModifiedDate
private Date updatedDate;
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getTitle() {
return title;
}
public void setTitle(String title) {
this.title = title;
}
public String getDescription() {
return description;
}
public void setDescription(String description) {
this.description = description;
}
public String getTagLine() {
return tagLine;
}
public void setTagLine(String tagLine) {
this.tagLine = tagLine;
}
}
i have added annotated application with #EnableMongoAuditing
i have created interface which implements mongorepository
public interface MyDocumentRepository extends MongoRepository<MyDocument, String> {
}
when i have created RestController with GET,POST,PATCH methods
in POST I'm sending
{'title':'first'}
Controller Class POST method is
#RequestMapping(value = "/", method = RequestMethod.POST)
public ResponseEntity<?> saveMyDocument(#RequestBody MyDocument myDocument) {
MyDocument doc = myDocumentRepo.save(myDocument);
return new ResponseEntity<MyDocument>(doc, HttpStatus.CREATED);
}
Its saving the data in mongo.
{
"_id" : ObjectId("56b3451f0364b03f3098f101"),
"_class" : "com.wiziq.service.course.model.MyDocument",
"title" : "test"
}
and PATCH request is like
#RequestMapping(value = "/{id}", method = RequestMethod.PATCH)
public ResponseEntity<MyDocument> updateCourse(#PathVariable(value = "id") String id,
#RequestBody MyDocument myDocument) {
myDocument.setId(id);
MyDocument doc = courseService.save(myDocument);
return ResponseEntity.ok(course);
}
when in make PATCH request with data {"description":"This is test"}
it update the docuent BUT it removes title field and createdDate form the document, its doing update which is ok. But i wanted to do an upsert, i can do its using mongoTemplate,
but there i have to set each property which i want to set.
Is there any generic way to that if i get a PATCH request i can update only not null properties.. properties which are coming in request
spring-data-rest seems to do it using #RepositoryRestResource. How can i achieve the same.
I don't want to code like this
Update update = new Update().set("title", myDocument.getTitle()).set("description", myDocument.getdescription());
Unfortunately its the behavior in MongoDB, you can verify the same using shell.
So to update create an Update Object and using
Query query = new Query(Criteria.where("id").is(ID));
Here ID is the document which you want to update.Based on your requirement set upsert after that using findAndModify update document.
mongoTemplate.findAndModify(query, update,
new FindAndModifyOptions().returnNew(true).upsert(false),
someclass.class);
If you have a model like MyModel.class and you need a smooth way to create an Update object from it there is no real clear way how to do this but you can use MongoConverter bean that is created in Spring Data Mongo auto configuration and then just use replaceOne method of MongoCollection.
#Autowired
private MongoTemplate template;
#Autowired
private MongoConverter mongoConverter;
...
#Override
public void upsertMyModel(MyModel model) {
Document documentToUpsert = new Document();
mongoConverter.write(model, documentToUpsert);
template.getCollection(collectionName).replaceOne(
Filters.eq("_id", model.getId()),
documentToUpsert,
new ReplaceOptions().upsert(true));
}
Upsert can be done in Spring data mongodb using BulkOperations.
Suppose there are two entities Entity1 and Entity2. Entity1 has foreginId which is primary id of Entity2. Both have a field title. Now, to upsert from entity2 to entity1, we can do it as follows:
Query query = new Query(Criteria.where("foreignId").is(entity2.getId()));
Update update = new Update();
update.set("title",entity2.getTitle());
List<Pair<Query, Update>> updates = new ArrayList<Pair<Query, Update>>();
updates.add(Pair.of(query, update););
BulkOperations bulkOps = this.mongoTemplate.bulkOps(BulkMode.UNORDERED, Entity1.class);
bulkOps.upsert(updates);
bulkOps.execute();
I have a requirement to provide functionality which will allow user to search through many different domain elements and see results as combined list. So in UI he will have to fill only one text-field and than retrive results.
To visualize lets assume i have 3 entities in domain:
#Document(indexName="car")
public class Car {
private int id;
private String type;
}
#Document(indexName="garage")
public class Garage{
private int id;
private String address;
}
#Document(indexName="shop")
public class Shop{
private int id;
private String name;
}
Now i thought i could achieve requirement like this:
...
#Inject
private ElasticsearchTemplate elasticsearchTemplate;
...
#RequestMapping(value = "/_search/all/{query}",
method = RequestMethod.GET,
produces = MediaType.APPLICATION_JSON_VALUE)
#Timed
public List<?> search(#PathVariable String query) {
SearchQuery searchQuery = new NativeSearchQueryBuilder()
.withQuery(queryString(query))
.withIndices("car", "garage", "shop")
.build();
//THIS WORKS
elasticsearchTemplate.queryForIds(searchQuery);
//THIS THROWS ERROR ABOUT WRONG INDEXES
return elasticsearchTemplate.queryForPage(searchQuery, GlobalSearchDTO.class, new GlobalSearchResultMapper()).getContent();
}
...
class GlobalSearchDTO {
public Long id;
public String type;
public Object obj;
}
...
but when calling 2nd function - the one which is responsible for returning actual documents, the following exception is thrown:
Unable to identify index name. GlobalSearchDTO is not a Document. Make
sure the document class is annotated with #Document(indexName="foo")
I've tried with passing any domain entity as a class argument, but than i am retriving only elements from the corresponding index, not all of them. For instance calling:
return elasticsearchTemplate.queryForPage(searchQuery, Shop.class, new GlobalSearchResultMapper()).getContent();
Results in retrivng elements only from 'shop' index. It seems like for some reason dynamically provided indicies are not used.
So the question is: Is it possible to retrive data like that? Why specifying '.withIndices("car", "garage", "shop")' is not enough?
Maybe i should consider other solutions like:
search through indexes in loop(one bye one), join results and order them by score
create separate GlobalSearch entity with 'globalsearch' index
and duplicate data there
Thanks in advance!
Krzysztof
I have managed to find suitable workaround for my problem. It turned out that when using 'scroll' and 'scan' functionality dynamically provided indicies are used which means that query works as expected. Code for solution:
#RequestMapping(value = "/_search/all/{query}",
method = RequestMethod.GET,
produces = MediaType.APPLICATION_JSON_VALUE)
#Timed
public List<?> search(#PathVariable String query) {
SearchQuery searchQuery = new NativeSearchQueryBuilder()
.withQuery(queryString(query))
.withIndices("car", "garage", "shop")
.withPageable(new PageRequest(0,1))
.build();
String scrollId = elasticsearchTemplate.scan(searchQuery, 1000, false);
List<GlobalSearchDTO> sampleEntities = new ArrayList<GlobalSearchDTO>();
boolean hasRecords = true;
while (hasRecords){
Page<GlobalSearchDTO> page = elasticsearchTemplate.scroll(scrollId, 5000L , new ResultMapper());
if(page != null) {
sampleEntities.addAll(page.getContent());
hasRecords = page.hasNext();
}
else{
hasRecords = false;
}
}
return sampleEntities;
}
}
and in the ResultMapper class:
...
for (SearchHit hit : response.getHits()) {
switch(hit.getIndex()) {
case "car": //map to DTO
case "shop": //map to DTO
case "garage": //map to DTO
}
}
...