In our app, we are synchronizing some of the data to elasticsearch, and some of this data is users' records. The app is grails 5.1 and we are using Elasticsearch Java API Client for elasticsearch integration.
The indexing is working perfectly fine, and an example of user data looks like this:
Now, we have this following function that suppose to get the list of users by their ids:
PublicUser[] getAllByIds(Long[] ids) {
MgetRequest request = new MgetRequest.Builder()
.ids(ids.collect { it.toString() }.toList())
.index("users")
.build()
MgetResponse<PublicUser> response = elasticSearchClientProviderService.getClient().mget(
request,
PublicUser.class
)
response.docs().collect {
it.result().source()
}
}
And when the response holds at least one user record, we are getting a list of PulicUser objects -> as expected.
However, if the search result is empty, the eventual return from this function is a list with one null element.
Some investigation
response.docs() holds a single non-existing document (looks like this one is filled with the request data).
And, as a result, the return from this function is (as I mentioned above) list of one null element.
Another observation:
I expected that response object will have .hits(), for the actual results are accessible through: response.hits().hits(). But now of that exist.
The only season I started looking into docs() directly is because if this documentation: https://www.elastic.co/guide/en/elasticsearch/reference/master/docs-multi-get.html
There is a lack of Elasticsearch Java API Client docs. They mostly refer to REST API docs.
What is the correct way to get the list of results from mget request?
For now, I am solving this the following way. Will be glad to see if there is a better way, though.
PublicUser[] getAllByIds(Long[] ids) {
MgetRequest request = new MgetRequest.Builder()
.ids(ids.collect { it.toString() }.toList())
.index("users")
.build()
MgetResponse<PublicUser> response = elasticSearchClientProviderService.getClient().mget(
request,
PublicUser.class
)
List<PublicUser> users = []
response.docs().each {
if (it.result().found()) {
users.add(it.result().source())
}
}
users
}
Related
So I have a Apache Camel route that reads Data elements from a JPA endpoint, converts them to DataConverted elements and stores them into a different database via a second JPA endpoint. Both endpoints are Oracle databases.
Now I want to set a flag on the original Data element that it got copied successfully. What is the best way to achieve that?
I tried it like that: saving the ID in the context and then reading it and accessing a dao method in the .onCompletion().onCompleteOnly().
from("jpa://Data")
.onCompletion().onCompleteOnly().process(ex -> {
var id = Long.valueOf(getContext().getGlobalOption("id"));
myDao().setFlag(id);
}).end()
.process(ex -> {
Data data = ex.getIn().getBody(Data.class);
DataConverted dataConverted = convertData(data);
ex.getMessage().setBody(data);
var globalOptions = getContext().getGlobalOptions();
globalOptions.put("id", data.getId().toString());
getContext().setGlobalOptions(globalOptions);
})
.to("jpa://DataConverted").end();
However, this seems to trigger a deadlock, the dao method is stalling on the commit of the update. The only explanation could be that the Data object gets locked by Camel and is still locked in the .onCompletion().onCompleteOnly() part of the route, therefore it can't get updated there.
Is there a better way to do it?
Have you tried using the recipient list EIP where first destination is the jpa:DataConverted endpoint and the second destination will be the endpoint to set the flag. This way both get the same message and will be executed sequentially.
https://camel.apache.org/components/3.17.x/eips/recipientList-eip.html
from("jpa://Data")
.process(ex -> {
Data data = ex.getIn().getBody(Data.class);
DataConverted dataConverted = convertData(data);
ex.getIn().setBody(data);
})
.recipientList(constant("direct:DataConverted","direct:updateFlag"))
.end();
from("direct:DataConverted")
.to("jpa://DataConverted")
.end();
from("direct:updateFlag")
.process(ex -> {
var id = ((MessageConverted) ex.getIn().getBody()).getId();
myDao().setFlag(id);
})
.end();
Keep in mind, you might want to make the route transactional by adding .transacted()
https://camel.apache.org/components/3.17.x/eips/transactional-client.html
I have a question regarding a small issue that I'm having. I've created a widget that will live on the Service Portal to allow an admin to Accept or Reject requests.
The data for the widget is pulling from the Approvals (approval_approver) table. Under my GlideRecord, I have a query that checks for the state as requested. (Ex. addQuery('state', 'requested'))
To narrow down the search, I tried entering addQuery('sys_id', current.sys_id). When I use this query, my script breaks and I get an error on the Service Portal end.
Here's a sample of the GlideRecord script I've written to Accept.
[//Accept Request
if(input && input.action=="acceptApproval") {
var inRec1 = new GlideRecord('sysapproval_approver');
inRec1.addQuery('state', 'requested');
//inRec1.get('sys_id', current.sys_id);
inRec1.query();
if(inRec1.next()) {
inRec1.setValue('state', 'Approved');
inRec1.setValue('approver', gs.getUserID());
gs.addInfoMessage("Accept Approval Processed");
inRec1.update();
}
}][1]
I've research the web, tried using $sp.getParameter() as a work-around and no change.
I would really appreciate any help or insight on what I can do different to get script to work and filter the right records.
If I understand your question correctly, you are asking how to get the sysId of the sysapproval_approver record from the client-side in a widget.
Unless you have defined current elsewhere in your server script, current is undefined. Secondly, $sp.getParameter() is used to retrieve URL parameters. So unless you've included the sysId as a URL parameter, that will not get you what you are looking for.
One pattern that I've used is to pass an object to the client after the initial query that gets the list of requests.
When you're ready to send input to the server from the client, you can add relevant information to the input object. See the simplified example below. For the sake of brevity, the code below does not include error handling.
// Client-side function
approveRequest = function(sysId) {
$scope.server.get({
action: "requestApproval",
sysId: sysId
})
.then(function(response) {
console.log("Request approved");
});
};
// Server-side
var requestGr = new GlideRecord();
requestGr.addQuery("SOME_QUERY");
requestGr.query(); // Retrieve initial list of requests to display in the template
data.requests = []; // Add array of requests to data object to be passed to the client via the controller
while(requestsGr.next()) {
data.requests.push({
"number": requestsGr.getValue("number");
"state" : requestsGr.getValue("state");
"sysId" : requestsGr.getValue("sys_id");
});
}
if(input && input.action=="acceptApproval") {
var sysapprovalGr = new GlideRecord('sysapproval_approver');
if(sysapprovalGr.get(input.sysId)) {
sysapprovalGr.setValue('state', 'Approved');
sysapprovalGr.setValue('approver', gs.getUserID());
sysapprovalGr.update();
gs.addInfoMessage("Accept Approval Processed");
}
...
I am trying to get a Reservation object which contains a pointer to Restaurant.
In Parse Cloud code, i am able to get the restaurants objects associated with Reservations via query.include('Restaurant') in log just before response.success. However, the Restaurants reverted back to pointer when i receive the response on client app.
I tried reverted JSSDK version to 1.4.2 & 1.6.7 as suggested in some answers, but it doesn't work for me.
Parse.Cloud.define('getreservationsforuser', function(request, response) {
var user = request.user
console.log(user)
var query = new Parse.Query('Reservations')
query.equalTo('User', user)
query.include('Restaurant')
query.find({
success : function(results) {
console.log(JSON.stringify(results))
response.success(results)
},
error : function (error) {
response.error(error)
}
})
})
response :
..."restaurant":{"__type":"Pointer",
"className":"Restaurants",
"objectId":"kIIYe7Z0tD"},...
You can't directly send the pointer objects back from cloud code even though you have included it. You need to manually copy the content of that pointer object to a javascript object. Like below:
var restaurant = {}
restaurant["id"] = YOUR_POINTER_OBJECT.id;
restaurant["createdAt"] = YOUR_POINTER_OBJECT.createdAt;
restaurant["custom_field"] = YOUR_POINTER_OBJECT.get("custom_field");
ps: in your code you seem do nothing else other than directly send the response back. I think parse REST api might be a better choice in that case.
It turned out that my code implementation was correct.
EDIT:
This is basically what I want to do, only in Java
Using ElasticSearch, we add documents to an index bypassing IndexRequest items to a BulkRequestBuilder.
I would like for the documents to be dropped from the index after some time has passed (time to live/ttl)
This can be done either by setting a default for the index, or on a per-document basis. Either approach is fine by me.
The code below is an attempt to do it per document. It does not work. I think it's because TTL is not enabled for the index. Either show me what Java code I need to add to enable TTL so the code below works, or show me different code that enables TTL + sets default TTL value for the index in Java I know how to do it from the REST API but I need to do it from Java code, if at all possible.
logger.debug("Indexing record ({}): {}", id, map);
final IndexRequest indexRequest = new IndexRequest(_indexName, _documentType, id);
final long debug = indexRequest.ttl();
if (_ttl > 0) {
indexRequest.ttl(_ttl);
System.out.println("Setting TTL to " + _ttl);
System.out.println("IndexRequest now has ttl of " + indexRequest.ttl());
}
indexRequest.source(map);
indexRequest.operationThreaded(false);
bulkRequestBuilder.add(indexRequest);
}
// execute and block until done.
BulkResponse response;
try {
response = bulkRequestBuilder.execute().actionGet();
Later I check in my unit test by polling this method, but the document count never goes down.
public long getDocumentCount() throws Exception {
Client client = getClient();
try {
client.admin().indices().refresh(new RefreshRequest(INDEX_NAME)).actionGet();
ActionFuture<CountResponse> response = client.count(new CountRequest(INDEX_NAME).types(DOCUMENT_TYPE));
CountResponse countResponse = response.get();
return countResponse.getCount();
} finally {
client.close();
}
}
After a LONG day of googling and writing test programs, I came up with a working example of how to use ttl and basic index/object creation from the Java API. Frankly most of the examples in the docs are trivial, and some JavaDoc and end-to-end examples would go a LONG way to help those of us who are using the non-REST interfaces.
Ah well.
Code here: Adding mapping to a type from Java - how do I do it?
I'm trying to retrieve a document by id, to get the folders/Collections that have associated this document, but am getting this error
com.google.gdata.util.ParseException: [Line 1, Column 266] Invalid root element, expected (namespace uri:local name)
this is the code:
DocsService client = new DocsService("test testnet v1");
URL feedUri = new URL("https://docs.google.com/feeds/default/private/full/"+DOCID+"?oauth_token="+token);
DocumentListFeed feed = client.getFeed(feedUri, DocumentListFeed.class);
for (DocumentListEntry entry : feed.getEntries()) {
if (!entry.getParentLinks().isEmpty()) {
for (Link link : entry.getParentLinks()) {
System.out.print(link.getTitle()+""+link.getHref());
}
}
}
do not know if this is the best way, or which is the way to get a document by its id
Can you send you request using OAuth Playground to look at the raw XML data and share it with us?
There might be an invalid XML element that prevents the parser to parse it.