DeleteRequest example with elastic 8.3.0 java api client - elasticsearch

I need examples for DeleteRequest with respect to ES 8.3.0 Java Api client.
I am looking for code reference where I want to delete one particular document by passing index name and the condition to delete the document.
I have found only Java High Level Rest Client(Deprecated in 7.15.0), and
Transport Client(Deprecated in 7.0.0).

You can use below code for delete specific document using id:
DeleteRequest request = DeleteRequest.of(d -> d.index("index_name").id("doc_id"));
DeleteResponse response = esClient.delete(request);
If you want to do DeleteByQuery then you can use below code (it will delete document where country is india):
DeleteByQueryRequest dbyquery = DeleteByQueryRequest
.of(fn -> fn.query(TermQuery.of(tq -> tq.field("country").value("india"))._toQuery()).index("index_name"));
DeleteByQueryResponse dqr = esClient.deleteByQuery(dbyquery);
There is no details document available for above. You can see open github issue for same here

Related

Calling a custom api via c# sdk

I have this custom api called "pits_PostProjectLineEntries" which takes in a EntityCollection called Transactions. I am using the option to have the entities in that EntityCollection to be expando since I really not need to create a table to hold this entities. I am able to call the custom api via PostMan with some raw json, and it works perfectly. My issue is when I use C# Plugins, I am getting errors messages about the entities logical name within Transactions collection. Is there a way to tell the Xrm SDK that these entities are not tied to a physical table but are just expando.
Here is the code that calls the Custom API for ref.
_transactions = new List<Entity>();
Entity Transaction = new Entity("");
Transaction["date"] = userReport.GetAttributeValue<DateTime>("pits_date");
Transaction["project"] = userReport.GetAttributeValue<EntityReference>("pits_project");
Transaction["resource"] = userReport.GetAttributeValue<EntityReference>("pits_user");
Transaction["task"] = userReport.GetAttributeValue<EntityReference>("pits_task");
Transaction["worktype"] = response["WorkType"];
Transaction["shifttype"] = response["ShiftType"];
Transaction["quantity"] = RG;
Transaction["chargeable"] = true;
Transaction["tx_type"] = 361200000;
_transactions.Add(Transaction);
EntityCollection Transactions = new EntityCollection(_transactions);
OrganizationRequest request = new OrganizationRequest("pits_PostProjectLineEntries");
request.Parameters.Add("Transactions", Transactions);
var response = _context.OrganizationService.Execute(request);
The error message I get is:
An unexpected error occurred: The entity with a name = '' with namemapping = 'Logical' was not found in the MetadataCache.
I am trying to call this custom api and pass it an entity that is tied to a DV Table from the C# sdk.
Sorry if the formatting is bad, I have been using stack overflow for a long time, but first time posting
This is a known bug with Expando entities in the currently shipping SDK assemblies, The Dataverse ServiceClient version (.net / cross platform) of the SDK should be working correctly at this time. You can get that here: https://www.nuget.org/packages/Microsoft.PowerPlatform.Dataverse.Client/
The libraries from Microsoft.CrmSdk.CoreAssemblies will have the fix on next update (at the time of this writing)

The RestHighLevelClient cannot be used with elasticsearch 7 BulkProcessor. Which client should be used?

The elasticsearch 7 documentation -
https://www.elastic.co/guide/en/elasticsearch/client/java-api/current/java-docs-bulk-processor.html
mentions the client to be used as follows-
https://www.elastic.co/guide/en/elasticsearch/client/java-api/current/client.html
Here HightLevelRest Client cannot be used with bulkprocessor in elastisearch7.
This is different from what they had suggested in elasticsearch 6 -
https://www.elastic.co/guide/en/elasticsearch/client/java-api/6.4/client.html
Could someone mention which client they use with bulkprocessor in elasticsearch 7
RestHighLevelClient can be used with ElasticSearch7 BulkProcessor. You should take a look at the source code. The only change is in the RestHighLevelClient bulkAsycn method. It now takes an additional parameter RequestOptions. So previously the signature was -
BulkProcessor.builder(restHighLevelClient()::bulkAsync, listener);
The new signature is something as follows-
BulkProcessor.builder((request, bulkListener) -> restHighLevelClient.bulkAsync(request, RequestOptions.DEFAULT, bulkListener), bulkProcessorListener
);
Hope it helps

credentials for google knowledge graph

I am trying to use the Google Knowledge graph API. I already have the API key and also use the library instead of the RESTful API.
kgSearch = Kgsearch::KgsearchService.new
response = kgSearch.search_entities(query: query)
I have tried to instantiate the service as below
kgSearch = Kgsearch::KgsearchService.new(api: 'klfkdlfkdlm')
it's rejected because the init expect no arguments.
Any idea, how to add the api_key ??
I try also:
response = kgSearch.search_entities(query: query, api: 'fjfkjfl')
same things
Any ideas?
According to the Ruby Docs for the Google Api Client, key is an instance method where you can assign your api key (http://www.rubydoc.info/github/google/google-api-ruby-client/Google/Apis/KgsearchV1/KgsearchService#key-instance_method).
So I believe you'd do something like the following:
kgSearch = Kgsearch::KgsearchService.new
kgSearch.key = 'your_key_here'
response = kgSearch.search_entities(query: query) # and any other options that are necessary

Does Wearable.DataApi.getDataItems users UriMatcher

I'm trying to extract all data from Wearable.DataApi that matches wear:/someAttr/*
The motivation is that I'm using PutDataRequest.createWithAutoAppendedId since I want to avoid overriding data written in the wearable device.
I would like to match the following URI:
wear:/someAttr/3/rand1
wear:/someAttr/2/rand2
wear:/someAttr/3/rand6
But avoid
wear:/someOtherAttr/3/rand1
Can I use wildcard to get data from DataAPI?
I have a current workaround of not providing a URI to Wearable.DataApi.getDataItems which brings all the data but includes unwanted DataItems that I would wish to avoid.
Any ideas?
I found the solution in a similar question by #dzeikei
From Android official documentation
int FILTER_PREFIX Filter type for getDataItems(GoogleApiClient, Uri,
int), deleteDataItems(GoogleApiClient, Uri, int): if this filter is
set, the given URI will be taken as a path prefix, and the operation
will apply to all matching items.
So in order to match my example you would use
Uri.Builder builder = new Uri.Builder();
builder.scheme(PutDataRequest.WEAR_URI_SCHEME).path("someAttr");
Uri uri = builder.build();
PendingResult<DataItemBuffer> pendingResult = Wearable.DataApi.getDataItems(googleApiClient, uri, DataApi.FILTER_PREFIX);

No signature of method: groovy.lang.MissingMethodException.makeKey()

I've installed titan-0.5.0-hadoop2 with hbase and elasticsearch support
I've loaded the graph with
g = TitanFactory.open('conf/titan-hbase-es.properties')
==>titangraph[hbase:[127.0.0.1]]
and a then I loaded the test application
GraphOfTheGodsFactory.load(g)
Now when I'm trying to create a new index key with:
g.makeKey('userId').dataType(String.class).indexed(Vertex.class).unique().make()
and I got this error:
No signature of method: groovy.lang.MissingMethodException.makeKey() is applicable for argument types: () values: []
Possible solutions: every(), any()
Display stack trace? [yN]
Can someone help me with this ?
when I want to see the indexed keys I see this
g.getIndexedKeys(Vertex.class)
==>reason
==>age
==>name
==>place
I'm not completely following what you are trying to do. It appears that you loaded Graph of the Gods to g and then you want to add userId as a new property to the schema. If that's right, then i think your syntax is wrong, given the Titan 0.5 API. The method for managing the schema is very different from previous versions. Changes to the schema are performed through the ManagementSystem interface which you can get an instance of through:
mgmt = g.getManagementSystem()
The syntax for adding a property then looks something like:
birthDate = mgmt.makePropertyKey('birthDate').dataType(Long.class).cardinality(Cardinality.SINGLE).make()
mgmt.commit()
Note that g.getIndexKeys(Class) is not the appropriate way to get schema information either. You should use the ManagementSystem for that too.
Please see the documentation here for more information.

Resources