Elasticsearch XContentBuilder.bytes removed - elasticsearch

I have the following code
TermVectorsResponse resp = request.execute().actionGet();
XContentBuilder builder = XContentFactory.jsonBuilder();
resp.toXContent(builder, null);
Map<String, Object> map = XContentHelper.convertToMap(builder.bytes(), false, XContentType.JSON).v2();
return map;
and my compiler is complaining because it cannot resolve the method bytes() for XContentBuilder. My code worked for elasticsearch 6.2.2 and I know the error of the compiler is because in elasticsearch 7.5 the method bytes() was removed. So, how I'm I suppose to create the map variable, I've looked everywhere and I haven't find an answer. I guess I'm supposed to create a JsonXContent and use it instead the builder.bytes() or retrieve the bytes another way but I don't know for sure
Thanks in advance

the builder.bytes() is like a kind of anti pattern so you can get a BytesReference from your builder using:
BytesReference.bytes(builder)
example for 7.6:
https://www.elastic.co/guide/en/elasticsearch/client/java-rest/current/java-rest-high-put-stored-script.html

Related

Reactive stream backpressure with spring reactor project

I have research and read documents by they are not very understandable.
What i am trying to achieve is the following functionality:
I am using Spring Reactor project and using the eventBus. My event bus is throwing event to module A.
Module A should receive the event and insert into Hot Stream that will hold unique values. Every 250 Milisecons the stream should pull all value and make calulcation on them.. and so on.
For example:
The eventBus is throwing event with number: 1,2,3,2,3,2
The Stream should get and hold unique values -> 1,2,3
After 250 miliseconds the stream should print the number and empty values
Anyone has an idea how to start? I tried the examples but nothing really works and i guess i don't understand something. Anyone has an example?
Tnx
EDIT:
When trying to do the next i always get exception:
Stream<List<Integer>> s = Streams.wrap(p).buffer(1, TimeUnit.SECONDS);
s.consume(i -> System.out.println(Thread.currentThread() + " data=" + i));
for (int i = 0; i < 10000; i++) {
p.onNext(i);
}
The exception:
java.lang.IllegalStateException: The environment has not been initialized yet
at reactor.Environment.get(Environment.java:156) ~[reactor-core-2.0.7.RELEASE.jar:?]
at reactor.Environment.timer(Environment.java:184) ~[reactor-core-2.0.7.RELEASE.jar:?]
at reactor.rx.Stream.getTimer(Stream.java:3052) ~[reactor-stream-2.0.7.RELEASE.jar:?]
at reactor.rx.Stream.buffer(Stream.java:2246) ~[reactor-stream-2.0.7.RELEASE.jar:?]
at com.ta.ng.server.controllers.user.UserController.getUsersByOrgId(UserController.java:70) ~[classes/:?]
As you can see i cannot proceed trying without passing this issue.
BY THE WAY: This is happeing only when i use buffer(1, TimeUnit.SECONDS) If i use buffer(50) for example it works.. Although this is not the final solution its a start.
Well after reading doc again i missed this:
static {
Environment.initialize();
}
This solved the problem. Tnx

Java8 streams map - check if all map operations succeeded?

I am trying to map one list to another using streams.
Some elements of the original list fail to map. That is, the mapping function may not be able to find an appropriate new value.
I want to know if any of the mappings has failed. Ideally I would also like to stop the processing once a failure happened.
What I am currently doing is:
The mapping function returns null if there's no mapped value
I filter() to remove nulls from the stream
I collect(), and then
I compare the size of the result to the size of the original list.
For example:
List<String> func(List<String> old, Map<String, String> oldToNew)
{
List<String> holger = old.stream()
.map(oldToNew::get)
.filter(Objects::nonNull)
.collect(Collectors.toList);
if (holger.size() < old.size()) {
// ... appropriate error handling code ...
}
else {
return holger;
}
}
This is not very elegant. Also, everything is processed even when the whole thing should fail.
Suggestions for a better way of doing it?
Or maybe I should ditch streams altogether and use good old loops?
There is no best solution because that heavily depends on the use case. E.g. if lookup failures are expected to be unlikely or the error handling implies throwing an exception anyway, just throwing an exception at the first failed lookup within the mapping function might indeed be a good choice. Then, no follow-up code has to care about error conditions.
Another way of handling it might be:
List<String> func(List<String> old, Map<String, String> oldToNew) {
Map<Boolean,List<String>> map=old.stream()
.map(oldToNew::get)
.collect(Collectors.partitioningBy(Objects::nonNull));
List<String> failed=map.get(false);
if(!failed.isEmpty())
throw new IllegalStateException(failed.size()+" lookups failed");
return map.get(true);
}
This can still be considered being optimized for the successful case as it collects a mostly meaningless list containing null values for the failures. But it has the point of being able to tell the number of failures (unlike using a throwing map function).
If a detailed error analysis has a high priority, you may use a solution like this:
List<String> func(List<String> old, Map<String, String> oldToNew) {
Map<Boolean,List<String>> map=old.stream()
.map(s -> new AbstractMap.SimpleImmutableEntry<>(s, oldToNew.get(s)))
.collect(Collectors.partitioningBy(e -> e.getValue()!=null,
Collectors.mapping(e -> Optional.ofNullable(e.getValue()).orElse(e.getKey()),
Collectors.toList())));
List<String> failed=map.get(false);
if(!failed.isEmpty())
throw new IllegalStateException("The following key(s) failed: "+failed);
return map.get(true);
}
It collects two meaningful lists, containing the failed keys for failed lookups and a list of successfully mapped values. Note that both lists could be returned.
You could change your filter to Objects::requireNonNull and catch a NullPointerException outside the stream

Spring Data + QueryDSL empty predicate + Predicate chaining

let me get straight to the point.
I am using Spring Data JPA with QueryDSL in a project and I cannot figure out this myself.
I have the QueryDSL predicates in static methods that can take arguments and if the argument is not correct it should return "empty predicate" :
public static BooleanExpression byWhateverId(Long whateverId) {
if(whateverId == null) return [insert magic here];
// if parameter is OK return usual predicate
return QClass.property.whateverId.eq(whateverId);
}
Now I want to be able to chain these predicates using AND/OR oprators :
someRepository.findAll(byWhateverId(someParam).and(bySomethingElseId(1));
The problem here is that at this point I don't know whether 'someParam' is null or not (of course I can check but that's a lot of IFs).
I also know I can use BooleanBuilder class but that seems also like a lot of code that should not be needed.
Does anybody knows what could be inserted instead of "[insert magic here]" ???
Or maybe I am missing something somewhere...
Thanks!
You can return null for non matching predicates in byWhateverId and bySomethingElseId and combine the predicate via ExpressionUtils.allOf()
In your case
Predicate where = ExpressionUtils.allOf(byWhateverId(someParam), bySomethingElseId(1));
someRepository.findAll(where);
4 years old question, but anyway...
You can return sql predicate which is always true, like true=true:
public static BooleanExpression alwaysTrue() {
return Expressions.TRUE.isTrue;
}
If you have a bunch of these the generated sql won't be super nice, so you might want to limit such usages to a minimum.
Sorry, I completely forgot about this.
The right solution (from my point of view) is to use BooleanBuilder.

Serializing JsonArray as array and not object using Gson

I'm using Gson library but when it serializes the JsonArray object, it seems to serialize this as an object rather than a JSON array. i.e.
{ elements: [ {name:"value1}, {name:"value2"}]}
How do I remove the elements from being serialized?
I went to see the doctor, because my foot hurt when I walked on it. The doctor said, "Don't walk on it."
Generally, when working with an API like Gson, one would rather not even know that JsonArray exists, and they'd instead just use the data binding part of the API. So, instead of manually building a JsonArray, and then deal with serializing it, just feed a Java List or array to Gson.toJson(). For example:
List list = new ArrayList();
list.add("one");
list.add(2);
list.add(new Foo());
Gson gson = new Gson();
String json = gson.toJson(list);
System.out.println(json);
If that approach doesn't fit your needs and you're stuck using a JsonArray for some reason, then you might be tempted to just call its toString() method, since that does currently create what's ultimately desired, I wouldn't use it, because there is nothing in the documentation that says the toString() is guaranteed to create a valid JSON representation of the enclosed array contents. So, it might not return a String of the same format in future releases of Gson.
At any rate, if you really want to use a JsonArray, it should serialize well enough as follows.
JsonElement one = new JsonPrimitive("one");
JsonElement two = new JsonPrimitive(2);
JsonObject foo = new JsonObject();
foo.addProperty("foo", new Foo().foo);
JsonArray jsonArray = new JsonArray();
jsonArray.add(one);
jsonArray.add(two);
jsonArray.add(foo);
System.out.println(new Gson().toJson(jsonArray));
// ["one",2,{"foo":"hi"}]
Note: This answer is based on the Gson 2.2 API. I don't recall whether earlier versions of Gson included the overloaded toJson(JsonElement) methods.
If the toJson method is already being used in this fashion (to serialize a JsonArray), but the output is as demonstrated in the original question, recall that Java doesn't consider the runtime type when selecting amongst overloaded methods. It binds to the compile time type. (Lame -- I know.) So, you may need to cast the argument type to JsonElement, to let the compiler know which method to bind to. The following demonstrates what might be effectively happening in the original question.
System.out.println(new Gson().toJson((Object)jsonArray));
// {"elements":["one",2,{"foo":"hi"}]}

Why does MemoryStream.GetBuffer() always throw?

The following code will always throw
UnuthorizedAccessException (MemoryStream's internal buffer cannot be accessed.)
byte[] buf1 = { 2, 3, 5, 7, 11 };
var ms = new MemoryStream(buf1);
byte[] buf2 = ms.GetBuffer(); // exception will be thrown here
This is in a plain old console app and I'm running as an admin. I can't imagine a more privileged setting I could give this code. So why can't I get at this buffer? (And if nobody can, what's the point of the GetBuffer method?)
The MSDN docs say
To create a MemoryStream instance with
a publicly visible buffer, use
MemoryStream,
MemoryStream(array[], Int32,
Int32, Boolean, Boolean), or
MemoryStream(Int32).
Am I not doing that?
P.S. I don't want to use ToArray() because that makes a copy.
Here is the documentation for MemoryStream(byte[]) constructor that you're using. It specifically says:
This constructor does not expose the underlying stream. GetBuffer throws UnauthorizedAccessException.
You should use this constructor instead, with publiclyVisible = true.
Check the docs for MemoryStream.GetBuffer()
To create a MemoryStream instance with
a publicly visible buffer, use
MemoryStream, MemoryStream(Byte[],
Int32, Int32, Boolean, Boolean), or
MemoryStream(Int32). If the current
stream is resizable, two calls to this
method do not return the same array if
the underlying byte array is resized
between calls. For additional
information, see Capacity.
You need to use a different constructor.
To add to what others have already put in here...
Another way to get your code to work is change your code to the following line.
byte[] buf2 = ms.ToArray();
You appear to be using MemoryStream(array[]) which does not match any of the three versions mentioned in the docs.

Resources