How to perform XRANGE on Redis streams using spring boot - spring-boot

I am using below code to perform XRANGE operation on Redis Stream however it does not give any output even if the range have values in it. Is the approach in the below code correct or there needs to be a different way. I am using Spring Boot 2.2.4
String from = start + "-0";
String to = end + "-0";
Range<String> range = Range.closed(from, to);
List<MapRecord<String, Object, Object>> mapRecords = template.opsForStream().range("SAMPLE.STREAM", range);
Iterator<MapRecord<String, Object, Object>> iterator = mapRecords.iterator();
while (iterator.hasNext()) {
MapRecord<String, Object, Object> current = iterator.next();
log.info("Record Id: {}, Stream: {}, Value: {}", current.getId(), current.getStream(), current.getValue());
}

Related

Spring Boot Neo4jRepository Find All Items With Stream

You know that there is a stream find-all method under JpaRepository:
#QueryHints(value = {
#QueryHint(name = HINT_FETCH_SIZE, value = ""),
#QueryHint(name = HINT_CACHEABLE, value = "false"),
#QueryHint(name = HINT_READONLY, value = "true"),
})
Stream<PaidKeyword> findAllBy();
Can I handle same/similar method for Neo4jRepository?
My purpose is so simple. To fetch all data (~5M) partially and effectively. Otherwise, I got timeout errors...

HashMap with a Pair has a strange value in the database

When I insert a object into my db with a rest controller I get for the Value of the Map, which is a Pair, a really strange string in the db, is this serialized? Can someone explain?
"openHours": {
"monday": {
"first": "19:58",
"second": "20:58"
}
}
Part of the model:
#ElementCollection
var openHours: Map<String, Pair<LocalTime, LocalTime >> = HashMap()
Database:
key: monday
value:
aced00057372000b6b6f746c696e2e50616972fa1b06813de78f780200024c000566697273747400124c6a6176612f6c616e672f4f626a6563743b4c00067365636f6e6471007e000178707372000d6a6176612e74696d652e536572955d84ba1b2248b20c0000787077030413c5787371007e000377030414c578
This is java ObjectSerialization format.
You can use ObjectInputStream to de-serialize the object.
Like so :
ObjectInputStream ois = new ObjectInputStream(new ByteArrayInputStream( .. you data here ... ));
Pair<LocalTime, LocalTime > pair = (Pair<LocalTime, LocalTime >) ois.readObject();

How to collect map from the Set of objects that has a list using Collectors.toMap

I have class Element with a list, my intended output is like this:
Map<String , List<Element>>
{
1 = [Element3, Element1],
2 = [Element2, Element1],
3 = [Element2, Element1], 4=[Element2]
}
And my input is set of element objects, I used forEach to get the desired outcome, but I'm looking for how to collect it using collectors.toMap. Any inputs are much appreciated
Set<Element> changes = new HashSet();
List<String> interesetList = new ArrayList();
interesetList.add("1");
interesetList.add("2");
interesetList.add("3");
Element element = new Element(interesetList);
changes.add(element);
interesetList = new ArrayList();
interesetList.add("2");
interesetList.add("3");
interesetList.add("4");
element = new Element(interesetList);
changes.add(element);
Map<String, List<Element>> collect2 = new HashMap();
changes.forEach(element -> {
element.getInterestedList().forEach(tracker -> {
collect2.compute(tracker, ( key , val) -> {
List<Element> elementList = val == null ? new ArrayList<Element>() : val;
elementList.add(Element);
return elementList;
});
});
});
class Element {
List<String> interestedList;
static AtomicInteger sequencer = new AtomicInteger(0);
String mName;
public Element(List<String> aList) {
interestedList = aList;
mName = "Element" + sequencer.incrementAndGet();
}
public List<String> getInterestedList() {
return interestedList;
}
#Override
public String toString() {
return mName;
}
}
You can do it by using Collectors.groupingBy instead of Collectors.toMap, along with Collectors.mapping, which adapts a collector to another collector:
Map<String, List<Element>> result = changes.stream()
.flatMap(e -> e.getInterestedList().stream().map(t -> Map.entry(t, e)))
.collect(Collectors.groupingBy(
Map.Entry::getKey,
Collectors.mapping(Map.Entry::getValue, Collectors.toList())));
You need to use the Stream.flatMap method first and then pair the elements of the inner lists with the current Element instance. I did this via the new Java 9's Map.entry(key, value) method. If you're not on Java 9 yet, you could change it to new AbstractMap.SimpleEntry<>(key, value).
After flatmapping, we need to collect instances of Map.Entry. So I'm using Collectors.groupingBy to classify entries by key (where we had previously stored each element of the inner lists, aka what you call tracker in your code). Then, as we don't want to have instances of List<Map.Entry<String, Element>> as the values of the map, we need to transform each Map.Entry<String, Element> of the stream to just Element (that's why I'm using Map.Entry::getValue as the first argument of Collectors.mapping). We also need to specify a downstream collector (here Collectors.toList()), so that the outer Collectors.groupingBy collector knows where to place all the adapted elements of the stream that belong to each group.
A shorter and surely more efficient way to do the same (similar to your attempt) could be:
Map<String, List<Element>> result = new HashMap<>();
changes.forEach(e ->
e.getInterestedList().forEach(t ->
result.computeIfAbsent(t, k -> new ArrayList<>()).add(e)));
This uses Map.computeIfAbsent, which is a perfect fit for your use case.

Tensorflow serving function using tf.Estimators causes error while calling from java

I have successfully created the model and wanted to export it to be used for prediction from java client but while invoking the prediction using prediction stub from java it errors out as i need to place the serialized example into a placeholder object while calling prediction!
You must feed a value for placeholder tensor 'input_example_tensor' with dtype string and shape [?]
if anyone can help me out in creating a tensorplaceholders using protobuff in java?
there is an error as below -
io.grpc.StatusRuntimeException: INVALID_ARGUMENT: You must feed a value for placeholder tensor 'input_example_tensor' with dtype string and shape [?]
[[Node: input_example_tensor = Placeholder[dtype=DT_STRING, shape=[?], _device="/job:localhost/replica:0/task:0/device:CPU:0"]()]]
at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:221)
at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:202)
at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:131)
at tensorflow.serving.PredictionServiceGrpc$PredictionServiceBlockingStub.predict(PredictionServiceGrpc.java:332)
My Signature Definition used is as below using saved_model_cli -
The given SavedModel SignatureDef contains the following input(s):
inputs['inputs'] tensor_info:
dtype: DT_STRING
shape: (-1)
name: Placeholder:0
The given SavedModel SignatureDef contains the following output(s):
outputs['classes'] tensor_info:
dtype: DT_STRING
shape: (-1, 2)
name: dnn/head/Tile:0
outputs['scores'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 2)
name: dnn/head/predictions/probabilities:0
Method name is: tensorflow/serving/classify
Please find the code below used in java to create a request object -
long start1 = System.currentTimeMillis();
HashMap<String, Feature> inputFeatureMap = new HashMap();
ByteString inputStr = null;
List<ByteString> inputList = new ArrayList<ByteString>();
HashMap<String, Object> inputData = new HashMap<String, Object>();
inputData.put("bid", Float.parseFloat("-1.628"));
inputData.put("day_of_week", "6");
inputData.put("hour_of_day", "5");
inputData.put("connType", "wifi");
inputData.put("geo", "0");
inputData.put("size", "Phone");
inputData.put("cat", "arcadegame");
inputData.put("os", "7");
inputData.put("conv", Float.parseFloat("4"));
inputData.put("time", Float.parseFloat("650907"));
inputData.put("conn", Float.parseFloat("5"));
for (Map.Entry<String, Object> entry : inputData.entrySet()) {
Feature feature = null;
String featureName = entry.getKey();
Object featureValue = entry.getValue();
if (featureValue instanceof Float) {
feature = Feature.newBuilder()
.setFloatList(FloatList.newBuilder().addValue(Float.parseFloat(featureValue.toString())))
.build();
} else if (featureValue instanceof String) {
feature = Feature.newBuilder()
.setBytesList(
BytesList.newBuilder().addValue(ByteString.copyFromUtf8(featureValue.toString())))
.build();
} else if (featureValue instanceof Integer) {
feature = Feature.newBuilder()
.setInt64List(Int64List.newBuilder().addValue(Integer.parseInt(featureValue.toString())))
.build();
}
if (feature != null) {
inputFeatureMap.put(featureName, feature);
}
Features features = Features.newBuilder().putAllFeature(inputFeatureMap).build();
inputStr = Example.newBuilder().setFeatures(features).build().toByteString();
}
TensorProto.Builder asyncReBuilder = TensorProto.newBuilder();
asyncReBuilder.addStringVal(inputStr);
TensorShapeProto.Dim idsDim2 = TensorShapeProto.Dim.newBuilder().setSize(inputList.size()).build();
TensorShapeProto idsShape2 = TensorShapeProto.newBuilder().addDim(idsDim2).build();
asyncReBuilder.setDtype(DataType.DT_STRING).setTensorShape(idsShape2);
TensorProto allReqAsyncProto = asyncReBuilder.build();
TensorProto proto = allReqAsyncProto;
// Generate gRPC request
com.google.protobuf.Int64Value version = com.google.protobuf.Int64Value.newBuilder().setValue(modelVersion)
.build();
Model.ModelSpec modelSpec = Model.ModelSpec.newBuilder().setName(modelName).setVersion(version).build();
Predict.PredictRequest request = Predict.PredictRequest.newBuilder().setModelSpec(modelSpec)
.putAllInputs(ImmutableMap.of("inputs", proto)).build();
// Request gRPC server
PredictResponse response;
try {
response = blockingStub.predict(request);
long end = System.currentTimeMillis();
long diff = end - start1;
System.out.println("diff:"+ diff);
System.out.println("Response output count is - "+response.getOutputsCount());
System.out.println("outputs are: - " + response.getOutputs());
System.out.println("*********************************************");
// response = asyncStub.predict(request);
System.out.println("PREDICTION COMPLETE>>>>>>");
} catch (StatusRuntimeException e) {
e.printStackTrace();
return;
}
NOTE: I have used and successfully exported the model using the following export function() -
def _make_serving_input_fn(working_dir):
"""Creates an input function reading from raw data.
Args:
working_dir: Directory to read transformed metadata from.
Returns:
The serving input function.
"""
raw_feature_spec = RAW_DATA_METADATA.schema.as_feature_spec()
# Remove label since it is not available during serving.
raw_feature_spec.pop(LABEL_KEY)
def serving_input_fn():
raw_input_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(
raw_feature_spec)
raw_features, _, default_inputs = raw_input_fn()
# Apply the transform function that was used to generate the materialized
# data.
_, transformed_features = (
saved_transform_io.partially_apply_saved_transform(
os.path.join(working_dir, transform_fn_io.TRANSFORM_FN_DIR),
raw_features))
serialized_tf_example = tf.placeholder(dtype=tf.string,
shape=[None] )
receiver_tensors = {'examples': serialized_tf_example}
return tf.estimator.export.ServingInputReceiver(transformed_features, receiver_tensors)
return serving_input_fn
Anyways i resolved it using a different serving export function()- as given below
def _make_serving_input_fn(working_dir):
"""Creates an input function reading from raw data.
Args:
working_dir: Directory to read transformed metadata from.
Returns:
The serving input function.
"""
raw_feature_spec = RAW_DATA_METADATA.schema.as_feature_spec()
# Remove label since it is not available during serving.
raw_feature_spec.pop(LABEL_KEY)
def serving_input_fn():
raw_input_fn = input_fn_utils.build_parsing_serving_input_fn(
raw_feature_spec, default_batch_size=None)
raw_features, _, inputs = raw_input_fn()
# Apply the transform function that was used to generate the materialized
# data.
_, transformed_features = (
saved_transform_io.partially_apply_saved_transform(
os.path.join(working_dir, transform_fn_io.TRANSFORM_FN_DIR),
raw_features))
return tf.estimator.export.ServingInputReceiver(
transformed_features, inputs)
return serving_input_fn
The change was get the inputs from deprecated function from contrib - input_fn_utils and then apply transformation then following with creating an ServingInputReceiver() function and returning it!

Spring Data Neo4j Ridiculously Slow Over Rest

public List<Errand> interestFeed(Person person, int skip, int limit)
throws ControllerException {
person = validatePerson(person);
String query = String
.format("START n=node:ErrandLocation('withinDistance:[%.2f, %.2f, %.2f]') RETURN n ORDER BY n.added DESC SKIP %s LIMIT %S",
person.getLongitude(), person.getLatitude(),
person.getWidth(), skip, limit);
String queryFast = String
.format("START n=node:ErrandLocation('withinDistance:[%.2f, %.2f, %.2f]') RETURN n SKIP %s LIMIT %S",
person.getLongitude(), person.getLatitude(),
person.getWidth(), skip, limit);
Set<Errand> errands = new TreeSet<Errand>();
System.out.println(queryFast);
Result<Map<String, Object>> results = template.query(queryFast, null);
Iterator<Errand> objects = results.to(Errand.class).iterator();
return copyIterator (objects);
}
public List<Errand> copyIterator(Iterator<Errand> iter) {
Long start = System.currentTimeMillis();
Double startD = start.doubleValue();
List<Errand> copy = new ArrayList<Errand>();
while (iter.hasNext()) {
Errand e = iter.next();
copy.add(e);
System.out.println(e.getType());
}
Long end = System.currentTimeMillis();
Double endD = end.doubleValue();
p ((endD - startD)/1000);
return copy;
}
When I profile the copyIterator function it takes about 6 seconds to fetch just 10 results. I use Spring Data Neo4j Rest to connect with a Neo4j server running on my local machine. I even put a print function to see how fast the iterator is converted to a list and it does appear slow. Does each iterator.next() make a new Http call?
If Errand is a node entity then yes, spring-data-neo4j will make a http call for each entity to fetch all its labels (it's fault of neo4j, which doesn't return labels when you return whole node in cypher).
You can enable debug level logging in org.springframework.data.neo4j.rest.SpringRestCypherQueryEngine to log all cypher statements going to neo4j.
To avoid this call use #QueryResult http://docs.spring.io/spring-data/data-neo4j/docs/current/reference/html/#reference_programming-model_mapresult

Resources