Serializing an object to a JSON input stream using GSON? - gson

I'm not sure if I'm asking for a right thing, but is it possible to make the GSON Gson.toJson(...) methods family work in "streaming mode" while serializing to JSON? Let's say, sometimes there are cases when using Appendable is not possible:
final String json = gson.toJson(value);
final byte[] bytes = json.getBytes(charset);
try ( final InputStream inputStream = new ByteArrayInputStream(bytes) ) {
inputStreamConsumer.accept(inputStream);
}
The example above is not perfect in this scenario, because:
It generates a string json as a temporary buffer.
The json string produces a new byte array just to wrap it up into a ByteArrayInputStream instance.
I think it's not a big problem to write a CharSequence to InputStream adapter and get rid of creating the byte array clone, but I still couldn't get rid of generating the string temporary buffer to use the inputStreamConsumer efficiently. So, I'd expect something like:
try ( final InputStream inputStream = gson.toJsonInputStream(value) ) {
inputStreamConsumer.accept(inputStream);
}
Is it possible using just GSON somehow?

According to this comment, this cannot be done using GSON.

Related

How to merge a serialized protobuf with a another protobuf without deserializing the first

I'm trying to understand if it's possible to take a serialized protobuf that makes up part of another protobuf and merge them together without having to deserialize the first protobuf.
For example, given a protobuf wrapper:
syntax = "proto2";
import "content.proto";
message WrapperContent {
required string metatData = 1;
required Content content = 2;
}
And then imagine we get a serialized version of Content below (i.e. Content is coming that is coming from a remote client):
syntax = "proto2";
message Content {
required string name = 1;
required bytes payload = 2;
}
Do you know if any way I can inject the serialized Content into the WrapperContent without first having to deserialize Content.
The reason I'm trying to inject Content without deserializing it, is that I'm try and save on the overhead of deserializing the message.
If that answer is, no, it's not possible. That is still helpful.
Thanks, Mike.
In protobuf, submessages are stored like bytes fields.
So you can make a modified copy of your wrapper:
message WrapperContentBytes {
required string metatData = 1;
required bytes content = 2;
}
and write the already serialized content data into the content field.
Decoders can use the unmodified WrapperContent message to decode also the submessage. The binary data on the wire will be the same so decoders do not know the difference.

Java Stream BufferedReader file stream

I am using Java 8 Streams to create stream from a csv file.
I am using BufferedReader.lines(), I read the docs for BufferedReader.lines():
After execution of the terminal stream operation there are no guarantees that the reader will be at a specific position from which to read the next character or line.
public class Streamy {
public static void main(String args[]) {
Reader reader = null;
BufferedReader breader = null;
try {
reader = new FileReader("refined.csv");
} catch (FileNotFoundException e) {
e.printStackTrace();
}
breader = new BufferedReader(reader);
long l1 = breader.lines().count();
System.out.println("Line Count " + l1); // this works correctly
long l2 = breader.lines().count();
System.out.println("Line Count " + l2); // this gives 0
}
}
It looks like after reading the file for first time, reader does not get to beginning of the file. What is the way around for this problem
It looks like after reading the file for first time, reader does not get to beginning of the file.
No - and I don't know why you would expect it to given the documentation you quoted. Basically, the lines() method doesn't "rewind" the reader before starting, and may not even be able to. (Imagine the BufferedReader wraps an InputStreamReader which wraps a network connection's InputStream - once you've read the data, it's gone.)
What is the way around for this problem
Two options:
Reopen the file and read it from scratch
Save the result of lines() to a List<String>, so that you're then not reading from the file at all the second time. For example:
List<String> lines = breader.lines().collect(Collectors.toList());
As an aside, I'd strongly recommend using Files.newBufferedReader instead of FileReader - the latter always uses the platform default encoding, which isn't generally a good idea.
And for that matter, to load all the lines into a list, you can just use Files.readAllLines... or Files.lines if you want the lines as a stream rather than a list. (Note the caveats in the comments, however.)
Probably the cited fragment from JavaDoc needs to be clarified. Usually you would expect that after reading the whole file reader will point to the end of the file. But using streams it depends on whether short-circuit terminal operation is used and whether the stream is parallel. For example, if you use
String magicLine = breader.lines()
.filter(str -> str.startsWith("magic"))
.findAny()
.orElse(null);
Your reader will likely to stop after the first found line (because no need to read further) or read the whole input file if such line is not found. If you make the same operation in parallel stream, then the resulting position will be unpredictable, because input will be split to some implementation-dependent chunks where the search will be performed. That's why it's written this way in the documentation.
As for workaround ways, please read the #JonSkeet answer. And consider closing your streams via try-with-resource construct.
If there are no guarantees that the reader will be at a specific line, why wouldn't you create two readers?
reader1=new FileReader("refined.csv");
reader2=new FileReader("refined.csv");

Can I write different types of messages to a chronicle-queue?

I would like to write different types of messages to a chronicle-queue, and process messages in consumers depending on their types.
How can I do that?
Chronicle-Queue provides low level building blocks you can use to write any kind of message so it is up to you to choose the right data structure.
As example, you can prefix the data you write to a chronicle with a small header with some meta-data and then use it as discriminator for data processing.
To achieve this I use Wire
try (DocumentContext dc = appender.writingDocument())
{
final Wire wire = dc.wire();
final ValueOut v = wire.getValueOut();
valueOut.typePrefix(m.getClass());
valueOut.marshallable(m);
}
When reading back I:
try (DocumentContext dc = tailer.readingDocument())
{
final Wire wire = dc.wire();
final ValueIn valueIn = wire.getValueIn();
final Class clazz = valueIn.typePrefix();
// msgPool is a prealloacted hashmap containing the messages I read
final ReadMarshallable readObject = msgPool.get(clazz);
valueIn.readMarshallable(readObject)
// readObject can now be used
}
You can also write/read a generic object. This will be slightly slower than using your own scheme, but is it a simple way to always read the type you wrote.

Serializing JsonArray as array and not object using Gson

I'm using Gson library but when it serializes the JsonArray object, it seems to serialize this as an object rather than a JSON array. i.e.
{ elements: [ {name:"value1}, {name:"value2"}]}
How do I remove the elements from being serialized?
I went to see the doctor, because my foot hurt when I walked on it. The doctor said, "Don't walk on it."
Generally, when working with an API like Gson, one would rather not even know that JsonArray exists, and they'd instead just use the data binding part of the API. So, instead of manually building a JsonArray, and then deal with serializing it, just feed a Java List or array to Gson.toJson(). For example:
List list = new ArrayList();
list.add("one");
list.add(2);
list.add(new Foo());
Gson gson = new Gson();
String json = gson.toJson(list);
System.out.println(json);
If that approach doesn't fit your needs and you're stuck using a JsonArray for some reason, then you might be tempted to just call its toString() method, since that does currently create what's ultimately desired, I wouldn't use it, because there is nothing in the documentation that says the toString() is guaranteed to create a valid JSON representation of the enclosed array contents. So, it might not return a String of the same format in future releases of Gson.
At any rate, if you really want to use a JsonArray, it should serialize well enough as follows.
JsonElement one = new JsonPrimitive("one");
JsonElement two = new JsonPrimitive(2);
JsonObject foo = new JsonObject();
foo.addProperty("foo", new Foo().foo);
JsonArray jsonArray = new JsonArray();
jsonArray.add(one);
jsonArray.add(two);
jsonArray.add(foo);
System.out.println(new Gson().toJson(jsonArray));
// ["one",2,{"foo":"hi"}]
Note: This answer is based on the Gson 2.2 API. I don't recall whether earlier versions of Gson included the overloaded toJson(JsonElement) methods.
If the toJson method is already being used in this fashion (to serialize a JsonArray), but the output is as demonstrated in the original question, recall that Java doesn't consider the runtime type when selecting amongst overloaded methods. It binds to the compile time type. (Lame -- I know.) So, you may need to cast the argument type to JsonElement, to let the compiler know which method to bind to. The following demonstrates what might be effectively happening in the original question.
System.out.println(new Gson().toJson((Object)jsonArray));
// {"elements":["one",2,{"foo":"hi"}]}

Map Support in Shell Scripting

I am new in Shell Scripting, however i am friendly with Java Maps. I Just wanted to know that how can i use Map facility in Shell Scripting. Below is the facility i need to use in shell-
HashMap<String, ArrayList<String>> users = new HashMap<String, ArrayList<String>>();
String username = "test_user1";
String address = "test_user1_address";
String emailId = "test_user1_emailId";
ArrayList<String> values = new ArrayList<String>();
values.add(address);
values.add(emailId);
users.put(username, values);
String anotherUser = "test_user2";
if (users.containsKey(anotherUser)) {
System.out.println("Do some stuff here");
}
In short, i want to use a Map, which has String as key, either Vector or ArrayList as value (otherwise i have live with Arrays instead of ArrayList and manually take care of indexes) , put method to insert and one more method to check the presence of the key in the existing Map.
The above code is a sample code.
Thank you in advance.
bash does not support nested structures like this. Either use separate variables for each array, or use something more capable such as Python.

Resources