I have a specific use case where I store the results from my one table in DynamoDB to be stored in a serialized manner in another DynamoDB.
Now when I use gson to deserialize the data being retrieved,
I get this error:
java.lang.RuntimeException: Unable to invoke no-args constructor for class java.nio.ByteBuffer. Register an InstanceCreator with Gson for this type may fix this problem.
at com.google.gson.internal.ConstructorConstructor$12.construct(ConstructorConstructor.java:210)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:186)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1.read(ReflectiveTypeAdapterFactory.java:103)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:196)
at com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper.read(TypeAdapterRuntimeTypeWrapper.java:40)
at com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter.read(MapTypeAdapterFactory.java:187)
at com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter.read(MapTypeAdapterFactory.java:145)
at com.google.gson.Gson.fromJson(Gson.java:810)
at com.google.gson.Gson.fromJson(Gson.java:775)
My method looks like this:
public void store(MyCustomObject obj) {
String primaryKey = obj.getKey();
List<Map<String, AttributeValue>> results = AmazonDynamoDB.query(...).getItems();
Gson gson = new Gson();
List<String>records = results .stream()
.map(mappedResult-> gson.toJson(mappedResult))
.collect(Collectors.toList());
Map<String, AttributeValue> attributeMap = transformToAttributeMap(records);
PutItemRequest putItemRequest = new PutItemRequest().withItem(attributeMap);
AmazonDynamoDB.putItem(...);
}
The method to retrieve the records looks something like this:
public void retrieve(String id) {
QueryRequest...
Map<String, AttributeValue> records = DynamoDB.query(...).getItems();
List<String> serializedRecords = new ArrayList<>();
List<AttributeValue> values = records.get("key");
for( AttributeValue attributeValue: values) {
serializedRecords.add(attributeValue.getS());
}
Gson gson = new Gson();
Type recordType = new TypeToken<Map<String, AttributeValue>>() { }.getType();
List<Map<String, AttributeValue>> actualRecords = serializedRecords.stream()
.map(record-> gson.fromJson(record, recordType))
.collect(Collectors.toList());
}
What am I doing wrong?
The problem is AttributeValue class has a field java.nio.ByteBuffer with name b. Gson tries to deserialize the data into it, but there is no default constructor for ByteBuffer class. Therefore gson cannot deserialize b field.
An alternative solution is with the new DynamoDB usage of AWS SDK. Following example should work:
AmazonDynamoDBClient client = new AmazonDynamoDBClient(
new ProfileCredentialsProvider());
Item item = new DynamoDB(client).getTable("user").getItem("Id", "user1");
String json = item.toJSON();
Item deserialized = Item.fromJSON(json);
You should modify the credentials provider according to your setup.
Not exactly the best workaround/answer, but I was able to do this:
Item item = new Item().withJSON("document", jsonStr);
Map<String,AttributeValue> attributes = InternalUtils.toAttributeValues(item);
return attributes.get("document").getM();
Related
I am using Spring and stored procedures to retrieve data from a mySQL database. I have the stored procedure and parameters working OK but I'm having problems mapping the result set. At the moment I have some truly ugly code to get the values and I'm sure there has to be a better, cleaner and more elegant way. Can anyone guide me to a better solution?
After the stored procedure class, I have:
List<String> outList = new ArrayList<String>();
Map<String,Object> outMap = execute(parameters_map);
List list = (List) outMap.get("#result-set-1");
for (Object object : list) {
Map map2 = (Map) object;
list.add(map2.get("runname"));
}
return outList;
runname is the column from the database query.
Is there a better way to achieve this?
Example from spring docs using RowMapper:
public class JdbcActorDao implements ActorDao {
private SimpleJdbcCall procReadAllActors;
public void setDataSource(DataSource dataSource) {
JdbcTemplate jdbcTemplate = new JdbcTemplate(dataSource);
jdbcTemplate.setResultsMapCaseInsensitive(true);
this.procReadAllActors = new SimpleJdbcCall(jdbcTemplate)
.withProcedureName("read_all_actors")
.returningResultSet("actors",
BeanPropertyRowMapper.newInstance(Actor.class));
}
public List getActorsList() {
Map m = procReadAllActors.execute(new HashMap<String, Object>(0));
return (List) m.get("actors");
}
// ... additional methods
}
took a while to interpret the Spring docs but I finally got there.
My solution:
SimpleJdbcCall simpleJdbcCall = new SimpleJdbcCall(jdbcTemplate)
.withProcedureName("DistinctRunNames")
.withoutProcedureColumnMetaDataAccess();
simpleJdbcCall.addDeclaredParameter(new SqlParameter("environment", Types.VARCHAR));
simpleJdbcCall.addDeclaredParameter(new SqlParameter("username", Types.VARCHAR));
simpleJdbcCall.addDeclaredParameter(new SqlParameter("test_suite", Types.VARCHAR));
SqlParameterSource parameters = new MapSqlParameterSource().addValue("environment", environment)
.addValue("username", username).addValue("test_suite", testSuite);
Map map = simpleJdbcCall.returningResultSet("runnames", new ParameterizedRowMapper<RunNameBean>() {
public RunNameBean mapRow(ResultSet rs, int rowNum) throws SQLException {
RunNameBean runNameBean = new RunNameBean();
runNameBean.setName(rs.getString("runname"));
return runNameBean;
}
}).execute(parameters);
return (List) map.get("runnames");
Had problems with expected parameters versus actual, had to break up the simpleJdbcCall object. Maps the results into a list beautifully.
Thank you for answers, helped me to learn about Spring mapping.
My client retrieves JSON content as below:
{
"table": "tablename",
"update": 1495104575669,
"rows": [
{"column5": 11, "column6": "yyy"},
{"column3": 22, "column4": "zzz"}
]
}
In rows array content, the key is not fixed. I want to retrieve the key and value and save into a Map using Gson 2.8.x.
How can I configure Gson to simply use to deserialize?
Here is my idea:
public class Dataset {
private String table;
private long update;
private List<Rows>> lists; <-- little confused here.
or private List<HashMap<String,Object> lists
Setter/Getter
}
public class Rows {
private HashMap<String, Object> map;
....
}
Dataset k = gson.fromJson(jsonStr, Dataset.class);
log.info(k.getRows().size()); <-- I got two null object
Thanks.
Gson does not support such a thing out of box. It would be nice, if you can make the property name fixed. If not, then you can have a few options that probably would help you.
Just rename the Dataset.lists field to Dataset.rows, if the property name is fixed, rows.
If the possible name set is known in advance, suggest Gson to pick alternative names using the #SerializedName.
If the possible name set is really unknown and may change in the future, you might want to try to make it fully dynamic using a custom TypeAdapter (streaming mode; requires less memory, but harder to use) or a custom JsonDeserializer (object mode; requires more memory to store intermediate tree views, but it's easy to use) registered with GsonBuilder.
For option #2, you can simply add the names of name alternatives:
#SerializedName(value = "lists", alternate = "rows")
final List<Map<String, Object>> lists;
For option #3, bind a downstream List<Map<String, Object>> type adapter trying to detect the name dynamically. Note that I omit the Rows class deserialization strategy for simplicity (and I believe you might want to remove the Rows class in favor of simple Map<String, Object> (another note: use Map, try not to specify collection implementations -- hash maps are unordered, but telling Gson you're going to deal with Map would let it to pick an ordered map like LinkedTreeMap (Gson internals) or LinkedHashMap that might be important for datasets)).
// Type tokens are immutable and can be declared constants
private static final TypeToken<String> stringTypeToken = new TypeToken<String>() {
};
private static final TypeToken<Long> longTypeToken = new TypeToken<Long>() {
};
private static final TypeToken<List<Map<String, Object>>> stringToObjectMapListTypeToken = new TypeToken<List<Map<String, Object>>>() {
};
private static final Gson gson = new GsonBuilder()
.registerTypeAdapterFactory(new TypeAdapterFactory() {
#Override
public <T> TypeAdapter<T> create(final Gson gson, final TypeToken<T> typeToken) {
if ( typeToken.getRawType() != Dataset.class ) {
return null;
}
// If the actual type token represents the Dataset class, then pick the bunch of downstream type adapters
final TypeAdapter<String> stringTypeAdapter = gson.getDelegateAdapter(this, stringTypeToken);
final TypeAdapter<Long> primitiveLongTypeAdapter = gson.getDelegateAdapter(this, longTypeToken);
final TypeAdapter<List<Map<String, Object>>> stringToObjectMapListTypeAdapter = stringToObjectMapListTypeToken);
// And compose the bunch into a single dataset type adapter
final TypeAdapter<Dataset> datasetTypeAdapter = new TypeAdapter<Dataset>() {
#Override
public void write(final JsonWriter out, final Dataset dataset) {
// Omitted for brevity
throw new UnsupportedOperationException();
}
#Override
public Dataset read(final JsonReader in)
throws IOException {
in.beginObject();
String table = null;
long update = 0;
List<Map<String, Object>> lists = null;
while ( in.hasNext() ) {
final String name = in.nextName();
switch ( name ) {
case "table":
table = stringTypeAdapter.read(in);
break;
case "update":
update = primitiveLongTypeAdapter.read(in);
break;
default:
lists = stringToObjectMapListTypeAdapter.read(in);
break;
}
}
in.endObject();
return new Dataset(table, update, lists);
}
}.nullSafe(); // Making the type adapter null-safe
#SuppressWarnings("unchecked")
final TypeAdapter<T> typeAdapter = (TypeAdapter<T>) datasetTypeAdapter;
return typeAdapter;
}
})
.create();
final Dataset dataset = gson.fromJson(jsonReader, Dataset.class);
System.out.println(dataset.lists);
The code above would print then:
[{column5=11.0, column6=yyy}, {column3=22.0, column4=zzz}]
I try to configure Gson as my JSON mapper to accept "snake_case" query parameter, and translate them into standard Java "camelCase" parameters.
First of all, I know I could use the #SerializedName annotation to customise the serialized name of each field, but this will involve some manual work.
After doing some search, I believe the following approach should work (please correct me if I am wrong).
Use Gson as the default JSON mapper of Spring Boot
spring.http.converters.preferred-json-mapper=gson
Configuring Gson before GsonHttpMessageConverter is created as described here
Customising the Gson naming policy in step 2 according to GSON Field Naming Policy
private GsonHttpMessageConverter createGsonHttpMessageConverter() {
Gson gson = new GsonBuilder()
.setFieldNamingPolicy(FieldNamingPolicy.LOWER_CASE_WITH_UNDERSCORES)
.create();
GsonHttpMessageConverter gsonConverter = new GsonHttpMessageConverter();
gsonConverter.setGson(gson);
return gsonConverter;
}
Then I create a simple controller like this:
#RequestMapping(value = "/example/gson-naming-policy")
public Object testNamingPolicy(ExampleParam data) {
return data.getCamelCase();
}
With the following Param class:
import lombok.Data;
#Data
public class ExampleParam {
private String camelCase;
}
But when I call the controller with query parameter ?camel_case=hello, the data.camelCase could not been populated (and it's null). When I change the query parameters to ?camelCase=hello then it could be set, which mean my setting is not working as expected.
Any hint would be highly appreciated. Thanks in advance!
It's a nice question. If I understand how Spring MVC works behind the scenes, no HTTP converters are used for #ModelAttribute-driven. It can be inspected easily when throwing an exception from your ExampleParam constructor or the ExampleParam.setCamelCase method (de-Lombok first) -- Spring uses its bean utilities that use public (!) ExampleParam.setCamelCase to set the DTO value. Another proof is that no Gson.fromJson is never invoked regardless how your Gson converter is configured. So, your camelCase confuses you because the default Gson instance uses this strategy as well as Spring does -- so this is just a matter of confusion.
In order to make it work, you have to create a custom Gson-aware HandlerMethodArgumentResolver implementation. Let's assume we support POJO only (not lists, maps or primitives).
#Configuration
#EnableWebMvc
class WebMvcConfiguration
extends WebMvcConfigurerAdapter {
private static final Gson gson = new GsonBuilder()
.setFieldNamingPolicy(LOWER_CASE_WITH_UNDERSCORES)
.create();
#Override
public void addArgumentResolvers(final List<HandlerMethodArgumentResolver> argumentResolvers) {
argumentResolvers.add(new HandlerMethodArgumentResolver() {
#Override
public boolean supportsParameter(final MethodParameter parameter) {
// It must be never a primitive, array, string, boxed number, map or list -- and whatever you configure ;)
final Class<?> parameterType = parameter.getParameterType();
return !parameterType.isPrimitive()
&& !parameterType.isArray()
&& parameterType != String.class
&& !Number.class.isAssignableFrom(parameterType)
&& !Map.class.isAssignableFrom(parameterType)
&& !List.class.isAssignableFrom(parameterType);
}
#Override
public Object resolveArgument(final MethodParameter parameter, final ModelAndViewContainer mavContainer, final NativeWebRequest webRequest,
final WebDataBinderFactory binderFactory) {
// Now we're deconstructing the request parameters creating a JSON tree, because Gson can convert from JSON trees to POJOs transparently
// Also note parameter.getGenericParameterType() -- it's better that Class<?> that cannot hold generic types parameterization
return gson.fromJson(
parameterMapToJsonElement(webRequest.getParameterMap()),
parameter.getGenericParameterType()
);
}
});
}
...
private static JsonElement parameterMapToJsonElement(final Map<String, String[]> parameters) {
final JsonObject jsonObject = new JsonObject();
for ( final Entry<String, String[]> e : parameters.entrySet() ) {
final String key = e.getKey();
final String[] value = e.getValue();
final JsonElement jsonValue;
switch ( value.length ) {
case 0:
// As far as I understand, this must never happen, but I'm not sure
jsonValue = JsonNull.INSTANCE;
break;
case 1:
// If there's a single value only, let's convert it to a string literal
// Gson is good at "weak typing": strings can be parsed automatically to numbers and booleans
jsonValue = new JsonPrimitive(value[0]);
break;
default:
// If there are more than 1 element -- make it an array
final JsonArray jsonArray = new JsonArray();
for ( int i = 0; i < value.length; i++ ) {
jsonArray.add(value[i]);
}
jsonValue = jsonArray;
break;
}
jsonObject.add(key, jsonValue);
}
return jsonObject;
}
}
So, here are the results:
http://localhost:8080/?camelCase=hello => (empty)
http://localhost:8080/?camel_case=hello => "hello"
i'm new to gson and i wonder how convert json data to LinkedHashMap<String, List<String>>
my json data is show like below:
{ "data":
{
"data1": ["asdf", "qwer"],
"data2": ["xczv", "aweqrfds123", "sfdgq234"],
"data3": ["dsafasd", "xcvr123", "sdfa324123"]
}
}
field names of json data of data are dynamic, so i want to convert json data of data to LinkedHashMap<String, List<String>>
how can i do that ?
You can use TypeToken to convert it into expected type with Gson#fromJson(Reader,Type)
As per JSON string it is LinkedHashMap<String,LinkedHashMap<String,ArrayList<String>>>
Sample code:
BufferedReader reader = new BufferedReader(new FileReader(new File("json.txt")));
Type type = new TypeToken<LinkedHashMap<String,LinkedHashMap<String,ArrayList<String>>>>() {}.getType();
LinkedHashMap<String,LinkedHashMap<String,ArrayList<String>>> data = new Gson().fromJson(reader, type);
LinkedHashMap<String,ArrayList<String>> innerMap = data.get("data");
System.out.println(new GsonBuilder().setPrettyPrinting().create().toJson(innerMap));
This is not how it works in Gson world - you can't convert JSON to any Java class you want, unless you want to do all of that manually. The common approach works as described below:
Create a Java class, which matches your JSON format, e.g. you can use a Java class generator described here: http://jsongen.byingtondesign.com/
Use GsonBuilder to read your Json from a file and to import it to the generated class
I've used that approach and the Java file that has been generated (after I've fixed a minor syntax error in your initial JSON) looks like this:
package com.json;
import java.util.List;
public class Data{
private List data1;
private List data2;
private List data3;
public List getData1(){
return this.data1;
}
public void setData1(List data1){
this.data1 = data1;
}
public List getData2(){
return this.data2;
}
public void setData2(List data2){
this.data2 = data2;
}
public List getData3(){
return this.data3;
}
public void setData3(List data3){
this.data3 = data3;
}
}
To start working with the newly created class you can use the template below:
is = new InputStreamReader(new FileInputStream(new File('<path-to-json>')), "UTF-8")/;
Gson gson = new GsonBuilder().create();
Data d = gson.fromJson(is, Data.class);
// Start using your d instance here
i am using JDBC template for getting data from database in Spring MVC.
my query is:
SELECT count(A.MEETING_ID),ITEM_TBL.REG_EMAIL FROM ITEM_TBL,MEETINGS_TBL WHERE ITEM_TBL.MEETING_ID=MEETINGS_TBL.MEETING_ID
GROUP BY ITEM_TBL.REG_EMAIL
this is returning rows like:
11 nishant#gmail.com
12 abhilasha#yahoo.com
13 shiwani#in.com
i want to store these value into Hash MAP. Can you please help how can i do this using JDBC TEMPLATE?
Thanks
You need ResultExtractor.
You can achieve that using below code.
String sql = "SELECT count(A.MEETING_ID),ITEM_TBL.REG_EMAIL FROM ITEM_TBL,MEETINGS_TBL WHERE ITEM_TBL.MEETING_ID=MEETINGS_TBL.MEETING_ID
GROUP BY ITEM_TBL.REG_EMAIL";
ResultExtractor mapExtractor = new ResultSetExtractor() {
public Object extractData(ResultSet rs) throws SQLException {
Map<String, String> mapOfKeys = new HashMap<String, String>();
while (rs.next()) {
String key = rs.getString("MEETING_ID");
String obj = rs.getString("REG_EMAIL");
/* set the business object from the resultset */
mapOfKeys.put(key, obj);
}
return mapOfKeys;
}
};
Map map = (HashMap) jdbcTemplate.query(sql.toString(), mapExtractor);