Spring OAuth2: can't get additional information from ClientDetailsService - spring

I am using OAuth2 with Spring Security.
My authentication manager configures client via database: clients.jdbc(dataSource());
Everything works, but when requesting a token, I get an exception:
2017-04-28 11:14:39.656 WARN 1200 --- [io-8096-exec-10] o.s.s.o.p.c.JdbcClientDetailsService : Could not decode JSON for additional information: BaseClientDetails [clientId=myclientid, clientSecret=mysecret, scope=[myscope], resourceIds=[], authorizedGrantTypes=[authorization_code, refresh_token], registeredRedirectUris=null, authorities=[], accessTokenValiditySeconds=36000, refreshTokenValiditySeconds=36000, additionalInformation={}]
with trace:
org.codehaus.jackson.JsonParseException: Unexpected character ('a' (code 97)): expected a valid value (number, String, array, object, 'true', 'false' or 'null') at [Source: java.io.StringReader#557a138f; line: 1, column: 2]
...
...
where the column additional_information (varchar(4096)) in table client_details is "asdf" for this entry.
I also tried to change the type of additional_information from String to Map<String, Object> and inserting a string by putting and getting it into the map with the key 'info'. After this, I get the same error, with different trace:
org.codehaus.jackson.JsonParseException: Unrecognized token 'asdf': was expecting at [Source: java.io.StringReader#3e8a7a77; line: 1, column: 21]
...
...
How can I include additional_information into my token?

Related

Hi Team, I am having Spawn failed issue while deploying the jupiterhub in our Kubernetes namespace

I am having the below error while deploying Jupiterhub
Spawn failed: (422) Reason: error HTTP response headers:
,"status":"Failure","message":"PersistentVolumeClaim "" is invalid: [metadata.name: Required value: name or generateName is required, metadata.labels: Invalid value: "-42hattacharjee-5f-5f-41nusuya": a valid label must be an empty string or consist of alphanumeric characters, '-', '' or '.', and must start and end with an alphanumeric character (e.g. 'MyValue', or 'my_value', or '12345', regex used for validation is '(([A-Za-z0-9][-A-Za-z0-9.]*)?[A-Za-z0-9])?')]",
I have tried removing the username from the "pvcNameTemplate" parameter but still getting this error.

ElasticSearch hive SerializationError handler

Using Elastic search version 6.8.0
hive> select * from provider1;
OK
{"id","k11",}
{"id","k12",}
{"id","k13",}
{"id","k14",}
{"id":"K1","name":"Ravi","salary":500}
{"id":"K2","name":"Ravi","salary":500}
{"id":"K3","name":"Ravi","salary":500}
{"id":"K4","name":"Ravi","salary":500}
{"id":"K5","name":"Ravi","salary":500}
{"id":"K6","name":"Ravi","salary":"sdfgg"}
{"id":"K7","name":"Ravi","salary":"sdf"}
{"id":"k8"}
{"id":"K9","name":"r1","salary":522}
{"id":"k10","name":"r2","salary":53}
Time taken: 0.179 seconds, Fetched: 14 row(s)
ADD JAR /home/smrafi/elasticsearch-hadoop-6.8.0/dist/elasticsearch-hadoop-6.8.0.jar;
CREATE external TABLE hive_es_with_handler( data STRING)
STORED BY 'org.elasticsearch.hadoop.hive.EsStorageHandler'
TBLPROPERTIES(
'es.resource' = 'test_eshadoop/healthCareProvider',
'es.nodes' = 'vpc-pid-pre-prod-es-cluster-b7thvqfj3tp45arxl34gge3yyi.us-east-2.es.amazonaws.com',
'es.input.json' = 'yes',
'es.index.auto.create' = 'true',
'es.write.operation'='upsert',
'es.nodes.wan.only' = 'true',
'es.port' = '443',
'es.net.ssl'='true',
'es.batch.size.entries'='1',
'es.mapping.id' ='id',
'es.batch.write.retry.count'='-1',
'es.batch.write.retry.wait'='60s',
'es.write.rest.error.handlers' = 'es, ignoreBadRecords',
'es.write.data.error.handlers' = 'customLog',
'es.write.data.error.handler.customLog' = 'com.verisys.elshandler.CustomLogOnError',
'es.write.rest.error.handler.es.client.resource'="error_es_index/error",
'es.write.rest.error.handler.es.return.default'='HANDLED',
'es.write.rest.error.handler.log.logger.name' = 'BulkErrors',
'es.write.data.error.handler.log.logger.name' = 'SerializationErrors',
'es.write.rest.error.handler.ignoreBadRecords' = 'com.verisys.elshandler.IgnoreBadRecordHandler',
'es.write.rest.error.handler.es.return.error'='HANDLED');
insert into hive_es_with_handler10 select * from provider1;
Below is exception trace, it failed complaining the error.handler index is not present
Caused by: org.elasticsearch.hadoop.serialization.EsHadoopSerializationException: org.codehaus.jackson.JsonParseException: Unexpected character (',' (code 44)): was expecting a colon to separate field name and value at [Source: [B#1e3f0aea; line: 1, column: 7]
at org.elasticsearch.hadoop.serialization.json.JacksonJsonParser.nextToken(JacksonJsonParser.java:95)
at org.elasticsearch.hadoop.serialization.ParsingUtils.doFind(ParsingUtils.java:168)
at org.elasticsearch.hadoop.serialization.ParsingUtils.values(ParsingUtils.java:151)
at org.elasticsearch.hadoop.serialization.field.JsonFieldExtractors.process(JsonFieldExtractors.java:213)
at org.elasticsearch.hadoop.serialization.bulk.JsonTemplatedBulk.preProcess(JsonTemplatedBulk.java:64)
at org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.write(TemplatedBulk.java:54)
at org.elasticsearch.hadoop.hive.EsSerDe.serialize(EsSerDe.java:171)
at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:725)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:897)
at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:897)
at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:130)
at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:148)
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:550)
... 9 more
Caused by: org.codehaus.jackson.JsonParseException: Unexpected character (',' (code 44)): was expecting a colon to separate field name and value at [Source: [B#1e3f0aea; line: 1, column: 7]
at org.codehaus.jackson.JsonParser._constructError(JsonParser.java:1433)
at org.codehaus.jackson.impl.JsonParserMinimalBase._reportError(JsonParserMinimalBase.java:521)
at org.codehaus.jackson.impl.JsonParserMinimalBase._reportUnexpectedChar(JsonParserMinimalBase.java:442)
at org.codehaus.jackson.impl.Utf8StreamParser.nextToken(Utf8StreamParser.java:500)
at org.elasticsearch.hadoop.serialization.json.JacksonJsonParser.nextToken(JacksonJsonParser.java:93) ... 22 more
I tried to use the custom SerializationErrorHandler But it is of no use and Handler is not coming into context, Its completely stopping the job instead of continuing for the good records even After having default (HANDLED as the constant)
Seems you have invalid JSON
Mentioned in the docs, this is not handled by Hive
Serialization Error Handlers are not yet available for Hive. Elasticsearch for Apache Hadoop uses Hive’s SerDe constructs to convert data into bulk entries before being sent to the output format. SerDe objects do not have a cleanup method that is called when the object ends its lifecycle. Because of this, we do not support serialization error handlers in Hive as they cannot be closed at the end of the job execution

SpringBoot emoji input

I have a SpringBoot application. On one of the POST endpoints, in the body I want a field that can contain emoji.
In my request object I have the following field:
private final String name;
In the body of my request I have the following input:
"name" : "Hello, playground \Ud83d\Ude0a \Ud83e\Udd21\Ud83d\Udc68\U200d\Ud83c\Udf73 \U2663\Ufe0f"
which contains some emojis.
The exception I get is:
com.fasterxml.jackson.databind.JsonMappingException: Unrecognized character escape 'U' (code 85)
and
Caused by: com.fasterxml.jackson.core.JsonParseException: Unrecognized character escape 'U' (code 85)
I tried to set the following flag on my object mapper with no luck:
.configure(JsonParser.Feature.ALLOW_BACKSLASH_ESCAPING_ANY_CHARACTER, true)
Any ideas?

Unexpected character (')' (code 41)) in JMeter request

After I record, I used variable and run.
Body data:
{
"data: [
{
"gsVersion":"1.0",
"stepCode":"${stepCode}",
"stepName":"${stepName}",
"familyId":"${familyId}",
"listGeoEventAddOfStep": [
{
"id":0,
"geoEventCode":"${geoEventCode}",
"name":"${geoEventName}")},
"description":"${geoEventDes}"
}
]
}
]
}
However, it gave the response:
{
"timestamp":1514976739620,
"status":400,
"error":"Bad Request",
"exception":"org.springframework.http.converter.HttpMessageNotReadableException",
"message":"Could not read document: Unexpected character (')' (code 41)): was expecting comma to separate Object entries\n at [Source: java.io.PushbackInputStream#4fa24716; line: 1, column: 175]\n at [Source: java.io.PushbackInputStream#4fa24716; line: 1, column: 150] (through reference chain: com.geopost.controller.requestbody.RequestBodyList[\"data\"]->java.util.ArrayList[0]->com.geopost.dto.GeoStepAddDTO[\"listGeoEventAddOfStep\"]->java.util.ArrayList[0]); nested exception is com.fasterxml.jackson.databind.JsonMappingException: Unexpected character (')' (code 41)): was expecting comma to separate Object entries\n at [Source: java.io.PushbackInputStream#4fa24716; line: 1, column: 175]\n at [Source: java.io.PushbackInputStream#4fa24716; line: 1, column: 150] (through reference chain: com.geopost.controller.requestbody.RequestBodyList[\"data\"]->java.util.ArrayList[0]->com.geopost.dto.GeoStepAddDTO[\"listGeoEventAddOfStep\"]->java.util.ArrayList[0])",
"path":"/rest/steps"
}
Extracting the exception message for more clarity:
Could not read document: Unexpected character (')' (code 41)):
was expecting comma to separate Object entries
at [Source: java.io.PushbackInputStream#4fa24716; line: 1, column: 175]
at [Source: java.io.PushbackInputStream#4fa24716; line: 1, column: 150]
(through reference chain: com.geopost.controller.requestbody.RequestBodyList["data"]
->java.util.ArrayList[0]->com.geopost.dto.GeoStepAddDTO["listGeoEventAddOfStep"]
->java.util.ArrayList[0]);
nested exception is com.fasterxml.jackson.databind.JsonMappingException:
Unexpected character (')' (code 41)):
was expecting comma to separate Object entries
at [Source: java.io.PushbackInputStream#4fa24716; line: 1, column: 175]
at [Source: java.io.PushbackInputStream#4fa24716; line: 1, column: 150]
(through reference chain: com.geopost.controller.requestbody.RequestBodyList["data"]
->java.util.ArrayList[0]
->com.geopost.dto.GeoStepAddDTO["listGeoEventAddOfStep"]
->java.util.ArrayList[0])
How do I fix this?
you JSON is invalid, you have for example irrelevant ) sign, a valid JSON can be:
{"data":[{"gsVersion":"1.0","stepCode":"${stepCode}","stepName":"${stepName}","familyId":"${familyId}","listGeoEventAddOfStep":[{"id":0,"geoEventCode":"${geoEventCode}","name":"${geoEventName}"}],"description":"${geoEventDes}"}]}
you can check your json online.

Calabash-android: getting error when trying to press on object with coordinates and using 'perform_action' command

I’m trying to use rect output with perform action command.
For example:
query("* text:’Hello’", :y)
[
[0] 226.0
]
Trying:
perform_action('long_press_coordinate',200,y)
And getting the error:
RuntimeError: Action 'long_press_coordinate' unsuccessful: Can not deserialize instance of java.lang.String[] out of END_OBJECT token
at [Source: java.io.StringReader#412a8480; line: 1, column: 61] (through reference chain: sh.calaba.instrumentationbackend.Command["arguments"])
Is it a syntax issue that I’m dealing with or is it much more?
How do I ‘’turn’ the y value to a regular number?
I found code that works:
y=query("* text:’Hello’", :y)
perform_action('long_press_coordinate',200,y[0])
Hope it is helping.

Resources