How to parse a json with dynamic property name in OIC? - oracle

I need to consume and parse incoming json from a third party system in my code. I used RestTemplate to do it. So the response from the system looks like below.
{ "data": { "05AAAFW9419M11Q": { "gstin": "05AAAFW9419M11Q", "error_cd": "SWEB_9035", "message": "Invalid GSTIN / UID" } } }
Now the problem is the property name ("05AAAFW9419M11Q" in this case) in dynamic and in the next response it would be another string. In this case, how can I parse this json as this is not fixed in Oracle Integration Cloud? Response wrapper is not capturing the data apart from the one that is used for configuring the adapter which is fair enough as fieldname itself is changing.
Is there is any workaround for this?

You will have to go to PL/SQL and dynamic SQL, and if it's always the value of gstin entry, you can get the path of the key with
select '$.data.' ||
json_query(js_column, '$.data.*.gstin') into v_key path from table_with_json_column where ... conditions... ;
(assuming there is only 1 "data" per JSON payload) to later build a dynamic query based on json_table.

Related

How to limit the amount of the alias in gqlgen library Go

I'm using gqlgen package to create GraphQL server. However, I can't limit the amount of the alias. FixedComplexityLimit limits the complexity of the query. It is possible in JS community thanks to graphql-no-alias npm package. I need that kind of thing.
I want to limit the amount of the alias to prevent the batching attack. Let's try to explain by giving an example.
query {
productsByIds(productIds: "353573855") {
active {
id
path
title
}
productsByIds2: productsByIds(productIds: "353573855") {
active {
id
path
title
}
}
}
The above query should give an error. However, the below should work. This is just an example I have more complex schemas that's why the complexity limit didn't work for me.
query {
productsByIds(productIds: "353573855") {
active {
id
path
title
}
products {
active {
id
path
title
}
}
}
I'm afraid you have to come with something on your own for that. If you think the request itself or the response could become too large, you can limit it in your router config. For example, with fiber you could do:
routerConfig := fiber.Config{ReadBufferSize: maxRequestSize, WriteBufferSize: maxResponseSize};
router := fiber.New(routerConfig)
router.Post("/graphql", adaptor.HTTPHandler(gqlHandler))
If it's just really the aliases you want to prevent, you need to parse the request. You can either do so by some custom middle ware before the request gets passed to the gqlHandler (advantage: you can stop parsing the request in total in case of an alias request, disadvantage: you're basically duplicating code from a library, and it needs to be parsed again later on if you don't drop the standard gqlHandler). Or, and that's what I propose, you check the parsed request.
import gqlLib "github.com/99designs/gqlgen/graphql"
...
oCtx := gqlLib.GetOperationContext(ctx)
fragmentToSelections := getFragmentsSelectionsByName(oCtx)
selectionSet := oCtx.Operation.SelectionSet
An alias can be detected by having an Alias that differs from the Name:
file is just the query root in this example. selectionSet[0] is an unaliased request, selectionSet[1] is.

Implementing filters in ORM golang

I am working on an api which takes in parameters for filters (as given below)
/api/endpoint?filter_key_1=filter_value_1&...
I've previously worked on spring where the criteria API allows for dynamically building SQL queries without much hassle. In golang I'm using gorm for handling the ORM operations. Is there anyway to build the queries with optional parameters without writing redundant code?.
For example:
If the request sent is:
/api/endpoint?fk_1=fv_1&fk_2=fv_2&fk_3=fv_3
Query generated should be :
select * from table where fk_1 = fv_1 AND fk_2 = fv_2 AND fk_3 = fv_3
but in case of :
/api/endpoint?fk_1=fv_1
Query generated should be:
select * from table where fk_1 = fv_1
Currently my approach is to check if each variable is present and build the query as a string :
query:="select * from table where "
if fk_1 != ""{
query += "fk_1 = fv_1"
}
... and so on
but this seems very awkward and error prone
Any help will be appreciated! Thanks
EDIT
Building on #bjornaer's answer what helped me was to get the map[string][]string in a form that I can send the same to gorm, map[string]interface{}.
This thread will help in the same.
Now there's no longer a need for redundant checks or string operations in the filters
so it seems to me your question has 2 parts:
you need to retrieve your query values from your url and
insert them to your db query
I don't see how you are handling your requests so let's assume you use the http package: from req.URL you get the URL object and from that calling the Query() method yields a map[string][]string of your query parameters, with those in a variable URLQuery let's pause and look at how you query with gorm:
db, err := gorm.Open(sqlite.Open("gorm.db"), &gorm.Config{
QueryFields: true,
})
here I open a sqlite, then you can pass a variable reference to fill with your query, for example:
result := db.Where(map[string]interface{}{"name": "jinzhu", "age": 20}).Find(&users)
now from the example above, replace your variable in:
result := db.Where(map[string]interface{}URLQuery).Find(&users)
you can find it in the docs

Parse a JSON object using ASPjson

In Classic ASP (VBScript) I can do a general request of POST using request.form or GET using request.querystring which would give me the entire string that was sent.
But, I now need to receive a JSON object from a client side location.
This is an example of what it might look like:
{
"firstName": "John",
"lastName" : "Smith",
"age" : 25
}
How do I request this entire object (which I will then be parsing with ASP.JSON)?
PS: I know that I can probably convert the JSON object to a string on the client side, and then parse as text on the server side, but that feels like a work-around rather than a straight solution.
First of all, I wouldn't use that AspJson, but this: https://github.com/rcdmk/aspJSON
Second, you're not receiving an object per se, but request that contains a "string version of the json object". In this case, probably bytes, thats why you're going to BinaryRead and convert it into a body first.
You then will be able to parse the body with any parser you want.
Now let's try to give you an example code:
<%Response.LCID = 1033%>
<!--#include file="__jsonObject.class.v3.8.1.asp" -->
Set UTF8Enc = CreateObject("System.Text.UTF8Encoding") ' .NET COMPONENT, required on the server app pool
Set JSON = new JSONobject
lngBytesCount = Request.TotalBytes
request_body = UTF8Enc.GetString(Request.BinaryRead(lngBytesCount))
Set request_json = JSON.parse(request_body)
first_name = request_json("firstName")
last_name = request_json("lastName")
age = request_json("age")

oracle apex with jmeter

i start to do a loading test with jmeter, i have a form where i can insert record, in the url of insert action there is a parameter called p_json, i do a very basic with table containing one column. the p_json is like this
{
"pageItems":{
"itemsToSubmit":[
{
"n":"P13_ROWID",
"v":"",
"ck":"9P9SjzLAQLGkBy_q7phVqLeAJFI"
},
{
"n":"P13_TEST",
"v":"fgjgghjhgjghjghjghjghjghjghj"
}
],
"protected":"QnL4629OYon2MxvQzUtEag",
"rowVersion":"",
"formRegionChecksums":{
}
},
"salt":"188736967333118203740478635219988206060"
}
how can i generate ck value in the json, the protected value... in my jmeter request
any documentation of how to do load testing with apex? is there an oracle tool to do this?
According to this answer
When Value Protected of an hidden item is set to YES a checksum is generated when your page is loaded. When you submit the page with a different value the checksum is not valid anymore and you get the error.
So my expectation is that it is the matter of simple correlation, to wit you need to extract the value from the previous response using a suitable JMeter Post-Processor, store it into a JMeter Variable and replace this QnL4629OYon2MxvQzUtEag value with the variable.

Multi get returns source as null after bulk update

I am using elastic search multi get for reading documents after bulk update. Its returning some document sources as null.
MultiGetRequestBuilder builder = client.prepareMultiGet();
builder.setRefresh(true);
builder.add(indexName, type, idsList);
MultiGetResponse multiResponse = builder.execute().actionGet();
for (MultiGetItemResponse response : multiResponse.getResponses())
{
String customerJson = response.getResponse().getSourceAsString();
System.out.println("customerJson::" + customerJson);
}
Any issues in my code? Thanks in advance.
When you say "some return sources as null", I assume the get response is marking them as not existing..?
If that's the case, then maybe :
some indexation request in the bulk are failing dur to mapping/random error.
You need to refresh your index between the indexation and the multiget (i.e : your docs are not available for search yet)
transportClient.admin().indices().prepareRefresh(index).execute();
good luck
EDIT : You answered your own question in the comment, but for readability's sake : when using get or multiget, if a routing key was used when indexing, it must be specified again during the get, else, a wrong shard is determined using default routing and the get fails.

Resources