How Can I execute this API using Zoho Deluge? - zoho

Could you please help me with that? I did try my best to get this function up and running?
Here are the articles that I viewed-
https://www.zoho.com/creator/help/script/invoking-a-function.html
curl --location --request POST 'https://gw.cmtelecom.com/v1.0/message' \
--header 'Content-Type: application/json' \
--data-raw '{
"messages": {
"authentication": {
"producttoken": "test"
},
"msg": [
{
"from": "00919538893819",
"to": [
{
"number": "00918892449978"
}
],
"body": {
"type": "auto",
"content": "This is a WhatsApp message"
},
"allowedChannels": [
"WhatsApp"
],
"richContent": {
"conversation": [
{
"template": {
"whatsapp": {
"namespace": "5c198301_106c_4fc2_a2f6_7556d8847746",
"element_name": "otp",
"language": {
"policy": "deterministic",
"code": "en"
},
"components": [
{
"type": "body",
"parameters": [
{
"type": "text",
"text": "Dhanush"
},
{
"type": "text",
"text": "627728289"
}
]
}
]
}
}
}
]
}
}
]
}
}'
Is there a different way we could execute this?

You should define a variable with the JSON data first, and then pass the data as a string to invokeUrl.
See the following example:
void APICall()
{
data = {"messages":{"authentication":{"producttoken":"******"},"msg":{{"from":"00919538893819","to":{{"number":"00918892449978"}},"body":{"type":"auto","content":"This is a WhatsApp message"},"allowedChannels":{"WhatsApp"},"richContent":{"conversation":{{"template":{"whatsapp":{"namespace":"5c198301_106c_4fc2_a2f6_7556d8847746","element_name":"otp","language":{"policy":"deterministic","code":"en"},"components":{{"type":"body","parameters":{{"type":"text","text":"Dhanush"},{"type":"text","text":"627728289"}}}}}}}}}}}}};
response = invokeUrl
[
url: "https://gw.cmtelecom.com/v1.0/message"
type: POST
parameters: data.toString()
headers: {"Content-Type": "application/json"}
];
info response;
}
Please, do not post API Authentication Tokens in your quetions!

Related

Elasticsearch completion suggester issue

Issue - completion suggester with custom keyword lowercase analyzer not working as expected. We can reproduce the issue with following steps.
Not able to understand whats causing issue here. However, if we search for "PRAXIS CONSULTING AND INFORMATION SERVICES PRIVATE" , it is giving result.
Create index
curl -X PUT "localhost:9200/com.tmp.index?pretty" -H 'Content-Type: application/json' -d'{
"mappings": {
"dynamic": "false",
"properties": {
"namesuggest": {
"type": "completion",
"analyzer": "keyword_lowercase_analyzer",
"preserve_separators": true,
"preserve_position_increments": true,
"max_input_length": 50,
"contexts": [
{
"name": "searchable",
"type": "CATEGORY"
}
]
}
}
},
"settings": {
"index": {
"mapping": {
"ignore_malformed": "true"
},
"refresh_interval": "5s",
"analysis": {
"analyzer": {
"keyword_lowercase_analyzer": {
"filter": [
"lowercase"
],
"type": "custom",
"tokenizer": "keyword"
}
}
},
"number_of_replicas": "0",
"number_of_shards": "1"
}
}
}'
Index document
curl -X PUT "localhost:9200/com.tmp.index/_doc/123?pretty" -H 'Content-Type: application/json' -d'{
"namesuggest": {
"input": [
"PRAXIS CONSULTING AND INFORMATION SERVICES PRIVATE LIMITED."
],
"contexts": {
"searchable": [
"*"
]
}
}
}
'
Issue - Complete suggest not giving result
curl -X GET "localhost:9200/com.tmp.index/_search?pretty" -H 'Content-Type: application/json' -d'{
"suggest": {
"legalEntity": {
"prefix": "PRAXIS CONSULTING AND INFORMATION SERVICES PRIVATE LIMITED.",
"completion": {
"field": "namesuggest",
"size": 10,
"contexts": {
"searchable": [
{
"context": "*",
"boost": 1,
"prefix": false
}
]
}
}
}
}
}'
You are facing this issue because of default value of max_input_length parameter is set to 50.
Below is description given for this parameter in documentation:
Limits the length of a single input, defaults to 50 UTF-16 code
points. This limit is only used at index time to reduce the total
number of characters per input string in order to prevent massive
inputs from bloating the underlying datastructure. Most use cases
won’t be influenced by the default value since prefix completions
seldom grow beyond prefixes longer than a handful of characters.
If you enter below string which is exact 50 character then you will get response:
PRAXIS CONSULTING AND INFORMATION SERVICES PRIVATE
Now if you add one more or two character to above string then it will not resturn the result:
PRAXIS CONSULTING AND INFORMATION SERVICES PRIVATE L
You can use this default behaviour or you can updated your index mapping with increase value of max_input_length parameter and reindex your data.
{
"mappings": {
"dynamic": "false",
"properties": {
"namesuggest": {
"type": "completion",
"analyzer": "keyword_lowercase_analyzer",
"preserve_separators": true,
"preserve_position_increments": true,
"max_input_length": 100,
"contexts": [
{
"name": "searchable",
"type": "CATEGORY"
}
]
}
}
},
"settings": {
"index": {
"mapping": {
"ignore_malformed": "true"
},
"refresh_interval": "5s",
"analysis": {
"analyzer": {
"keyword_lowercase_analyzer": {
"filter": [
"lowercase"
],
"type": "custom",
"tokenizer": "keyword"
}
}
},
"number_of_replicas": "0",
"number_of_shards": "1"
}
}
}
You will get response like below after updating index:
"suggest": {
"legalEntity": [
{
"text": "PRAXIS CONSULTING AND INFORMATION SERVICES PRIVATE LIMITED",
"offset": 0,
"length": 58,
"options": [
{
"text": "PRAXIS CONSULTING AND INFORMATION SERVICES PRIVATE LIMITED.",
"_index": "74071871",
"_id": "123",
"_score": 1,
"_source": {
"namesuggest": {
"input": [
"PRAXIS CONSULTING AND INFORMATION SERVICES PRIVATE LIMITED."
],
"contexts": {
"searchable": [
"*"
]
}
}
},
"contexts": {
"searchable": [
"*"
]
}
}
]
}
]
}

Copy field value to a new field in existing index

I have a document that has the structure with an field object with a nested field internally. The nested field is responsible for storing all interactions that occurred in an internal communication.
It happens that I need to create a new field inside the nested field, with a new type that will now be used to store the old field with a new parser.
How can I copy the data from the old field to the new field inside the nested field?
My document:
curl -XPUT 'localhost:9200/problems?pretty' -H 'Content-Type: application/json' -d '
{
"settings": {
"number_of_shards": 1
},
"mappings": {
"problem": {
"properties": {
"problemid": {
"type": "long"
},
"subject": {
"type": "text",
"index": true
},
"usermessage": {
"type": "object",
"properties": {
"content": {
"type": "nested",
"properties": {
"messageid": {
"type": "long",
"index": true
},
"message": {
"type": "text",
"index": true
}
}
}
}
}
}
}
}
}'
My New Field:
curl -XPUT 'localhost:9200/problems/_mapping/problem?pretty' -H 'Content-Type: application/json' -d '
{
"properties": {
"usermessage": {
"type": "object",
"properties": {
"content": {
"type": "nested",
"properties": {
"message_accents" : {
"type" : "text",
"analyzer" : "ignoreaccents"
}
}
}
}
}
}
}
'
Data Example:
{
"problemid": 1,
"subject": "Test",
"usermessage": [
{
"messageid": 1
"message": "Hello"
},
{
"messageid": 2
"message": "Its me"
},
]
}'
My script to copy fields:
curl -XPOST 'localhost:9200/problems/_update_by_query' -H 'Content-Type: application/json' -d '
{
"query": {
"match_all": {
}
},
"script": "ctx._source.usermessage.content.message_accents = ctx._source.usermessage.content.message"
}'
I tried the code below but it didn't work, it returns an error.
curl -XPOST 'localhost:9200/problems/_update_by_query' -H 'Content-Type: application/json' -d '
{
"query": {
"match_all": {
}
},
"script": "ctx._source.usermessage.content.each { elm -> elm.message_accents = elm.message }"
}
'
Error:
"script":"ctx._source.usermessage.content.each { elm -> elm.message_accents = elm.message }","lang":"painless","caused_by":{"type":"illegal_argument_exception","reason":"unexpected token ['{'] was expecting one of [{, ';'}]."}},"status":500}%

Bing Visual Search Not Returning Image Tag properly

Here is my response from Bing API Visual Search(v7).
Expected Response :
https://learn.microsoft.com/en-us/bing/search-apis/bing-visual-search/reference/response-objects
Actual Response:
{
"_type": "ImageKnowledge",
"instrumentation": {
"_type": "ResponseInstrumentation"
},
"tags": [
{
"displayName": "",
"actions": [
{
"actionType": "MoreSizes"
},
{
"actionType": "ImageById"
}
]
}
],
"image": {**strong text**
"imageInsightsToken": "ccid_w4YMbjM9"
},
"imageQualityHints": [
{
"category": "UnknownFormat"
}
],
"debugInfo": {}
}
Can any one help?

Google GA4 batchRunReports when API doesn't have records throw 500 (Internal Server Error)

https://developers.google.com/analytics/devguides/reporting/data/v1/rest/v1alpha/TopLevel/batchRunReports
Request :
{
"entity": {
"propertyId": "XXXXXXXX"
},
"requests": [
{
"entity": {
"propertyId": "XXXXXXXX"
},
"dimensions": [
{
"name": "date"
},
{
"name": "dateHour"
},
{
"name": "firstUserCampaignName"
}
],
"metrics": [
{
"name": "sessions"
}
],
"dateRanges": [
{
"startDate": "2021-04-06",
"endDate": "2021-04-07"
}
],
"metricAggregations": [
"TOTAL"
],
"dimensionFilter": {
"andGroup": {
"expressions": [
{
"filter": {
"fieldName": "medium",
"stringFilter": {
"matchType": "EXACT",
"value": "Test"
}
}
}
]
}
},
"orderBys": [
{
"desc": true,
"metric": {
"metricName": "sessions"
}
},
{
"desc": false,
"dimension": {
"dimensionName": "dateHour"
}
}
],
"keepEmptyRows": true
}
]
}
Response:
{
"error": {
"code": 500,
"message": "Internal error encountered.",
"status": "INTERNAL"
}
}
But if remove following property from request:
"metricAggregations": [
"TOTAL"
],
I am able to see following response where there is not rows :
{
"reports": [
{
"metricHeaders": [
{
"name": "sessions",
"type": "TYPE_INTEGER"
}
],
"metadata": {},
"dimensionHeaders": [
{
"name": "date"
},
{
"name": "dateHour"
},
{
"name": "firstUserCampaignName"
}
],
"kind": "analyticsData#runReport"
}
],
"kind": "analyticsData#batchRunReports"
}
Any idea how to prevent 500 internal server error in this case ?
This error block google API call for an hour.
Furqan, there seems to be an issue with the Data API where a call using metricAggregations is failing in case the generated report is empty. In the meantime, to workaround this error, you can modify a query so that the resulting report contains more than 0 rows.

ElasticSearch multiple AND/OR query

I have a schema like below -
{
"errorCode": "e015",
"errorDescription": "Description e015",
"storeId": "71102",
"businessFunction": "PriceFeedIntegration",
"createdDate": "2021-02-20T09:17:04.004",
"readBy": [
{
"userId": "scha3055"
},
{
"userId": "abcd1234"
}
]
}
I'm trying to search combination of "errorCode","storeId","businessFunction" with a date range like below -
{
"query": {
"bool": {
"must": [
{
"terms": {
"errorCode": [
"e015",
"e020",
"e022"
]
}
},
{
"terms": {
"storeId": [
"71102",
"71103"
]
}
},
{
"range": {
"createdDate": {
"gte": "2021-02-16T09:17:04.000",
"lte": "2021-02-22T00:00:00.005"
}
}
}
]
}
}
}
But when I add another condition with "businessFunction" the query does not work.
{
"query": {
"bool": {
"must": [
{
"terms": {
"errorCode": [
"e015",
"e020",
"e022"
]
}
},
{
"terms": {
"storeId": [
"71102",
"71103"
]
}
},
{
"terms": {
"errorDescription": [
"Description e020",
"71103"
]
}
},
{
"range": {
"createdDate": {
"gte": "2021-02-16T09:17:04.000",
"lte": "2021-02-22T00:00:00.005"
}
}
}
]
}
}
}
What am I missing in the query? When I add the third "terms" cndition , the query does not work. Please suggest or let me know any alternate way.
In your example you are searching for "Description e020" but in your example you stored "Description e015".
Short answer, I hope that's right for you:
"Description e015" will have been indexed as the two terms ["description","e015"].
use match_phrase instead of terms
...
{
"match_phrase": {
"errorDescription": "Description e015"
}
},
{
"range": {
"createdDate": {
"gte": "2021-02-16T09:17:04.000",
"lte": "2021-02-22T00:00:00.005"
}
}
}
....
Without knowing your mapping, I think that your errorDescription field its analyzed.
Other option not recommended:
If your field its analyzed and you require match exact, search in errorDescription.keyword
{
"terms": {
"errorDescription.keyword": [
"Description e015"
]
}
}
UPDATE
Long answer:
As I mentioned previously maybe, your field value was analyzed, then converted from "PriceFeedIntegration2" to pricefeedintegration2.
2 options
Search by your field.keyword aka businessFunction.keyword
Change your field mapping to not analyzed. Then you can get results just as you expect using terms.
Option: 1
It's the easy way, if you never run full text searches on that field, better not use as default. If it does not matter, use this option, it is the simplest.
Check your businessFunction.keyword field (created by default if you dont specify mapping)
Indexing data without mapping on my000001 index
curl -X "POST" "http://localhost:9200/my000001/_doc" \
-H "Content-type: application/json" \
-d $'
{
"errorCode": "e015",
"errorDescription": "Description e015",
"storeId": "71102",
"businessFunction": "PriceFeedIntegration",
"createdDate": "2021-02-20T09:17:04.004"
}'
Check
curl -X "GET" "localhost:9200/my000001/_analyze" \
-H "Content-type: application/json" \
-d $'{
"field": "businessFunction.keyword",
"text": "PriceFeedIntegration"
}'
Result:
{
"tokens": [
{
"token": "PriceFeedIntegration",
"start_offset": 0,
"end_offset": 20,
"type": "word",
"position": 0
}
]
}
Get the results using businessFunction.keyword
curl -X "GET" "localhost:9200/my000001/_search" \
-H "Content-type: application/json" \
-d $'{
"query": {
"bool": {
"must": [
{
"terms": {
"errorCode": [
"e015",
"e020",
"e022"
]
}
},
{
"terms": {
"storeId": [
"71102",
"71103"
]
}
},
{
"terms": {
"businessFunction.keyword": [
"PriceFeedIntegration2",
"PriceFeedIntegration"
]
}
},
{
"range": {
"createdDate": {
"gte": "2021-02-16T09:17:04.000",
"lte": "2021-02-22T00:00:00.005"
}
}
}
]
}
}
}' | jq
Why isn't recommended as default option?
"The default dynamic string mappings will index string fields both as
text and keyword. This is wasteful if you only need one of them."
https://www.elastic.co/guide/en/elasticsearch/reference/current/tune-for-disk-usage.html
Option 2
Run on my000001 index
curl -X "GET" "localhost:9200/my000001/_analyze" \
-H "Content-type: application/json" \
-d $'{
"field": "businessFunction",
"text": "PriceFeedIntegration"
}'
You can see, that your field value was analyzed(tokenized, lowercase, and others modifications depending of the analyzer and the value provided)
Results:
{
"tokens": [
{
"token": "pricefeedintegration",
"start_offset": 0,
"end_offset": 20,
"type": "<ALPHANUM>",
"position": 0
}
]
}
That is the reason why your search doesn't return results.
"PriceFeedIntegration" doesn't match with "pricefeedintegration"
"The problem isn’t with the term query; it is with the way the data
has been indexed."
Your businessFunction field value was analyzed.
If you require find(search/filter) by exact values, maybe you need to change your "businessFunction" field mapping to not_analyzed.
Change your mapping require delete your index and create again providing the required mapping.
If you try to change the mapping of an existing index you will get an "resource_already_exists_exception" error.
Here is the background that you need to know in order to solve your problem:
https://www.elastic.co/guide/en/elasticsearch/guide/master/_finding_exact_values.html#_finding_exact_values
Create a Mapping on a new my000005 index
curl -X "PUT" "localhost:9200/my000005" \
-H "Content-type: application/json" \
-d $'{
"mappings" : {
"properties" : {
"businessFunction" : {
"type" : "keyword"
},
"errorDescription" : {
"type" : "text"
},
"errorCode" : {
"type" : "keyword"
},
"createdDate" : {
"type" : "date"
},
"storeId": {
"type" : "keyword"
}
}
}
}'
Indexing data
curl -X "POST" "http://localhost:9200/my000005/_doc" \
-H "Content-type: application/json" \
-d $'
{
"errorCode": "e015",
"errorDescription": "Description e015",
"storeId": "71102",
"businessFunction": "PriceFeedIntegration",
"createdDate": "2021-02-20T09:17:04.004"
}'
Get the results, that you expect using terms businessFunction
curl -X "GET" "localhost:9200/my000005/_search" \
-H "Content-type: application/json" \
-d $'{
"query": {
"bool": {
"must": [
{
"terms": {
"errorCode": [
"e015",
"e020",
"e022"
]
}
},
{
"terms": {
"storeId": [
"71102",
"71103"
]
}
},
{
"terms": {
"businessFunction": [
"PriceFeedIntegration2",
"PriceFeedIntegration"
]
}
},
{
"range": {
"createdDate": {
"gte": "2021-02-16T09:17:04.000",
"lte": "2021-02-22T00:00:00.005"
}
}
}
]
}
}
}' | jq
This answer is based on what I think is your mapping and your needs.
In the future share your mapping and your ES version, in order to get a better answer from the community.
curl -X "GET" "localhost:9200/yourindex/_mappings"
Please read this https://www.elastic.co/guide/en/elasticsearch/guide/master/_finding_exact_values.html#_finding_exact_values
and this https://www.elastic.co/blog/strings-are-dead-long-live-strings

Resources