I want to figure out how to search a table's field for a keyword and return the rows that match. i.e. search for the word apple in the description field on a post table.
https://docs.servicenow.com/bundle/newyork-application-development/page/integrate/inbound-rest/concept/c_TableAPI.html#r_TableAPI-GET
If you are using ServiceNow TABLE API to fetch results then you could the following on any column of any table. All you need to do is to use sysparm_query parameter with CONTAINS keyword on the column that you are trying to search.
For example:
I want to fetch all the incidents that contains unable to connect string in the descriptions.
https://myinstance.service-now.com/api/now/v2/table/incident?sysparm_query=descriptionCONTAINSunable+to+connect
It returned all the incidents records that contains unable to connect string in there descriptions.
Related
Hello everyone please I am working on a system and need to add a query that receives a search term stored with the variable $search and fetch all matching values for a specific key in a json field from database.
I have tried the code bellow
$query = Book::whereJsonContains('book_details->author',['like'=>"%{$search}%"])->get();
Book is my model, book_details is the json field name and author is the key. I want to retrieve every book with authors related to the search term
In Laravel, you can perform a query that retrieves all matching values for a specific key in a JSON field by utilizing the whereJsonContains method. A revised version of your code could look like this:
$query = Book::whereJsonContains('book_details->author', $search)->get();
This query will return all Book records where the value of the author key in the book_details JSON field contains the search term stored in the $search variable.
Note that you don't need to wrap the search term in % characters, as the whereJsonContains method performs a case-insensitive search for matching values by default.
I have an index named web with a field called csr that has a datatype of string -- the reason why we made it so, is because we have migrated from Mysql and is now trying to use Elasticsearch.
The csr field signifies the linked/joined data with another table during the Mysql phase (It shows that the web records HAS MANY linked csr records)
Here's an example screenshot of the index using Kibana:
Now, I'm trying to do a range query filter to display all the documents that belongs to a certain range of csr.csr_story_value. For example, I want to display documents that has a csr.csr_story_value values ranging from 4 to 5. However, since there are multiple json elements in the csr field, I presume that the query points out to the highest value in the csr field. Here's an example screenshot:
MAIN QUESTION
Is there a way in Elastcisearch to insert a match query in this range query so I can specifically match my kgp_id and cli_id to just specifically pick a certain json element in this field?
I am looking to create URL on string field called "session name" in kibana interface to be redirected to an other search.
In my URL I need to use two queries, one based on the field value, for that I used '{{value}}' and the second one using the #timestamp value.
I have bieng trying "{{rawValue:#timestamp}}" but in vain.
My goal is two link two searchs based one the fields "session name" and the timestamp.
Any one can help?
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"4eb9f840-3969-11e8-ae19-552e148747c3\",\"filter\":[],\"query\":{\"language\":\"lucene\",\"query\":\"\"}}"
}
The above mentioned snippet is a exported JSON of a Kibana visualization. Without exporting this json is there a direct way to get this
\"index\":\"4eb9f840-3969-11e8-ae19-552e148747c3\ index id.
And if i am not wrong this is supposed to be the index id as its common across visualization with same index.
So, you can retrieve all index patterns using this query
GET .kibana/_search?q=type:index-pattern&size=100
Additionally, you can retrieve a specific set of index pattern given its name using
GET .kibana/_search?q=type:index-pattern%20AND%20index-pattern.title:indexname
Similarly, regarding visualizations, you can retrieve one by name using
GET .kibana/_search?q=type:visualization%20AND%20visualization.title:vizname
I have many JSON documents to be stored in elastic Search.
All the JSON documents have few fields which always exists but
some of the fields need not exist all the time.
So Maximum there would be 5000 fields in the complete JSON.
Some of the fields can have fields which can have either string or Date Type.And New fields can also be added to new documents.
Problems Faced:
As some of the fields,can have either date type or String type , elastic search gives an exception while inserting new documents
Exception:
Wrong mapping type current type [Date] mapped Type [text]
Question1:
Can we type of a field to (*) instead of Date or String that can store any type?
Can this be done after insertion or it should be declared initially and then only documents should be added?
But if new fields are to be added i have to re-insert all the documents
Question2:
Although there are 5000 fields can we put index only on certain fields?
If Yes how?
I want something similar to SQL:
Step 1: Create Table with 5000 columns.
Step 2: Create index only on 10 fields
Step 3: If More indexes are needed add dynamically
Are the same things possible in elastic search?
I mean the can it be done in the following way?
1.First insert documents with no fields as indexes
2.Then index only certain fields which are needed?
3.Add More indexes if required on already existing documents?