Bing Search API is not responding constantly on the same query (images) - image

I am building a customized search engine with Bing API, when I get image results from Bing API, it is not responding constantly to the same query “mattress”.
It is so strange, once there are no results, at the other moment there are some results.
How can I get the constant results by the same query?
Please check this video : Screen Capture
Here is what is the difference :

Related

How can I get the raw query sent from Kibana?

When I use kibana to search logs, the response time is very slow. How can I grab the raw query sent to Elasitcsearch from Kibana? I'd like to analyse why the query is very slow and how to improve that.
You can view the raw query, response time, request time etc. from the "inspect" option - in the visualizations or the discover page of Kibana.

Using ElasticSearch Local version in postman

I am trying to Use my Elastic search server installed in my local machine to use Postman .i.e., With the help of Postman I want to Post Data and retrieve it with a get operation but unable to do it as I am getting error unknown key [High] for create index
So please help me with the same.
If you want to add a document to your index,
your url should look something like this ( for document ID 1 ) :
PUT http://localhost:9200/test/_doc/1
A good place to start :
https://www.elastic.co/guide/en/elasticsearch/reference/current/getting-started-index.html
For indexing document in the index
PUT http://localhost:9200/my_index/_doc/1
Retrieving indexed document
GET http://localhost:9200/my_index/_doc/1
Introduction:
Elasticsearch is a distributed, RESTful search and analytics engine capable of addressing a growing number of use cases. As the heart of the Elastic Stack, it centrally stores your data for lightning fast search, fine‑tuned relevancy, and powerful analytics that scale with ease.
Kibana is a free and open user interface that lets you visualize your Elasticsearch data and navigate the Elastic Stack. Do anything from tracking query load to understanding the way requests flow through your apps.
Logstash is a free and open server-side data processing pipeline that ingests data from a multitude of sources, transforms it, and then sends it to your favorite “stash.” .
Elasticsearch exposes itself through rest API so in this case you don't have to use logstash as we are directly adding data to elastic search
How to add it directly
you can create an index and type using :
{{url}}/index/type
where index is like a table and type is like just a unique data type that we will be storing to the index. Eg {{url}/movielist/movie
https://praveendavidmathew.medium.com/visualization-using-kibana-and-elastic-search-d04b388a3032

Scanning and finding sensitive data in an Elasticsearch index in an efficient way

What I have : Elastic search database for full text search purposes.
What my requirement is : In a given elasticsearch index, I need to detect some sensitive data like iban no, credit card no, passport no, social security no, address etc. and report them to the client. There will be checkboxes as input parameters. For instance, the client can select credit card no and passport no, then clicks detect button. After that, the system will start scanning index, and reports documents which include credit card no and passport no. It is aimed to have more than 200 sensitive data types, and clients will be able to make multiple selections over these types.
What I have done : I have created a C# application and used Nest library for ES queries. In order to detect each sensitive data type, I have created regular expressions and some special validation rules in my C# app which works well for manually given input string.
In my C# app, I have created a match all query with scroll api. When the user clicks detect button, my app is iterating all the source records which returns from scroll api,and for each record, the app is executing sensitive data finder codes based on client's selection.
The problem here is searching all source records in the ES index, extracting sensitive datas and preparing report as fast as possible with great amount of documents. I know ES is designed for full text search, not for scanning whole system and bringing data. However all data are in elasticsearch right now and I need to use this db to make detecting operation.
I am wondering if I can do this in a different and efficient way. Can this problem be solved with writing an elastic search plugin without a C# app? Or is there a better solution to scan the whole source data in ES index?
Thanks for suggestions.
Passport number, other sensitive information detection algorithm should run once, during indexing time, or maybe asynchronously as a separate job that will update documents with flags representing the presence of sensitive information. Based on the flag the relevant documents can be searched.
Search time analysis in this case will be very costly and should be avoided.

Emulating google search on elasticsearch

I am trying to emulate google search using elasticsearch.
When I search for "My name is kalam" in google, it shows results of "I am kalam" movie. How is google search able to provide this result and how is google checking for relevance of data searched? Is this implementable in elasticsearch as well?

GSA Search Result Logs

I am using GSA Appliance 7.2
We are planning to improve search experience for that I want to analyze search logs, In search logs we are getting only User Ip and search query link.
Other than this I want which link user clicked and in which page he got his search result like in 1st page or second page etc.
Please help me to get detailed search logs.
Thank you
There is the possibility to generate search reports in GSA admin console under Reports > Search Reports. These reports are sort of a summary in which you can see the following details:
Number of search queries per day;
Number of search queries per hour;
Top keywords;
Top search queries;
Position of clicks;
Page of clicks;
Top clicked URLs;
Top IPs of clients which are used to perform search queries.
With Reports > Serving Logs, you can per-query track clients and which search results where returned by GSA, but you can't analyze their click-behaviour and user-journey through a website. For that you'll need to implement Google Analytics, Omniture or any other web analytics. For a GSA-GA integration, please consult this document.

Resources