As part of a school research, I would like to know if certain websites uses Elasticsearch as their search engine. Is there any way of doing so ?
Thanks !
There's no way to know what search engine is being used internally in a website unless they tell you.
But there are many use cases written in the official Elastic page. Writing the companies/services that use your product in your official page is a common practice.
Related
I'm really after some advice here please.
I have a membership website, and I am using WordPress to manage access to my paid subscriptions.
I'm interested in Kibana in terms of its' data visualization but my question is really one of whether I'm using it for the wrong purposes?
My point is: can Kibana be embedded and made to look white-labelled?
I'm pretty sure Kibana is really meant for internal use hence my question.
Thanks!
Yes that is possible. You can create various visualizations and have them integrated into one dashboard in Kibana.
This dashboard feature has capability to share these visualizations as discussed in this LINK. As an example, the share URL comes in the below format:
<iframe src="http://abc.myserver.com:5601/app/kibana#/dashboard/a6c99100-b2b2-11e7-8aa0-9fc1ad35f7e7?embed=true&_g=()" height="600" width="800"></iframe>
You can either share the current state of the dashboard with your users, or share them the most recent view of the dashboard whenever they reload their page.
Let me know if it helps.
i'm searching for working search ui demo or tutorial for building search UI/Frontend. At best for php or js.
I never builded a elasticsearch application, but I already made projects with lucene, solr, epoq and google search.
Already searched on inet but most example are very simple and incomplete.
Examples:
github.com/scotchfield/elasticsearch-react-example/
github.com/spalger/elasticsearch-angular-example
There also API for PHP and JS
www.elastic.co/guide/en/elasticsearch/client/javascript-api/current/quick-start.html
www.elastic.co/guide/en/elasticsearch/client/php-api/2.0/_quickstart.html
What a example should contain (from my view)
* Basic Search Field
* Filter based on es fields index
* Resultview
* Filter interaction with results
* Paging
I was thinking something like this is already exits, but found no matching one. I think better ask, before invest time in creating.
Thanks in Advance
densanki
I found this live demo interesting:
http://demo.searchkit.co/imdb
If you already know ElasticSearch query syntax and some basic concepts like aggregations you may test all that on https://demo.elastic.co/app/kibana. But again, this would require some basic knowledge of Elastic stack.
Otherwise if you want just touch/try most common Elasticsearch features without going into implementations details then you may check these 3 demos for 3rd party SearchKit UI component.
There's also Elastic UI that you might be interested in: https://elastic.github.io/eui/#/
If you want something ready-made for the time-series use-case, you might want to check Sematext Cloud. There's a simpler UI than Kibana there, but there's API access, too, so you can develop your own.
I want to scrape few websites and many suggested Scrapy. It is Python based and since I am very familiar with PHP I looked for alternatives.
I got a crawler PHPCrawl. I am not sure if it is just a crawler or will it provides scraping facility as well. If it can be used for scraping- will it support XPath or Regular expressions.
How can it be compared with Scrapy which is on Python.
Please suggest me which is best to use for scraping the websites.
Thanks
PHPCrawl is a pure crawler, it delivers found pages and their sourcecode to users "as they are" (together with some context-information). Therefor it's fast, it's able ot use multi processes and has tons of options to configure it.
Can't say much about Scrapy since i didn't use it so far.
Yes, of course.
But as i said, PHPCrawl delivers the page-sources, and you have to extract the data you want to extract from it.
Anyone have an idea about searching for users/images in MySpace using their corresponding APIs ? From the site they told that by using their API allows you to search for people, videos, images, music.But is it possible now?
it think these wiki might help you
http://wiki.developer.myspace.com/index.php?title=Albums_MySpace_v1
it shows how to fetch user images.cheers !!!
When it comes to URL Rewriting there are some alternatives these days like the IIS7 module or Urlrewriter.NET. However, as far as I can see those two are based on wildcards which I sadly cannot use.
My problem is that the data I'm working with have no real structure. A made up example:
Something.aspx?page=4 might be /Weapons/Flails/
Something.aspx=page=5 might be /Clothes/Dresses/Blue/
i.e. there is no clear match between page id and what kind of page it is pointing to. I guess this requires some kind of lookup (slugs?) in a db.
How would I implement this in the easiest way? Does any of the existing alternatives offer a solution to this or do I have to build my own module?
Thank you.
IIS7 had a regular Expression option not just wild card it also has Rewrite maps. which i think is what you are looking for, it could be your look up table.
when you set up a rewrite rule, use the drop down to find the regEx option.
also when you create the rule, the option for the map is there.
You can also use the Managed Fusion URL Rewriter and Reverse Proxy. It support the Apache mod_rewrite sytnax of configuring rewriting. And you can use this method described on my blog to create a module that can do a database lookup of these old ID's and redirect them to the correct location.
http://www.coderjournal.com/2008/12/creating-extension-module-net-url-rewriter-reverse-proxy/
Please contact me through my blog if you would like help setting up this type of rewriting.