Elasticsearch with Yii 2.0: Error: Elasticsearch request failed: 7 - Failed to connect to ##.##.##.### port 9200: Connection refused - elasticsearch

I have Elasticsearch properly configured on my server. I can do everything from the command line using cURL. I can even connect to it using cURL from a PHP script outside Yii. However, I can't seem to get it to work from within Yii 2.0.
In my config, I have:
'elasticsearch' => [
'class' => 'yii\elasticsearch\Connection',
'nodes' => [
['http_address' => 'localhost:9200'],
// configure more hosts if you have a cluster
],
],
But when I try to do a simple query in Yii, I get this error. Note how it's using my server ip address rather than 'localhost' or '172.0.0.1'. Note: I've hashed out my ip address for sercurity.
Elasticsearch Database Exception – yii\elasticsearch\Exception
Elasticsearch request failed: 7 - Failed to connect to ##.##.##.### port 9200: Connection refused
Error Info: Array
(
[requestMethod] => GET
[requestUrl] => http://##.##.##.###:9200/profiles/profile/_search
[requestBody] => {"size":100,"query":{"match_all":{}}}
[responseHeaders] => Array
(
)
[responseBody] =>
)

I was able to fix this error by updating the version of Elasticsearch to something > 1.3.0 as this is the minimum requirement for YIISOFT/YII2-ELASTICSEARCH
run curl -X GET 'http://127.0.0.1:9200' to check what version you are running.

First follow this steps to download elastic search.
wget https://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-1.5.2.tar.gz
mkdir es
tar -xf elasticsearch-1.5.2.tar.gz -C es
cd es
./bin/elasticsearch
Then you must be able to access to localhost:9200 and get something like this below :
{
"name" : "Sigyn",
"cluster_name" : "elasticsearch",
"version" : {
"number" : "2.4.0",
"build_hash" : "ce9f0c7394dee074091dd1bc4e9469251181fc55",
"build_timestamp" : "2016-08-29T09:14:17Z",
"build_snapshot" : false,
"lucene_version" : "5.5.2"
},
"tagline" : "You Know, for Search"
}
Then secondly,follow instruction in https://github.com/yiisoft/yii2-elasticsearch. Then you are done

Related

OpenSearch docker instance only allowing HTTPS connections

I'm trying to get OpenSearch configured on my local machine, and am deploying it through docker-compose using the following configuration:
opensearch:
image: opensearchproject/opensearch:1.0.0
restart: unless-stopped
ports:
- "9200:9200"
- "9300:9300"
environment:
discovery.type: single-node
The instance starts successfully, however when trying to access it through the web interface, it only accepts HTTPS connections with the default basic auth credentials (admin:admin). i.e.
https://localhost:9200 asks me to enter administrator credentials, and upon doing so, returns an expected response:
{
"name" : "a39dcf825899",
"cluster_name" : "docker-cluster",
"cluster_uuid" : "d2ZBZDQRTyG6SvYlCmX3Iw",
"version" : {
"distribution" : "opensearch",
"number" : "1.0.0",
"build_type" : "tar",
"build_hash" : "34550c5b17124ddc59458ef774f6b43a086522e3",
"build_date" : "2021-07-02T23:22:21.383695Z",
"build_snapshot" : false,
"lucene_version" : "8.8.2",
"minimum_wire_compatibility_version" : "6.8.0",
"minimum_index_compatibility_version" : "6.0.0-beta1"
},
"tagline" : "The OpenSearch Project: https://opensearch.org/"
}
However when attempting to connect to the instance over HTTP, I get an empty response:
On chrome:
Using the OpenSearch Python client on a Django instance running in a separate Docker container (part of the same docker-compose.yml):
opensearchpy.exceptions.ConnectionError: ConnectionError(('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))) caused by: ProtocolError(('Connection aborted.', RemoteDisconnected('Remote end closed connection without response')))
For reference, the code I am using to connect the OpenSearch Python client to the OpenSearch instance is:
cls._os_client = OpenSearch(
[{"host": 'opensearch', "port": '9200'}],
use_ssl=False,
verify_certs=False,
ssl_assert_hostname=False,
ssl_show_warn=False
)
How can I configure OpenSearch to allow insecure HTTP connections?
You can disable security, just add DISABLE_SECURITY_PLUGIN=true to your env.

How to correctly implement elasticsearch on top of sql db datasource [duplicate]

In one of my project, I am planning to use ElasticSearch with MySQL.
I have successfully installed ElasticSearch. I am able to manage index in ES separately. but I don't know how to implement the same with MySQL.
I have read a couple of documents but I am a bit confused and not having a clear idea.
As of ES 5.x , they have given this feature out of the box with logstash plugin.
This will periodically import data from database and push to ES server.
One has to create a simple import file given below (which is also described here) and use logstash to run the script. Logstash supports running this script on a schedule.
# file: contacts-index-logstash.conf
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://localhost:3306/mydb"
jdbc_user => "user"
jdbc_password => "pswd"
schedule => "* * * * *"
jdbc_validate_connection => true
jdbc_driver_library => "/path/to/latest/mysql-connector-java-jar"
jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
statement => "SELECT * from contacts where updatedAt > :sql_last_value"
}
}
output {
elasticsearch {
protocol => http
index => "contacts"
document_type => "contact"
document_id => "%{id}"
host => "ES_NODE_HOST"
}
}
# "* * * * *" -> run every minute
# sql_last_value is a built in parameter whose value is set to Thursday, 1 January 1970,
# or 0 if use_column_value is true and tracking_column is set
You can download the mysql jar from maven here.
In case indexes do not exist in ES when this script is executed, they will be created automatically. Just like a normal post call to elasticsearch
Finally i was able to find the answer. sharing my findings.
To use ElasticSearch with Mysql you will require The Java Database Connection (JDBC) importer. with JDBC drivers you can sync your mysql data into elasticsearch.
I am using ubuntu 14.04 LTS and you will require to install Java8 to run elasticsearch as it is written in Java
following are steps to install ElasticSearch 2.2.0 and ElasticSearch-jdbc 2.2.0 and please note both the versions has to be same
after installing Java8 ..... install elasticsearch 2.2.0 as follows
# cd /opt
# wget https://download.elasticsearch.org/elasticsearch/release/org/elasticsearch/distribution/deb/elasticsearch/2.2.0/elasticsearch-2.2.0.deb
# sudo dpkg -i elasticsearch-2.2.0.deb
This installation procedure will install Elasticsearch in /usr/share/elasticsearch/ whose configuration files will be placed in /etc/elasticsearch .
Now lets do some basic configuration in config file. here /etc/elasticsearch/elasticsearch.yml is our config file
you can open file to change by
nano /etc/elasticsearch/elasticsearch.yml
and change cluster name and node name
For example :
# ---------------------------------- Cluster -----------------------------------
#
# Use a descriptive name for your cluster:
#
cluster.name: servercluster
#
# ------------------------------------ Node ------------------------------------
#
# Use a descriptive name for the node:
#
node.name: vps.server.com
#
# Add custom attributes to the node:
#
# node.rack: r1
Now save the file and start elasticsearch
/etc/init.d/elasticsearch start
to test ES installed or not run following
curl -XGET 'http://localhost:9200/?pretty'
If you get following then your elasticsearch is installed now :)
{
"name" : "vps.server.com",
"cluster_name" : "servercluster",
"version" : {
"number" : "2.2.0",
"build_hash" : "8ff36d139e16f8720f2947ef62c8167a888992fe",
"build_timestamp" : "2016-01-27T13:32:39Z",
"build_snapshot" : false,
"lucene_version" : "5.4.1"
},
"tagline" : "You Know, for Search"
}
Now let's install elasticsearch-JDBC
download it from http://xbib.org/repository/org/xbib/elasticsearch/importer/elasticsearch-jdbc/2.3.3.1/elasticsearch-jdbc-2.3.3.1-dist.zip and extract the same in /etc/elasticsearch/ and create "logs" folder also there ( path of logs should be /etc/elasticsearch/logs)
I have one database created in mysql having name "ElasticSearchDatabase" and inside that table named "test" with fields id,name and email
cd /etc/elasticsearch
and run following
echo '{
"type":"jdbc",
"jdbc":{
"url":"jdbc:mysql://localhost:3306/ElasticSearchDatabase",
"user":"root",
"password":"",
"sql":"SELECT id as _id, id, name,email FROM test",
"index":"users",
"type":"users",
"autocommit":"true",
"metrics": {
"enabled" : true
},
"elasticsearch" : {
"cluster" : "servercluster",
"host" : "localhost",
"port" : 9300
}
}
}' | java -cp "/etc/elasticsearch/elasticsearch-jdbc-2.2.0.0/lib/*" -"Dlog4j.configurationFile=file:////etc/elasticsearch/elasticsearch-jdbc-2.2.0.0/bin/log4j2.xml" "org.xbib.tools.Runner" "org.xbib.tools.JDBCImporter"
now check if mysql data imported in ES or not
curl -XGET http://localhost:9200/users/_search/?pretty
If all goes well, you will be able to see all your mysql data in json format
and if any error is there you will be able to see them in /etc/elasticsearch/logs/jdbc.log file
Caution :
In older versions of ES ... plugin Elasticsearch-river-jdbc was used which is completely deprecated in latest version so do not use it.
I hope i could save your time :)
Any further thoughts are appreciated
Reference url : https://github.com/jprante/elasticsearch-jdbc
The logstash JDBC plugin will do the job:
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://localhost:3306/testdb"
jdbc_user => "root"
jdbc_password => "factweavers"
# The path to our downloaded jdbc driver
jdbc_driver_library => "/home/comp/Downloads/mysql-connector-java-5.1.38.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
# our query
schedule => "* * * *"
statement => "SELECT" * FROM testtable where Date > :sql_last_value order by Date"
use_column_value => true
tracking_column => Date
}
output {
stdout { codec => json_lines }
elasticsearch {
"hosts" => "localhost:9200"
"index" => "test-migrate"
"document_type" => "data"
"document_id" => "%{personid}"
}
}
To make it more simple I have created a PHP class to Setup MySQL with Elasticsearch. Using my Class you can sync your MySQL data in elasticsearch and also perform full-text search. You just need to set your SQL query and class will do the rest for you.

Error while connecting Logstash and Elasticsearch

I am very very new to ELK, I installed ELK version 5.6.12 on CentOS sever. Elasticsearch and Kibana works fine. But I cannot connect Logstash to Elastic search.
I have set environment variable as
export JAVA_HOME=/usr/local/jdk1.8.0_131
export PATH=/usr/local/jdk1.8.0_131/bin:$PATH
I run simple test :
bin/logstash -e 'input { stdin { } } output { elasticsearch { host => localhost:9200 protocol => "http" port => "9200" } }'
I get error :
WARNING: Could not find logstash.yml which is typically located in
$LS_HOME/config or /etc/logstash. You can specify the path using --
path.settings. Continuing using the defaults
Could not find log4j2 configuration at path
/etc/logstash/logstash.yml/log4j2.properties. Using default config which
logs errors to the console
Simple "slash" mentioned in official documentation of Logstash works like following :
$bin/logstash -e 'input { stdin { } } output { stdout {} }'
Hello
WARNING: Could not find logstash.yml which is typically located in
$LS_HOME/config or /etc/logstash. You can specify the path using --
path.settings. Continuing using the defaults Could not find log4j2
configuration at path /usr/share/logstash/config/log4j2.properties.
Using default config which logs errors to the console
The stdin plugin is now waiting for input: {
"#version" => "1",
"host" => "localhost",
"#timestamp" => 2018-11-01T04:44:58.648Z,
"message" => "Hello" }
What could be the problem?

Reading Rabbit MQ From LogStash (ELK Instance)

I have 2 servers, one RabbitMQ, and one ELK server.
Both are independently running as they should. My ELK instance receives input messages from several sources across my network, but both servers are in the same internal network.
I am trying to make LogStash read from RabbitMQ, to get any log messages i pushed into there.
Here's my logstash conf.d file:
input{
rabbitmq {
host => "1.66.66.66"
queue => "logs"
durable => true
exchange => "event.log"
threads => 1
prefetch_count => 50
port => 5672
user => "elk"
password => "*******"
}
}
filter{
}
output {
elasticsearch {
hosts => ["1.66.66.1:9200"]
sniffing => false
manage_template => false
index => "%{[#metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[#metadata][type]}"
}
}
After i run a config test ($ sudo service logstash configtest)
if get: ConfigTest OK
So this seems good.
But then when it runs i get the following error in my "/var/log/logstash/logstash.log" file:
{:timestamp=>"2017-06-15T10:28:18.727000-0400", :message=>"Connection refused (Connection refused)", :class=>"Manticore::SocketException", :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.6.0-java/lib/manticore/response.rb:37:in `initialize'", "org/jruby/RubyProc.java:281:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.6.0-java/lib/manticore/response.rb:79:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.6.0-java/lib/manticore/response.rb:256:in `call_once'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.6.0-java/lib/manticore/response.rb:153:in `code'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.18/lib/elasticsearch/transport/transport/http/manticore.rb:84:in `perform_request'", "org/jruby/RubyProc.java:281:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.18/lib/elasticsearch/transport/transport/base.rb:257:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.18/lib/elasticsearch/transport/transport/http/manticore.rb:67:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.18/lib/elasticsearch/transport/transport/sniffer.rb:32:in `hosts'", "org/jruby/ext/timeout/Timeout.java:147:in `timeout'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.18/lib/elasticsearch/transport/transport/sniffer.rb:31:in `hosts'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.18/lib/elasticsearch/transport/transport/base.rb:79:in `reload_connections!'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:72:in `sniff!'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:60:in `start_sniffing!'", "org/jruby/ext/thread/Mutex.java:149:in `synchronize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:60:in `start_sniffing!'", "org/jruby/RubyKernel.java:1479:in `loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:59:in `start_sniffing!'"], :level=>:error}
If, from my ELK server do:
curl -X GET http://1.66.66.1:9200
I get the proper response:
{
"name" : "Hera",
"cluster_name" : "elasticsearch",
"cluster_uuid" : "-NmCwDk_RA2qzTwWlNNizQ",
"version" : {
"number" : "2.4.5",
"build_hash" : "c849dd13904f53e63e88efc33b2ceeda0b6a1276",
"build_timestamp" : "2017-04-24T16:18:17Z",
"build_snapshot" : false,
"lucene_version" : "5.5.4"
},
"tagline" : "You Know, for Search"
}
If you know anything i can try, that would be much appreciated, I am running Ubuntu 16.04 on both servers. Thanks!

ssh tunnel for elasticsearch

I am on a vpn which does not allow access to elasticsearch directly, so I am trying to ssh tunnel to an external box that has access.
I am tunneling with the following:
ssh -L 12345:<elastic_ip>-east-1.aws.found.io:9200
but then if I curl:
curl http://user:pass#localhost:12345
I get:
{"ok":false,"message":"Unknown cluster."}
Yet, if I try this from the box directly:
curl http://user:pass#<elastic_ip>-east-1.aws.found.io:9200
I get:
{
"status" : 200,
"name" : "instance",
"cluster_name" : “<cluster>”,
"version" : {
"number" : "1.7.2",
"build_hash" : “<build>“,
"build_timestamp" : "2015-09-14T09:49:53Z",
"build_snapshot" : false,
"lucene_version" : "4.10.4"
},
"tagline" : "You Know, for Search"
}
What am I doing wrong?
Here is how you can do it using #SSH tunneling with #Putty.
Below are the steps you need to take in order to configure SSH tunneling using Putty:
Download Putty from here and install it.
Configure Putty tunneling for Elasticsearch 9300 and 9200 ports as shown in the screenshot below:
After configuring you’ll need to open the SSH connection and make sure it is connected.
You may look at the SSH event log in order to validate your tunnel. Here is a link on how to do it.
Below is an #Elasticsearch code written in #Java that shows how to connect to the remote Elasticsearch cluster using local (9090 and 9093) ports forwarded over Putty SSH client.
public class App
{
public static void main( String[] args ) throws Exception
{
Settings settings = ImmutableSettings.settingsBuilder().
put("cluster.name", "my-cluster").build();
TransportClient client = new TransportClient(settings)
.addTransportAddress(
new netSocketTransportAddress(
"localhost", 9093));
CreateIndexResponse rs = client.admin().indices().create(
new CreateIndexRequest("tunnelingindex"))
.actionGet();
System.out.println(rs.isAcknowledged());
client.close();
}
}
The code creates an index named tunnelingindex on Elasticsearch.
Hope it helps.
This is a problem of HTTP protocol. It contains also hostnames and not only IP addresses and if you issue request on the localhost, this hostname is passed to the cluster.
There are basically two solutions, both quite hacky:
Set up your elasticsearch hostname to localhost so it will recognize your query.
Set up your /etc/hosts to direct <elastic_ip>-east-1.aws.found.io to your 127.0.0.1, connect to your ssh with direct IP and then curl to the real address.

Resources