ElasticSearch and Redis Remote Servers - laravel

I deployed a Laravel application on AWS Elasticbeanstalk.
I want to incorporate caching with Redis as my cache driver as well as Elasticsearch.
I managed to run these 2 features locally (redis on port 6379 and elasticsearch on 9200),
but now I want them to run on remote servers and I simply specify their endpoints in my .env file.
Can anyone let me know how I can obtain remote URLs for Redis and Elasticsearch?
Update:
I found out that Heruko offers the ability to create a Redis instance and thereby one can obtain a URL for Redis. I presume a similar thing is for Elasticsearch.
If this is not the right way to do so, please let me know how it works

Related

Elasticsearch - Collecting logs from devices not on server LAN

I am trying to build familiarity with SIEM systems in general and decided to set up an Elastic Stack via Digital Ocean. Everything was successful and my server as localhost is producing logs. It's been interesting to tinker with visualizations and that good stuff.
Obviously my interest isn't in logs from this remote server, though. I would like to configure some devices on my home network to send logs.
Current setup on server: filebeat > logstash > elasticsearch > kibana.
When I install filebeat onto, say, my laptop and configure the .yml file in a similar way to the server (comment out elastic output, uncomment logstash output) it is not able to connect. Basically I just set the hosts to serverip:logstash port and enabled filebeat on the system. Running the setup commands leads to a "couldn't connect to any configured elasticsearch hosts".
Instead of a direct answer, can someone explain for me generally what I need to be considering for this process? What is happening when connecting outside of the server LAN? and how do I handle authentication to the server, if needed?
Thank you, really. I know that the information is out there but I am deep in a rabbit hole and having a hard time finding what I need.
By default, the HTTP API is bound to only the host's local loopback interface,
ensuring that it is not accessible to the rest of the network. Because the API
includes neither authentication nor authorization and has not been hardened or
tested for use as a publicly-reachable API, binding to publicly accessible IPs
should be avoided where possible.
Even you set "http.host: 0.0.0.0" - you need to open port for your laptop (better if you already have public IP and open it only for your laptop)
For authentication - you have to investigate xpack - security features .
BR Alexey.

How to host Moqui on AWS EC2

Is there a way to host Moqui on AWS? I was trying to host Moqui using a EC2 instance but couldn't figure out a way to connect them.
The Run and Deploy document on moqui.org has a section for a simple recommended deployment using ElasticBeanstalk and RDS:
https://www.moqui.org/m/docs/framework/Run+and+Deploy#AWSElasticBeanstalkandRDS
With more details about how you want to set things up on AWS the answer to how might vary from this.
For clustered setups things get more involved to get the right settings for Hazelcast AWS discovery and it is best to use an external ElasticSearch server like an AWS ElasticSearch instance and configure Moqui using environment variables to use the Java REST Client mode instead of the Embedded Node mode. Settings for the moqui-hazelcast and moqui-elasticsearch components can be seen in the MoquiConf.xml file in each component.

Get back data from ElasticSearch

I'm new to ELK. We have a Spring Boot backend on a dedicated AWS instance. We have ELK stack on another instance (To the outside world only Kibana is available). Information gathering to ELK is carried out via Amazon SQS.
These information include logs and some business history about user (registration, any other action, etc).
In this case, I have a question. Is it possible to get back information by action, by user and use it in the backend responses?
I am guessing you want to use data present in Elasticsearch to be available to spring boot application. It is definitely possible. You will need to open up elasticsearch port on elasticsearch machine to specifically to EC2 instance on which spring boot is running. How to open port will depend on if they are on same vpc, different vpc, different aws account etc. Once port is open, you can either use Spring Data Elasticsearch or just rest calls to access elasticsearch api.

Memcache on kubernetes

I have a spring boot api running on google cloud kubernetes cluster, I wanna have a caching server to use for my api so I thought to use memcache.
I tried two ways of doing it:
I downloaded the memcache from the google launcher which is basically deploying an instance of memcache on a vm. And then I assigned an external IP to my vm, whitelisted my ip to try it locally and ofc opened the port 11211 (the default one). For the client side I used, this guy, specified the ip address but I still get connection cancelled : java.util.concurrent.CancellationException: Cancelled and the doc is bad so I could find anything that helps.
I decided to try another way, which is following this tutorial and now I have the memcached cluster but I don't know how to consume these pods from my other cluster or should the pods be on the same cluster i have the api running on?
I would appreciate any help, this is my first encounter with the global caching.
So I figured it out based on Jonah Benton's advice.
It was actually pretty simple, i used this tutorial to create a new pod running memcached in my cluster and then I used this client to connect on it and it worked like a charm!
Hope it helps someone.

Elasticsearch Access Log

I'm trying to track down who is issuing queries to an ElasticSearch Cluster. Elastic doesn't appear to have an access log.
Is there a place where I can find out which IP is hitting the cluster?
Elasticsearch doesn't provide any security out of the box, and that is on purpose and by design.
So you have a couple solutions out there:
Don't let your ES cluster exposed to the open world, but put it behind a firewall (i.e. whitelist the hosts that can access ports 9200/9300 on your nodes)
Look into the Shield plugin for Elasticsearch in order to secure your environment.
Put an nginx server in front of your cluster to act as a reverse proxy.
Add simple basic authentication with either the elasticsearch-jetty plugin or simply the elasticsearch-http-basic plugin, which also allowws you to whitelist the client IPs that are allowed to access your cluster.
If you want to have access logs, you need either 2 or 3, but all solutions above will allow you to secure your ES environment.

Resources