Configure varnish on a different server - magento

This is my situation,
I have 2 dedicated servers, one with a Magento installation, and the other without anything.
I would like to use the second server with most of it's RAM with Varnish.
I can't find around an example on how to make Varnish work alone on one server and manage the requests vs the mysite.com on the first server.
Is this possible or Varnish must work on same server?

It's completely possible to have Varnish on seperate server.
You can specify IP address & port in below code where your Magento Web Server resides
# Default backend definition. Set this to point to your content server.
backend default {
.host = "127.0.0.1";
.port = "6081";
}

Related

Laravel + nginx + Subdomain + Load balancer

My wildcard subdomains are not working when I am using a load balancer. I have edited the nginx config so the domain is .xxx.com on both the load balancer and both of my app servers. The servers are setup using Forge.
When I visit a subdomain, the app interprets it as the main domain. For example, visiting subdomain.xxx.com shows me the homepage of xxx.com, and visting subdomain.xxx.com/blog shows me xxx.com/blog (which is a 404). The URL also changes in the browser and doesn't include the subdomain.
The same code works on my staging server, which leads me to believe that the load balancer is causing the issue. I don't have a LB on the staging server.
I have restarted nginx, cleared the route and config cache.
Looking at the request in Telescope, I see that host is set to the domain (not subdomain).
Why is the subdomain not working when using a load balancer?
Turns out the DNS hadn't propagated yet. Weird result.

Serving a website using Caddy

I have created an application and want to serve it using caddy.
On my localhost if I run the application on 127.0.0.1:9000 and set it as proxy in
the caddyfile it works. I figured I have to serve my website similarly on my production as well.
Now I am trying to serve it on my ec2 instance. I tried serving it on localhost, 127.0.0.1 and even the domain directly itself but caddy does not work here. Oneof the things I noticed is that the url is automatically changing from http tp https, I figure it means that at least caddy is running and recognizing the request but is not actually able to find the content.
Below is my CaddyFile.
abc.xyz.com {
proxy / zbc.xyz.com:9000 {
transparent
}
}

Separate frontend and backend with Heroku

I have an application, let's call it derpshow, that consists of two repositories, one for the frontend and one for the backend.
I would like to deploy these using Heroku, and preferably on the same domain. I would also like to use pipelines for both parts separate, with a staging and production environment for each.
Is it possible to get both apps running on the same domain, so that the frontend can call the backend on /api/*? Another option would be to serve the backend on api.derpshow.com and the frontend on app.derpshow.com but that complicates security somewhat.
What are the best practices for this? The frontend is simply static files, so it could even be served from S3 or similar, but I still need the staging and production environments and automatic testing and so and so forth.
Any advice is greatly appreciated!
For what you are trying to you must use webserver for serving static content and provide access to container(gunicorn, tomcat, etc...) holding your app. Also this is best practice.
Asume your use nginx as webserver, because its easier to setup. nginx config file would look like this
# Server definition for project A
server {
listen 80;
server_name derpshow.com www.derpshow.com;
location / {
# Proxy to gUnicorn.
proxy_pass http://127.0.0.1:<projectA port>;
# etc...
}
}
# Server definition for project B
server {
listen 80;
server_name api.derpshow.com www.api.derpshow.com;
location / {
# Proxy to gUnicorn on a different port.
proxy_pass http://127.0.0.1:<projectBg port>;
allow 127.0.0.1;
deny all;
# etc...
}
}
And thats it.
OLD ANSWER: Try using nginx-buildpack it allows you to run NGINX in front of your app server on Heroku. Then you need to run your apps on different ports and setup one port to api.derpshow.com and other to app.derpshow.com, and then you can restrict calls to api.derpshow.com only from localhost.
Would just like to contribute what I recently did. I had a NodeJS w/ Express backend and a plain old Bootstrap/vanilla frontend (using just XMLHttpRequest to communicate). To connect these two, you can simply tell express to serve static files (i.e. serve requests to /index.html, /img/pic1.png) etc.
For example, to tell express to serve the assets in directory test_site1, simply do:
app.use(express.static('<any-directory>/test_site1'));
Many thanks to this post for the idea: https://www.fullstackreact.com/articles/deploying-a-react-app-with-a-server/
Note that all these answers appear to be variations of merging the code to be served by one monolith server.
Jozef's answer appears to be adding an entire nginx server on top of everything (both the frontend and backend) to reverse proxy requests.
My answer is about letting your backend server serve frontend requests; I am sure there is also a way to let the frontend server serve backend requests.

Redirect :80 to :443 (http to https) with Wakanda Server

I've set up a Wakanda server hosted on an Amazon EC2 instance, that has SSL certificates installed as per the Wakanda documentation and accessing the home page via https easily enough, but won't redirect incoming traffic on port 80 to 443 automatically.
Being an Amazon AWS instance with an elastic IP, I've tried to set up a load balancer to handle the traffic routing for me as a possible solution. Though while that reports that it's routing "Load Balancer Port = 80" to "Instance Port = 443", it doesn't seem to be redirecting traffic either.
I may be missing something entirely in the way the Load Balancer is supposed to work, but is there a way for the Wakanda Server to automatically route incoming http traffic to https? Edit: I have also tried to set up a .htaccess file in my webFolder directory to manually try to redirect traffic, though I'm finding very limited documentation around whether that is a viable option in itself too.
Thanks!

Enable page caching on Nginx

I have a CDN for my website that uses Nginx and Drupal.
In my nginx configuration, I am trying to enable page level caching so requests like "website.com/page1" can be served from the CDN. Currently, I am only able to serve static files from the CDN(GET requests on 'website.com/sites/default/files/abc.png').
All page-level requests always hit the back-end web server.
What nginx config should I add in order for "website.com/page1" requests to also be served from the CDN?
Thanks!
If I understand you correctly, you want to setup another Nginx so that it works as a basic CDN in front of your current webserver (Nginx or Apache??) on which Drupal resides. You need to have a reverse proxy Nginx server to cache both static assets and pages. Since its not super clear to me what you wrote, this is what I assumed.
If you want a setup like this, then you should read the following article on how to setup reverse proxy

Resources