Use Letsencrypt for multiple domains in Lighttpd - lets-encrypt

For me it seems that a lot of guides / info about Lighttpd and Letsencrypt is outdated (5+ years).
I have a newly installed Ubuntu 20.04 server with Lighttpd and 20+ virtual domains on 1 ip.
How do I install LetsEncrypt certificates in this setup in an easy way and keep them updated?
I was using Hiawatha webserver before, where it was very easy since the creator included a script that parsed out all domains from the config file and requested new cert's for them.
I have found guides for single domains where you need to merge files etc (to create .pem files) but how is this done for multiple domains in a good way?
I'm also pointing some domains to the same file : example1.com and example2.com shows the same file (/var/www/example.com/public/index.html).
I find it confusing that I can't seem to find this information easily. Most info about Lighttpd seem to be old.

I find it confusing that I can't seem to find this information easily. Most info about Lighttpd seem to be old.
If you focused on primary sources of information, you would have easily found the official lighttpd documentation https://wiki.lighttpd.net/
lighttpd TLS documentation
lighttpd TLS Let's Encrypt
I have found guides for single domains where you need to merge files etc (to create .pem files) but how is this done for multiple domains in a good way?
This is no longer needed (though still supported) since lighttpd 1.4.53, which introduced ssl.privkey alongside ssl.pemfile

Related

Unable to set up custom domain on Heroku using Google Domains DNS?

To preface this, I am new to backend web development so I'm coming at this totally clueless. My past experience is with Netlify, which makes it pretty seamless to add a custom domain to a website with their free DNS service.
To start, I am working on a Flask application that ideally I would like to put on a subdomain of my website (i.e. app.my-website.whatever, not actually my real domain since it includes my real name) on a different host, in this case Heroku, while keeping my main website (www.my-website.whatever) on Netlify. This required me to switch from using Netlify's DNS to using the DNS tools provided by Google Domains.
After deploying the app on the free domain, which went just fine, I tried setting up my domain for the website, following these steps:
On my website's dashboard, I went to Settings > Domains > Add domain and under domain entered app.my-website.whatever, including the subdomain of course.
Copied the DNS Target Heroku gave me.
At my dashboard for my-website.whatever at domains.google.com, under DNS > Default name servers > Resource records, I added a custom record with the hostname app.my-website.whatever (Google Domains automatically adds the .my-website.whatever), type CNAME, TTL 600, and in the Data field I pasted the DNS Target.
In my Ubuntu (WSL) terminal, when I type host app.my-website.whatever, the output says app.my-website.whatever is an alias for {bunch-of-random-characters}.herokudns.com.
Unfortunately, this has not been successful. When I try to visit the domain, I usually get an error such as DNS_PROBE_FINISHED_NXDOMAIN or alternatively ERR_SSL_UNRECOGNIZED_NAME_ALERT. I've also tried the same thing with just www.my-website.whatever, and the same issues occur.
When I try to visit the site, most browsers will automatically append https://, which I would assume doesn't quite work since I do not have a cert set-up for my site, which I need to do manually.
Does the above error mean that there is a problem related to SSL, or is it something else? Is it because my browser forces https:// that I cannot see anything changing (i.e. would http:// work?)?
From what I can tell, I should be able to do all of this on the free-tire, but I have some confusion about a few details, and feel like I could be missing some other things:
Do I need a certificate/SSL for my custom domain to work at all with Heroku?
If it could possibly be an easier solution: Is there a better alternative to Heroku in my case?
With regards to setting up the cert, I tried following the tutorial here:
https://medium.com/#bantic/free-tls-with-letsencrypt-and-heroku-in-5-minutes-807361cca5d3
For certbot, as the tutorial explains, you are given two strings like so: <long-string>.<other-long-string>, and you need to serve a file at /.well-known/acme-challenge/ with the name <long-string> (no extension), but as an unrelated issue, I cannot get Flask to serve this file, even on a local dev server, and I just get a 404 message, which the certbot utility also reports. I can create another file, such as a simple .txt file, in that same directory, and it will serve just fine.
I'll admit, these issues feel a bit basic, but I genuinely am lost, and none of the guides or posts I see online seem to have any remedy or explanation for what is happening here.
If there is any more information I should share, please let me know.

How to allow external custom domains to run a Laravel app on my server?

My app is a Laravel app, running on Nginx, provisioned by Forge, and SSL certificates are provided by CloudFlare.
It is hosted at a URL like https://www.myapp.com
My app’s customers are businesses, and already own their domains:
https://www.customer1.com
https://www.customer2.com
https://www.customer3.com
etc.
I want my customers to run MyApp from the sub-domains of their choice:
https://some-name.customer1.com
https://some-other-name.customer2.com
https://any-name-they-want.customer3.com
etc.
My customers should not install anything — MyApp still runs on myapp.com, not on their servers
My customers should only (if possible) modify their DNS, probably add a CNAME like "some-name” that points to “myapp.com”
I followed this amazing article: Dynamic custom domain routing in Laravel.
but I can't get it to work in an https (with SSL) environment -- the browser returns:
This site can’t provide a secure connection
some-name.customer1.com uses an unsupported protocol.
ERR_SSL_VERSION_OR_CIPHER_MISMATCH
The client and server don't support a common SSL protocol version or cipher suite.
How should Nginx and/or SSL certificates be configured?
This is still a question which is not very simple.
However, Caddy does generate SSLs automatically (if replacing Nginx with Caddy is an option for you).
You can check the documentation for more.

How do you deploy (using Roots Trellis) to a domain that has CloudFlare proxying it?

I built a site using Roots' Trellis that now uses CloudFlare, thus proxying traffic. Which, understandably, prevents Trellis from deploying via MyExample.com.
I’m aware I can connect via IP or an un-proxied CName (Ex. ssh.MyExample.com). But I am unclear about which file(s) I edit in Trellis so the deploy uses the IP or un-proxied domain.
It seems that editing the /hosts/production file would do the trick, but the rest of the Roots ecosystem depends on the values in these files I'm afraid that re-running the deploy with corrupt the server. This has been my experience with similar issues in the past.
Can anyone confirm the steps to achieve this?
Edit /trellis/hosts/production at lines 5 and 8, below [production] and [web] and update it with the un-proxied domain (ex. ssh.myexample.com).
Save, and Git commit your change.
Navigate to the /trellis/ directory and run your deploy using the proxied domain. Ex. ./bin/deploy.sh production myexample.com

how to set up SSL on a magento multisite

We just moved a bunch of our websites from one server to another (obviously changing their IP addresses in the process), some of which were multisites in magento. The domains are not parked, but the multisites work. I don't completely understand how magento works regarding multisites, but that's not necessarily what my question is.
When we moved the websites, the multisites were broken. Eventually, we hired a freelancer to fix the multisites. Last night, i put the entire website package (multisites included) on their own dedicated IP addresses. They're on the same server, in the same place, but they have their own IP address on that server. I just walked in monday morning and SSL is broken on all the multisites, but works on the main website. Can anybody tell me what to do here? I have access to the certificate, bought through a third party. When i try to list the certs in cpanel, it just lists the main website as a "controlled certificate". My question is, why did these work on the original IP address after being transferred to the new server? and how do i set up SSL on the multisites? I have cPanel but im actually an admin that's worked WITHOUT cpanel for many years (not in webhosting). so i dont know much about ssl.
Depends on what certificate you have and server config. The only thing in magento you should check is the secure url is https and that it is enabled per each site in the admin. The certificate setup depends on what type of certificate you are using. If it's a wild card/ucc certificate that covers all domains/subdomains then it should simply install the certificate and the issuing authorities certificate, setup vhosts and it should work for all sites. If it's individual certificates per website you will need to install each certificate onto the server but this can be complex.
You also need to configure the apache vhost for each site so there is a host listening on port 443 for each site. there should be an ssl directive poiting to the certificate files. Check the vhost for the working site and compare to the others to see if anything is missing. If individual certificates (i.e. one per each domain) you normally need to have multiple ips for your server, one per each domain.
This issue got complex because our server was set up using cPanel, which means it had a bunch of pre-installed programs (like, for example, sendmail, dovecot, etc.) the program that was giving me grief over this issue was suphp. I couldn't figure out how to make the multisites work independently of the parent website. so, say i have www.frattoys.com as the parent and a bunch of child sites that pull from it, like frattanks.com, irontap.com, etc. those child websites share source code with frattoys.com. suphp wasnt allowing frattanks and irontap to pull code from frattoys if they were independent websites; independent cpanel and user accounts. thats why i was trying to install SSL on top of addon or subdomains; hence the question. as it turned out, i ended up uninstalling suphp and replacing it with fastcgi. that way, i could set the permissions to what they needed to be, share the code with frattoys, and install SSL on the child websites without too many issues.
The eventual solution to my problems was to install fastcgi (uninstall suphp), create independent users for each child website, and install SSL certificates from WHM for each child user.

Can gitlab be installed with Cherokee web server?

I've looked all over and can't figure out if you could use Cherokee instead of Apache or Nginx for gitlab. I'd rather not run multiple webservers (and imagine that they could conflict anyway). I'm giving this a shot on Ubuntu Server 12.10.
For the record, I've already installed gitlab with this guide up to the Nginx section (with all default settings other than passwords, email addresses, and hostname). I'd like to install gitlab at git.mydomain.com and I would prefer for the local server files to be located at /var/www/git.mydomain.com, as I keep all of my domains under /var/www/.
Since you already have all of the Ruby config done, you just need to hook cherokee
up for hosting RoR by following this guide http://cherokee-project.com/doc/cookbook_ror.html
My only problem turned out to be an issue with Ruby. Once that was resolved, I set up gitlab to use a port (though sockets should work too). Everything seems to work pretty well, except for an issue with pushing over HTTPS, but that might have something to do with my local Eclipse/eGit install.
So yes, gitlab will work with Cherokee.

Resources