I've a domain which is using 2 separate virtualhost files: one for :80 and one for :443
The :80 setup is pretty easy, it's only job is to redirect to :443:
<VirtualHost *:80>
# This is the first host so it's the default.
# So although I've specified a ServerName and ServerAlias anything else not specified elsewhere will also end up here.
ServerName www.domain.com
ServerAlias domain.com
ErrorLog ${APACHE_LOG_DIR}/error.log
CustomLog ${APACHE_LOG_DIR}/access.log combined
# Redirect everything to https:
RewriteEngine on
RewriteRule ^(.*)$ https://www.domain.com$1 [R=301,L]
</VirtualHost>
The :443 simply needs to add www to the beginning of the url if it was absent:
<VirtualHost *:443>
# This is the first host so it's the default.
# So although I've specified a ServerName and ServerAlias anything else not specified elsewhere will also end up here.
ServerName www.domain.com
ServerAlias domain.com
ErrorLog ${APACHE_LOG_DIR}/ssl.error.log
CustomLog ${APACHE_LOG_DIR}/ssl.access.log combined
# Redirect everything which is not already on the real www domain name to that:
RewriteEngine on
RewriteCond %{HTTP_HOST} !www.domain.com
RewriteRule ^(.*)$ https://www.domain.com$1 [R=301]
ErrorDocument 404 /404.html
</VirtualHost>
I've 1 case in which these Rewrites seem to fail:
https://domain.com -> should point to https://www.domain.com but it points to https://www.domain.com%24/# . Obviously the characters at the back prevent the DNS server from finding the domain.
What is causing this issue? I've already had helped creating these virtualhosts files but it seems they're still not fully working as expected.
BUT I also want to rewrite my URLs to nicer ones. I think my rule is correct and the Rewrite block in :443 looks like the following
RewriteEngine on
RewriteCond %{HTTP_HOST} !www.domain.com
RewriteRule ^(.*)$ https://www.domain.com$1 [R=301]
RewriteRule ^subpage/(.+)/?$ subpage.html?$1 [NC]
Which should rewrite
https://www.domain.com/subpage/2 -> https://www.domain.com/subpage.html?2 but it's just pointing towards my 404 file now.
It might be something obvious, but I'm not seeing my mistake.
NOTE: This doesn't solve the problem as stated here, but it does solve the underlying problem.
As this is a production environment (how awkward.) my webserver, which was pretty weak to start with, got flooded rather quickly. My company greatly increased my budget (we were expecting a lot of traffic, but were hoping it would build up slowly which it didn't) so I was able to set up multiple servers and placed 2 HAProxy loadbalancers in front of them. I used the HAProxy configuration to solve my problems using:
frontend http
bind MY_IP:80
redirect prefix http://www.domain.com code 301 if { hdr(host) -i domain.com }
redirect scheme https code 301 if !{ ssl_fc }
frontend https
bind MY_IP:443 ssl crt /etc/haproxy/certs/domain.com.pem
redirect prefix https://www.domain.com code 301 if { hdr(host) -i domain.com }
reqadd X-Forwarded-Proto:\ https
default_backend app_pool
backend app_pool
balance roundrobin
redirect scheme https if !{ ssl_fc }
server app-1 MY_IP:80 check
server app-2 MY_IP:80 check
server app-3 MY_IP:80 check
server app-4 MY_IP:80 check
which will always redirect to the www.domain.com version and enforce HTTPS as well. Setting this up in HAProxy is way easier than using VirtualHost in my opinion.
Related
My setting is done and it works. Is it the correct way?
I have a Windows server and I installed XAMPP on it. Different domain would point to different IP address to the server. Also, every site runs https on this server. I go through a lot of tutorials and set up self-signed cert to each site.
Then, I configed the server with below setting.
These config works but I am not sure is it secure enough. I afraid that I missed something important.
I need the site to be reachable by below URL:
http://sitea.com (Will redirect to https://sitea.com)
http://www.sitea.com (Will also redirect to https://sitea.com)
https://sitea.com (This great)
https://www.sitea.com (Will force to use non-www version due to program needed- https://sitea.com)
My configuration is listed below. May I ask if it is good enough or if I missed something?
C:\xampp\apache\conf\extra\httpd-vhosts.conf:
<VirtualHost 192.168.242.121:80>
ServerName sitea.com
ServerAlias www.sitea.com
Redirect permanent / https://sitea.com/
</VirtualHost>
<VirtualHost 192.168.242.121:443>
DocumentRoot "S:/websites/sitea/"
ServerName sitea.com
RewriteEngine On
RewriteCond %{HTTP_HOST} ^(www\.)(.*) [NC]
RewriteRule (.*) https://%2%{REQUEST_URI} [L,R=301]
SSLEngine on
SSLCertificateFile "ssl/sitea.com/server.crt"
SSLCertificateKeyFile "ssl/sitea.com/server.key"
AccessFileName .htaccess
ErrorLog "S:/websites/sitea/logs/error.log"
CustomLog "S:/websites/sitea/logs/access.log" common
<Directory S:/websites/sitea/>
Options FollowSymLinks
AllowOverride All
Require all granted
</Directory>
</VirtualHost>
<VirtualHost 192.168.242.120:80>
ServerName siteb.com
ServerAlias www.siteb.com
Redirect permanent / https://siteb.com/
</VirtualHost>
<VirtualHost 192.168.242.120:443>
DocumentRoot "S:/websites/siteb/"
ServerName siteb.com
RewriteEngine On
RewriteCond %{HTTP_HOST} ^(www\.)(.*) [NC]
RewriteRule (.*) https://%2%{REQUEST_URI} [L,R=301]
SSLEngine on
SSLCertificateFile "ssl/siteb.com/server.crt"
SSLCertificateKeyFile "ssl/siteb.com/server.key"
AccessFileName .htaccess
ErrorLog "S:/websites/siteb/logs/error.log"
CustomLog "S:/websites/siteb/logs/access.log" common
<Directory S:/websites/siteb/>
Options FollowSymLinks
AllowOverride All
Require all granted
</Directory>
</VirtualHost>
C:\Windows\System32\drivers\etc\hosts:
192.168.242.121 sitea.com www.sitea.com
192.168.242.120 siteb.com www.siteb.com
Thank you!
Enabling HTTPS on a website does not stop website vulnerabilities, it only secures data which is being transferred between the website server and client i.e. someone can not eaves drop on what the server and client are saying to each other. If a website has a vulnerability people will still be able to exploit it.
In your Apache configuration it looks like some of your apache configurations can be bypassed by accessing your website directly i.e. type it's IP address into a web browser. This would allow someone to bypass your mandated HTTPS for example. You should set up a redirect rule if you want to prevent against this.
I have apache2 server on Ubuntu 16.04LTS and I have Laravel 5.6 project called book_donation in /var/www/html now when I visit localhost the Laravel welcome screens appear no problem but whenever I click on any url it gives:
Not Found
The requested URL /book_donation/public/login was not found on this
server. Apache/2.4.18 (Ubuntu) Server at 127.0.0.1 Port 80
and here is .htaccess:
<IfModule mod_rewrite.c>
<IfModule mod_negotiation.c>
Options -MultiViews -Indexes
</IfModule>
RewriteEngine On
# Handle Authorization Header
RewriteCond %{HTTP:Authorization} .
RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}]
# Redirect Trailing Slashes If Not A Folder...
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_URI} (.+)/$
RewriteRule ^ %1 [L,R=301]
# Handle Front Controller...
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^ index.php [L]
</IfModule>
and here is my /etc/apache2/site-available/000-default.conf:
<VirtualHost *:80>
# The ServerName directive sets the request scheme, hostname and port that
# the server uses to identify itself. This is used when creating
# redirection URLs. In the context of virtual hosts, the ServerName
# specifies what hostname must appear in the request's Host: header to
# match this virtual host. For the default virtual host (this file) this
# value is not decisive as it is used as a last resort host regardless.
# However, you must set it for any further virtual host explicitly.
#ServerName www.example.com
ServerAdmin webmaster#localhost
DocumentRoot /var/www/html
<Directory "/var/www/html">
AllowOverride all
Require all granted
</Directory>
# Available loglevels: trace8, ..., trace1, debug, info, notice, warn,
# error, crit, alert, emerg.
# It is also possible to configure the loglevel for particular
# modules, e.g.
#LogLevel info ssl:warn
ErrorLog ${APACHE_LOG_DIR}/error.log
CustomLog ${APACHE_LOG_DIR}/access.log combined
# For most configuration files from conf-available/, which are
# enabled or disabled at a global level, it is possible to
# include a line for only one particular virtual host. For example the
# following line enables the CGI configuration for this host only
# after it has been globally disabled with "a2disconf".
#Include conf-available/serve-cgi-bin.conf
</VirtualHost>
# vim: syntax=apache ts=4 sw=4 sts=4 sr noet
My php version is 7.1 and apache version is 2.4.18, how to solve this?
As stated in the docs, you should configure your web server's document / web root to be the public directory.
Create a new file /etc/apache2/site-available/book_donation.test.conf:
<VirtualHost *:80>
ServerAdmin webmaster#localhost
ServerName book_donation.test
ServerAlias www.book_donation.test
DocumentRoot /var/www/html/book_donation/public
ErrorLog ${APACHE_LOG_DIR}/error.log
CustomLog ${APACHE_LOG_DIR}/access.log combined
</VirtualHost>
Edit your /etc/hosts file and add the following:
127.0.1.1 book_donation.test
Then run:
sudo a2ensite book_donation.test.conf
sudo service apache2 restart
Then you should be able to visit book_donation.test and see your site.
I was facing same problem any new route was not working and gave 404 error. Also check php artisan route:list where new route details showing properly but still it was not working. I also tried following command to make it work
php artisan view:clear
php artisan config:clear
php artisan cache:clear
Here are my project details
My DocumentRoot is /var/www/html/project/public. when add new route in web.php, it was nor working and gave 404 error but root route was working fine. Means any new route was working. so I did following changes and it start working.
changed in /etc/apache2/sites-available/project.conf
<Directory /var/www/html/project/public>
**Allowoverride All**
and enable url rewrite by running sudo a2enmod rewrite
restart apache server by running sudo systemctl restart apache2
I am using Crafter CMS with multi-tenancy. I am trying to setup Apache2.4 on RHEL7 to be a reverse proxy. http://site.example.com -> ajp://localhost:9009/?crafterSite=site
Here is my Apache2 virtual host configuration. I have ensured that mod_proxy and mod_rewrite are loaded. I can reach Crafter Delivery through the proxy but the rewrite isnt working as Crafter doesnt know what site I am trying to load. Does anyone have any suggestions on how to get this working.
<VirtualHost *:80>
ServerName site.example.com
LogLevel alert rewrite:trace3
RewriteEngine On
RewriteRule ^$ /?crafterSite=site [QSA,L]
<Proxy *>
Order allow,deny
Allow from all
</Proxy>
ProxyRequests Off
ProxyPreserveHost On
ProxyPass / ajp://localhost:9009/
ProxyPassReverse / ajp://localhost:9009/
</VirtualHost>
Try changing the rewrite rule to be:
RewriteRule (.*) $1?crafterSite=site [QSA,PT]
Where site is your site ID.
The differences are:
It rewrites anything coming in regardless of URL and preserves it (see the (.*) and $1)
It's a passthrough PT (not a redirect). This means it augments the request with the param and lets it straight through to Crafter Engine.
iv'e been looking through and am trying to find a solution to force HTTPS on apache-reverse-proxy behind AWS ELB without success.
my sites-enabled config file looks like this.
<VirtualHost *:80>
ServerAlias *.domain.net
RewriteEngine On
RewriteCond %{HTTP:X-Forwarded-Proto} http
RewriteRule https:// %{SERVER_NAME}%{REQUEST_URI} [R=301,L]
ProxyPass / http://{10.10.10.21}/
ProxyPassReverse / http://{10.10.10.21}/
</VirtualHost>
however i never get any redirect back to the browser when i hit the server on port 80. the proxypass and reverse are kicking in, but not the redirect.
i see that by enabling rewrite-trace level 8 as follows:
ive been on this for too long now....
any help will be greatly appreciated!
What I am trying to achieve is the following:
I want to have numerous subdomains such as abc.domain.com redirect to a url such as www.domain.com/something?subdomain=abc
Since I am redirecting to a fully qualified domain, I needed to use a reverse proxy to avoid the change of the URL in the browser. (using the [P] Flag and turning on the mod_proxy module and some other modules)
This is my DNS setup
*.domain.com. 14400 A 111.111.11.1
This is my virtual host configuration for apache
<VirtualHost 111.111.11.1:80>
ServerName www.domain.com
ServerAlias *.lionite.com
DocumentRoot /var/www/html
ErrorLog /var/www/logs
UseCanonicalName off
RewriteEngine on
RewriteCond %{REQUEST_URI} !^/images
RewriteCond %{HTTP_HOST} !^www\.domain\.com$
RewriteRule ^(.+) %{HTTP_HOST}$1 [C]
RewriteRule ^([^.]+)\.domain\.com(.*) http://www.domain.com/something?subdomain=$1 [P,L]
This setup is working fine (Let me know if you think you can improve it of course).
My main problem is when I am trying to setup https://
This is my virtual host configuration for apache
<VirtualHost 111.111.11.1:443>
ServerName www.domain.com:443
ServerAlias *.domain.com
DocumentRoot /var/www/html
SSLEngine on
SSLProtocol all -SSLv2
SSLCipherSuite ALL:!ADH:!EXPORT:!SSLv2:RC4+RSA:+HIGH:+MEDIUM:+LOW
SSLCertificateFile /etc/httpd/conf.d/cert/server.crt
SSLCertificateKeyFile /etc/httpd/conf.d/cert/server.key
<Directory "/var/www/cgi-bin">
SSLOptions +StdEnvVars
</Directory>
SetEnvIf User-Agent ".*MSIE.*" \
nokeepalive ssl-unclean-shutdown \
downgrade-1.0 force-response-1.0
CustomLog logs/ssl_request_log \
"%t %h %{SSL_PROTOCOL}x %{SSL_CIPHER}x \"%r\" %b"
RewriteEngine on
RewriteCond %{REQUEST_URI} !^/images
RewriteCond %{HTTPS_HOST} !^www\.domain\.com$
RewriteRule ^(.+) %{HTTPS_HOST}$1 [C]
RewriteRule ^([^.]+)\.domain\.com(.*) https://www.domain.com/something?subdomain=$1 [P,L]
</VirtualHost>
Whenever I call https://abc.domain.com - the response I am getting is the homepage but no matter what I am appending to the end of the subdomain, I will get the same response. It's like the rewrite isn't responding well.
Any help would be appreciated, or if you could share how you'd setup reverse proxy, rewrite, wildcard subdomain and SSL all together
Thanks,
I have had this same problem as well. The only way I solved it was to put different domains that need secure connection on different Listen ports because I was limited with IP addresses.
From my understanding, the problem is that in the https protocol the HOST is not included in the request. So when the request reaches the server, apache just uses the first match on the IP and port the connection was received on because it does not know the domain it was requested from.
The only work around for this is to have a different IP for each domain, or a different port.
Unfortunately you are out of luck using https with a wildcard domain setup, I don't believe there is anyway to get it to work.