Rewrite User Agent in apache2 .conf file on proxy - proxy

I am running apache 2.4.38 on a Debian server and would like to be able to rewrite the user-agent within the following .conf file:
<Location /user1>
Redirect 302 /user1 http://www.example.com/user1
ErrorDocument 403 /errors/403.html
</Location>
I would like for the user-agent to be modified so that it can be identified on the remote server where the request is being forwarded to, for example in the above .conf file the destination for the request is http://www.example.com/user1 so that on the destination server it appears as a custom user-agent (overwriting the user-agent connecting to my server) OR at least be able to append something to it.
Is this possible? If so, how would I go about it?
Thanks.

Related

My laravel .env file config is downloadable via ip address! How to solve this on litespeed server?

hello I am creating a new laravel project with Openlitespeed server and I saw a glaring security issue.
I've added the following rule to the .htaccess of the site root and it works fine by stopping when someone tries to download the .env file by www.mywebsite.com/.env
But to my surprise, it is possible to download the .env file easily by accessing the server IP, eg: 127.0.0.1/.env 😯
<Files ~ "\.(env|json|config.js|md|gitignore|gitattributes|lock)$">
Order allow,deny
Deny from all
</Files>
how do I solve this? I have access to edit httpd_config.conf

Moodle 3.5 with Reverse proxy

I'm having troubles setting up a moodle instance behind an apache proxy.
Here's my apache front-end that proxies to the running server.
<VirtualHost *:80>
ServerName public.domain.com
ProxyRequests Off
ProxyPreserveHost On
ProxyPass / http://10.10.10.10:81/moodle/
ProxyPassReverse / http://10.10.10.10:81/moodle/
</VirtualHost>
AND.
$CFG->wwwroot = 'http://public.domain.com';
I install without problems, but when finished the installation I try in browser:
http://public.domain.com
This redirect to: http://public.domain.com/moodle/index.php?sessionstarted=1&lang=en...
Does anyone know what might be happening?
The best way to fix this issue is to move the moodle installation on the internal host to the root of web server.
Move your moodle in 10.10.10.10 to be at / and not at /moodle
Note, that if you will use SSL on the external apache (it's suggested) you should also add to your config this line:
$CFG->sslproxy = true;
Finally I have been able to fix the problem, I am writing this answer with a bigger level of detail so other people having this problem can follow my answer.
First we need to edit the apache2 config for our site:
In general the apache2 configuration for your specified site can be found at /etc/apache2/sites-enabled. Depending if you are using http or https you need to edit the right configuration file. Default name for http is 000-default.conf and for https 000-default-ssl.conf
Add the following lines between the <VirtualHost *:80>....</VirtualHost> sections.
# MOODLE
ProxyRequests Off
ProxyPreserveHost On
ProxyPass "/" "http://10.10.10.10:81/moodle/"
ProxyPassReverse "/" "http://10.10.10.10:81/moodle/"
Then we need to restart our apache2 webserver, this can be done with the command service apache2 restart.
Now we also need to edit a few things in our moodle config.php file. This file can be found at /var/www/html/moodle on the server with the IP (in this case) 10.10.10.10 if you used the default install location from the moodle guides.
In the config.php file we append the following lines under the default $CFG declarations: Please make sure to change all values according to your server setup.
$CFG->wwwroot = 'http://public.domain.com';
$CFG->dirroot = '/var/www/html/moodle';
$CFG->reverseproxy = true;
//$CFG->sslproxy = true; //UNCOMMENT this line if you are using SSL
Attention: If you are not using the root directory at your public webserver, then make sure that you do not use the same directory as moodle is using on the subserver. For example http://public.domain.com/moodle will fail if moodle is installed on the subserver at /var/html/moodle since both directories are equal and the proxy loops for some reason. My easy fix for this problem was do just move the moodle installation to /var/html/moodley including all required changes in the config.php. This fixes every problem I had.

Sending files over ftp using Apache Camel

I've been trying to copy a file from the local directory to a remote directory. The problem is I don't know how to specify the address of the destination which is another computer. Can someone please show an example of ftp URL with ip address. It would be helpful if the URL also include user name, password, port and a specific path .Thanks.
It's easy, read the section URI format of ftp Camel manual
(http://camel.apache.org/ftp2.html).
According to Camel documentation:
URI format
ftp://[username#]hostname[:port]/directoryname[?options]
sftp://[username#]hostname[:port]/directoryname[?options]
ftps://[username#]hostname[:port]/directoryname[?options]
Just replace the hostname by remote IP.
URL also can include password :
ftp://user:password#192.168.10.20:21/dir/subdir
or you can define user and password in options:
ftp://192.168.10.20:21/dir/subdir?user=user&password=password

NGrok and Laravel

I want to deploy my local larvel website online with NGROK.
I'm using Laragon with Apache server, I use this command :
ngrok http -host-header=rewrite site.dev:80
It almost work, but the asset file (like CSS/Image) are still link to my local server (site.dev). And it's the same for my link, the laravel routing command :
{{ route('ngo') }} return site.dev/ngo instead of my online tunnel (http://number.ngrok.io/ngo)
I've try to :
Edit the http.conf (https://forum.laragon.org/topic/88/allow-outside-other-devices-phones-tablets-to-access-your-local-server-using-ngrok)
Change my Laravel App url in config/app.php
Change my url in .env file
Nothings work
I ran into this problem myself just now, but also found a way to fix it:
Run the ngrok command without the -host-header=rewrite part, resulting in
ngrok http site.dev:80
After this, edit your http.conf file and add the ngrok domain as a server alias. For example:
ServerAlias nd3428do.ngrok.io
The problem is that Laravel's route helper uses the HOST header, which is rewritten to site.dev. Without the -host-header part, the header isn't rewritten and now it works.

Lets Encrypt Error "urn:acme:error:unauthorized"

I use Lets Encrypt and get error:
urn:acme:error:unauthorized :: The client lacks sufficient authorization :: Error parsing key authorization file: Invalid key authorization: malformed token
I try: sudo service nginx stop
but get error: nginx service not loaded
So I had a lot of trouble with this stuff. Basically, the error means that certbot was unable to find the file it was looking for when testing that you owned the site. This has a number of potential causes, so I'll try to summarize because I encountered most of them when I set this up. For more reference material, I found the github readme much more useful than the docs.
First thing to note is that the nginx service needs to be running for the acme authorization to work. It looks like you're saying it's not, so start by spinning that up.
sudo service nginx start
With that going, everything here is based on the file location of the website you're trying to create a certificate for. If you don't know where that is, it will be in the relevant configuration file under /etc/nginx which depends largely on your version of NGINX, but is usually under /etc/nginx/nginx.conf or /etc/nginx/sites-enabled/[site-name] or /etc/nginx/conf/[something].conf. Note that the configuration file should be listed (or at least it's directory) under /etc/nginx/nginx.conf so you might start there.
This is an important folder, because this is the folder that certbot needs to modify. It needs to create some files in a nested folder structure that the URL it tries to read from returns the data from those files. The folder it tries to create will be under the root directory you give it under the folder:
/.well-known/acme-challenge
It will then try to create a file with an obscure name (I think it's a GUID), and read that file from the URL. Something like:
http://example.com/.well-known/acme-challenge/abcdefgh12345678
This is important, because if your root directory is poorly configured, the url will not match the folder and the authorization will fail. And if certbot does not have write permissions to the folders when you run it, the file will not be created, so the authorization will fail. I encountered both of these issues.
Additionally, you may have noticed that the above URL is http not https. This is also important. I was using an existing encryption tool, so I had to configure NGINX to allow me to view the ./well-known folder tree under port 80 instead of 443 while still keeping most of my data under the secure https url. These two things make for a somewhat complicated NGINX file, so here is an example configuration to reference.
server {
listen 80;
server_name example.com;
location '/.well-known/acme-challenge' {
default_type "text/plain";
root /home/example;
}
location '/' {
return 301 https://$server_name$request_uri;
}
}
This allows port 80 for everything related to the certbot challenges, while retaining security for the rest of my website. You can modify the directory permissions to ensure that certbot has access to write the files, or simply run it as root:
sudo ./certbot-auto certonly
After you get the certificate, you'll have to set it up in your config as well, but that's outside the scope of this question, so here's a link.

Resources