I want nginx to deny users who have folders inside /webroot/uploads
e.g.
/webroot/uploads/user1
/webroot/uploads/user2
/webroot/uploads/user999
to execute any shell script or binary (php, pl, py).
The malicious codes are often hidden in jpg or gif files. like badfile.php.jpg
I also see malacious binary files being uploaded to the folder.
Here my preliminary rules:
location ~ /webroot/uploads/(.+)\.php$ {
deny all;
}
location ~ /webroot/uploads/(.+)\.pl$ {
deny all;
}
But I am not sure it is robust enough. So I appreciate your help.
nginx doesn't have CGI support exactly for this reason - by default, people can't upload random scripts or executables and then run them.
If you have a FastCGI bridge which executes files, check its configuration and whether you can deny the webroot/uploads directory.
You could also force uploaded files to not have the execute bit set, though (depending on who is running the files, see below) that may not help. For details, use something like upload_store_access user:rw (see the HttpUploadModule documentation for details).
One last point is a vulnerability by misconfiguration, through which someone could have random files (not ending in .php) be executed by the PHP handler. Follow this article for the details and correct configuration.
Related
how is it adviced to alternate between online and local development, since you want to modify your websites on local.
Do you systematically change all URLs (by search/replace) in your project code to fit local URL type and sometimes create personal SSL certificate for https, or do you use another solution like localhost aliases, rewrite rules, or online developpement tools?
What could be an automatic solution in order to avoid this fastidious modifications like search/replace sometimes looking quite primitive and time costing since I develop during the few hours left after my main work.
What are the operation modes to facilitate developpment,
Have a nice day,
for all the biginners, here's the thing.
I've created a config.php file which contains constants: one config file for the local project folder and one for the online server folder.
Inside this config file, I've create a constant (constant are then available everywhere in the project) to define the main URL of the project. e.g.:
define('CST_MAIN_URL',http://www.myproject.com); // for the online config.php file
define('CST_MAIN_URL',http://localhost:8888); // for the local config.php file
Thus, each header or redirection can work with that constant, like:
header('location:' . CST_MAIN_URL . 'index.php');
Then, things must have to do with RewriteEngine in your htaccess file, for instance whenever you must modify the behavior of MAMP/WAMP if an interrogation point or a slash provokes you with its malicious resistance. But, unfortunately RegEx expression must be understood as a basic level for mastering those url rewritings.
Hope it'll helps.
The trouble is, while PL/SQL procedures do generate HTML, I cannot make image folder work. That is, when I try to insert an IMG tag, it shows that it can't find that file in /xxx/img folder.
I tried to redefine DocumentRoot in httpd.conf - it works only on that folder itself, not recursively.
I tried to change DOCUMENT_ROOT in dads.conf - it doesn't work at all.
So the question is, how can I make images deep inside that root folder show up?
At last I have found an answer and a reason of this behavior.
The reason is Oracle's hand-made handler, pls_handler, used for any DADs, made up as Apache Locations.
Trying to create folders for storing images like $ORACLE_HOME/htdocs/myapp/img, I interfered with that directive:
<Location /myapp>
SetHandler pls_handler
# lots of stuff
</Location>
And thus, anything under $ORACLE_HOME/htdocs/myapp folder was processed as PL/SQL procedures.
This is a plain Apache configuration issue. You simply must define an alias in your Apache configuration file.
Assume that your image resources are in a directory /middleware/project/img. Then just add the following line to your httpd.conf or (that's where I configure it) dads.conf:
Alias /i/ "/middleware/project/img/"
If you now have a file alert.png in your /middleware/project/img directory you can access it with an /i/alert.png url.
I am building an application which will allow users to upload images. Mostly, it will work with mobile browsers with slow internet connections. I was wondering if there are best practices for this. Does doing some encryption and than doing the transfer and decoding on server is a trick to try ? OR something else?
You would want something preferably with resumable uploads. Since your connections is slow you'd need something that can be resumed where you left off. A library i've come across over the many years is Nginx upload module:
http://www.grid.net.ru/nginx/upload.en.html
According to the site:
The module parses request body storing all files being uploaded to a directory specified by upload_store directive. The files are then being stripped from body and altered request is then passed to a location specified by upload_pass directive, thus allowing arbitrary handling of uploaded files. Each of file fields are being replaced by a set of fields specified by upload_set_form_field directive. The content of each uploaded file then could be read from a file specified by $upload_tmp_path variable or the file could be simply moved to ultimate destination. Removal of output files is controlled by directive upload_cleanup. If a request has a method other than POST, the module returns error 405 (Method not allowed). Requests with such methods could be processed in alternative location via error_page directive.
A common method to block websites is to go this directory.
C->System32–>drivers–>etc and add the exceptions to the 'hosts' file.
But anybody can re-edit the file and remove the exceptions.
So..is there some kind of batch programming to block certain websites ?
You have a few options for this.
Change admin rights and set up yourself as the supervisor and everyone else as something else and lock edit permissions.
Write a bat file that opens both the internet and a second bat file that reads the website to the host file. If you do this every single time they start the web browser they will add the website back to the blocked list in the background forcing them to exit if they want to change it and if that happens they will reblock the website when they open the web browser again. Effective and out of some peoples abilities bypass.
Example can be found here: http://www.makeuseof.com/tag/launch-multiple-programs-single-shortcut-using-batch-file/
Similar to the method above add that bat file to launch when someone access's the hosts file with a timeout function to rewrite the hosts file after some amount of time...
Password protecting the system 32 folder but this could prove problematic for a plethora of reasons.
I was wondering how to create and debug this kind of script that can become a bit of headache if you are not used to write them (like me).
Do you use tool to create them?
Any tips to debug what's going on instead of just create a local structure and see what's happening in the browser?
Note to readers: the old answer doesn't work anymore.
As of version 2.4, Apache no longer allows the RewriteLogLevel and RewriteLog directives. Now they're all bundled with the single LogLevel directive (see Log Files documentation), which supports module-specific log levels with prefixes and trace[1-8] constants. To set the highest level of logging specifically for the rewrite module, you now use the following:
LogLevel warn rewrite:trace8
You can use any regex testing tool to help you testing your patterns against URLs (I'm using "The Regex Coach" -- Windows app). This will only help you with pattern -- you should already know the general logic / flow of how rewrite works.
To DEBUG you must be able to edit Apache config file -- use RewriteLogLevel 9 and RewriteLog /path/to/rewrite.log to see exact details on what is going on during URL rewriting (because it's a server config you will have to restart Apache to have new server config applied).
You need level 9 if you want to debug problematic rule. Level 3 or any other pretty low value will only show you overview on what is going on without going into details.
Do not use level 9 on busy/production server as it may generate huge log within few seconds.
If you need to do 301 (permanent) redirects -- do 302 instead during a testing period (until you are happy with the rule and results -- then change to 301) as modern browsers do cache 301 redirects .. so you may end up in frustrating situation when you have completely changed the rule (or even deleted it) but browser still does the redirects. The only cure in such cases: -- clear the browser cache and reload the page.
You can set RewriteLog directive in your virtualhost configuration
It will write necessary info to the file specified by you.
RewriteLog "/usr/local/var/apache/logs/rewrite.log"
Further, use RewriteLogLevel directive to control the amount of logging
RewriteLogLevel 3
read through