how to process dynamic urls as static pages using nginx? - mod-rewrite

I want my Nginx serve dynamical urls as static pages, e.g.
given a url "/book?name=ruby_lang&published_at=2014" ,
the nginx will serve a static file (which is generated automatically ) named as:
"book?name=ruby_lang&published_at=2014.html" or:
"book-name-eq-ruby_lang-pblished_at-eq-2014.html"
is this possible?
NOTE:
1.there's no static file named:
"book?name=ruby_lang&published_at=2014.html" nor
"book-name-eq-ruby_lang-pblished_at-eq-2014.html"
however, I can generate them if needed.
2.I can't change the url that give to the consumer. e.g. my consumer could only send request to me via
"/book?name=ruby_lang&published_at=2014"
but not with any other urls.

If you are OK with generating the HTML files yourself, you could simply use nginx's rewrite module. Example:
rewrite ^/book book-name-eq-$arg_name-published_at-eq-$arg_published_at.html last;
If you need to make sure that name and published_at are valid, you can instead do something like this:
location = /book {
if ($arg_name !~ "^[A-Za-z\d_-]+$") { return 404; }
if ($arg_published_at !~ "^\d{4}$") { return 404; }
rewrite ^/book book-name-eq-$arg_name-published_at-eq-$arg_published_at.html last;
}
This will make sure that published_at is a valid 4-digits integer, and name is a valid identifier (English alphabets, numbers, underscore, and hyphen).
To make sure a book is only accessible from one URL, you should throw 404 if the URL is the HTML file. Add this before the previous rule:
location ~ /book-(.*).html {
return 404;
}

OK, thanks to #Alon Gubkin's help, finally I solved this problem, (see : http://siwei.me/blog/posts/nginx-try-files-and-rewrite-tips) .here are some tips:
use 'try_files' instead of 'rewrite'
use '-' instead of underscore '_' in your static file names, otherwise nginx would get confused when setting $arg_parameters to your file name. e.g. use "platform-$arg_platform.json' instead of "platform_$arg_platform.json"
take a look at nginx built-in variables.
and this is my nginx config snippet:
server {
listen 100;
charset utf-8;
root /workspace/test_static_files;
index index.html index.htm;
# nginx will first search '/platform-$arg_platform...' file,
# if not found return /defautl.json
location /popup_pages {
try_files /platform-$arg_platform-product-$arg_product.json /default.json;
}
}
also I put my code on github so that someone interested in this issue could take a look:
https://github.com/sg552/server_dynamic_urls_as_static_files

Related

use many ssi with fastcgi_cache in nginx

I am using fastcgi_cache in nginx in order to speed up the php, which connects database and select articles.
Here is my flow:
first, I add a ssi-include in my index:
<\!--# include file="/templates/1-5-list.html" -->
then, I add a location route to handle html -> php in nginx-conf
location ~(\d*?)-(\d*?)-list.html
{
try_files $uri /articles/list.php?p1=$1&p2=$2;
}
after that, I apply fastcgi_cache for list.php
# outside the server{}
fastcgi_cache_path /home/cache/articles levels=1 keys_zone=articles_cache:10m max_size=1024m inactive=1h;
fastcgi_cache_key $scheme$host$request_uri$request_method;
# outside the server{}
location ~/list.php$ {
fastcgi_cache articles_cache;
fastcgi_cache_valid 200 2h;
...
}
Everything is ok right now, and the caching function well.
However, If I have two or more ssi in my index:
<\!--# include file="/templates/1-5-list.html" -->
<\!--# include file="/templates/2-5-list.html" -->
The second ssi return exactly the same result as the first one, FAIL!
I search inside the cache directory, And I found that, the KEY using for caching is httplocalhost/articlesGET, which means such two ssi are sharing the same KEY. And I think this is the cause.
My question is how can I modify the fastcgi_cache_key such that they can have different KEY? I've tried adding fastcgi_cache_key inside location{} but fail.
$request_uri in nginx SSI subrequest refers to parent request URI.
Use $uri in included fragment cache key instead:
fastcgi_cache_key $scheme$host$uri$request_method;

NGINX domain rewrite rules

What I am trying to achieve using nginx is this:
http://domain.com - redirects to http://otherdomain/page.html
http://www.domain.com - redirects to http://otherdomain/page.html
http://domain.com/* - redirects to http://otherdomain/*
Basically only the domain and the www should be redirected to an url link. Everything else should be redirected to another domain, but keeping the url like this:
http://domain.com/subdomain/page.html -> http://otherdomain/subdomain/page.html
If you have questions please let me know. Thank you!
You can use $request_uri
see in http://nginx.org/en/docs/varindex.html
Probably like below
server{
location = /
{
rewrite ^ http://otherdomain/page.html;
}
location /subdomain
{
rewrite ^ http://otherdomain/$request_uri;
}
}

Laravel quick start guide route return 404

So I am a new Laravel user,I just went straight to the documentation to get started.but I get a question:
In app/routes.php,I write like this:
Route::get('users',function() {
return 'hello';
});
Route::get('/',function() {
return View::make('hello');
});
When I hit
127.0.0.1/aerial/public/
it works fine;When hit:
127.0.0.1/aerial/public/index.php/users or 127.0.0.1/aerial/public/users or localhost/aerial/public/index.php/users
it returns 404;My environment nginx.
You should probably convert your .htaccess file in public directory to nginx format and make sure you have mod rewrite (or nginx module used for that) enabled

Redirect to other domain if static files do not exist

I would like to configure nginx so that
The requested path is a static file, serve it;
otherwise redirect an external domain: http://example.net/.
I do understand that I need to use the try_files directive, but I do not understand how to implement the fallback mechanism.
Found after many tests. What happens if a static file is not found is defined in the last parameter of try_files. You can set that parameter to an internal location. In my case I used
server {
...
root /srv/example.org/web/static;
index index.html;
try_files $uri #redirect-to-dot-net;
location #redirect-to-dot-net {
rewrite ^ http://example.net redirect;
}
}

xsendfile only works from index

I'm trying to send a file to the user using xsendfile within the code igniter framework.
It is all installed correctly, my problem is that it only seems to work from the route, even though every page comes from index.php anyway.
This is my function:
function _output_file($name, $root_location, $public_location = FALSE)
{
if (function_exists('apache_get_modules') && in_array('mod_xsendfile', apache_get_modules())) {
header ('Content-Description: File Transfer');
header ('Content-Type: application/octet-stream');
if (strstr($_SERVER["HTTP_USER_AGENT"], "MSIE") != FALSE) {
header ('Content-Disposition: attachment; filename='.urlencode($name));
} else {
header ('Content-Disposition: attachment; filename="'.$name.'"');
}
//86400 is one day
header ('Expires: '.gmdate('D, d M Y H:i:s', (TIME_NOW + 86400)));
header ('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header ('Pragma: public');
header ('X-Sendfile: '.$root_location);
exit;
} else {
redirect(site_url($public_location));
}
}
If I place this at the top of my index.php and load the root it works fine, but if I try to access it from domain.com/controller/function it returns a 404 error.
It is definitely using the index.php file since if I replace the function call with die("test"); this displays to the screen.
I believe it's something to do with what permissions xsendfile has to access the file, but since it's working from the root index.php I would have thought it would have complete permissions, presumably it's based on what the request url is, which I find strange.
So.... does anyone have any suggestions as to how I can get xsendfile to work through codeigniter, from a url such as "domain.com/files/get/12"?
XSendFile has some specific security-related path quirks... and depending on your server configuration, these issues can sometimes occur over HTTPS even when HTTP seems to be working correctly.
If you run into mysterious 404s while using mod_xsendfile and you can confirm that the files being served really do exist, then you probably need to configure XSendFilePath in your Apache confs.
Add the following to the appropriate conf (httpd.conf, ssl.conf, httpd-ssl.conf, etc) and/or inside the appropriate VirtualHost declaration (if using vhosts)...
XSendFilePath /Absolute/Path/To/Your/Working/Directory/
Note: You cannot add this to an .htaccess file. It must be placed into an Apache conf.
Generally, xsendfile will try to figure out the working directory automatically, but sometimes it can't. This directive tells it explicitly what directory (or directories) should be accessible through xsendfile. Chances are that a mysterious 404 means that your directory isn't passing the whitelist check for some reason. This will fix that.
And don't forget to reboot apache after you change the config.
Prefixing a method name with an underscore makes it inaccessible through the URL.
From the documentation:
Private Functions
In some cases you may want certain functions hidden from public access. To make a function private, simply add an underscore as the name prefix and it will not be served via a URL request. For example, if you were to have a function like this:
private function _utility()
{
// some code
}
Trying to access it via the URL, like this, will not work: example.com/index.php/blog/_utility/
It seems this answer never got a reponse, in the end I just created a file in my root called "getfile.php", it's not perfect but it gets the job done for now, here it is for anyone that may find it useful.
<?php
define('BASEPATH', 'just done to stop direct access being disallowed');
function show_getfile_error()
{
echo 'You do not have permission to download this file, if you think this is a mistake please get in contact.';
exit;
}
include('applications/config/database.php');
$mysqli = new mysqli($db['default']['hostname'], $db['default']['username'], $db['default']['password'], $db['default']['database']);
if(!preg_match('%^[0-9]+$%', $_GET['key']))
{
show_getfile_error();
}
else
{
$query = mysqli_query($mysqli, 'SELECT * FROM getfiles WHERE getfile_key = '.(int)$_GET['key']);
$result = mysqli_fetch_array($query, MYSQLI_ASSOC);
if(!$result || $result['getfile_ip'] != $_SERVER['REMOTE_ADDR'])
{
show_getfile_error();
}
header ('Content-Description: File Transfer');
header ('Content-Type: application/octet-stream');
if (strstr($_SERVER["HTTP_USER_AGENT"], "MSIE") != FALSE) {
header ('Content-Disposition: attachment; filename='.urlencode($result['getfile_name']));
} else {
header ('Content-Disposition: attachment; filename="'.$result['getfile_name'].'"');
}
//86400 is one day
header ('Expires: '.gmdate('D, d M Y H:i:s', (TIME_NOW + 86400)));
header ('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header ('Pragma: public');
header ('X-Sendfile: '.$result['getfile_location']);
}
?>
I have come across the 404 error today. If the path you pass to the header contains any components that lie outside of the root folder that index.php is in, then you get a 404. Make sure the path is relative to index.php, not an absolute path.

Resources