How to select environment by ip in Laravel - laravel-4

I want laravel to set the environment to "local" when the visitor is from 127.0.0.1, however the Request object is not available yet in bootstrap/start. Is there any built-in way to do this, or am I going to have to directly access $_SERVER['HTTP_CLIENT_IP']?

IP-based environment detection via $app->detectEnvironment() were removed recently. I believe it was in 4.1, because they were not very secure. A user can spoof their IP address and thus potentially access areas of the site you don't want them to access -- or get secure debug information, for example.
I know you asked for an IP solution, but the built-in method for detecting your environment should look like this:
$env = $app->detectEnvironment(array(
'local' => array('MYHOSTNAME'),
));
If you have multiple systems/hostnames working from localhost, just add them to the array. This will keep things much more secure than trying to make IP addresses work.

This sort of thing should work.
$env = $app->detectEnvironment(function() {
if($_SERVER['REMOTE_ADDR'] == '127.0.0.1') { return 'local'; }
});
Swap out REMOTE_ADDR for HTTP_CLIENT_IP if that's how your infrastructure works.
If you want to use the Request object, you might be able to do it in an App::before filter, but I'm not certain if you can change the environment at runtime.

Related

Laravel forcing Http for asssets

this is a little bit strange because most of the questions here wanted to force https.
While learning AWS elastic beanstalk. I am hosting a laravel site there. Everything is fine, except that none of my javascripts and css files are being loaded.
If have referenced them in the blade view as :
<script src="{{asset('assets/backend/plugins/jquery/jquery.min.js')}}"></script>
First thing I tried was looking into the file/folder permissions in the root of my project by SSHing into EC2 instance. Didn't work even when I set the permission to public folder to 777.
Later I found out that, the site's main page url was http while all the assets url were 'https'.
I dont want to get into the SSL certificates things just yet, if it is possible.
Is there anyway I can have my assets url be forced to Http only?
Please forgive my naiveity. Any help would be appreciated.
This usually happens if your site is for example behind an reverse proxy, As the URL helper facade, trusts on your local instance that is beyond the proxy, and might not use SSL. Which can be misleading/wrong.
Which is probaly the case on a EC2 instance... as the SSL termination is beyond load balancers/HA Proxies.
i usually add the following to my AppServiceProvider.php
public function boot()
{
if (Str::startsWith(config('app.url'), 'https')) {
\URL::forceScheme('https');
} else {
\URL::forceScheme('http');
}
}
Of course this needs to ensure you've set app.url / APP_URL, if you are not using that, you can just get rid of the if statement. But is a little less elegant, and disallows you to develop on non https

Laravel shared cookie detection issue in domain and subdomain

I am working on Laravel 5.4.30.
Imagine that we have a domain example.com and a subdomain of dev.example.com. The main domain is for master branch and the dev subdomain is for develop branch. We have cookie notice system that will be hidden after clicking on Hide Cookie Notice button. This works by setting a cookie forever.
We have set the SESSION_DOMAIN configs to each domain for each environment.
For main domain:
SESSION_DOMAIN=example.com
For dev subdomain:
SESSION_DOMAIN=dev.example.com
Now the issue comes from here. If we go to the example.com and click on hiding the cookie notice, a cookie will be set forever for main domain. After that we go to the dev.example.com and do the same. So a cookie will be set for subdomain as well. But this cookie has been set after previous one. (The order is important)
Now if we refresh the subdomain, we will see that notice again! (not hidden) The browser has read the main cookie because of .example.com set in domain parameter of cookie in the browser, so every subdomain will be affected. But the view still shows the notice because it cannot read any cookie for hiding.
Anyway I don't want to share that cookie across all subdomains. How can I achieve that? I think I should add a prefix for cookie name. But I don't know how to do it, that laravel automatically adds prefix to cookie name.
Any solutions?
You need to implement your own "retrieving" and "setting" a cookie.
Retrieving (has, get) cookies
Create yourself new class (anywhere you like, but I would do app/Foundation/Facades/) with name Cookie.
use \Illuminate\Support\Facades\Cookie as CookieStock;
class Cookie extends CookieStock {
//implement your own has(...);
public static function has($key)
{
return ! is_null(static::$app['request']->cookie(PREFIX . $key, null)); //get the prefix from .env file for your case APP_ENV
}
//implement your own get(...);
public static function get($key = null, $default = null) {...}
}
Now open up config/app.php and change corresponding alias (cookie).
Setting (make) cookies
Create yourself new provider (use artisan), and copy-paste code from Illuminate\Cookie\CookieServiceProvider.php and change namespaces.
Again open up config/app.php and change corresponding service provider with the new one.
Create yourself new class (anywhere you like, but I would do app/Foundation/Cookie/) with name CookieJar.
use \Illuminate\Cookie\CookieJar as CookieJarStock;
class CookieJar extends CookieJarStock {
//Override any method you think is relevant (my guess is make(), I am not sure at the moment about queue related methods)
public function make($name, $value, $minutes = 0, $path = null, $domain = null, $secure = false, $httpOnly = true)
{
// check before applying the PREFIX
if (!empty($name)) {
$name = PREFIX . $name; // get the PREFIX same way as before
}
return parent::make($name, $value, $minutes, $path, $domain, $secure, $httpOnly);
}
}
Update the code in your own cookie service provider to use your implementation of CookieJar (line 19).
Run $ composer dump-autoload, and you should be done.
Update
Since BorisD.Teoharov brought up, that if framework changes signature of CookieJarStocks make() (or any other cookie related function) in between the major versions, I made a example repository, that includes a test that can be used as is and it will fail if signature change happens.
It is as simple as this:
public function test_custom_cookie_jar_can_be_resolved()
{
resolve(\App\Foundation\Cookie\CookieJar::class);
$this->assertTrue(true);
}
Detailed how to can be inspected in the corresponding commit diff.
I've setup test environments to make sure, I'm not missing any details.
As in my former answer, I thought invalidating cookies will be sufficient for that case, but as #BorisD suggested it is not, and I've confirmed that on my tests.
So there are a few important notes, coming from my experiences...
Don't mix Laravel versions in subdomains - If using SESSION_DOMAIN you need to make sure your Laravel version matches (between root and subdomains), cause I've experimented with 5.4 under example.com domain and 5.6 under dev.example.com. This showed me some inconsistency in dealing with Cookies, so some important changes have been done, between these versions, and you can be sure it will not work correctly if you mix versions. I finally ended up with Laravel 5.6 on both domains, so I'm not 100% sure if that works on Laravel 5.4, but I think it should.
Make sure all your subdomains use the same APP_KEY - otherwise, Laravel will be unable to decrypt the Cookie, returning null value, cause all encryption/decryption in Laravel uses this app key...
SESSION_DOMAIN. In SESSION_DOMAIN I've pointed the same root domain like example.com for both domains. With this setting, I can create a cookie on root domain, and retrieve it correctly on both domains. After that setting, creating a cookie on subdomain forces root domain to receive new value from subdomains cookie also, and they are overridden. So I guess everything works here as requested in the original question.
Cookie make parameters - In case you want to use a subdomain in SESSION_DOMAIN, you can safely do that also. However, you need to make sure, important let's call them global cookies are defined in a bit different way. Cookie make syntax:
Cookie make(string $name, string $value, int $minutes, string $path = null, string $domain = null, bool $secure = false, bool $httpOnly = true)
So what's important here, you need to put your root domain for this particular cookie on creation like this for example:
return response($content)->cookie('name','value',10,null,'example.com')
Conclusions:
With this config, you should be able to access your Cookies properly under subdomains and your root domain.
You may probably need to update your Laravel installations to 5.6, which will force you to upgrade to PHP 7.1 at least (there were some changes to cookies in php also)
And finally, in your code, don't rely on Cookie existence, but on its values only (I don't know if that's in your case).
You could set a prefix for the cookie name depending on the environment.
First, add COOKIE_PREFIX to your env file.
COOKIE_PREFIX=dev
Then, use it when setting your cookie
$cookie = cookie(env('COOKIE_PREFIX', 'prod') . '_name', 'value', $minutes);
Then, retrieve it like so
$value = $request->cookie(env('COOKIE_PREFIX', 'prod') . '_name');
One of reason for that is for both app
APP_KEY and APP_NAME
should same in .env file,
that worked for me after 2 days of tries, I checked each library internally to get this solution.

Typo3 behind Proxy

I'm trying to get a Typo3 (6.2) instance running behind a (forwarding) proxy (squid). I have set
'HTTP' => array(
'adapter' => 'curl',
'proxy_host' => 'my.local.proxy.ip',
'proxy_port' => '8080',
)
as well as
'SYS' => array(
'curlProxyServer' => 'http://my.local.proxy.ip:8080',
'curlUse' => '1'
)
The proxy doesn't ask for credentials.
When I try to update the extension list, I get the error message
Update Extension List
Could not access remote resource http://repositories.typo3.org/mirrors.xml.gz.
If I try Get preconfigured distribution, it says
1342635425
Could not access remote resource http://repositories.typo3.org/mirrors.xml.gz.
According to the proxy log, the server doesn't even try to connect to the proxy.
I can easily download the file using wget on the command line.
Ok, I've investigated he issue a bit more and from what I can tell, the Typo3 doesn't even try to connect anywhere.
I used tcpdump and wireshark to analyze the network traffic. The site claims to have tried sending a http-Request to repositories.typo3.org so I'd expect to find either a proxy connection attempt or a DNS query followed by an attempt to connect to that IP. (Of course, the latter is known not to work.) However, none of this happens.
I've tried some slight changes in the variable curlProxyServer. The documentation clearly states
String: Proxyserver as http://proxy:port/. Deprecated since 4.6 - will be removed in TYPO3 CMS 7. See below for http options.
So I tried adding the trailing "/" and removing the "http://" - no change. I'm confident there's no problem whatsoever regarding the proxy as the proxy isn't even contacted and has been working perfectly fine for everything else for years.
The error message comes from \TYPO3\CMS\Extensionmanager\Utility\Repository\Helper::fetchFile(). This one uses \TYPO3\CMS\Core\Utility\GeneralUtility::getUrl() to get the actual file content.
According to your setting, it should use the first part of the function, because curlUse is set and the URL starts with http or https.
So what you would need to do now is to throw some debug lines in the code and check at what point the request goes wrong.
Look at the source code, three possibilities come to mind:
The curl proxy parameters does not support a scheme, thus it should be 'curlProxyServer' => 'my.local.proxy.ip:8080',.
Some redirect does not work.
Your proxy has problems with https, because the TYPO3 TER should be queried over https.

What's the base URL for my app?

In Camping/Rack, how can I get the base URL for my app? I want to know so I can put it in an email it sends.
It might be (in development)
or
http://localhost:9292
or
http://localhost:80/game
or in production
http://fancy-snake.heroku.com
So far I have
url = #env['rack.url_scheme'] + "://" + #env['HTTP_HOST'] + R(LoginX, u.secret)
Which seems to work for the first and third cases. I don't know if it's write if the app is at localhost/prefix
You have to be a little careful with this, as there are a some subtle potential traps. The Rack::Request class will probably be helpful here.
First, you can’t really get the url for the app, as it may be responding to multiple urls (via Rack routes, Apache config, etc), so you’re looking at getting the url for the particular request. If you’re only serving requests from one url this won’t matter.
The scheme for the request is in the env hash under the rack.url_scheme, but this is only for the “last leg” of the request. If your app is behind a proxy of some sort (Nqinx, Apache etc.) then you want to get the scheme of the real request, not the request from the proxy to the machine your app is running on. If you’ve configured your proxy correctly it should be setting a header so you can tell what the original scheme was. Rack::Request has a scheme method that takes these headers into account.
The host for the url is probably in the env hash under the HTTP_HOST key, but this header is not necessarily present (admittedly that’s pretty unlikey nowadays). You should fall back on SERVER_NAME and SERVER_PORT. Additionally there’s the issue of handling proxied requests, you want the hostname of the original request, not the backend server. Again, Rack::Request provides host_with_port and host methods that deal with these issues.
Rack::Request also provides a base_url method that combines scheme and host, and additionally only includes the port if differs from the default (80 or 443).
The location that your app is mounted is in the env hash under the SCRIPT_NAME key. This would be /game in your second example, and can be empty if your app mounted at the root of your server. Again, Rack::Request provides a script_name method, although this one simply returns the value of the entry in the env hash.
So, in summary, you probably want to use something like this:
req = Rack::Request.new env
url = req.base_url + req.script_name
which looks pretty simple, but is taking care of various possibilities for you.
Additionally, you miight find the the Rack specification useful to have a read of, it contains details of the various entries that should be in the env hash.
Camping has a helper called URL which returns the absolute URL to your app:
URL() # => #<URL:http://test.ing/blog/>
URL() + "view/12" # => #<URL:http://test.ing/blog/view/12>
URL("/view/12") # => #<URL:http://test.ing/blog/view/12>

codeigniter php native sessions without using cookies or URL session id, but matching browserfingerprints in database

Because of european privacy law being harsly applied in the Netherlands and to keep my company's site user friendly without nagging the users with questions if it's okay to store a cookie on their computer that allows me to access their client data.
What I need is a way to "overwrite" the native php sessions class so that at the point where the native class requests the cookie that stores the phpsessid, that I can place my own code there that checks the browsers fingerprint and matches that to a session id which I can use to return the normal working of the native class.
My idea is:
table sess_fingerprints
Fields: fingerprint - phpsessid
function getsessionid()
{
$result = $this->db->query("SELECT phpsessid
FROM `sessiondatabase`.`sess_fingerprints`
WHERE `sess_fingerprints`.`fingerprint` = '$userfingerprint'");
if($result->num_rows() != 0)
{
return $result->row->phpsessid;
}
}
and at that point the native php session code just works as it would normally do.
So, my question is: is it possible to overwrite only the "cookie" part of the phpsession class? if so, how? because I haven't found that yet.
I'm aware of being able to pass along the session variable via urls etc, but that's not secure enough for my web applications.
PHP provides support for custom session handlers:
http://php.net/manual/en/session.customhandler.php
I think I have found the solution to my problem.
I'm going to override the functions related to cookies by using http://php.net/manual/en/function.override-function.php
Thank you all for thinking along.

Resources