Hey folks, working on a CI app that provides public ad well as private (secure) access. I have not implemented SSL before but i understand that part of the setup is specifying which folders should be accessed using https.
I would like some advice with regards to how i should structure my folders to facilitate that.
Does the setup only affect the controllers folder? in other words should I split my app controllers between a public and secure subfolders under the standard CI controllers folder?
Do I need to do do anything to my views and models folders? anything else I should be paying attention to?
Your help would be appreciated.
thanks
OK the best way to split things up the way you want, would be to:
Set up your CodeIgniter app under a folder, say /var/www and ensure everything is working as you want.
Set the base url for the site under the config.php of CodeIgniter to just "/".
Create an Apache virtual host for the secure portion of the site, listening to requests on port 443 or whatever. Install your certificate and so on. http://www.namecheap.com are good for certs. Set up the web root as the CodeIgniter folder, e.g. /var/www.
Create a further Apache virtual host, pointing to the same directory e.g. /var/www for the unsecure version of the website.
You will now, all being well, at this stage be able to access the entire site using either https or standard http. I think you mentioned being able to take things a step further by only allowing access to certain controllers via HTTPS and certain unsecure. What I would do for this is the following.
Create a CodeIgniter library, call it say Ssl.php, under your application/libraries folder. Put in the following code:
class Ssl {
public function require()
{
// Is the current request method secure, via SSL?
if ( ! isset($_SERVER['https']) )
{
// No. Do something here, display an error, redirect... up to you
show_error("This resource must be accessed through an SSL encrypted connection.");
}
}
}
Now, in your application controllers, simply load the library the usual way $this->load->library('ssl') and for any controller method that you wish to require an SSL connection for, simply call the $this->ssl->require() method before any execution starts.
You could even go a step further and drop that method call to require() in a class controller __construct() function, or even an entire new controller that you may wish to extend from.
I hope this helps in some way.
Hey, I am in the middle of developing a CI app that is successfully running with HTTPS/SSL.
I think you are a bit confused. As far as I know, you can only set up an SSL enabled site per se by creating a new site or "virtual host" if you are using Apache for example.
So essentially if you were using Apache, you would create a virtual host to handle requests on port 443 for say https://example.com and then set the web root to say /var/www or wherever your CI app sits. You would also have to configure Apache to use your certificate file, once you have bought the cert and downloaded the bits and bobs after generating the cert request. It's easier than it sounds.
Is there any reason why you can't just have your entire app running through SSL? Rather than an encrypted and non-encrypted section? There is a small CPU overhead for SSL but it is minimal.
I hope this helps in one way or another.
EDIT IN RESPONSE TO COMMENT:
You're welcome. It's a minimal overhead. For the hassle, I would just simple put it all under an SSL vhost. Plus, if you were to split content between SSL/non-SSL, you may notice that if you include non-SSL based content on an SSL page, users will get a pesky message in their browser about "insecure content" etc, which may put them off and create some needless doubt.
It may be quite difficult to split as you want - as you would need seperate root index.php CI files for each vhost to allow CI to route it correctly. You couldn't just set a vhost serving a directory such as application/controllers/private/ because CodeIgniter wouldn't know how to handle the request without some severe modification to it's core routing.
I would honestly just stick everything under an SSL vhost. Or, another option would be to set up two CI apps running from the same system/core CI folder... if that makes sense, but then sharing content such as libraries and models will become tedious.
Related
I searched for my issue in so many ways, but I don't seem to find the correct case, so I'm asking here.
I have a Laravel app which is installed on a server and everything works correct. The domain is set as HTTP only and is configured from AWS. However we need to have another domain which should work only from HTTPS. The HTTP domain is pointing to the server instance and the HTTPS one is pointing to a CloudFront distribution with origin the HTTP domain. The issue is that when I open the HTTPS domain, all of the links and images are loaded from the HTTP domain.
To be more concrete, let's say I have http://mysite-notsecure.example.com and https://mysite-secure.example.com.
When I open http://mysite-notsecure.example.com everything works as it should and there are no issues. However when I open https://mysite-secure.example.com the site loads, also files like app.js and app.css load with the correct host, but things like fonts, images, links, etc, load from http://mysite-notsecure.example.com.
Because most of the urls are built with the url() function, I think the issue has something to do with APP_URL, which was first set to http://mysite-notsecure.example.com, but when I added the new domain, I set it to empty (APP_URL=),
however the urls are still built the same way (I cleared config cache).
What should I do in order for my site to build the urls according to the current host?
I don't need any other change for the two domains. They should load everything exactly the same, only the host should remain and not redirect to the other domain.
It turned out there were two different issues.
I'll describe them here, because there is a slight chance someone could be dealing with one of them.
First, I printed the contents of the $_SERVER variable on both domains and the host in both was the same - the HTTP domain.
This issue was from the CloudFront configuration. Turned out the Host header was removed from the CF distribution behavior, so that CF replaced it with the origin's value (the origin is the http domain). After this was fixed, the host in $_SERVER appeared correctly.
But the initial issue for the urls building was something else which I didn't think of. After clearing the cache to remove the debugging and seeing the right urls on the HTTPS domain, I switched back to the HTTP one and saw now there all the urls pointed to the HTTPS domain. That is when it hit me that these domains share not only the configuration, but also the cache. And most of my urls on the page I was testing with, were coming from a function with cache, so when the cache was stored from one of the domains, they appeared the same on the other. When I included the host in the cache key, everything worked correctly.
Hope this helps someone else.
goto file .env and setup the APP_URL=https://mysite-secure.example.com/ and change
href={{asset('folder-path')}} in layout or blades file
I have this issue where I cant decide what's for what and what is the best. Hope someone could provide me with some explanations to clear up my confusion.
So here it is, I have a site which is quite large. Currently running on Yii Framework which I am in the process of migrating it to L5 Framework. This website has several sub sites structure like below example:
1) www.example.com
2) www.example.com/{username}
3) explore.example.com
4) explore.example.com/{organization}
5) connect.example.com
6) coin.example.com
7) m.example.com
8) e3.example.com
So the current hosting method is using 1 project to host everything. Here comes the problem, should one of the sub sites were to be disabled for whatever reason, the whole website would need to be put to a stop, code to disable the site and the deploy the entire site again.
Back to L5 Framework, I noticed that I can do the same in laravel aslo to host everything in one project using the following routing methods in routes.php:
Route::group(['domain' => '{account}.myapp.com'], function()
{
Route::get('user/{id}', function($account, $id)
{
//
});
});
Again the same question rises, if there is a sub site to be disabled, I would need to code the disable site and redeploy the whole site again. So I was thinking if it is practical to host each sub sites in a new project as a module such that if any changes to that sub site, it will only affect that particular module while the others maintain running.
Additionally I would also like to ask, if I host it as a separate project, my website requires the user to login before they can navigate to any of the pages or application. So how do I tell the other modules where the user has been logged in and everything can be proceed as usual?
Finally of course if anyone have any other suggestions or approaches feel free to enlighten me too. My main motives are to achieve:
1) Maximum development flexibility
2) Fail tolerance
3) Sub sites can share the same login with the main website
(You are only require to login at the main page and you will be authorized
to use the rest of the web application)
Thank you.
I guess, the answer will be too late. Nevertheless, it might help someone else.
Today I faced a similar dilemma myself - should I use subsites or separate applications for public site and admin site of my application.
At first it seemed that I should have separate apps to avoid bringing down bith sites if I wish to upgrade just one of them. But I did not want to duplicate all the Laravel boilerplate code all over again in my repository.
Then it came to me that actually I can implement everything as subsites, but I can deploy the Laravel app to multiple subdomains. Thus there will be single app in the repository, but I'll have a different copy for each subdomain.
Drawback - the code of controllers, views (and some site specific models) will be duplicated for each deployed site. That's not an issue for a small site - just a bunch of unused files. But if I wanted a clean solution, I could implement some modular structure and bring in site-specific MVC stuff during deployment. That's another topic, I guess. There are some 3rd party solutions for modular Laravel apps. Of course, modular solution adds to the complexity and maintenance, therefore I have nothing against file duplication on server, if that's not an issue for you.
Regarding authentication - I haven't tried it with Laravel, but I have experience with implementing cookies for top level domain in pure PHP application. Essentially, you can configure your session cookie to be effective for all subdomains, and then users will have to log-in just once in any of your sites. Here seems to be a working solution for Laravel:
Persisting sessions across subdomains in Laravel 5
Just set the top level domain to .example.com and theoretically you should be good to go. For development purposes, I'd store this value in .env, though.
I want to prevent users access for my "~/Content/..." folder I wrote it as follow in "Global.asax.cs" and put this line of code at the top of every other routes
routes.IgnoreRoute("Content/{*pathInfo}");
but it does not work. in fact user can see every files in content folder by type the URL in browser.
am I missing something?
How did you figure out that it does not work? Give example.
You may have put it last in the Routing table. So try to move it up so that it gets added to the routing table first. The route collection is an ordered list of routes.
Also try this : Routes.IgnoreRoute("Content/");, but your version of ignore is also correct and it should work.
Lastly, I do not know what you mean when you say the user can see all the contents of the Content folder : Isn't that the point? User must be able to download files from the folder, and we usually just need MVC to ignore the requests from coming into the framework, and so that IIS can directly serve those files.
or did you mean Directory browsing is enabled, and you want to disable that : In that case go to IIS manager, and select your website and look for the Directory browsing option and disable it as shown here.
Your problem cannot be solved by routing constraints. There are 3 significant steps in processing request:
IIS got request.
IIS watch at filesystem and search for direct correspondence to file
If IIS didn't found any file - it gives request to ASP.NET MVC for processing.
So, you need to configure folder security to forbidden direct access to files, but allow access to application, as here.
But I don't recommend to secure folder, that should be shared. I don't believe that your site shouldn't have images to display :) If you have some secured content, you need to create another folder.
I am trying to setup multiple stores within same hosting account and I studied many interesting guides out there on the matter.
Seems to me I figured out simplest solution for me - map both my dot com sites onto the same directory on host, and modify .htaccess to launch different website depending on URL, like so:
SetEnvIf Host .*anatscraftonia.* MAGE_RUN_CODE="anatscraftonia";
SetEnvIf Host .*anatscraftonia.* MAGE_RUN_TYPE="website";
My first store works fine, but when I go to anatscraftonia.com , all I get always is a Magento 404 page.
I checked all settings, multiple stores are defined and code above is copy/paste from Admin console under Website I added. I have Home page enabled for All Stores and Base URL redefined for both Secure/Unsecure. I also tried changing website to store, with no improvements.
What else am I missing? How do I even know what page is it trying to go to, or whether it even gets “no-route” or just totally whacked…
I threw together some quick and dirty code a few years ago to log the controller dispatch process in Magento Community 1.3x.
http://alanstorm.com/magento_controller_dispatch_logging
I don't think you'll be able to drop those files into a current installation, but it should give you an idea where to stick some logging functions to see why Magento is routing to a 404.
I am not 100% sure what you are trying to achieve but why not use a single instance of magento to handle both stores and magentos own multi store capability..
full details here http://dx3webs.com/front/2010/08/magento-multistore-setup-under-plesk/ post contains links to cpanel instructions as well.
the instructions are for multiple domains but will work with sub folders
I googled this many times but now I have to ask it here.
I want to make a workflow for a website for Developpement/Production.
My constraint is that I use Facebook Connect (Facebook Graph now) so I need to have the dev and prod on the same domain and server. (to be able to log in and test the features)
I thought I will edit the CodeIgniter Index.php to redirect if I have a specific user agent (I can edit the one of my firefox)
You think it's a good Idea or you have a better one ?
And now comes the eternal question : how can I deploy this the easy way ?
should I use Capistrano or Phing ?
or simply a script with SVN ?
Please help me, I'm totally new to this Deployment thing. I used to work directly in production for my little websites or on other domains. but now it's not possible anymore.
For me, I'll have something like two application folders. One called "production", one called "development". Then in your index.php file, where you set your application folder, you can use php to determine which one to use for whatever reason. Just set your $application_folder variable to whichever one you need. (You could do this based on anything. A cookie, IP address or something.)