What does website isolation / user isolation do Laravel Forge? - hosting

In Laravel Forge there's an option called Website Isolation, which some people also seem to mention as user isolation as well. I am not quite sure what it does and possible benefit for such functionality. It would be great if someone can explain this!

It also means you can assign different users to different sites on your server. You can give them SSH and SFTP access to a specific site without allowing them access to other sites on the same server. It also can prevent malicious code in one site from affecting another site on the same server if these sites have different users.
See https://blog.laravel.com/forge-user-isolation

I found out that one of the biggest advantages for using Website / User Isolation is that Laravel Forge will be then able to offer unique and independent nginx environment compared to other sites exist in the same server. Thus, in situations where it requires restarting an nginx or overloading, other sites in the server should work perfectly fine.

Related

Must strange site visitor user agent be avoided? If yes how?

I am using shared hosting.
My site was showing "ERR_CONNECTION_REFUSED".
So i went to see visitors to my (SSL) site.
I found that instead of regular names in the "User Agent" list,
cpanel visitors list is showing
user agent Expanse indexes the network perimeters of our customers. If you have any questions or concerns, please reach out to: scaninfo#example.com"
I want to know whether this is harmful and if yes,
How to avoid such unknown user agents?
Is there something i should do with ".htaccess" file?
Once again, i am using shared hosting (so, i have limited accessibility).
The ERR_CONNECTION_REFUSED you saw when accessing your website had nothing to do with the visitor you saw in cpanel. You might have had a different issue with your server configuration/shared hosting provider.
That "visitor" was an internet crawler, most likely from Palo Alto Networks, who owns Expanse. Long story short, it shouldn't cause any harm. They say that their crawlers are used to index/categorize URLs around the internet and/or to spot malicious content.
I advise you to ignore it, since there's not much you can do - I assume they have some ranges of IPs for their crawlers so you wouldn't be able to blacklist all of them anyway.

Best practice to store App Key in Laravel

I have been doing a lot of research on this and I can't seem to find a definitive answer. Obviously these days security is a big issue, hacks are going on all over the place of major companies that invest millions into security and they're still getting hacked.
I work on Laravel a lot and use shared hosting with Hostgator or some similar company of high report. Laravel comes with a built in function for encrypting database info and decrypting to the user when requested.
However, I have a question on how secure this ACTUALLY is. If someone gets into my cPanel, my app key which is used for encryption is right there in front of them. Granted, my cPanel password is the one that's auto-generated by Hostgator and it's complete jibberish with semicolons and alphanumeric strings all over, so it's not easy to guess.
But I'm trying to learn a little bit more about security. If my app key in my env file is locked securely behind my cPanel login, is Laravels built in "encrypt()" method "enough" to call an app "secure"? Is there other measures within Laravel or my host provider that could make it more secure than just tight passwords? Is there some sort of practice of referencing the app key through an external source that's not located in the cPanel area? So even if my cPanel got hacked, my app key wouldn't be in those files and get exposed?
I'm not a security expert, but there are a few points I can share from my experience in working at highly-secured companies.
First, Laravel itself is fine. You can generally trust open source software since it's transparent and security bugs get discovered and addressed early. So you don't need to improve Laravel, just use it as is, preferably an LTS version.
Then, CPanel is a liability. You should minimize weak points on your system, i.e. those that are externally accessible. Get a VPS or a private server and access it via an SSH, don't use tools like CPanel and PhpMyAdmin on it. The less software you have that talks to the outer world, the less vulnerable you are to bugs in that software.
In my current company the production server can only be accessed via SSH from a single IP address, the address of the dev server. So I log in to dev server first, and then log in from there to the prod. It denies all connections from all other IPs.
If you are limited to using CPanel or something similar, consider protecting the login page with HTTP Basic Auth, some hosting providers allow that.
You also want to keep your system and software up to date. Not too new either as that may have bugs that haven't been caught yet. Our devops prefer to have it a couple of minor versions behind, so that the community has time to test it out and get hacked for you.
That's all I know as a web-dev, sure enough there are special tools and ddos protection services but that's beyond a dev's concern imo. If you just follow these steps, you should be safe. Hope that helped a bit, cheers :)

Device based access policy for Laravel

Security is not my area of expertise. I am working on a lightweight administrative Laravel web app for internal use by company (small) employees:
The app is intended to be used only by the employees
Remote work (from home) is not uncommon
Smartphones and laptops are usually used when working remotely
I would like to secure it as much as possible - beyond authentication, access controls or 2FA. I am trying to think of ways to make it virtually invisible to the public, but still available for the employees. Defining proper rules for crawlers might make it a bit more obscure but I think more could be done. Network based restrictions would limit the employee flexibility.
Based on this I got the idea that the app could be made available only if the request is made by an authorized device. I am not sure however whether or not this is a good approach. Neither do I know how to tackle the problem of authorizing the various devices and making that information available to the server during communication.
i.e. How would I tag a device as authorized so that I only have to do it once and can reliably validate the information in a web app? Regular authentication as well as role based access would still be in place but the app could return a 404 response if the accessing device is not whitelisted.
Is there a way to achieve something like this while not making it too restrictive for the users or painful to set up? Or is there a better method for achieving the same result?
Consider a VPN?
If you are hosting the device on an internal network, you could see if the IT dept. can set up VPN access to work remotely (in most cases, this is already in place) and then it does not need to be accessed over the internet via a URI. Instead you can simply navigate to the internal address once you're in the network through the VPN - no public access and no need to worry about pesky web crawlers!
It also makes it easier to moderate your application. For example, if an employee leaves the company you can simply revoke their VPN access and they'll no longer be able to access the application.

Multiple Wordpress Installations on a Single Domain name

I want to set up Multiple Wordpress Installations on a Single Domain name. I need a wordpress installation for my webdeveloper website, and other installations for developing customer websites. After development I would have to move these new websites to client server. Technical staff of my web hosting company suggested two possibilities:
1)development.domainame.com (create a subdomain for additional wordpress installation)
2)domainame.com/development (use subfolder for additional wordpress installation
They recommended to use solution 1). May I know the pro and con of each solution ? And would be a best practise ? Thx in advance
PS: I cant use multisite, not suited for my case
Your web host's technical staff is correct. You have 2 options:
Use Subdomains. ex: sub1.mydomain.com vs sub2.mydomain.com
Use Subfolders. ex: mydomain.com/folder1 vs mydomain.com/folder2
The benefit of using subdomains is that the "environments" are separated. The two websites will act like they are entirely different websites, making it easier to migrate the clients site in the future.
The benefit of using subfolders is that it's easier to "switch in between the sites". You could access all the files using the same FTP login.
I would suggest the use of subdomains. It separates the sites into 2 clearly different sites, will make it easier to restrict client/user access by-subdomain, and will decrease the chances that making a change to one site would affect the other (for example, if you have 2 wordpress sites as subfolders, it would be much easier to accidentally edit or change the wrong mysql database table)

Setting up web farm for DNN 6.2.6 CE with multiple file servers

We are planning to convert our website which is running on single server to a web farm with two servers on Windows 2008 R2. I am afraid I haven't found lot of documentation on how to achieve this. Can any one please point me to the proper documentation for this. The one document I found is
http://www.datasprings.com/resources/articles-information/creating-a-webfarm-for-your-dotnetnuke-site
This one explains using single UNC share as file server but we are looking into to use every server in the web farm as file server (i.e. have dotnetnuke folder on all the server's local drive) since UNC share becomes single point of failure. So my questions are:
Can we do DNN web farm with multiple file servers, if so, how?
And also, how does the modules updates should be done? Does it need to be done on all the servers separately or does the DNN has any inbuilt mechanism for this or do we need to use DFS replication between the servers?
Also, we use heavy caching. Since we have to use file caching in the DNN CE web farms, how does the caching works with multiple file servers?
Also, please let me know any points or gotcha that I need be aware of. Any help is greatly appreciated.
The recommended way or doing a web farm for DNN is to use a single UNC share. Even with the paid editions of DNN that is the recommended approach.
Is it possible to do it any other way, yes, but there is nothing build into DNN to help you do so.
If you want to use multiple file servers you start running into issues with file based caching, module installations, etc.
Using UNC Share is best and easy to setup method to run DNN portal on webfarm. If you use single UNC share from all servers then possibly you do not get cache issue.
I had setup webfarm of DNN portal which was running on 4 web servers and 5th one was being used as file sever (UNC share) and DNN Database server and it was working quite well.
One more thing you should consider is that - Sessions.
DNN itself do not make use of Session and session variables. But if you are using your own modules or third party modules using session then it will be good to implement Session State Server.

Resources