Where or how to host secret code (from hoster)? - hosting

i would be very interested to know where you host websites whose code is not allowed to get out to the public. For example, I found out that big hosters in germany give their support staff full access to all files in the hosting of their clients. I assume that it is relatively easy to sneak a support job there. i am aware that the employees have signed contracts stating that they are to keep the customers' data secret. But what if a well-known website is hosted there and an employee downloads a copy and searches for weak points in the code at home to exploit them for himself?
Is there any way to protect particularly sensitive (php) files from the hoster?
How do the big players do it. Do they all host on their own servers?
I am curious about your answers.

Related

Best practice to store App Key in Laravel

I have been doing a lot of research on this and I can't seem to find a definitive answer. Obviously these days security is a big issue, hacks are going on all over the place of major companies that invest millions into security and they're still getting hacked.
I work on Laravel a lot and use shared hosting with Hostgator or some similar company of high report. Laravel comes with a built in function for encrypting database info and decrypting to the user when requested.
However, I have a question on how secure this ACTUALLY is. If someone gets into my cPanel, my app key which is used for encryption is right there in front of them. Granted, my cPanel password is the one that's auto-generated by Hostgator and it's complete jibberish with semicolons and alphanumeric strings all over, so it's not easy to guess.
But I'm trying to learn a little bit more about security. If my app key in my env file is locked securely behind my cPanel login, is Laravels built in "encrypt()" method "enough" to call an app "secure"? Is there other measures within Laravel or my host provider that could make it more secure than just tight passwords? Is there some sort of practice of referencing the app key through an external source that's not located in the cPanel area? So even if my cPanel got hacked, my app key wouldn't be in those files and get exposed?
I'm not a security expert, but there are a few points I can share from my experience in working at highly-secured companies.
First, Laravel itself is fine. You can generally trust open source software since it's transparent and security bugs get discovered and addressed early. So you don't need to improve Laravel, just use it as is, preferably an LTS version.
Then, CPanel is a liability. You should minimize weak points on your system, i.e. those that are externally accessible. Get a VPS or a private server and access it via an SSH, don't use tools like CPanel and PhpMyAdmin on it. The less software you have that talks to the outer world, the less vulnerable you are to bugs in that software.
In my current company the production server can only be accessed via SSH from a single IP address, the address of the dev server. So I log in to dev server first, and then log in from there to the prod. It denies all connections from all other IPs.
If you are limited to using CPanel or something similar, consider protecting the login page with HTTP Basic Auth, some hosting providers allow that.
You also want to keep your system and software up to date. Not too new either as that may have bugs that haven't been caught yet. Our devops prefer to have it a couple of minor versions behind, so that the community has time to test it out and get hacked for you.
That's all I know as a web-dev, sure enough there are special tools and ddos protection services but that's beyond a dev's concern imo. If you just follow these steps, you should be safe. Hope that helped a bit, cheers :)

Secure folder contents and delete them after certain number of days

I would like to secure folder, so that no one can cut or copy any file or contents of file without "secure" password (or happy to get rid of password bit as well, so no one can cut, copy or move any file or file contents from folder). Also, if all files and folders inside my root folder can be deleted after certain number of days, that will be great. This is to stop people from copying and distributing my files to others without my permission and folder contents to "expire" after certain number of days (e.g. 7 days).
Currently, I manually copy folder to other people's machine, so I do have physical access to their machines.
PS. I am happy to write a script as well, in case there is a way to execute script everytime I open the folder.
I understand, I can't stop people from stealing file contents by manually typing file contents to other file or taking photos of file contents, however I want to make it harder of them.
This is not a PowerShell issue, nor a solution provided by PowerShell. This is an data risk management issue as well as a reality check.
Don't get me wrong, you can write a scrip that encrypts data,
https://blogs.technet.microsoft.com/heyscriptingguy/2015/03/06/powertip-encrypt-files-with-powershell
Even just use EFS, but each of those have several limitations.
https://technet.microsoft.com/en-us/library/bb457116.aspx
Then there are password encrypted zip files. But.....
None of the above stop cut/copy/paste/print and there is no way to make them.
Here is the simple truth to data security which I deliver at all my public speaking engagements and customer deployment engagements.
Nothing can defeat and ocular attack. Meaning...
'If I can see your data, I can take your data.'
It may take me longer than being able to just bulk exfiltrate you data (copy to a USB, CD, DVD, native print, etc), but I can just take a picture, photo copy it, screen grab it from another device, manually write it down.
Either method allows me to walk away with it and give it to whomever.
You can only mitigate / slow down / prevent bulk exfiltration using DLP/RMS protection solutions.
Why are you putting this manually on their systems, vs hosting it in the cloud where they can access it. If you do this in MS Azure, you can leverage Azure Information Protection.
RMS for individuals and Azure Information Protection
RMS for individuals is a free self-service subscription for users in
an organization who need to open files that have been protected by the
Azure Rights Management service from Azure Information Protection. If
these users cannot be authenticated by Azure Active Directory and
their organization does not have Active Directory Rights Management
(AD RMS), this free sign-up service can create an account in Azure
Active Directory for a user. As a result, these users can now
authenticate by using their company email address and then read the
protected files on computers or mobile devices.
https://learn.microsoft.com/en-us/information-protection/understand-explore/rms-for-individuals
Why are you not heavily watermarking your data?
Putting passwords on files and folders do not prevent that ocular attack.
Neither does DLP/RMS. You can apply cut/copy/paste/print policies, remove access after a certain date, restrict access as per the feature set using policies.
Yet, again, this is just prevention against the bulk dumping / sharing of your data. Not the fine grained, patient, write it down or capture from a remote camera approach. Even if you block cut / copy / paste from the host, I can bring that host up is a screen sharing - think remote desktop, and screen shoot in the RDP session. Meaning, using the host tools that I use to connect to an RDP destination. Heck I create a webcast and share it with a group, meaning, I open it on my system and let people view it with me.
No DLP solution is 100%. Anyone telling you this is lying.
As one that has been doing Info/CyberSec for almost 2 decades, evaluated, deployed and used several DLP solutions, what I state here is from experience. DLP is important, and business must look to it as another mitigation in their risk strategies, but must do so with real vision and reality.
No matter who it is from, no technology can prevent this ocular avenue. If you don't want your data leaving your control, then don't share it. Yet, since you are in the education business, that is not an option.
I'll say it again, and again...
'If I can see your data, I can take your data.'

Looking for a way (preferably an API) to determine Effective Permissions on Active Directory object

We have a custom Active Directory integrated web app that helps users perform some self-service on their accounts (e.g. update photo, change phone number, reset password etc.) Our app runs on domain-joined servers, as Local System, and is thus able to authenticate to the AD using the server account(s).
We use a service connection point, that the app's clients use to locate an instance of our app. (Our app clients are hard-coded to look for certain keywords which are published on the servie connection point's keywords attribute.)
We recently had a situation wherein someone (we believe accidentally) changed the keywords on one of the service connection points resulting in an outage, since the clients could no longer find our SCP when querying the AD for our keyword(s).
The customer is a bit upset about this and wishes for us to provide them the ability to determine who can change the keywords on our SCPs. This feedback was passed on from our sales guys to us, and now we need to provide some way of helping them figure out who can change the keywords on our SCPs.
So, we're looking for an API to help us to determine Effective Permissions on our Active Directory service connection point objects, so we can alleviate this situation for the customer. We couldn't quite find an Effective Permissions / Access API that could help us list all the users who have effective write access to the keyword and other attributes on our SCPs.
Is there an API/other way that one can use to determine Effective Permissions on an Active Directory object?
It needs to be able to list all the users who have a specified access on a specified set of attributes of an Active Directory object.
This stack overflow post may be able to help you. LINQ to LDAP should also allow you to access the information pretty easily as well.

Azure, Sync Framework and Access Control Service: Are there obvious shortcomings or problems in using this technologies together?

I have a desktop application which uses flat files (some xml and small pictures) as data. I want this data to be available on other PCs which have the desktop application installed and usable by a smartphone client (WP7 at the moment) as well.
The user should have it very easy to synchronize this data. He should be able to use accounts he already possesses (Live-Login, Googlemail, Facebook,...).
I thought about using Azure Blob Storage to save the data in Azure, the Sync Framework to perform the actual synchronization and the Access Control Service to handle authentication.
I have not used any of this technologies before so any advice would be great but I'm searching foremost for errors or shortcomings in this strategy I don't see yet. Is this approach viable at all?
Windows Azure is basically a virtualized datacentre. It is elaborate and complicated and is pitched at corporations who don't want to own their server infrastructure or hardware.
If I understand correctly, what you want is a cloud fileserver, not a whole LAN. Windows SkyDrive fulfils this requirement nicely and offers 25GB of storage per member with no charge for membership.
About Hotmail and Windows Live People often confuse Hotmail and
Windows Live, because when you set up a Hotmail account it uses
Windows Live for authentication and therefore you end up with a
Windows Live account and all the associated facilities, including
SkyDrive. However, it is entirely possible to set up a Windows Live
account using any email address as the username.
If you do this, it is important to be aware that the Windows Live
password associated with a given email address is completely
independent of the password required by the mail server that hosts
mail for the account. This can cause a great deal of user confusion.
For Hotmail (or any other mail server that uses Windows Live for
authentication) they are guaranteed to be the same password.
There is no official Microsoft framework support for SkyDrive. There is an open source project called SkyDriveApiClient, but it only works with the full .NET framework. I tried porting it but the author was a bit of an architecture astronaut, and it is absolutely riddled with [Serializable] which is not available on WP7x.
The WP7 guys have said that the WP7 framework will probably include support for SkyDrive but not in Mango (WP7.1) and given that Microsoft's typical release cycle is 18 months and Mango has yet to hit the streets, I'd say it will be two years before you can count on intrinsic cloud file services for WP7.
Roll-your-own wouldn't be hard, WCF services are dead easy to use from WP7. But that's not really cloud since you have to provide and maintain the server infrastructure yourself. For this reason and given the MS timetable, I have put a great deal of effort into producing my own SkyDrive client for WP7. Core functionality is complete and I am now refactoring, improving robustness and adding performance enhancements like local cacheing of tokens (cookies, essentially). I don't intend to release it; I have a number of apps planned that depend on this functionality and it suits me fine that there is a substantial barrier to competition.
I didn't tell you that to tease you. My point is that I'm so sure SkyDrive is the right answer that I put a lot of work into making it happen.
Cloud file storage is a perfect fit for mobile devices.
Azure is not a good answer for the sort of phone apps individuals want because the data store isn't shared in a way that required indexing or supports high levels of concurrency
I can certainly think of corporate phone apps that would benefit from using SQL Server as storage
Azure can do file services but it represents an ongoing expense. Nobody's going to put up with that when Google and Microsoft both give away web based cloud storage.
I can personally attest that if you're determined, it is possible to use SkyDrive from WP7.
Cloud storage is the only way you're going to get programmatically accessible storage that's shared by your user's mobile device and his computer. One of the things I intend to do that depends on shared storage is write a Silverlight app that lets you prepare map routes with multiple waypoints on a desktop computer and a companion app that uses them on WP7.
The Windows Live team has released what they call support for WP7. They supply a sample project showing you how to instantiate a browser object and load their login pages and manipulate them to log in and use their javascript API to manipulate SkyDrive.
This has one big advantage: browser cookies and cached credentials. The disadvantages are obvious; technical shortcomings notwithstanding the Windows Live team seems to think the only thing people want to do with a phone is tag their photos and fiddle with social media.
I have finished my own libraries. They do not support most of the social media twaddle. I have treated SkyDrive as no more or less than a cloud file system, providing
Authenticate(username, password)
CreateFolder(folderpath[, blocking=false])
Delete(fileOrFolderPath[, blocking=false])
SaveString(filepath, value[, blocking=false])
LoadString(filepath)
I could handle binaries but Convert.ToBase64 makes this unnecessary and strings are convenient for XML. CreateFolder, Delete and SaveString are optionally blocking. LoadString is always blocking because it's a function that returns the loaded string. CreateFolder is recursive so you can create an entire path in one call (eg /folder1/folder2/folder3). Calling CreateFolder on a pre-existing path has no effect, and SaveString uses CreateFolder to ensure the path is valid, making it unnecessary to create a filepath in advance. Authenticate loads the file system (except file content) into memory eliminating server chatter. This is asynchronous and a FileSystemReady event announces when the file system is completely loaded. The model is maintained as you add and remove files and folders.
This was a lot of work and no one reponded to my attempt to make it an open source project so I'm not inclined to give the fruits of my labour away, but provided your plans don't compete with mine I could be persuaded to come to an arrangement.

Single godaddy account suitable for hosting multiple dev projects?

I want a single hosting account where I can put up my development sites, and small sites I do for friends, some might be experiments, some might be public. None will get huge traffic. They'll all either be using PHP roll-my-own or Code Igniter with MySQL.
I'll want to be pointing multiple domain names at different directories under this account. I'll also probably make use of rewrites extensively.
I'm not in the US but US hosting is far more economical. Is godaddy a good choice given my requirements? I'm looking at the base account as it allows unlimited domain names.
What i hate about go daddy is their domain registerations are "expensive". With privacy it comes to essentially $18/domain, compared to someone like dreamhost (which has free privacy $10).
I personally use dreamhost to register my domains and rackspace to serve the content.
Their smallest instance is ~12/month.
I like the freedom rackspace gives me, it is a full linux box with whatever you want. Shared hosts often aren't flexible enough for quirky framework/requirements. In your case, any shared hosting will do as you are using php/CI.
I'm looking at the base account as it allows unlimited domain names.
Nowadays, just about everyone offers unlimited domain names and what not. Not really a killer feature.
In the end shared hosting is shared hosting. You are sharing a space with other users. If it is experimental then it won't matter.
Something you wish to consider is "money back policy". For instance I had at one point an account with MOcha host and they only offer money back inside 30 days, limited money inside 180 days. After that, they eat your money. Something to consider.

Resources