Hosting on google drive - web-hosting

I'm new to web developing and had to design a website for college. I decided to make it from scratch. I am currently hosting the website on Google Drive and it's working fine. Does this mean that it will work fine on a hosting company's web server? The main reason I ask is because of the many errors I get when using Markup Validation!

There's a lot of variables involved in this decision. The main assumption I'll make is that it's a flat site with no server scripting involved. If that's the case, you shouldn't have any issues copying them over to another web server. Though you should fix up those Markup Validation errors.
When I mention server scripting, I'm referring to PHP/ASP/Coldfusion/etc code that may be in your pages.

Related

Best practice to store App Key in Laravel

I have been doing a lot of research on this and I can't seem to find a definitive answer. Obviously these days security is a big issue, hacks are going on all over the place of major companies that invest millions into security and they're still getting hacked.
I work on Laravel a lot and use shared hosting with Hostgator or some similar company of high report. Laravel comes with a built in function for encrypting database info and decrypting to the user when requested.
However, I have a question on how secure this ACTUALLY is. If someone gets into my cPanel, my app key which is used for encryption is right there in front of them. Granted, my cPanel password is the one that's auto-generated by Hostgator and it's complete jibberish with semicolons and alphanumeric strings all over, so it's not easy to guess.
But I'm trying to learn a little bit more about security. If my app key in my env file is locked securely behind my cPanel login, is Laravels built in "encrypt()" method "enough" to call an app "secure"? Is there other measures within Laravel or my host provider that could make it more secure than just tight passwords? Is there some sort of practice of referencing the app key through an external source that's not located in the cPanel area? So even if my cPanel got hacked, my app key wouldn't be in those files and get exposed?
I'm not a security expert, but there are a few points I can share from my experience in working at highly-secured companies.
First, Laravel itself is fine. You can generally trust open source software since it's transparent and security bugs get discovered and addressed early. So you don't need to improve Laravel, just use it as is, preferably an LTS version.
Then, CPanel is a liability. You should minimize weak points on your system, i.e. those that are externally accessible. Get a VPS or a private server and access it via an SSH, don't use tools like CPanel and PhpMyAdmin on it. The less software you have that talks to the outer world, the less vulnerable you are to bugs in that software.
In my current company the production server can only be accessed via SSH from a single IP address, the address of the dev server. So I log in to dev server first, and then log in from there to the prod. It denies all connections from all other IPs.
If you are limited to using CPanel or something similar, consider protecting the login page with HTTP Basic Auth, some hosting providers allow that.
You also want to keep your system and software up to date. Not too new either as that may have bugs that haven't been caught yet. Our devops prefer to have it a couple of minor versions behind, so that the community has time to test it out and get hacked for you.
That's all I know as a web-dev, sure enough there are special tools and ddos protection services but that's beyond a dev's concern imo. If you just follow these steps, you should be safe. Hope that helped a bit, cheers :)

When to self host ASP.NET Web API

While reading a blog I found that ASP.NET Web API could be self hosted. There are loads of links telling how to self host Web API but I could not find any link explaining when it makes sense over IIS. Could someone please point me to couple of scenarios where self hosting of Web API is more suited than IIS hosting.
Thanks,
Ravi
Generally speaking, you would use self hosted Web API to get a better performance and to get rid of unnecessary pipelines of IIS. Additionally, you get better control over handling http requests, configuration and so forth. Since you have less dependencies on other apps your deployment and troubleshooting gets easier and less complicated.
Having said that, you have to write code to handle everything, even simple things (such as returning static files) that are simply done by IIS.
Thanks to the ASP.net Core, you'd be able to host your apps on Linux, MacOS and Windows. So, going cross platform would be another reason for using self hosted apps.

Data synchronization in a Chrome packaged app

In my pursuit to write a Chrome packaged app, I am struggling to get my data synchronized to the app so that it can be used in offline mode.
My data lives on a server and I access it trough a restful service in this case I use.net MVC WebApi
What I have tried: Using Breezejs because of the easy offline capabilities ; the problem is that
window.localStorage
is not available in packaged apps, I did tried switching it out for IndexedDB but no luck.
I tried chrome.storage that worked great with the build in sync but it is not a big truck and that is what I need at least a 10tunner.
So my Question; is there a silver bullet that has some XMLHttpRequest implementations, that makes it easier to get data from a restful service, storing it to IndexedDB so that it can be used offline and when going online syncs the changes, that is compatible with packaged apps.
I know I must probably write my own but if someone already went trough all the hoops and complexities of synchronization that can guide me it will be awesome.
Have you looked at using the syncFileSystem API ?
As long as you are happy to sync your data into Google Drive, this api should meet your needs for not implementing sync code yourself and still working offline.

Best practice to web hosting a website with Scrapy Spiders running in backend

Maybe I am missing something about Scrapy, but here is what I am about to do:
I have created a website based on the information I am crawling from Internet using Scrapy Crawl Spiders. However, I am stuck in how to get my website going live. I am considerring web hosting but most of the service providers do not allow install those scripts on their server. Of couse I can rent a server but it is too expensive for me at the moment. Could anyone please shed some light on this if you have similar experience. The website is based on ASP.NET so will need the webhosting supports MS SQL, ASP.NET as well as Scrapy. Is there something in the scrapy so I can get the spiders running without installing? Much appreciated.
Cheers,
Ray
You would need a hosting service where you could install the scrapyd service so that you can automate your screen scraping. I've never done it as I am just getting started playing around with Scrapy, but here is the information on scrapyd: http://readthedocs.org/docs/scrapy/en/latest/topics/scrapyd.html
You might want to look at Virtual Dedicated Servers for hosting as they are cheaper than co-located or dedicated servers but give you more control than shared hosting.
I have found a lot of success in deploying and running my spiders periodically using Heroku, for free. You can read about the steps here
Alternatively, we could use Scrapyd to host your spiders and actually send requests, alongside with ScrapydWeb.

Browser Sync across many machines

Everyone remembers google browser sync right? I thought it was great. Unfortunately Google decided not to upgrade the service to Firefox 3.0. Mozilla is developing a replacement for google browser sync which will be a part of the Weave project. I have tried using Weave and found it to be very very slow or totally inoperable. Granted they are in a early development phase right now so I can not really complain.
This specific problem of browser sync got me to thinking though. What do all of you think of Mozilla or someone making a server/client package that we, the users, could run on your 'main' machine? Now you just have to know your own IP or have some way to announce it to your client browsers at work or wherever.
There are several problems I can think of with this: non static IPs, Opening up ports on your local comp etc. It just seems that Mozilla does not want to handle this traffic created by many people syncing their browsers. There is not a way for them to monetize this traffic since all the data uploaded must be encrypted.
Mozilla Weave is capable of running on personal servers. It uses WebDAV to communicate with HTTP servers and can be configured to connect to private servers. I've tried setting it up on my own servers but with no success (Mainly because I'm not very good at working with Apache to configure WebDAV)
I'm hoping Mozilla Weave eventually allows FTP access so I can easily use my server to host my firefox profile.
If you're interested in trying Mozilla Weave on a personal server, there's a tutorial here:
http://marios.tziortzis.com/page/blog/article/setting-up-mozilla-weave-on-your-server/
Browser Sync is up on Google Code now. Doesn't look like anything has been done with it yet though, as far as making it hosted on personal servers/computers.
I've been using the Firefox Scrapbook extension, sync'd via FolderShare. It takes a little setup, but the nice thing is that Scrapbook grabs a local copy of each page so it works offline or if the site goes away.
Not a complete solution to this problem, but I've found FoxMarks to be a really nice bookmark syncing extension.

Resources