I'm reading this Rails tutorial, which suggests using Cloud9 for the IDE. It also recommends Heroku for hosting your app.
It turns out that Cloud9 has built-in hosting of some sort; every Cloud9 app also has a public URL, and you can run Rails yourself (albeit with Webrick by default).
As much as I Google, I can't find any details about the hosting offering Cloud9 provides. Webrick aside, what are the limits on RAM, etc.? I know Cloud9 uses Docker and Ubuntu to build a VM, but I can't find much more than that.
(As much as I am following the recommendation to use Heroku for hosting, I can't help but wonder how the two compare.)
Disclosure: I work for Cloud9 :) http://c9.io/site/about
Cloud9 is meant for development, so it's hosting is like running rails on your laptop and pointing your domain at it (albeit with a fair bit more bandwidth). It's ok for showing your project to some friends or testers, but not for running a proper website. You'd also have to make your project public which means other will be able to see your source code.
Also on free accounts your project is archived after a week of no development so it won't be accessible from the outside world any more. It is unarchived when you work on it again.
Only premium accounts workspaces are kept running, because we can't afford to keep free users workspaces taking up ram / disk when they're not being actively used. Free users get 512MB ram + 1.5GB disk and premium users get 1GB ram + 5GB disk per project.
If I recall correctly; 512MB of RAM allocated to non-premium users; 1GB for premium users. There is enough for a developer to work with.
Also, unless you're willing to share your source code with your visitors — there is no way to have it as a hosting platform.
Private Workspace = Private to developers/workspace users.
Public Workspace = Public to all; even the source code.
https://c9.io/site/blog/2013/05/can-i-use-cloud9-to-do-x/
Related
Is it possible to serve my production domain through Laravel valet service??
I have my domain which I have purchased but I want to share my website through valet.
Laravel Valet is pretty much exclusively for local development and not recommended to run production code. It's essentially a lightweight shortcut to allow you to start developing in Laravel quickly. It is also exclusively maintained for a Mac - I mention this as most production websites are run on a Linux or Windows environment, and support for running on a Mac would be relatively small. (https://www.quora.com/What-is-the-Server-Operating-System-market-share)
Off the top of my head, this is also not a good idea as Valet proxies all folders that you run valet park to http://[folder_name].test URLs, which could theoretically open up security holes if you accidentally moved something in your local development environment into the parked folder. Essentially, it opens you up to a bunch of security issues, that just being one of the first ones.
I know it's tempting to want to just publish a site as everything "just works" on your local environment, but I would highly recommend finding a dedicated host for your website running in production. There are a plethora of hosting packages allowing you to run your Laravel application safely and cheaply.
Shared hosting is alot easier then Valet. I spent a good 4 hours trying to get valet working on my mac cause I had about. 5 versions of PHP. If you use Valet on a production server then you going to have to pay more for a more dedicated server that will allow you to install the software. And your getting a less productive enviroment. Your basically paying more for less quality. A shared hosting site will atleast provide support to fix any server issues non related to your web app and your paying less.
I have TFS 2015 installed on one of the company's servers. I try to access TFS using web access and it is extremely slow, it takes more than 5 minutes for a page to load and sometimes even longer. If I restart the server, TFS becomes a little bit faster (a page would need only a minute or so to load), but soon it becomes slower.
The server itself is okay. The CPU and memory are not even fully utilized (~20% - ~40% is utilized).
Other applications that are installed on the server are working fine, so it's just TFS.
Any suggestions?
Log in the application tier machine to try to access the web access to see whether you can see the same behavior.
Check the network connection between the application tier machine and data tier machine if you set up TFS in a multiple server configuration. You may try to turn off the firewall and anti-virus software on the machines.
Clean the cache folder on the application tier, usually the folder locates in: C:\TfsData\ApplicationTier\_fileCache
Check the Requirements and compatibility, to see whether your TFS set up on a appropriate environment.
If the items above is not helpful. You may need to consider move your TFS to another hardware.
Hello I have been wanting to get into working with a framework and Laravel seems like a decent one to try.
I have seen a lot of tutorials that tell you how to setup Laravel locally with Homestead or variants.
I am wanting to install and setup Laravel on my dedicated remote server with my hosting company. From there I want to be able to work with it on my local MacBook or MacPro.
I have not been able to find a good tutorial to make this happen in the fashion I want to do it.
I work with PHP and related daily but usually login to FTP and edit files with TextWrangler and save them and go about my day so my methods are dated and not efficient.
One side note is that I also have a Dell PowerEdge server running CentOS and VestaCP in my office as my development server so nothing is done locally per say (on my own computer) so the question and answer will apply to both my remote server and my remote but local development server.
Any suggestions are always welcome.
Best Regards,
Bradley
Assuming you have full root access to your remote servers, you should install composer on them and install Laravel in whichever way suits you. Then you can edit your project files just as if you were working on it locally.
Seriously though, the biggest thing you should add to your development arsenal (in case you haven't already) which will make your development process so much more resilient is Git.
Set up a free Bitbucket account, get a free Git client, and learn how commits, pushes, pulls, branches and deployments work. The easiest approach for deployment is to use a service such as Envoyer.
That way you can develop and test locally (even if 'locally' is a remote machine) and not really have to worry about breaking your app by making a mistake in controller or something on the live server.
We've transitioning from Rackspace dedicated boxes to a completely cloud Azure environment. Production servers and development and as an MS shop we're going to be using Visual Studio Team Services. As an MS ISV partner we have a bunch of MSDN seats so our developers are all going to have an MSDN w/ VS Premium account which we'll use with Team Services/TFS. We're replicating our production web server on a virtual machine but after some refactoring will eventually move to an Azure website.
My question is about when users leave the company. Right now we have everyone log into a development server using RDP. They develop on that server. When someone is gone we shut their access off to that server.
With Team Services when the user opens up a project do they automatically get the entire project downloaded to their local development environment/machine? If someone leaves the company is there a process using VSO that secures that code and removes it from them or makes it inaccessible? Any way to lock it down when we need to? I can't seem to find a procedure to do this.
To add or remove someone from the account, go to the Users hub on the home page for your account. If you remove a user from it, that user will no longer be able to access your account.
When users connect to your account, they'll need to take some action to get source code. That would be cloning in the case of using Git or creating a workspace and running get for TFVC.
If the user has source code, for example, on a machine, there is no way to remotely remove it. They won't be able to get updates, etc., but there's nothing running on the computer that would be able to erase the code the user has already obtained.
All source code sharing i know allow zipping up or browsing the local repository. Including VS Team Services.
Daniel Mann is correct . Developing on shared servers via RDP is terrible for productivity due to development being graphics and disk intensive, often requiring admin rights and reboots / crashes, debugging triggers system interrupts, out of memory loops are fun on a shared machine ie they stuff everybody else around. (Even with RDP you can copy and paste or map a network drive locally or upload to the net )
If your doing critical stuff the ONLY thing that really works is physically bring them in to non internet connected machine /network with USB disabled. However these mechanisms especially denying internet will half productivity.
This is why most organizations rely on legal contracts. On a 2M project is it worth making it a 4M project? There are cases where this is required normally around national security /CIA / Defence but not for IP, there are better / trickier ways.
Pretty much all binaries are reverse engineer-able with little effort if you really want to. obfuscation does very little.
Typically, how long should it take to install the Oracle WebCenter Suit?
We have a team of 3 developers trying to install WCS, however, it seems to be taking a little too long.
It really hard to say without any environment info like db version, cluster, network, load balancing etc...
Normally, for a local development installation, with correct database and os version, and a little bit luck, bringing up a standalone webcenter stack should be around 1-3 days.
If your developers are really stuck with the installation. I would suggest to get a Oracle pre-built VM for a good start without holding the enviroment
http://www.oracle.com/technetwork/community/developer-vm/index.html#wcp
Oracle WebCenter Portal VM
It really depends, but assuming a local, non clustered content install, you should be able to knock it out in a few hours.
Some factors that can extend the process:
web tier installation
slow x11 over VPN
clustered
networking issues
not doing a proper pre-install checklist (e.g., not having credentials ready)
I've seen it take as long as a week for a non-expert to install.
Update if you have any specific questions.
-ryan