So i setup a virtual machine with vagrant using the virtual box container provided by laravel's homestead. (on my osx env)
I used the nfs option method of sharing my directories to my local environment and this speed up response times drastically. On a typical app, i'm getting like 20-40ms load per page. However, I noticed that file uploads are terribly slow. I can upload a 1MB file on a simple form that does nothing to the file and it will take about a 30seconds to a minute. Is this normal, or is there a way to speed things up even further besides using nfs shares.
This has been driving me crazy for some time, and no amount of toying with Vagrant's or VirtualBox's settings (sendfile, NFS, adjusting packet sizes, etc.) helped. But with the help of this answer on a similar problem relating to failed image uploads, I've finally cracked it:
The key, for me, was changing my hosts file to resolve my Homestead domains to the homestead VM's IP of 192.168.10.10, rather than to 127.0.0.1. (When doing this, you also drop the port :8000 from the URL, so you just navigate to homestead.app).
This changed my file upload speed from around 25 KB/sec to 5980 KB/sec!
It appears that the Laravel docs were updated a week ago to reflect this change. I wonder if your Yosemite install coincided with the docs change, and you set up your new box with 192.168.10.10 while your old hosts setup pointed to 127.0.0.1.
Even though the Laravel docs have been corrected—so this issue shouldn't present itself for new Homestead installs—there are still a lot of tutorials floating around in the wild that suggest resolving Homestead domains to 127.0.0.1. Hopefully this answer will help head off some aggravation!
Things that come to my mind:
Edit /etc/nginx/nginx.conf, find the "sendfile" setting and change it to "off" (I've seen some reports of people having some issues with it turned on when using NFS);
Having xdebug enabled slows things down, and although I wouldn't expect it to make such a huge difference as the one you're noticing, if nothing else works it might be worth disabling it and see if helps;
If none of these help, create a repo on github to recreate the issue and I'll have a look, see if I find anything.
Related
Before posting this, I've done some research and tried different solutions. The question is how to configure a system so that it would be possible to SSH into it's vagrant box from external/different network?
I have a Windows machine at home. I have installed Vagrant and now able to access the contents both via HTTP and SSH from any device connected to very same network.
What I want to do is to be able to get a laptop, go to a nice little café just across the river, sit down and work on my project which sits in that Vagrant box on my home desktop PC.
I am quite terrible in networking and not sure what is the solution. Do I need to make my home desktop a server? If so, which steps should I take? Do I need to do configure something in my router software? Or do I need to create some kind of VPN stuff where Vagrant thinks I am actually requesting it's contents from the same home network or perhaps I just better give up and setup a droplet in the DigitalOcean instead?
To moderators: please don't shut this question because the answer is an opinion based. I am happy to listen to these opinions and I want to know which steps to follow to achieve what I want.
Thanks
Why not just copy your Vagrantfile to the laptop and spin up an instance there? It would be much less work, faster, and importantly much safer than opening up your desktop computer to the world.
I think your own suggestion of a remote server is also a valid option, although not quite as simple as just using the laptop.
I have set up and my wordpress site on homestead. I have windows machine running it and it works nicely, i can access my configured URL when i edit hosts file. Now i want to access the domain from different windows machine that relies in the same network but this time editing hosts file does not work, it simply cant resolve it and times out. Is it even possible to do that? I would prefer if i can get it working with hosts file and not using vagrant share or similar service like that.
Thanks in advance!
I had the same problem accessing my homestead install across the network until I recently discovered I could perform a Network Preview for my Prepros projects (https://prepros.io/).
Homestead files are mapped to your local computer; adding them to prepros projects will give indirect access to your homestead website.
Its been working quite well for me.
So I setup my homestead on a Win 10 machine and it seems like I find myself with this issue every now and then.
What happens is, homestead works just fine for a few days then all of a sudden, when I start it up, I can no longer access http://local.domain.dev (which is what I mapped to the IP 192.168.10.10) without appending it with port 8000.
I really don't get it. I think it has something to do with Windows updates? Although I can't really tell since Win 10 updates are so, I dunno, sneaky? I never know when there were updates until I see it on the notification bar. Most of the time, it doesn't tell you it's updating during bootup like Win 7 did.
In any case, this is really becoming a pain since having the domain is important to some of the features of the site I'm working on and specifying a port somehow screws it up (whole different story, don't ask).
So, I guess I'm asking if anyone else is experiencing the same issues on their Windows 10 machine and if they have a permanent fix to this?
This issue has popped up every now and then since I've upgraded to Win 10 and the last time had this issue, what I did to fix it was to update Vagrant and VBox.
I have Vagrant 1.7.4 and VBox 5.0.4, I think those are still the latest ones.
I just ran vagrant provision in a futile attempt at getting my customized synced_folders directive to work and now my whole guest box is wiped out.
Is this normal? I don't see any references to Vagrant docs about this behavior.
As per the doc:
Provisioners in Vagrant allow you to automatically install software, alter configurations, and more on the machine as part of the vagrant up process.
The only thing I have in my config provision shell commands are installation commands. Nothing about wiping anything out.
I do have app.vm.provision for puppet that sets fqdn, user name and box name (along with the normal module_path, manifests_path and manifests_file). Maybe this caused things to be reset?
The Answer
Is Vagrant Provision suppose to wipe out all your data?
No. Vagrant should never harm your "data" (i.e., websites, code, etc.).
...now my whole guest box is wiped out. Is this normal?
Yes. Your Vagrant environment (in other words, the guest operating system created in a virtual environment by Vagrant) is volatile, and you should be able to destroy and recreate it at will without having any impact on your working files (because those should be kept in your local, or host, file system).
Explanation
On Vagrant's website, the very first thing they tell you is this:
Create and configure lightweight, reproducible, and portable development environments.
Your development environment allows you to work. You work on your data, in your development work environment. When you are done with your "development work environment," you should be able to delete it freely without affecting your data in the least.
Further, you should be able to send a collaborating developer your Vagrantfile so that they can create the exact same development environment you used to create your data (i.e., write your program, build your website, and so forth). Then, when you provide them with your code, they can use it in an environment identical to the one that your code was created in without having to reconfigure their own setup.
For more details about how your data files (code, working files, etc.) are kept safely in your computer while making them accessible to your guest system created by Vagrant, see my answer to this question.
So what appears to have happened was that when I set up a synced folder, it wiped out everything because there was nothing on my host machine in that synced folder. Unless there is a way to recover the lost data, there should be an unmistakable WARNING in their docs that this can happen.
I setup the synced_folder to be on my whole home directory. When I created a new machine, I cloned the one project I had saved and decided to just sync my individual projects instead of my whole user directory this time. When I reloaded, the project directory was empty since it was empty on my host machine.
So I guess, make sure the directories on your host machine are already setup with the data before configuring your Vagrantfile with synced_folder information.
Theres a number of threads discussing slow resolution of /etc/hosts on Mountain Lion. The resolution for my custom host (someserver.dev) is fast when connected to the internet. The moment I go offline the resolution could take up to 30 seconds. Can't seem to find a way to get the resolution to happen quickly without either being online or installing a DNS server.
I've found that the entries in /etc/hosts need to be on individual lines. For instance, I was also having the problem with this config:
192.168.0.13 my1stmachine.local my2ndmachine.local my3rdmachine.local
But after I put each entry on its own line, no slowness anymore:
192.168.0.13 my1stmachine.local
192.168.0.13 my2ndmachine.local
192.168.0.13 my3rdmachine.local
Good luck!
Your best option is probably to install a local DNS server like dnsmasq as seen in this blog post by Justin Carmony. It's a bit annoying, but the resolutions come very fast for your local dev servers. You can also do a wildcard entry like this in the dnsmasq.conf file:
address=/.dev/127.0.0.1
So that anything.dev resolves to your localhost. Hope that helps!