Vagrant hangs on Windows 7 - vagrant

vagrant up seems to hang on Windows 7
My Vagrant file
VAGRANTFILE_API_VERSION = "2"
ENV['VAGRANT_DEFAULT_PROVIDER'] = 'docker'
Vagrant.configure("2") do |config|
config.vm.network "forwarded_port", guest: 80, host: 8080, auto_correct: true
config.vm.define "elk" do |elk|
elk.vm.synced_folder "./www", "/var/www"
elk.vm.provider "docker" do |d|
d.build_dir = "./Docker"
end
end
end
vagrant up --debug
==> elk: Syncing folders to the host VM...
INFO machine: Calling action: sync_folders on provider VirtualBox (3c7dc34c-6fcf-4ace-87d1-0602b664e783)
DEBUG environment: Attempting to acquire process-lock: machine-action-740c202843bdf6334148bb69e000ec99
DEBUG environment: Attempting to acquire process-lock: dotlock
INFO environment: Acquired process lock: dotlock
INFO environment: Released process lock: dotlock
INFO environment: Acquired process lock: machine-action-740c202843bdf6334148bb69e000ec99
INFO environment: Released process lock: machine-action-740c202843bdf6334148bb69e000ec99
INFO runner: Preparing hooks for middleware sequence...
INFO runner: 1 hooks defined.
INFO runner: Running action: #<Vagrant::Action::Builder:0x3786018>
INFO warden: Calling IN action: #<Vagrant::Action::Builtin::SyncedFolders:0x38079d8>
INFO subprocess: Starting process: ["C:\\windows\\System32\\WindowsPowerShell\\v1.0\\/powershell.EXE", "-NoProfile", "-ExecutionPolicy", "Bypass", "$PSVersionTable.PSVersion.Major"]
DEBUG subprocess: Selecting on IO
DEBUG subprocess: stdout: 2

Found related vagrant issue here:
https://github.com/mitchellh/vagrant/issues/3139
Updating powershell to v3.0 solved the issue:
http://www.microsoft.com/en-us/download/details.aspx?id=34595

Related

Consul UI does not show

Running single node Consul (v1.8.4) on Ubuntu 18.04. consul service is up, I had set the ui to be true (default).
But when I try access http://192.168.37.128:8500/ui
This site can’t be reached 192.168.37.128 took too long to respond.
ui.json
{
"addresses": {
"http": "0.0.0.0"
}
}
consul.service file:
[Unit]
Description=Consul
Documentation=https://www.consul.io/
[Service]
ExecStart=/usr/bin/consul agent –server –ui –data-dir=/temp/consul –bootstrap-expect=1 –node=vault –bind=–config-dir=/etc/consul.d/
ExecReload=/bin/kill –HUP $MAINPID
LimitNOFILE=65536
[Install]
WantedBy=multi-user.target
systemctl status consul
● consul.service - Consul
Loaded: loaded (/etc/systemd/system/consul.service; disabled; vendor preset: enabled)
Active: active (running) since Sun 2020-10-04 19:19:08 CDT; 50min ago
Docs: https://www.consul.io/
Main PID: 9477 (consul)
Tasks: 9 (limit: 4980)
CGroup: /system.slice/consul.service
└─9477 /opt/consul/bin/consul agent -server -ui -data-dir=/temp/consul -bootstrap-expect=1 -node=vault -bind=1
agent.server.raft: heartbeat timeout reached, starting election: last-leader=
agent.server.raft: entering candidate state: node="Node at 192.168.37.128:8300 [Candid
agent.server.raft: election won: tally=1
agent.server.raft: entering leader state: leader="Node at 192.168.37.128:8300 [Leader]
agent.server: cluster leadership acquired
agent.server: New leader elected: payload=vault
agent.leader: started routine: routine="federation state anti-entropy"
agent.leader: started routine: routine="federation state pruning"
agent.leader: started routine: routine="CA root pruning"
agent: Synced node info
Shows bind at 192.168.37.128:8300
This issue was firewall, had to open firewall on 8500
sudo ufw allow 8500/tcp

vagrant hangs up after fixed port collision

I am trying to bring up vagrant on a Windows machine.
It hangs up after
Fixed port collision for 22 => 2222. Now on port 2200.
A part of the debug log is below:
DEBUG subprocess: Waiting for process to exit. Remaining to timeout: 32000
DEBUG subprocess: Exit status: 0
INFO warden: Calling IN action: #<Vagrant::Action::Builtin::HandleForwardedPortCollisions:0x44e7760>
DEBUG environment: Attempting to acquire process-lock: fpcollision
DEBUG environment: Attempting to acquire process-lock: dotlock
INFO environment: Acquired process lock: dotlock
INFO environment: Released process lock: dotlock
INFO environment: Acquired process lock: fpcollision
INFO handle_port_collisions: Detecting any forwarded port collisions...
DEBUG handle_port_collisions: Extra in use: []
DEBUG handle_port_collisions: Remap: {}
DEBUG handle_port_collisions: Repair: true
INFO handle_port_collisions: Attempting to repair FP collision: 2222
INFO handle_port_collisions: Repaired FP collision: 2222 to 2200
INFO interface: info: Fixed port collision for 22 => 2222. Now on port 2200.
INFO interface: info: ==> vlad: Fixed port collision for 22 => 2222. Now on port 2200.
==> vlad: Fixed port collision for 22 => 2222. Now on port 2200.
INFO environment: Released process lock: fpcollision
DEBUG environment: Attempting to acquire process-lock: dotlock
INFO environment: Acquired process lock: dotlock
INFO environment: Released process lock: dotlock
INFO warden: Calling IN action: #<VagrantPlugins::ProviderVirtualBox::Action::PrepareNFSValidIds:0x44431c8>
INFO subprocess: Starting process: ["C:/Program Files/Oracle/VirtualBox/VBoxManage.exe", "list", "vms"]
INFO subprocess: Command not in installer, restoring original environment...
DEBUG subprocess: Selecting on IO
DEBUG subprocess: stdout: "vlad_vlad" {efce349f-2b2e-40db-9a14-2298d3024638}
DEBUG subprocess: Waiting for process to exit. Remaining to timeout: 32000
DEBUG subprocess: Exit status: 0
INFO warden: Calling IN action: #<VagrantPlugins::SyncedFolderNFS::ActionCleanup:0x435c3a8>
DEBUG host: Searching for cap: nfs_prune
DEBUG host: Checking in: windows
INFO nfs: Host doesn't support pruning NFS. Skipping.
INFO warden: Calling IN action: #<Vagrant::Action::Builtin::SyncedFolderCleanup:0x4263ca8>
INFO subprocess: Starting process: ["C:\\Windows\\System32\\WindowsPowerShell\\v1.0/powershell.EXE", "-NoProfile", "-ExecutionPolicy", "Bypass", "$PSVersionTable.PSVersion.Major"]
INFO subprocess: Command not in installer, restoring original environment...
DEBUG subprocess: Selecting on IO
DEBUG subprocess: stdout: 2
I do not have the slightest idea of how to proceed. Any help is appreciated.
After powershell installation vagrant continued with the setup of the VM

Vagrant up issue

We have a network that requires a proxy to go to the internet, I have a local Mac that we’re setting up for Drupal development and have HTTP_PROXY and ALL_PROXY set and everything works fine using bower, gem and librarian-puppet scripts from our development house to install these. I am now using a Vagrantfile they have created to setup a vagrant (VirtualBox) machine and hitting issues. I initially got issues early on in the script, but have installed vagrant-proxyconf and added config.proxy.http, https and no_proxy lines to the fine, which now gets a lot further before erroring with:
==> default: Info: Loading facts ==> default: Could not retrieve fact='nodejs_latest_version', resolution='': Connection refused - connect(2) ==> default: Could not retrieve fact='nodejs_stable_version', resolution='': Connection refused - connect(2) ==> default: Error: Invalid version string ''! at /tmp/vagrant-puppet/modules-71ca71d9c718e523371ddd53fd147634/nodejs/manifests/install.pp:165 on node drupal.dev ==> default: Error: Invalid version string ''! at /tmp/vagrant-puppet/modules-71ca71d9c718e523371ddd53fd147634/nodejs/manifests/install.pp:165 on node drupal.dev The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong.
The ‘SSH command responded with a non-zero exit status.’ I see a few times in the script execution, but this is the first that stops the script running. If I connect the machine directly to an internet connection the script runs through successfully, but this isn’t a long term solution. I’m not sure if the issue is on the mac or the virtual box, or what file or setting I should be looking at. If anyone can help shed some light where the issue might be, or where I should be looking it would really be appreciated.
The debug output ends like this :
==> default: Info: Loading facts DEBUG ssh: stderr: Could not retrieve fact='nodejs_latest_version', resolution='': Connection refused - connect(2)
INFO interface: info: Could not retrieve fact='nodejs_latest_version', resolution='': Connection refused - connect(2) INFO interface: info: ==> default: Could not retrieve fact='nodejs_latest_version', resolution='': Connection refused - connect(2) ==> default: Could not retrieve fact='nodejs_latest_version', resolution='': Connection refused - connect(2) DEBUG ssh: stderr: Could not retrieve fact='nodejs_stable_version', resolution='': Connection refused - connect(2)
INFO interface: info: Could not retrieve fact='nodejs_stable_version', resolution='': Connection refused - connect(2) INFO interface: info: ==> default: Could not retrieve fact='nodejs_stable_version', resolution='': Connection refused - connect(2) ==> default: Could not retrieve fact='nodejs_stable_version', resolution='': Connection refused - connect(2) DEBUG ssh: stderr: Error: Invalid version string ''! at /tmp/vagrant-puppet/modules-71ca71d9c718e523371ddd53fd147634/nodejs/manifests/install.pp:165 on node drupal.dev
INFO interface: info: Error: Invalid version string ''! at /tmp/vagrant-puppet/modules-71ca71d9c718e523371ddd53fd147634/nodejs/manifests/install.pp:165 on node drupal.dev INFO interface: info: ==> default: Error: Invalid version string ''! at /tmp/vagrant-puppet/modules-71ca71d9c718e523371ddd53fd147634/nodejs/manifests/install.pp:165 on node drupal.dev ==> default: Error: Invalid version string ''! at /tmp/vagrant-puppet/modules-71ca71d9c718e523371ddd53fd147634/nodejs/manifests/install.pp:165 on node drupal.dev DEBUG ssh: stderr: Error: Invalid version string ''! at /tmp/vagrant-puppet/modules-71ca71d9c718e523371ddd53fd147634/nodejs/manifests/install.pp:165 on node drupal.dev
INFO interface: info: Error: Invalid version string ''! at /tmp/vagrant-puppet/modules-71ca71d9c718e523371ddd53fd147634/nodejs/manifests/install.pp:165 on node drupal.dev INFO interface: info: ==> default: Error: Invalid version string ''! at /tmp/vagrant-puppet/modules-71ca71d9c718e523371ddd53fd147634/nodejs/manifests/install.pp:165 on node drupal.dev ==> default: Error: Invalid version string ''! at /tmp/vagrant-puppet/modules-71ca71d9c718e523371ddd53fd147634/nodejs/manifests/install.pp:165 on node drupal.dev DEBUG ssh: Exit status: 1 ERROR warden: Error occurred: The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong. INFO warden: Beginning recovery process... INFO warden: Recovery complete. ERROR warden: Error occurred: The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong. INFO warden: Beginning recovery process... INFO warden: Calling recover: # INFO warden: Beginning recovery process... INFO warden: Calling recover: # INFO warden: Beginning recovery process... INFO warden: Recovery complete. INFO warden: Recovery complete. INFO warden: Recovery complete. INFO warden: Beginning recovery process... INFO warden: Recovery complete. ERROR warden: Error occurred: The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong. INFO warden: Beginning recovery process... INFO warden: Recovery complete. ERROR warden: Error occurred: The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong. INFO warden: Beginning recovery process... INFO warden: Calling recover: # INFO warden: Beginning recovery process... INFO warden: Recovery complete. INFO warden: Recovery complete. ERROR warden: Error occurred: The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong. INFO warden: Beginning recovery process... INFO warden: Recovery complete. ERROR warden: Error occurred: The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong. INFO warden: Beginning recovery process... INFO warden: Calling recover: # INFO warden: Beginning recovery process... INFO warden: Recovery complete. INFO warden: Recovery complete. INFO warden: Beginning recovery process... INFO warden: Recovery complete. INFO warden: Beginning recovery process... INFO warden: Recovery complete. INFO environment: Released process lock: machine-action-d3538fe6302dd66edcdb4c05283597a6 INFO environment: Running hook: environment_unload INFO runner: Preparing hooks for middleware sequence... INFO runner: 2 hooks defined. INFO runner: Running action: environment_unload # ERROR vagrant: Vagrant experienced an error! Details: ERROR vagrant: # ERROR vagrant: The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong. ERROR vagrant: /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/plugins/communicators/ssh/communicator.rb:236:in execute'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/plugins/communicators/ssh/communicator.rb:246:insudo' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/plugins/provisioners/puppet/provisioner/puppet.rb:251:in run_puppet_apply'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/plugins/provisioners/puppet/provisioner/puppet.rb:124:inprovision' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/provision.rb:133:in run_provisioner'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:95:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:95:in block in finalize_action'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builder.rb:116:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/runner.rb:66:in block in run'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/util/busy.rb:19:inbusy' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/runner.rb:66:in run'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/environment.rb:428:inhook' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/provision.rb:121:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/provision.rb:121:inblock in call' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/provision.rb:103:in each'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/provision.rb:103:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/plugins/providers/virtualbox/action/check_accessible.rb:18:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:95:inblock in finalize_action' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builder.rb:116:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/runner.rb:66:inblock in run' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/util/busy.rb:19:in busy'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/runner.rb:66:inrun' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/call.rb:53:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:95:in block in finalize_action'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builder.rb:116:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/runner.rb:66:in block in run'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/util/busy.rb:19:inbusy' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/runner.rb:66:in run'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/call.rb:53:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/config_validate.rb:25:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/plugins/providers/virtualbox/action/check_virtualbox.rb:17:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builder.rb:116:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/runner.rb:66:in block in run'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/util/busy.rb:19:inbusy' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/runner.rb:66:in run'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/machine.rb:214:inaction_raw' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/machine.rb:191:in block in action'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/environment.rb:516:inlock' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/machine.rb:178:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/machine.rb:178:inaction' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/plugins/commands/provision/command.rb:30:in block in execute'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/plugin/v2/command.rb:226:inblock in with_target_vms' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/plugin/v2/command.rb:220:in each'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/plugin/v2/command.rb:220:inwith_target_vms' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/plugins/commands/provision/command.rb:29:in execute'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/cli.rb:42:inexecute' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/environment.rb:301:in cli'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/bin/vagrant:174:in' INFO interface: error: The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong. The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong. INFO interface: Machine: error-exit ["Vagrant::Errors::VagrantError", "The SSH command responded with a non-zero exit status. Vagrant\nassumes that this means the command failed. The output for this command\nshould be in the log above. Please read the output to determine what\nwent wrong."]

Can't connect in Ubuntu box from Windows host in Vagrant

I'm having problems trying to ssh in an ubuntu box inside my windows machine. I'm following the basic Getting Started tutorial from Vagrant site (http://docs.vagrantup.com/v2/getting-started/up.html)
Software Versions:
Vagrant: 1.7.2
Virtualbox: 4.3.22
SO: Windows 7
Box: hashicorp/precise32
The scenario:
When I hit vagrant up everything goes according to plan: services and vm starts, but in the final step, which vagrant tests the ssh, it fails after severals Warning: Connection refused. Retrying...
After that when I open the Virtualbox GUI, I can log in the box and check that openssh-server is running. When I try to ssh through the command line I get the error:
ssh 127.0.0.1 -p 2222
ssh: connect to host 127.0.0.1 port 2222: Bad file number
When I hit vagrant ssh (with the logs in INFO level), I get this output:
INFO global: Vagrant version: 1.7.2
INFO global: Ruby version: 2.0.0
INFO global: RubyGems version: 2.0.14
INFO global: VAGRANT_EXECUTABLE="C:\\dev\\tools\\Vagrant\\embedded\\gems\\gems\\vagrant-1.7.2\\bin\\vagrant"
INFO global: VAGRANT_INSTALLER_EMBEDDED_DIR="C:\\dev\\tools\\Vagrant\\embedded"
INFO global: VAGRANT_INSTALLER_ENV="1"
INFO global: VAGRANT_INSTALLER_VERSION="2"
INFO global: VAGRANT_INTERNAL_BUNDLERIZED="1"
INFO global: VAGRANT_LOG="INFO"
INFO global: Plugins:
INFO global: - bundler = 1.7.11
INFO global: - mime-types = 1.25.1
INFO global: - rdoc = 4.0.0
INFO global: - rest-client = 1.6.8
INFO global: - vagrant-share = 1.1.3
INFO manager: Registered plugin: box command
INFO manager: Registered plugin: destroy command
INFO manager: Registered plugin: global-status command
INFO manager: Registered plugin: halt command
INFO manager: Registered plugin: help command
INFO manager: Registered plugin: init command
INFO manager: Registered plugin: list-commands command
INFO manager: Registered plugin: vagrant-login
INFO manager: Registered plugin: package command
INFO manager: Registered plugin: plugin command
INFO manager: Registered plugin: provision command
INFO manager: Registered plugin: push command
INFO manager: Registered plugin: rdp command
INFO manager: Registered plugin: reload command
INFO manager: Registered plugin: resume command
INFO manager: Registered plugin: ssh command
INFO manager: Registered plugin: ssh-config command
INFO manager: Registered plugin: status command
INFO manager: Registered plugin: suspend command
INFO manager: Registered plugin: up command
INFO manager: Registered plugin: version command
INFO manager: Registered plugin: ssh communicator
INFO manager: Registered plugin: winrm communicator
INFO manager: Registered plugin: Arch guest
INFO manager: Registered plugin: CoreOS guest
INFO manager: Registered plugin: Darwin guest
INFO manager: Registered plugin: Debian guest
INFO manager: Registered plugin: ESXi guest.
INFO manager: Registered plugin: Fedora guest
INFO manager: Registered plugin: FreeBSD guest
INFO manager: Registered plugin: Funtoo guest
INFO manager: Registered plugin: Gentoo guest
INFO manager: Registered plugin: Linux guest.
INFO manager: Registered plugin: Mint guest
INFO manager: Registered plugin: NetBSD guest
INFO manager: Registered plugin: NixOS guest
INFO manager: Registered plugin: OmniOS guest.
INFO manager: Registered plugin: OpenBSD guest
INFO manager: Registered plugin: PLD Linux guest
INFO manager: Registered plugin: RedHat guest
INFO manager: Registered plugin: SmartOS guest.
INFO manager: Registered plugin: Solaris guest.
INFO manager: Registered plugin: Solaris 11 guest.
INFO manager: Registered plugin: SUSE guest
INFO manager: Registered plugin: TinyCore Linux guest.
INFO manager: Registered plugin: Ubuntu guest
INFO manager: Registered plugin: Windows guest.
INFO manager: Registered plugin: Arch host
INFO manager: Registered plugin: BSD host
INFO manager: Registered plugin: Mac OS X host
INFO manager: Registered plugin: FreeBSD host
INFO manager: Registered plugin: Gentoo host
INFO manager: Registered plugin: Linux host
INFO manager: Registered plugin: null host
INFO manager: Registered plugin: Red Hat host
INFO manager: Registered plugin: Slackware host
INFO manager: Registered plugin: SUSE host
INFO manager: Registered plugin: Windows host
INFO manager: Registered plugin: kernel
INFO manager: Registered plugin: kernel
INFO manager: Registered plugin: docker-provider
INFO manager: Registered plugin: Hyper-V provider
INFO manager: Registered plugin: VirtualBox provider
INFO manager: Registered plugin: ansible
INFO manager: Registered plugin: CFEngine Provisioner
INFO manager: Registered plugin: chef
INFO manager: Registered plugin: docker
INFO manager: Registered plugin: file
INFO manager: Registered plugin: puppet
INFO manager: Registered plugin: salt
INFO manager: Registered plugin: shell
INFO manager: Registered plugin: atlas
INFO manager: Registered plugin: ftp
INFO manager: Registered plugin: heroku
INFO manager: Registered plugin: local-exec
INFO manager: Registered plugin: noop
INFO manager: Registered plugin: NFS synced folders
INFO manager: Registered plugin: RSync synced folders
INFO manager: Registered plugin: SMB synced folders
INFO global: Loading plugins!
INFO manager: Registered plugin: vagrant-share
INFO vagrant: `vagrant` invoked: ["ssh"]
INFO environment: Environment initialized (#<Vagrant::Environment:0x36893d0>)
INFO environment: - cwd: C:/dev/local-server
INFO environment: Home path: C:/Users/vitallan/.vagrant.d
INFO environment: Local data path: C:/dev/local-server/.vagrant
INFO environment: Running hook: environment_plugins_loaded
INFO runner: Preparing hooks for middleware sequence...
INFO runner: 1 hooks defined.
INFO runner: Running action: #<Vagrant::Action::Builder:0x2c494c8>
INFO environment: Running hook: environment_load
INFO runner: Preparing hooks for middleware sequence...
INFO runner: 1 hooks defined.
INFO runner: Running action: #<Vagrant::Action::Builder:0x2cc93a0>
INFO cli: CLI: [] "ssh" []
INFO loader: Set :root = #<Pathname:C:/dev/local-server/Vagrantfile>
INFO loader: Loading configuration in order: [:home, :root]
INFO command: Active machine found with name default. Using provider: virtualbox
INFO environment: Getting machine: default (virtualbox)
INFO environment: Uncached load of machine.
INFO base: VBoxManage path: C:\Program Files\Oracle\VirtualBox\VBoxManage.exe
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "--version"]
INFO meta: Using VirtualBox driver: VagrantPlugins::ProviderVirtualBox::Driver::Version_4_3
INFO base: VBoxManage path: C:\Program Files\Oracle\VirtualBox\VBoxManage.exe
INFO loader: Set "27710496_machine_default" = []
INFO loader: Loading configuration in order: [:home, :root, "27710496_machine_default"]
INFO box_collection: Box found: hashicorp/precise32 (virtualbox)
INFO environment: Running hook: authenticate_box_url
INFO host: Autodetecting host type for [#<Vagrant::Environment: C:/dev/local-server>]
INFO host: Detected: windows!
INFO runner: Preparing hooks for middleware sequence...
INFO runner: 2 hooks defined.
INFO runner: Running action: #<Vagrant::Action::Builder:0x376dbc8>
INFO warden: Calling IN action: #<VagrantPlugins::LoginCommand::AddAuthentication:0x4727a18>
INFO warden: Calling OUT action: #<VagrantPlugins::LoginCommand::AddAuthentication:0x4727a18>
INFO loader: Set :"28616568_hashicorp/precise32_virtualbox" = #<Pathname:C:/Users/vitallan/.vagrant.d/boxes/hashicorp-VAGRANTSLASH-precise32/1.0.0/virtualbox/Vagrantfile>
INFO loader: Loading configuration in order: [:"28616568_hashicorp/precise32_virtualbox", :home, :root, "27710496_machine_default"]
INFO machine: Initializing machine: default
INFO machine: - Provider: VagrantPlugins::ProviderVirtualBox::Provider
INFO machine: - Box: #<Vagrant::Box:0x474dfb0>
INFO machine: - Data dir: C:/dev/local-server/.vagrant/machines/default/virtualbox
INFO base: VBoxManage path: C:\Program Files\Oracle\VirtualBox\VBoxManage.exe
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "--version"]
INFO meta: Using VirtualBox driver: VagrantPlugins::ProviderVirtualBox::Driver::Version_4_3
INFO base: VBoxManage path: C:\Program Files\Oracle\VirtualBox\VBoxManage.exe
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "showvminfo", "d2575078-1bc6-448b-9c70-04e21249f33b"]
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "showvminfo", "d2575078-1bc6-448b-9c70-04e21249f33b", "--machinereadable"]
INFO command: With machine: default (#<VagrantPlugins::ProviderVirtualBox::Provider:0x487e4f8 #logger=#<Log4r::Logger:0x487e4c8 #fullname="vagrant::provider::virtualbox", #outputters=[], #additive=true, #name="virtualbox", #path="vagrant::provider", #parent=#<Log4r::Logger:0x37e65e8 #fullname="vagrant", #outputters=[#<Log4r::StderrOutputter:0x375fd50 #mon_owner=nil, #mon_count=0, #mon_mutex=#<Mutex:0x375fc18>, #name="stderr", #level=0, #formatter=#<Log4r::DefaultFormatter:0x375dbd8 #depth=7>, #out=#<IO:<STDERR>>>], #additive=true, #name="vagrant", #path="", #parent=#<Log4r::RootLogger:0x37e6540 #level=0, #outputters=[]>, #level=2, #trace=false>, #level=2, #trace=false>, #machine=#<Vagrant::Machine: default (VagrantPlugins::ProviderVirtualBox::Provider)>, #driver=#<VagrantPlugins::ProviderVirtualBox::Driver::Meta:0x4897560 #logger=#<Log4r::Logger:0x48a4148 #fullname="vagrant::provider::virtualbox::meta", #outputters=[], #additive=true, #name="meta", #path="vagrant::provider::virtualbox", #parent=#<Log4r::Logger:0x487e4c8 #fullname="vagrant::provider::virtualbox", #outputters=[], #additive=true, #name="virtualbox", #path="vagrant::provider", #parent=#<Log4r::Logger:0x37e65e8 #fullname="vagrant", #outputters=[#<Log4r::StderrOutputter:0x375fd50 #mon_owner=nil, #mon_count=0, #mon_mutex=#<Mutex:0x375fc18>, #name="stderr", #level=0, #formatter=#<Log4r::DefaultFormatter:0x375dbd8 #depth=7>, #out=#<IO:<STDERR>>>], #additive=true, #name="vagrant", #path="", #parent=#<Log4r::RootLogger:0x37e6540 #level=0, #outputters=[]>, #level=2, #trace=false>, #level=2, #trace=false>, #level=2, #trace=false>, #interrupted=false, #vboxmanage_path="C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", #uuid="d2575078-1bc6-448b-9c70-04e21249f33b", #version="4.3.22", #driver=#<VagrantPlugins::ProviderVirtualBox::Driver::Version_4_3:0x48d4568 #logger=#<Log4r::Logger:0x48ed108 #fullname="vagrant::provider::virtualbox_4_3", #outputters=[], #additive=true, #name="virtualbox_4_3", #path="vagrant::provider", #parent=#<Log4r::Logger:0x37e65e8 #fullname="vagrant", #outputters=[#<Log4r::StderrOutputter:0x375fd50 #mon_owner=nil, #mon_count=0, #mon_mutex=#<Mutex:0x375fc18>, #name="stderr", #level=0, #formatter=#<Log4r::DefaultFormatter:0x375dbd8 #depth=7>, #out=#<IO:<STDERR>>>], #additive=true, #name="vagrant", #path="", #parent=#<Log4r::RootLogger:0x37e6540 #level=0, #outputters=[]>, #level=2, #trace=false>, #level=2, #trace=false>, #interrupted=false, #vboxmanage_path="C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", #uuid="d2575078-1bc6-448b-9c70-04e21249f33b">>, #cap_logger=#<Log4r::Logger:0x491d600 #fullname="vagrant::capability_host::vagrantplugins::providervirtualbox::provider", #outputters=[], #additive=true, #name="provider", #path="vagrant::capability_host::vagrantplugins::providervirtualbox", #parent=#<Log4r::Logger:0x37e65e8 #fullname="vagrant", #outputters=[#<Log4r::StderrOutputter:0x375fd50 #mon_owner=nil, #mon_count=0, #mon_mutex=#<Mutex:0x375fc18>, #name="stderr", #level=0, #formatter=#<Log4r::DefaultFormatter:0x375dbd8 #depth=7>, #out=#<IO:<STDERR>>>], #additive=true, #name="vagrant", #path="", #parent=#<Log4r::RootLogger:0x37e6540 #level=0, #outputters=[]>, #level=2, #trace=false>, #level=2, #trace=false>, #cap_host_chain=[[:virtualbox, #<#<Class:0x491d8d0>:0x4b76518>]], #cap_args=[#<Vagrant::Machine: default (VagrantPlugins::ProviderVirtualBox::Provider)>], #cap_caps={:docker=>#<Vagrant::Registry:0x491d7f8 #items={:public_address=>#<Proc:0x3806538#C:/dev/tools/Vagrant/embedded/gems/gems/vagrant-1.7.2/plugins/providers/docker/plugin.rb:54>, :proxy_machine=>#<Proc:0x3806490#C:/dev/tools/Vagrant/embedded/gems/gems/vagrant-1.7.2/plugins/providers/docker/plugin.rb:59>}, #results_cache={}>, :hyperv=>#<Vagrant::Registry:0x491d780 #items={:public_address=>#<Proc:0x380ed10#C:/dev/tools/Vagrant/embedded/gems/gems/vagrant-1.7.2/plugins/providers/hyperv/plugin.rb:25>}, #results_cache={}>, :virtualbox=>#<Vagrant::Registry:0x491d708 #items={:forwarded_ports=>#<Proc:0x383fcd0#C:/dev/tools/Vagrant/embedded/gems/gems/vagrant-1.7.2/plugins/providers/virtualbox/plugin.rb:27>, :nic_mac_addresses=>#<Proc:0x383fc58#C:/dev/tools/Vagrant/embedded/gems/gems/vagrant-1.7.2/plugins/providers/virtualbox/plugin.rb:32>, :public_address=>#<Proc:0x2a38508#C:/dev/tools/Vagrant/embedded/gems/gems/vagrant-share-1.1.3/lib/vagrant-share.rb:39>}, #results_cache={}>}>)
INFO machine: Calling action: ssh on provider VirtualBox (d2575078-1bc6-448b-9c70-04e21249f33b)
INFO runner: Preparing hooks for middleware sequence...
INFO runner: 1 hooks defined.
INFO runner: Running action: #<Vagrant::Action::Builder:0x460fde8>
INFO warden: Calling IN action: #<VagrantPlugins::ProviderVirtualBox::Action::CheckVirtualbox:0x4686480>
INFO base: VBoxManage path: C:\Program Files\Oracle\VirtualBox\VBoxManage.exe
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "--version"]
INFO meta: Using VirtualBox driver: VagrantPlugins::ProviderVirtualBox::Driver::Version_4_3
INFO base: VBoxManage path: C:\Program Files\Oracle\VirtualBox\VBoxManage.exe
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "list", "hostonlyifs"]
INFO warden: Calling IN action: #<VagrantPlugins::ProviderVirtualBox::Action::CheckCreated:0x4686468>
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "showvminfo", "d2575078-1bc6-448b-9c70-04e21249f33b", "--machinereadable"]
INFO warden: Calling IN action: #<VagrantPlugins::ProviderVirtualBox::Action::CheckAccessible:0x4686450>
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "showvminfo", "d2575078-1bc6-448b-9c70-04e21249f33b", "--machinereadable"]
INFO warden: Calling IN action: #<VagrantPlugins::ProviderVirtualBox::Action::CheckRunning:0x4686438>
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "showvminfo", "d2575078-1bc6-448b-9c70-04e21249f33b", "--machinereadable"]
INFO warden: Calling IN action: #<Vagrant::Action::Builtin::SSHExec:0x4686420>
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "showvminfo", "d2575078-1bc6-448b-9c70-04e21249f33b", "--machinereadable"]
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "showvminfo", "d2575078-1bc6-448b-9c70-04e21249f33b", "--machinereadable"]
INFO subprocess: Starting process: ["C:\\Program Files (x86)\\Git\\bin/ssh.EXE"]
INFO ssh: Invoking SSH: ["vagrant#127.0.0.1", "-p", "2222", "-o", "Compression=yes", "-o", "DSAAuthentication=yes", "-o", "LogLevel=FATAL", "-o", "StrictHostKeyChecking=no", "-o", "UserKnownHostsFile=/dev/null", "-o", "IdentitiesOnly=yes", "-i", "C:/Users/vitallan/.vagrant.d/insecure_private_key"]
Does anyone know how can I ssh in my box?
EDIT: i tried openning new forwardding doors (like #user1389596 suggested), but it still doesnt work
Hmm, ssh 127.0.0.1 is the loopback address (the same machine). Shouldn't you use the local LAN IP of your ubuntu server intead?

Running Selenium Grid through Vagrant

I'm trying to migrate from running my Selenium server and client from all on my Mac, to having the servers run in a Vagrant VM, and the clients run locally on my Mac.
I'm using Vagrant 1.4.3 running on Mac OS X 10.9.1 to launch an Ubuntu 13.10 VM. Once the VM is launched, I install Java, Node.js and a few other dependencies that are required for my testing environment. After installing Selenium 2.39.0 (the latest as of this writing), here are the relevant configurations.
I SSH into my Vagrant VM and run the following:
java -jar /usr/local/bin/selenium-server-standalone-*.jar \
-role hub \
-trustAllSSLCertificates \
-hubConfig /vagrant/hub.json
/vagrant on the VM maps to the root of my project directory on my local Mac. Here's the relevant config from my Vagrantfile.
config.vm.box = "saucy64"
config.vm.box_url = "http://cloud-images.ubuntu.com/vagrant/saucy/20140202/saucy-server-cloudimg-amd64-vagrant-disk1.box"
# ...
config.vm.define "testing" do | test |
test.vm.network :forwarded_port, guest: 3444, host: 4444
test.vm.network :private_network, ip: "192.168.50.6"
# ...
end
Here is the Hub config that the Selenium Grid Hub is using on the Vagrant VM. Selenium Hub uses port 3444 inside the VM, which is portmapped to 4444 outside the VM, facing my Mac.
{
"browserTimeout": 180000,
"capabilityMatcher": "org.openqa.grid.internal.utils.DefaultCapabilityMatcher",
"cleanUpCycle": 2000,
"maxSession": 5,
"newSessionWaitTimeout": -1,
"nodePolling": 2000,
"port": 3444,
"throwOnCapabilityNotPresent": true,
"timeout": 30000
}
Here's how I launch Selenium on my Mac as a node.
java -jar selenium-server-standalone-*.jar \
-role node \
-trustAllSSLCertificates \
-nodeConfig node.mac.json
And here's the node config which tries to talk to the Hub running inside Vagrant.
{
"capabilities": [
{
"platform": "MAC",
"seleniumProtocol": "WebDriver",
"browserName": "firefox",
"maxInstances": 1
},
{
"platform": "MAC",
"seleniumProtocol": "WebDriver",
"browserName": "chrome",
"maxInstances": 1
}
],
"configuration": {
"proxy": "org.openqa.grid.selenium.proxy.DefaultRemoteProxy",
"hubHost": "127.0.0.1",
"hubPort": 4444,
"hub": "http://127.0.0.1:4444/grid/register",
"maxSession": 1,
"port": 4445,
"register": true,
"registerCycle": 2000,
"remoteHost": "http://127.0.0.1:4445",
"role": "node",
"url": "http://127.0.0.1:4445"
}
}
Lastly, here's what I get in the Terminal on the Mac side.
Feb 02, 2014 9:29:07 PM org.openqa.grid.selenium.GridLauncher main
INFO: Launching a selenium grid node
21:29:18.706 INFO - Java: Oracle Corporation 24.51-b03
21:29:18.706 INFO - OS: Mac OS X 10.9.1 x86_64
21:29:18.713 INFO - v2.39.0, with Core v2.39.0. Built from revision ff23eac
21:29:18.773 INFO - Default driver org.openqa.selenium.ie.InternetExplorerDriver registration is skipped: registration capabilities Capabilities [{platform=WINDOWS, ensureCleanSession=true, browserName=internet explorer, version=}] does not match with current platform: MAC
21:29:18.802 INFO - RemoteWebDriver instances should connect to: http://127.0.0.1:4445/wd/hub
21:29:18.803 INFO - Version Jetty/5.1.x
21:29:18.804 INFO - Started HttpContext[/selenium-server/driver,/selenium-server/driver]
21:29:18.804 INFO - Started HttpContext[/selenium-server,/selenium-server]
21:29:18.804 INFO - Started HttpContext[/,/]
21:29:18.864 INFO - Started org.openqa.jetty.jetty.servlet.ServletHandler#593aa24f
21:29:18.864 INFO - Started HttpContext[/wd,/wd]
21:29:18.866 INFO - Started SocketListener on 0.0.0.0:4445
21:29:18.867 INFO - Started org.openqa.jetty.jetty.Server#48ef85f3
21:29:18.867 INFO - using the json request : {"class":"org.openqa.grid.common.RegistrationRequest","capabilities":[{"platform":"MAC","seleniumProtocol":"WebDriver","browserName":"firefox","maxInstances":1},{"platform":"MAC","seleniumProtocol":"WebDriver","browserName":"chrome","maxInstances":1},{"platform":"MAC","seleniumProtocol":"WebDriver","browserName":"iphone","maxInstances":1},{"platform":"MAC","seleniumProtocol":"WebDriver","browserName":"ipad","maxInstances":1}],"configuration":{"nodeConfig":"node.mac.json","port":4445,"host":"192.168.50.1","hubHost":"127.0.0.1","registerCycle":2000,"trustAllSSLCertificates":"","hub":"http://127.0.0.1:4444/grid/register","url":"http://127.0.0.1:4445","remoteHost":"http://127.0.0.1:4445","register":true,"proxy":"org.openqa.grid.selenium.proxy.DefaultRemoteProxy","maxSession":1,"role":"node","hubPort":4444}}
21:29:18.868 INFO - Starting auto register thread. Will try to register every 2000 ms.
21:29:18.868 INFO - Registering the node to hub :http://127.0.0.1:4444/grid/register
21:30:25.079 INFO - Registering the node to hub :http://127.0.0.1:4444/grid/register
21:31:31.254 INFO - Registering the node to hub :http://127.0.0.1:4444/grid/register
21:32:35.416 INFO - Registering the node to hub :http://127.0.0.1:4444/grid/register
21:33:41.581 INFO - Registering the node to hub :http://127.0.0.1:4444/grid/register
21:34:47.752 INFO - Registering the node to hub :http://127.0.0.1:4444/grid/register
21:35:51.908 INFO - Registering the node to hub :http://127.0.0.1:4444/grid/register
21:36:56.045 INFO - Registering the node to hub :http://127.0.0.1:4444/grid/register
21:38:00.189 INFO - Registering the node to hub :http://127.0.0.1:4444/grid/register
Lastly, here's what I get in the Terminal on the Vagrant VM side.
Feb 03, 2014 5:28:53 AM org.openqa.grid.selenium.GridLauncher main
INFO: Launching a selenium grid server
2014-02-03 05:28:54.780:INFO:osjs.Server:jetty-7.x.y-SNAPSHOT
2014-02-03 05:28:54.811:INFO:osjsh.ContextHandler:started o.s.j.s.ServletContextHandler{/,null}
2014-02-03 05:28:54.823:INFO:osjs.AbstractConnector:Started SocketConnector#0.0.0.0:3444
Feb 03, 2014 5:29:20 AM org.openqa.grid.selenium.proxy.DefaultRemoteProxy isAlive
WARNING: Failed to check status of node: Connection refused
Feb 03, 2014 5:29:22 AM org.openqa.grid.selenium.proxy.DefaultRemoteProxy isAlive
WARNING: Failed to check status of node: Connection refused
Feb 03, 2014 5:29:22 AM org.openqa.grid.selenium.proxy.DefaultRemoteProxy onEvent
WARNING: Marking the node as down. Cannot reach the node for 2 tries.
Feb 03, 2014 5:29:24 AM org.openqa.grid.selenium.proxy.DefaultRemoteProxy isAlive
WARNING: Failed to check status of node: Connection refused
Feb 03, 2014 5:29:26 AM org.openqa.grid.selenium.proxy.DefaultRemoteProxy isAlive
WARNING: Failed to check status of node: Connection refused
Feb 03, 2014 5:29:28 AM org.openqa.grid.selenium.proxy.DefaultRemoteProxy isAlive
WARNING: Failed to check status of node: Connection refused
Feb 03, 2014 5:29:30 AM org.openqa.grid.selenium.proxy.DefaultRemoteProxy isAlive
WARNING: Failed to check status of node: Connection refused
Feb 03, 2014 5:29:32 AM org.openqa.grid.selenium.proxy.DefaultRemoteProxy isAlive
WARNING: Failed to check status of node: Connection refused
Google returns nothing of usefulness in this situation. Can anybody help me determine why the Hub and the Node can't talk to each other?
I have a similar setup where my selenium server (aka hub) is on a remote vm and a client (aka node) is on my local machine. I've been seeing the same error:
Feb 04, 2014 5:29:22 PM org.openqa.grid.selenium.proxy.DefaultRemoteProxy isAlive
WARNING: Failed to check status of node: Connection refused
Feb 04, 2014 5:29:22 PM org.openqa.grid.selenium.proxy.DefaultRemoteProxy onEvent
WARNING: Marking the node as down. Cannot reach the node for 2 tries.
I talked to our Ops team and they told me that my vm is sitting on a different network and in different location. And even though the node machine is able to reach the hub but the hub can never reach the node. They suggested to get another VM that is sitting on the same network. It's like one way street.
Hope it helps.
I don't know too much about Selenium, but I guess the issue is about using 127.0.0.1. Especially the VM has no way to connect to the host, and you don't forward port 4445.
As you already specify a private_network address (192.168.50.6), you could try to use it directly without any port forwarding.
The first answer was partially correct. You do have to ensure communication path between the node and the server and the server to the node is clear and able to connect on the specific ports. Since technically you are running 2 servers a server on the node listening on 1 port and a server on the hub listening to another port.
Try this:
I had the same problem, but fixed it by adding the host field:
"host": [ip or hostname of node],
Here is my node config file:
{
"capabilities":[
{
"platform":"MAC",
"browserName":"firefox",
"version":"28",
"maxInstances":1
},
{
"platform":"MAC",
"browserName":"chrome",
"version":"34",
"maxInstances":1
}
],
"configuration":{
"port": 5556,
"hubPort": 5555,
"host": 10.50.10.101, //this is the ip of my node
"hubHost":"10.50.10.100", //this is ip of my grid hub
"nodePolling":2500,
"registerCycle":10500,
"register":true,
"cleanUpCycle":2500,
"maxSession":5,
"role":"node"
}
}

Resources