vagrant hangs up after fixed port collision - vagrant

I am trying to bring up vagrant on a Windows machine.
It hangs up after
Fixed port collision for 22 => 2222. Now on port 2200.
A part of the debug log is below:
DEBUG subprocess: Waiting for process to exit. Remaining to timeout: 32000
DEBUG subprocess: Exit status: 0
INFO warden: Calling IN action: #<Vagrant::Action::Builtin::HandleForwardedPortCollisions:0x44e7760>
DEBUG environment: Attempting to acquire process-lock: fpcollision
DEBUG environment: Attempting to acquire process-lock: dotlock
INFO environment: Acquired process lock: dotlock
INFO environment: Released process lock: dotlock
INFO environment: Acquired process lock: fpcollision
INFO handle_port_collisions: Detecting any forwarded port collisions...
DEBUG handle_port_collisions: Extra in use: []
DEBUG handle_port_collisions: Remap: {}
DEBUG handle_port_collisions: Repair: true
INFO handle_port_collisions: Attempting to repair FP collision: 2222
INFO handle_port_collisions: Repaired FP collision: 2222 to 2200
INFO interface: info: Fixed port collision for 22 => 2222. Now on port 2200.
INFO interface: info: ==> vlad: Fixed port collision for 22 => 2222. Now on port 2200.
==> vlad: Fixed port collision for 22 => 2222. Now on port 2200.
INFO environment: Released process lock: fpcollision
DEBUG environment: Attempting to acquire process-lock: dotlock
INFO environment: Acquired process lock: dotlock
INFO environment: Released process lock: dotlock
INFO warden: Calling IN action: #<VagrantPlugins::ProviderVirtualBox::Action::PrepareNFSValidIds:0x44431c8>
INFO subprocess: Starting process: ["C:/Program Files/Oracle/VirtualBox/VBoxManage.exe", "list", "vms"]
INFO subprocess: Command not in installer, restoring original environment...
DEBUG subprocess: Selecting on IO
DEBUG subprocess: stdout: "vlad_vlad" {efce349f-2b2e-40db-9a14-2298d3024638}
DEBUG subprocess: Waiting for process to exit. Remaining to timeout: 32000
DEBUG subprocess: Exit status: 0
INFO warden: Calling IN action: #<VagrantPlugins::SyncedFolderNFS::ActionCleanup:0x435c3a8>
DEBUG host: Searching for cap: nfs_prune
DEBUG host: Checking in: windows
INFO nfs: Host doesn't support pruning NFS. Skipping.
INFO warden: Calling IN action: #<Vagrant::Action::Builtin::SyncedFolderCleanup:0x4263ca8>
INFO subprocess: Starting process: ["C:\\Windows\\System32\\WindowsPowerShell\\v1.0/powershell.EXE", "-NoProfile", "-ExecutionPolicy", "Bypass", "$PSVersionTable.PSVersion.Major"]
INFO subprocess: Command not in installer, restoring original environment...
DEBUG subprocess: Selecting on IO
DEBUG subprocess: stdout: 2
I do not have the slightest idea of how to proceed. Any help is appreciated.

After powershell installation vagrant continued with the setup of the VM

Related

Jmeter doesn't show results when executed on slave, both GUI and CLI

Introduction:
I'm new to Jmeter, so I'm working with basic tasks but I have a mentor. We are working with distributed architecture, one master and one slave in my local network executing a real world working testplan.
Problem:
No response from slave:
We have installed and configured Jmeter 5.3 and java version "1.8.0_271" in both slave and master with windows 10 machines. Each machine can ping the other. When the testplan is executed in the master, from the JMeter GUI or CLI, the slave recognize the order and start the process, it also notify the end, but there is no response in the .jtl file another than the file header. We have tried with RMI enabled and disabled. The logs don't show errors.
System specs:
OS W10, Jmeter 5.3, Java 1.8, Jmeter plugin: jmeter-plugins-manager-1.4.jar (Plugins installed: Custom Thread Groups, 3 Basic Graphs, Console Status Logger), Windows Firewall is disabled in both machines, Jmeter properties (Master remote_hosts: Slave ip, Slave remote_hosts: 127.0.0.1)
JMeter GUI execution log (Master):
2020-12-11 11:33:45,446 INFO o.a.j.e.DistributedRunner: Configuring remote engine: 192.168.1.135
2020-12-11 11:33:45,446 INFO o.a.j.r.RmiUtils: Disabling SSL for RMI as server.rmi.ssl.disable is set to 'true'
2020-12-11 11:33:45,624 INFO o.a.j.e.DistributedRunner: Starting distributed test with remote engines: [192.168.1.135] # Fri Dec 11 11:33:45 CET 2020 (1607682825623)
2020-12-11 11:33:45,624 INFO o.a.j.e.ClientJMeterEngine: running clientengine run method
2020-12-11 11:33:45,629 INFO o.a.j.r.RmiUtils: Disabling SSL for RMI as server.rmi.ssl.disable is set to 'true'
2020-12-11 11:33:45,629 INFO o.a.j.r.RmiUtils: Disabling SSL for RMI as server.rmi.ssl.disable is set to 'true'
2020-12-11 11:33:45,630 INFO o.a.j.s.BatchSampleSender: Using batching (client settings) for this run. Thresholds: num=100, time=60000
2020-12-11 11:33:45,630 INFO o.a.j.s.DataStrippingSampleSender: Using DataStrippingSampleSender for this run
2020-12-11 11:33:45,630 INFO o.a.j.r.RmiUtils: Disabling SSL for RMI as server.rmi.ssl.disable is set to 'true'
2020-12-11 11:33:45,630 INFO o.a.j.r.RmiUtils: Disabling SSL for RMI as server.rmi.ssl.disable is set to 'true'
2020-12-11 11:33:45,630 INFO o.a.j.s.BatchSampleSender: Using batching (client settings) for this run. Thresholds: num=100, time=60000
2020-12-11 11:33:45,630 INFO o.a.j.s.DataStrippingSampleSender: Using DataStrippingSampleSender for this run
2020-12-11 11:33:45,630 INFO o.a.j.r.RmiUtils: Disabling SSL for RMI as server.rmi.ssl.disable is set to 'true'
2020-12-11 11:33:45,630 INFO o.a.j.r.RmiUtils: Disabling SSL for RMI as server.rmi.ssl.disable is set to 'true'
2020-12-11 11:33:45,630 INFO o.a.j.r.RmiUtils: Disabling SSL for RMI as server.rmi.ssl.disable is set to 'true'
2020-12-11 11:33:45,630 INFO o.a.j.r.RmiUtils: Disabling SSL for RMI as server.rmi.ssl.disable is set to 'true'
2020-12-11 11:33:45,641 INFO o.a.j.r.RmiUtils: Disabling SSL for RMI as server.rmi.ssl.disable is set to 'true'
2020-12-11 11:33:45,641 INFO o.a.j.r.RmiUtils: Disabling SSL for RMI as server.rmi.ssl.disable is set to 'true'
2020-12-11 11:33:45,641 INFO o.a.j.s.BatchSampleSender: Using batching (client settings) for this run. Thresholds: num=100, time=60000
2020-12-11 11:33:45,641 INFO o.a.j.s.DataStrippingSampleSender: Using DataStrippingSampleSender for this run
2020-12-11 11:33:48,060 INFO o.a.j.e.ClientJMeterEngine: sent test to 192.168.1.135 basedir='.'
2020-12-11 11:33:48,060 INFO o.a.j.e.ClientJMeterEngine: Sending properties {}
2020-12-11 11:33:48,075 INFO o.a.j.e.ClientJMeterEngine: sent run command to 192.168.1.135
2020-12-11 11:33:48,075 INFO o.a.j.e.DistributedRunner: Remote engines have been started:[192.168.1.135]
2020-12-11 11:33:48,721 INFO o.a.j.g.u.JMeterMenuBar: setRunning(true, 192.168.1.135)
2020-12-11 11:35:49,030 INFO o.a.j.g.u.JMeterMenuBar: setRunning(false, 192.168.1.135)
JMeter CLI interface:
Creating summariser <summary>
Created the tree successfully using fileExample.jmx
Configuring remote engine: 192.168.1.135
Starting distributed test with remote engines: [192.168.1.135] # Fri Dec 11 12:41:46 CET 2020 (1607686906826)
Remote engines have been started:[192.168.1.135]
Waiting for possible Shutdown/StopTestNow/HeapDump/ThreadDump message on port 4445
summary = 0 in 00:00:00 = ******/s Avg: 0 Min: 9223372036854775807 Max: -9223372036854775808 Err: 0 (0.00%)
Tidying up remote # Fri Dec 11 12:43:51 CET 2020 (1607687031041)
... end of run
Guides and articles read:
Our own guide, which has worked in the past (installed in production), based in:
https://jmeter.apache.org/usermanual/jmeter_distributed_testing_step_by_step.html
https://jmeter.apache.org/usermanual/remote-test.html
JMeter Summary report in distributed mode
https://loadium.com/blog/jmeter-distributed-testing-step-by-step/
https://cwiki.apache.org/confluence/display/jmeter/JMeterFAQ#How_to_do_remote_testing_the_.27proper_way.27.3F
Question:
So the question is basically what can be wrong to make the server recognize the start calls, send the end signal, but don't write the .jtl file.
EDIT WITH SOLUTION
The .csv file in what the test are based must be in the slave inside the JMeter bin folder /bin/data/yourFile.csv
How about slave log? Given the symptoms I can think of 3 possible reasons:
Your test relies on external data file, i.e. CSV file or .properties file, if this is the case - you need to copy all the dependent files to all the slave machines
Your test relies on a plugin which is not installed on the slave(s), either copy your JMeter installation from master to all the slaves or use JMeter Plugins Manager to install the missing plugins
Your RMI configuration is not correct, i.e. port which is used for communication from the slave to the master is random by default, you might want to set it explicitly and open it in firewall
In order to be able to tell more I need to see your jmeter-server.log file (it's being generated in "bin" folder of your JMeter installation given you launch the slave process via jmeter-server.bat file, if you use other approach you can set the desired log file name/location via -j command-line argument

The oozie job does not run with the message [AM container is launched, waiting for AM container to Register with RM]

I ran a shell job among the oozie examples.
However, YARN application is not executed.
Detail information YARN UI & LOG:
https://docs.google.com/document/d/1N8LBXZGttY3rhRTwv8cUEfK3WkWtvWJ-YV1q_fh_kks/edit
YARN application status is
Application Priority: 0 (Higher Integer value indicates higher priority)
YarnApplicationState: ACCEPTED: waiting for AM container to be allocated, launched and register with RM.
Queue: default
FinalStatus Reported by AM: Application has not completed yet.
Finished: N/A
Elapsed: 20mins, 30sec
Tracking URL: ApplicationMaster
Log Aggregation Status: DISABLED
Application Timeout (Remaining Time): Unlimited
Diagnostics: AM container is launched, waiting for AM container to Register with RM
Application Attempt status is
Application Attempt State: FAILED
Elapsed: 13mins, 19sec
AM Container: container_1607273090037_0001_02_000001
Node: N/A
Tracking URL: History
Diagnostics Info: ApplicationMaster for attempt appattempt_1607273090037_0001_000002 timed out
Node Local Request Rack Local Request Off Switch Request
Num Node Local Containers (satisfied by) 0
Num Rack Local Containers (satisfied by) 0 0
Num Off Switch Containers (satisfied by) 0 0 1
nodemanager log
2020-12-07 01:45:16,237 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.scheduler.ContainerScheduler: Starting container [container_1607273090037_0001_01_000001]
2020-12-07 01:45:16,267 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerImpl: Container container_1607273090037_0001_01_000001 transitioned from SCHEDULED to RUNNING
2020-12-07 01:45:16,267 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl: Starting resource-monitoring for container_1607273090037_0001_01_000001
2020-12-07 01:45:16,272 INFO org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor: launchContainer: [bash, /tmp/hadoop-oozie/nm-local-dir/usercache/oozie/appcache/application_1607273090037_0001/container_1607273090037_0001_01_000001/default_container_executor.sh]
2020-12-07 01:45:17,301 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl: container_1607273090037_0001_01_000001's ip = 127.0.0.1, and hostname = localhost.localdomain
2020-12-07 01:45:17,345 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl: Skipping monitoring container container_1607273090037_0001_01_000001 since CPU usage is not yet available.
2020-12-07 01:45:48,274 INFO logs: Aliases are enabled
2020-12-07 01:54:50,242 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService: Cache Size Before Clean: 496756, Total Deleted: 0, Public Deleted: 0, Private Deleted: 0
2020-12-07 01:58:10,071 INFO SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for appattempt_1607273090037_0001_000001 (auth:SIMPLE)
2020-12-07 01:58:10,078 INFO org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl: Stopping container with container Id: container_1607273090037_0001_01_000001
What is the problem ?

Vagrant up issue

We have a network that requires a proxy to go to the internet, I have a local Mac that we’re setting up for Drupal development and have HTTP_PROXY and ALL_PROXY set and everything works fine using bower, gem and librarian-puppet scripts from our development house to install these. I am now using a Vagrantfile they have created to setup a vagrant (VirtualBox) machine and hitting issues. I initially got issues early on in the script, but have installed vagrant-proxyconf and added config.proxy.http, https and no_proxy lines to the fine, which now gets a lot further before erroring with:
==> default: Info: Loading facts ==> default: Could not retrieve fact='nodejs_latest_version', resolution='': Connection refused - connect(2) ==> default: Could not retrieve fact='nodejs_stable_version', resolution='': Connection refused - connect(2) ==> default: Error: Invalid version string ''! at /tmp/vagrant-puppet/modules-71ca71d9c718e523371ddd53fd147634/nodejs/manifests/install.pp:165 on node drupal.dev ==> default: Error: Invalid version string ''! at /tmp/vagrant-puppet/modules-71ca71d9c718e523371ddd53fd147634/nodejs/manifests/install.pp:165 on node drupal.dev The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong.
The ‘SSH command responded with a non-zero exit status.’ I see a few times in the script execution, but this is the first that stops the script running. If I connect the machine directly to an internet connection the script runs through successfully, but this isn’t a long term solution. I’m not sure if the issue is on the mac or the virtual box, or what file or setting I should be looking at. If anyone can help shed some light where the issue might be, or where I should be looking it would really be appreciated.
The debug output ends like this :
==> default: Info: Loading facts DEBUG ssh: stderr: Could not retrieve fact='nodejs_latest_version', resolution='': Connection refused - connect(2)
INFO interface: info: Could not retrieve fact='nodejs_latest_version', resolution='': Connection refused - connect(2) INFO interface: info: ==> default: Could not retrieve fact='nodejs_latest_version', resolution='': Connection refused - connect(2) ==> default: Could not retrieve fact='nodejs_latest_version', resolution='': Connection refused - connect(2) DEBUG ssh: stderr: Could not retrieve fact='nodejs_stable_version', resolution='': Connection refused - connect(2)
INFO interface: info: Could not retrieve fact='nodejs_stable_version', resolution='': Connection refused - connect(2) INFO interface: info: ==> default: Could not retrieve fact='nodejs_stable_version', resolution='': Connection refused - connect(2) ==> default: Could not retrieve fact='nodejs_stable_version', resolution='': Connection refused - connect(2) DEBUG ssh: stderr: Error: Invalid version string ''! at /tmp/vagrant-puppet/modules-71ca71d9c718e523371ddd53fd147634/nodejs/manifests/install.pp:165 on node drupal.dev
INFO interface: info: Error: Invalid version string ''! at /tmp/vagrant-puppet/modules-71ca71d9c718e523371ddd53fd147634/nodejs/manifests/install.pp:165 on node drupal.dev INFO interface: info: ==> default: Error: Invalid version string ''! at /tmp/vagrant-puppet/modules-71ca71d9c718e523371ddd53fd147634/nodejs/manifests/install.pp:165 on node drupal.dev ==> default: Error: Invalid version string ''! at /tmp/vagrant-puppet/modules-71ca71d9c718e523371ddd53fd147634/nodejs/manifests/install.pp:165 on node drupal.dev DEBUG ssh: stderr: Error: Invalid version string ''! at /tmp/vagrant-puppet/modules-71ca71d9c718e523371ddd53fd147634/nodejs/manifests/install.pp:165 on node drupal.dev
INFO interface: info: Error: Invalid version string ''! at /tmp/vagrant-puppet/modules-71ca71d9c718e523371ddd53fd147634/nodejs/manifests/install.pp:165 on node drupal.dev INFO interface: info: ==> default: Error: Invalid version string ''! at /tmp/vagrant-puppet/modules-71ca71d9c718e523371ddd53fd147634/nodejs/manifests/install.pp:165 on node drupal.dev ==> default: Error: Invalid version string ''! at /tmp/vagrant-puppet/modules-71ca71d9c718e523371ddd53fd147634/nodejs/manifests/install.pp:165 on node drupal.dev DEBUG ssh: Exit status: 1 ERROR warden: Error occurred: The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong. INFO warden: Beginning recovery process... INFO warden: Recovery complete. ERROR warden: Error occurred: The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong. INFO warden: Beginning recovery process... INFO warden: Calling recover: # INFO warden: Beginning recovery process... INFO warden: Calling recover: # INFO warden: Beginning recovery process... INFO warden: Recovery complete. INFO warden: Recovery complete. INFO warden: Recovery complete. INFO warden: Beginning recovery process... INFO warden: Recovery complete. ERROR warden: Error occurred: The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong. INFO warden: Beginning recovery process... INFO warden: Recovery complete. ERROR warden: Error occurred: The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong. INFO warden: Beginning recovery process... INFO warden: Calling recover: # INFO warden: Beginning recovery process... INFO warden: Recovery complete. INFO warden: Recovery complete. ERROR warden: Error occurred: The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong. INFO warden: Beginning recovery process... INFO warden: Recovery complete. ERROR warden: Error occurred: The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong. INFO warden: Beginning recovery process... INFO warden: Calling recover: # INFO warden: Beginning recovery process... INFO warden: Recovery complete. INFO warden: Recovery complete. INFO warden: Beginning recovery process... INFO warden: Recovery complete. INFO warden: Beginning recovery process... INFO warden: Recovery complete. INFO environment: Released process lock: machine-action-d3538fe6302dd66edcdb4c05283597a6 INFO environment: Running hook: environment_unload INFO runner: Preparing hooks for middleware sequence... INFO runner: 2 hooks defined. INFO runner: Running action: environment_unload # ERROR vagrant: Vagrant experienced an error! Details: ERROR vagrant: # ERROR vagrant: The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong. ERROR vagrant: /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/plugins/communicators/ssh/communicator.rb:236:in execute'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/plugins/communicators/ssh/communicator.rb:246:insudo' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/plugins/provisioners/puppet/provisioner/puppet.rb:251:in run_puppet_apply'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/plugins/provisioners/puppet/provisioner/puppet.rb:124:inprovision' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/provision.rb:133:in run_provisioner'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:95:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:95:in block in finalize_action'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builder.rb:116:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/runner.rb:66:in block in run'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/util/busy.rb:19:inbusy' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/runner.rb:66:in run'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/environment.rb:428:inhook' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/provision.rb:121:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/provision.rb:121:inblock in call' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/provision.rb:103:in each'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/provision.rb:103:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/plugins/providers/virtualbox/action/check_accessible.rb:18:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:95:inblock in finalize_action' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builder.rb:116:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/runner.rb:66:inblock in run' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/util/busy.rb:19:in busy'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/runner.rb:66:inrun' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/call.rb:53:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:95:in block in finalize_action'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builder.rb:116:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/runner.rb:66:in block in run'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/util/busy.rb:19:inbusy' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/runner.rb:66:in run'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/call.rb:53:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/config_validate.rb:25:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/plugins/providers/virtualbox/action/check_virtualbox.rb:17:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builder.rb:116:incall' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/runner.rb:66:in block in run'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/util/busy.rb:19:inbusy' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/runner.rb:66:in run'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/machine.rb:214:inaction_raw' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/machine.rb:191:in block in action'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/environment.rb:516:inlock' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/machine.rb:178:in call'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/machine.rb:178:inaction' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/plugins/commands/provision/command.rb:30:in block in execute'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/plugin/v2/command.rb:226:inblock in with_target_vms' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/plugin/v2/command.rb:220:in each'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/plugin/v2/command.rb:220:inwith_target_vms' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/plugins/commands/provision/command.rb:29:in execute'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/cli.rb:42:inexecute' /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/environment.rb:301:in cli'
/opt/vagrant/embedded/gems/gems/vagrant-1.7.4/bin/vagrant:174:in' INFO interface: error: The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong. The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong. INFO interface: Machine: error-exit ["Vagrant::Errors::VagrantError", "The SSH command responded with a non-zero exit status. Vagrant\nassumes that this means the command failed. The output for this command\nshould be in the log above. Please read the output to determine what\nwent wrong."]

Can't connect in Ubuntu box from Windows host in Vagrant

I'm having problems trying to ssh in an ubuntu box inside my windows machine. I'm following the basic Getting Started tutorial from Vagrant site (http://docs.vagrantup.com/v2/getting-started/up.html)
Software Versions:
Vagrant: 1.7.2
Virtualbox: 4.3.22
SO: Windows 7
Box: hashicorp/precise32
The scenario:
When I hit vagrant up everything goes according to plan: services and vm starts, but in the final step, which vagrant tests the ssh, it fails after severals Warning: Connection refused. Retrying...
After that when I open the Virtualbox GUI, I can log in the box and check that openssh-server is running. When I try to ssh through the command line I get the error:
ssh 127.0.0.1 -p 2222
ssh: connect to host 127.0.0.1 port 2222: Bad file number
When I hit vagrant ssh (with the logs in INFO level), I get this output:
INFO global: Vagrant version: 1.7.2
INFO global: Ruby version: 2.0.0
INFO global: RubyGems version: 2.0.14
INFO global: VAGRANT_EXECUTABLE="C:\\dev\\tools\\Vagrant\\embedded\\gems\\gems\\vagrant-1.7.2\\bin\\vagrant"
INFO global: VAGRANT_INSTALLER_EMBEDDED_DIR="C:\\dev\\tools\\Vagrant\\embedded"
INFO global: VAGRANT_INSTALLER_ENV="1"
INFO global: VAGRANT_INSTALLER_VERSION="2"
INFO global: VAGRANT_INTERNAL_BUNDLERIZED="1"
INFO global: VAGRANT_LOG="INFO"
INFO global: Plugins:
INFO global: - bundler = 1.7.11
INFO global: - mime-types = 1.25.1
INFO global: - rdoc = 4.0.0
INFO global: - rest-client = 1.6.8
INFO global: - vagrant-share = 1.1.3
INFO manager: Registered plugin: box command
INFO manager: Registered plugin: destroy command
INFO manager: Registered plugin: global-status command
INFO manager: Registered plugin: halt command
INFO manager: Registered plugin: help command
INFO manager: Registered plugin: init command
INFO manager: Registered plugin: list-commands command
INFO manager: Registered plugin: vagrant-login
INFO manager: Registered plugin: package command
INFO manager: Registered plugin: plugin command
INFO manager: Registered plugin: provision command
INFO manager: Registered plugin: push command
INFO manager: Registered plugin: rdp command
INFO manager: Registered plugin: reload command
INFO manager: Registered plugin: resume command
INFO manager: Registered plugin: ssh command
INFO manager: Registered plugin: ssh-config command
INFO manager: Registered plugin: status command
INFO manager: Registered plugin: suspend command
INFO manager: Registered plugin: up command
INFO manager: Registered plugin: version command
INFO manager: Registered plugin: ssh communicator
INFO manager: Registered plugin: winrm communicator
INFO manager: Registered plugin: Arch guest
INFO manager: Registered plugin: CoreOS guest
INFO manager: Registered plugin: Darwin guest
INFO manager: Registered plugin: Debian guest
INFO manager: Registered plugin: ESXi guest.
INFO manager: Registered plugin: Fedora guest
INFO manager: Registered plugin: FreeBSD guest
INFO manager: Registered plugin: Funtoo guest
INFO manager: Registered plugin: Gentoo guest
INFO manager: Registered plugin: Linux guest.
INFO manager: Registered plugin: Mint guest
INFO manager: Registered plugin: NetBSD guest
INFO manager: Registered plugin: NixOS guest
INFO manager: Registered plugin: OmniOS guest.
INFO manager: Registered plugin: OpenBSD guest
INFO manager: Registered plugin: PLD Linux guest
INFO manager: Registered plugin: RedHat guest
INFO manager: Registered plugin: SmartOS guest.
INFO manager: Registered plugin: Solaris guest.
INFO manager: Registered plugin: Solaris 11 guest.
INFO manager: Registered plugin: SUSE guest
INFO manager: Registered plugin: TinyCore Linux guest.
INFO manager: Registered plugin: Ubuntu guest
INFO manager: Registered plugin: Windows guest.
INFO manager: Registered plugin: Arch host
INFO manager: Registered plugin: BSD host
INFO manager: Registered plugin: Mac OS X host
INFO manager: Registered plugin: FreeBSD host
INFO manager: Registered plugin: Gentoo host
INFO manager: Registered plugin: Linux host
INFO manager: Registered plugin: null host
INFO manager: Registered plugin: Red Hat host
INFO manager: Registered plugin: Slackware host
INFO manager: Registered plugin: SUSE host
INFO manager: Registered plugin: Windows host
INFO manager: Registered plugin: kernel
INFO manager: Registered plugin: kernel
INFO manager: Registered plugin: docker-provider
INFO manager: Registered plugin: Hyper-V provider
INFO manager: Registered plugin: VirtualBox provider
INFO manager: Registered plugin: ansible
INFO manager: Registered plugin: CFEngine Provisioner
INFO manager: Registered plugin: chef
INFO manager: Registered plugin: docker
INFO manager: Registered plugin: file
INFO manager: Registered plugin: puppet
INFO manager: Registered plugin: salt
INFO manager: Registered plugin: shell
INFO manager: Registered plugin: atlas
INFO manager: Registered plugin: ftp
INFO manager: Registered plugin: heroku
INFO manager: Registered plugin: local-exec
INFO manager: Registered plugin: noop
INFO manager: Registered plugin: NFS synced folders
INFO manager: Registered plugin: RSync synced folders
INFO manager: Registered plugin: SMB synced folders
INFO global: Loading plugins!
INFO manager: Registered plugin: vagrant-share
INFO vagrant: `vagrant` invoked: ["ssh"]
INFO environment: Environment initialized (#<Vagrant::Environment:0x36893d0>)
INFO environment: - cwd: C:/dev/local-server
INFO environment: Home path: C:/Users/vitallan/.vagrant.d
INFO environment: Local data path: C:/dev/local-server/.vagrant
INFO environment: Running hook: environment_plugins_loaded
INFO runner: Preparing hooks for middleware sequence...
INFO runner: 1 hooks defined.
INFO runner: Running action: #<Vagrant::Action::Builder:0x2c494c8>
INFO environment: Running hook: environment_load
INFO runner: Preparing hooks for middleware sequence...
INFO runner: 1 hooks defined.
INFO runner: Running action: #<Vagrant::Action::Builder:0x2cc93a0>
INFO cli: CLI: [] "ssh" []
INFO loader: Set :root = #<Pathname:C:/dev/local-server/Vagrantfile>
INFO loader: Loading configuration in order: [:home, :root]
INFO command: Active machine found with name default. Using provider: virtualbox
INFO environment: Getting machine: default (virtualbox)
INFO environment: Uncached load of machine.
INFO base: VBoxManage path: C:\Program Files\Oracle\VirtualBox\VBoxManage.exe
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "--version"]
INFO meta: Using VirtualBox driver: VagrantPlugins::ProviderVirtualBox::Driver::Version_4_3
INFO base: VBoxManage path: C:\Program Files\Oracle\VirtualBox\VBoxManage.exe
INFO loader: Set "27710496_machine_default" = []
INFO loader: Loading configuration in order: [:home, :root, "27710496_machine_default"]
INFO box_collection: Box found: hashicorp/precise32 (virtualbox)
INFO environment: Running hook: authenticate_box_url
INFO host: Autodetecting host type for [#<Vagrant::Environment: C:/dev/local-server>]
INFO host: Detected: windows!
INFO runner: Preparing hooks for middleware sequence...
INFO runner: 2 hooks defined.
INFO runner: Running action: #<Vagrant::Action::Builder:0x376dbc8>
INFO warden: Calling IN action: #<VagrantPlugins::LoginCommand::AddAuthentication:0x4727a18>
INFO warden: Calling OUT action: #<VagrantPlugins::LoginCommand::AddAuthentication:0x4727a18>
INFO loader: Set :"28616568_hashicorp/precise32_virtualbox" = #<Pathname:C:/Users/vitallan/.vagrant.d/boxes/hashicorp-VAGRANTSLASH-precise32/1.0.0/virtualbox/Vagrantfile>
INFO loader: Loading configuration in order: [:"28616568_hashicorp/precise32_virtualbox", :home, :root, "27710496_machine_default"]
INFO machine: Initializing machine: default
INFO machine: - Provider: VagrantPlugins::ProviderVirtualBox::Provider
INFO machine: - Box: #<Vagrant::Box:0x474dfb0>
INFO machine: - Data dir: C:/dev/local-server/.vagrant/machines/default/virtualbox
INFO base: VBoxManage path: C:\Program Files\Oracle\VirtualBox\VBoxManage.exe
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "--version"]
INFO meta: Using VirtualBox driver: VagrantPlugins::ProviderVirtualBox::Driver::Version_4_3
INFO base: VBoxManage path: C:\Program Files\Oracle\VirtualBox\VBoxManage.exe
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "showvminfo", "d2575078-1bc6-448b-9c70-04e21249f33b"]
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "showvminfo", "d2575078-1bc6-448b-9c70-04e21249f33b", "--machinereadable"]
INFO command: With machine: default (#<VagrantPlugins::ProviderVirtualBox::Provider:0x487e4f8 #logger=#<Log4r::Logger:0x487e4c8 #fullname="vagrant::provider::virtualbox", #outputters=[], #additive=true, #name="virtualbox", #path="vagrant::provider", #parent=#<Log4r::Logger:0x37e65e8 #fullname="vagrant", #outputters=[#<Log4r::StderrOutputter:0x375fd50 #mon_owner=nil, #mon_count=0, #mon_mutex=#<Mutex:0x375fc18>, #name="stderr", #level=0, #formatter=#<Log4r::DefaultFormatter:0x375dbd8 #depth=7>, #out=#<IO:<STDERR>>>], #additive=true, #name="vagrant", #path="", #parent=#<Log4r::RootLogger:0x37e6540 #level=0, #outputters=[]>, #level=2, #trace=false>, #level=2, #trace=false>, #machine=#<Vagrant::Machine: default (VagrantPlugins::ProviderVirtualBox::Provider)>, #driver=#<VagrantPlugins::ProviderVirtualBox::Driver::Meta:0x4897560 #logger=#<Log4r::Logger:0x48a4148 #fullname="vagrant::provider::virtualbox::meta", #outputters=[], #additive=true, #name="meta", #path="vagrant::provider::virtualbox", #parent=#<Log4r::Logger:0x487e4c8 #fullname="vagrant::provider::virtualbox", #outputters=[], #additive=true, #name="virtualbox", #path="vagrant::provider", #parent=#<Log4r::Logger:0x37e65e8 #fullname="vagrant", #outputters=[#<Log4r::StderrOutputter:0x375fd50 #mon_owner=nil, #mon_count=0, #mon_mutex=#<Mutex:0x375fc18>, #name="stderr", #level=0, #formatter=#<Log4r::DefaultFormatter:0x375dbd8 #depth=7>, #out=#<IO:<STDERR>>>], #additive=true, #name="vagrant", #path="", #parent=#<Log4r::RootLogger:0x37e6540 #level=0, #outputters=[]>, #level=2, #trace=false>, #level=2, #trace=false>, #level=2, #trace=false>, #interrupted=false, #vboxmanage_path="C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", #uuid="d2575078-1bc6-448b-9c70-04e21249f33b", #version="4.3.22", #driver=#<VagrantPlugins::ProviderVirtualBox::Driver::Version_4_3:0x48d4568 #logger=#<Log4r::Logger:0x48ed108 #fullname="vagrant::provider::virtualbox_4_3", #outputters=[], #additive=true, #name="virtualbox_4_3", #path="vagrant::provider", #parent=#<Log4r::Logger:0x37e65e8 #fullname="vagrant", #outputters=[#<Log4r::StderrOutputter:0x375fd50 #mon_owner=nil, #mon_count=0, #mon_mutex=#<Mutex:0x375fc18>, #name="stderr", #level=0, #formatter=#<Log4r::DefaultFormatter:0x375dbd8 #depth=7>, #out=#<IO:<STDERR>>>], #additive=true, #name="vagrant", #path="", #parent=#<Log4r::RootLogger:0x37e6540 #level=0, #outputters=[]>, #level=2, #trace=false>, #level=2, #trace=false>, #interrupted=false, #vboxmanage_path="C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", #uuid="d2575078-1bc6-448b-9c70-04e21249f33b">>, #cap_logger=#<Log4r::Logger:0x491d600 #fullname="vagrant::capability_host::vagrantplugins::providervirtualbox::provider", #outputters=[], #additive=true, #name="provider", #path="vagrant::capability_host::vagrantplugins::providervirtualbox", #parent=#<Log4r::Logger:0x37e65e8 #fullname="vagrant", #outputters=[#<Log4r::StderrOutputter:0x375fd50 #mon_owner=nil, #mon_count=0, #mon_mutex=#<Mutex:0x375fc18>, #name="stderr", #level=0, #formatter=#<Log4r::DefaultFormatter:0x375dbd8 #depth=7>, #out=#<IO:<STDERR>>>], #additive=true, #name="vagrant", #path="", #parent=#<Log4r::RootLogger:0x37e6540 #level=0, #outputters=[]>, #level=2, #trace=false>, #level=2, #trace=false>, #cap_host_chain=[[:virtualbox, #<#<Class:0x491d8d0>:0x4b76518>]], #cap_args=[#<Vagrant::Machine: default (VagrantPlugins::ProviderVirtualBox::Provider)>], #cap_caps={:docker=>#<Vagrant::Registry:0x491d7f8 #items={:public_address=>#<Proc:0x3806538#C:/dev/tools/Vagrant/embedded/gems/gems/vagrant-1.7.2/plugins/providers/docker/plugin.rb:54>, :proxy_machine=>#<Proc:0x3806490#C:/dev/tools/Vagrant/embedded/gems/gems/vagrant-1.7.2/plugins/providers/docker/plugin.rb:59>}, #results_cache={}>, :hyperv=>#<Vagrant::Registry:0x491d780 #items={:public_address=>#<Proc:0x380ed10#C:/dev/tools/Vagrant/embedded/gems/gems/vagrant-1.7.2/plugins/providers/hyperv/plugin.rb:25>}, #results_cache={}>, :virtualbox=>#<Vagrant::Registry:0x491d708 #items={:forwarded_ports=>#<Proc:0x383fcd0#C:/dev/tools/Vagrant/embedded/gems/gems/vagrant-1.7.2/plugins/providers/virtualbox/plugin.rb:27>, :nic_mac_addresses=>#<Proc:0x383fc58#C:/dev/tools/Vagrant/embedded/gems/gems/vagrant-1.7.2/plugins/providers/virtualbox/plugin.rb:32>, :public_address=>#<Proc:0x2a38508#C:/dev/tools/Vagrant/embedded/gems/gems/vagrant-share-1.1.3/lib/vagrant-share.rb:39>}, #results_cache={}>}>)
INFO machine: Calling action: ssh on provider VirtualBox (d2575078-1bc6-448b-9c70-04e21249f33b)
INFO runner: Preparing hooks for middleware sequence...
INFO runner: 1 hooks defined.
INFO runner: Running action: #<Vagrant::Action::Builder:0x460fde8>
INFO warden: Calling IN action: #<VagrantPlugins::ProviderVirtualBox::Action::CheckVirtualbox:0x4686480>
INFO base: VBoxManage path: C:\Program Files\Oracle\VirtualBox\VBoxManage.exe
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "--version"]
INFO meta: Using VirtualBox driver: VagrantPlugins::ProviderVirtualBox::Driver::Version_4_3
INFO base: VBoxManage path: C:\Program Files\Oracle\VirtualBox\VBoxManage.exe
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "list", "hostonlyifs"]
INFO warden: Calling IN action: #<VagrantPlugins::ProviderVirtualBox::Action::CheckCreated:0x4686468>
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "showvminfo", "d2575078-1bc6-448b-9c70-04e21249f33b", "--machinereadable"]
INFO warden: Calling IN action: #<VagrantPlugins::ProviderVirtualBox::Action::CheckAccessible:0x4686450>
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "showvminfo", "d2575078-1bc6-448b-9c70-04e21249f33b", "--machinereadable"]
INFO warden: Calling IN action: #<VagrantPlugins::ProviderVirtualBox::Action::CheckRunning:0x4686438>
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "showvminfo", "d2575078-1bc6-448b-9c70-04e21249f33b", "--machinereadable"]
INFO warden: Calling IN action: #<Vagrant::Action::Builtin::SSHExec:0x4686420>
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "showvminfo", "d2575078-1bc6-448b-9c70-04e21249f33b", "--machinereadable"]
INFO subprocess: Starting process: ["C:\\Program Files\\Oracle\\VirtualBox\\VBoxManage.exe", "showvminfo", "d2575078-1bc6-448b-9c70-04e21249f33b", "--machinereadable"]
INFO subprocess: Starting process: ["C:\\Program Files (x86)\\Git\\bin/ssh.EXE"]
INFO ssh: Invoking SSH: ["vagrant#127.0.0.1", "-p", "2222", "-o", "Compression=yes", "-o", "DSAAuthentication=yes", "-o", "LogLevel=FATAL", "-o", "StrictHostKeyChecking=no", "-o", "UserKnownHostsFile=/dev/null", "-o", "IdentitiesOnly=yes", "-i", "C:/Users/vitallan/.vagrant.d/insecure_private_key"]
Does anyone know how can I ssh in my box?
EDIT: i tried openning new forwardding doors (like #user1389596 suggested), but it still doesnt work
Hmm, ssh 127.0.0.1 is the loopback address (the same machine). Shouldn't you use the local LAN IP of your ubuntu server intead?

Vagrant hangs on Windows 7

vagrant up seems to hang on Windows 7
My Vagrant file
VAGRANTFILE_API_VERSION = "2"
ENV['VAGRANT_DEFAULT_PROVIDER'] = 'docker'
Vagrant.configure("2") do |config|
config.vm.network "forwarded_port", guest: 80, host: 8080, auto_correct: true
config.vm.define "elk" do |elk|
elk.vm.synced_folder "./www", "/var/www"
elk.vm.provider "docker" do |d|
d.build_dir = "./Docker"
end
end
end
vagrant up --debug
==> elk: Syncing folders to the host VM...
INFO machine: Calling action: sync_folders on provider VirtualBox (3c7dc34c-6fcf-4ace-87d1-0602b664e783)
DEBUG environment: Attempting to acquire process-lock: machine-action-740c202843bdf6334148bb69e000ec99
DEBUG environment: Attempting to acquire process-lock: dotlock
INFO environment: Acquired process lock: dotlock
INFO environment: Released process lock: dotlock
INFO environment: Acquired process lock: machine-action-740c202843bdf6334148bb69e000ec99
INFO environment: Released process lock: machine-action-740c202843bdf6334148bb69e000ec99
INFO runner: Preparing hooks for middleware sequence...
INFO runner: 1 hooks defined.
INFO runner: Running action: #<Vagrant::Action::Builder:0x3786018>
INFO warden: Calling IN action: #<Vagrant::Action::Builtin::SyncedFolders:0x38079d8>
INFO subprocess: Starting process: ["C:\\windows\\System32\\WindowsPowerShell\\v1.0\\/powershell.EXE", "-NoProfile", "-ExecutionPolicy", "Bypass", "$PSVersionTable.PSVersion.Major"]
DEBUG subprocess: Selecting on IO
DEBUG subprocess: stdout: 2
Found related vagrant issue here:
https://github.com/mitchellh/vagrant/issues/3139
Updating powershell to v3.0 solved the issue:
http://www.microsoft.com/en-us/download/details.aspx?id=34595

Resources