Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I need to run docker on my new Windows 10 Home laptop.
Until recently, the docker website said that it didn't work on Windows 10. It now says you need Windows 7.1 or later.
But it also says that virtualization must be enabled.
Task Manager on my machine says that it isn't enabled (says, Hyper-V support is yes though).
I see that I need Windows 10 Pro to have virtualization capability.
Before I upgrade, does anyone know that Windows 10 Pro does indeed run docker?
Does it run it without troubles?
First, we are talking about Windows supporting a Linux VM for supporting docker in that Linux VM.
Windows itself won't support natively docker before a Windows Server 2016 SR3+.
Second, the Docker installation on Windows page says:
Your machine must be running Windows 7.1, 8/8.1 or newer to run Docker. Windows 10 is not currently supported
Actually (from this article), Windows 10 is also supported, but:
as it turns out, HyperV and VirtualBox will not run together side-by-side in 64 bit modes. And Scott’s blog post about rebooting to a hypervisorlaunchtype off mode of Windows 8.1 worked flawlessly for Windows 10
See "Switch easily between VirtualBox and Hyper-V with a BCDEdit boot Entry in Windows 8.1" (which applies here for Windows 10 as well)
Related
Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 months ago.
Improve this question
The issue is that although I enabled virtualization everywhere, both wsl2 and Docker Desktop see it as disabled.
Details ->
Environment:
Win 10 Pro, OS Build 19043
x64 based system
FUJITSU Lifebook E546
motherboard FJNB291
Configuration:
Hyper-V tried both enabled/disabled, checked in Turn Win features on/off
-> I tried to enable/disable also via Cmd Prompt and PowerShell, this seems working well
BIOS Virtualization Technology [enabled]
BIOS Intel VT -d [tried both enabled/disabled]
enabled virtualization based security -> I could not disable it anyways, so it is enabled.
Try Disabling the Hyper V Manager
Follow the steps:
1.Open CMD As Admin
2.Type In bcdedit /set hypervisorlaunchtype off and click enter.
Then You Have to restart Ur PC... Hope This Works!
Reference: virtualbox - virtualization is enabled but not working
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
This is part of a migration from Vista to Windows7. I now have a dual boot computer, with Win7 the preferred OS. From time to time I might need to go back to Vista to see how the things were configured there and then I will need to go back to Win7 to configure/install the same app there.
This is a computer that had very complex settings and it was difficult and risky to upgrade in place, to install Win7 over Vista.
In order to avoid countless reboots I would like to be able to always run Win7 and when I need I would like to be able to fire up VMWare Workstation and to start a Vista Machine that would have as HDD the physical HDD where currently Vista resides. I would expect the VMWare machine to run the OS installed on that HDD and I would expect Vista no to see that the hardware changed. My apps are not hardware dependent.
Is this possible?
Its possible and there are a few ways you could go about doing this.
The Easy Way
VMware Desktop allows you to use your existing partition/Disk to boot from only if its an IDE Disk.
https://www.vmware.com/support/ws5/doc/ws_disk_dualboot.html
The hard way
You can capture the Windows Vista OS as an .wim image with Windows Deployment Tool ImageX.exe. Then use other tools to create a bootable ISO. You would have to update the image though every time you feel there are a lot of changes made in Vista you want to see in VMware.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I am trying to install Ubuntu on windows using Oracle VM virtual Box from an ISO disk image.When I try to install, the installer gives a prompt that " The computer currently has no operating systems" I am wondering if this should have detected my windows operating system or is it just trying to detect Operating systems within the virtual Box.
The reason being it gives only two options to install
--Erase disk and install Ubuntu
-- Resize partition for Ubuntu
I do not want to erase all files on my windows operating system. Does anybody know what does the 'disk' in the discussion mean>
The installer is looking for OS installations on the disk in the VM, not the host machine. You are perfectly safe selecting Erase disk and install Ubuntu, and in most situations in a VM, that is the correct choice.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I have constantly failed to install Oracle 10 g on my windows 7. I selected the right download for my operating system but the installer brings a terminal which asks me to press enter to exit while showing the error that Operating system version 6.1 is not supported -- Failed
I tried a hack found here: http://windows7bugs.wordpress.com/2010/02/18/install-oracle-10g-onward-database-developer-on-windows-7/ but still failed. The hack required me to alter the ?PATH\database\stage\prereq\db\refhost.xml as well as the ?PATH\database\stage\prereq\db_prereqs\db\refhost.xml files to add the lines under CERTIFIED_SYSTEMS like this:
<OPERATING_SYSTEM>
<VERSION VALUE="6.1"/>
</OPERATING_SYSTEM>
Having done all this and many other hacks i have failed. I tried getting Windows XP Mode for windows 7 found here: http://www.microsoft.com/windows/virtual-pc/download.aspx
to help create an XP virtual environment so that Oracle can install, but i failed to get it since am a Windows 7 Home Premium , 64 bit user.
Anyone running windows 7 64 bit Home Premium or any other windows 7, who has probably faced the same problem, and has managed to install Oracle, do explain how you did it, please? thanks in advance
Oracle is an Enterprise software tool, and as such is only supported on Windows Professional operating systems. I agree that this is a bit provoking, especially when the OTN Download License appears to encourage self-learning but that's the way it is.
Have you considered using a virtual image instead? Install Oracle's Virtual Box on your PC and then you can run the OTN Developer Day VM. Find out more.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I have Windows File sharing enabled on an OS X 10.4 computer. It's accessible via \rudy\myshare for all the Windows users on the network, except for one guy running Vista Ultimate 64-bit edition.
All the other users are running Vista or XP, all 32-bit. All the workgroup information is the same, all login with the same username/password.
The Vista 64 guy can see the Mac on the network, but his login is rejected every time.
Now, I imagine that Vista Ultimate is has something configured differently to the Business version and XP but I don't really know where to look. Any ideas?
Try changing the local security policy on that Vista box for "Local Policies\Security Options\Network Security: LAN manager authentication level" from “Send NTLMv2 response only” to “Send LM & NTLM - use NTLMv2 session security if negotiated”.
No I have successfully done this with my Vista 64-bit machine. You may want to try using the IP Address of the machine and try connecting that way. Or maybe check out the log files on the Mac to see what the rejection error was.