Does Kinect for windows v2 work with Parallels? - macos

Does Kinect for windows v2 work on mac pro using windows 8.1 that is running on top of Parallels?

Considering Kinects v2's minimum hardware requirements below (copied from this MSDN blogs), it is not possible for windows 8/8.1 running on top of Parallels to recognize and run Kinect v2. The latest version of parallels v10, as of the time of this answer, only supports DirectX 10 which is below the minimum requirement. I have tried it myself, but no success even with Parallels Gaming Mode. Moreover, in order for Kinect to be recognized you need the full USB 3.0 bandwidth.
Alternative solution as discussed inthis MSDN blog, is to use WindowsToGo or by installing Windows using boot camp.
Kinects v2 minimum required capabilities:
64 bit (x64) processor
4 GB Memory (or more)
I7 3.1Ghz (or higher)
Built-in USB 3.0 host controller (Intel or Renesas chipset).
If you’re adding USB 3.0 functionality to your existing PC through an adapter, please ensure that it is a Windows 8 compliant device and that it supports Gen-2. See the troubleshooting section of Getting Started for more information.
DX11 capable graphics adapter (see list of known good adapters below)
Intel HD 4400 integrated display adapter
ATI Radeon HD 5400 series
ATI Radeon HD 6570
ATI Radeon HD 7800 (256bit GDDR5 2GB/1000Mhz)
NVidia Quadro 600
NVidia GeForce GT 640
NVidia GeForce GTX 660
NVidia Quadro K1000M
A Kinect v2 sensor, which includes a power hub and USB cabling.

Related

Android Emulator, graphical glitches on Windows 11

I have a Windows 11 machine, with a RTX 3050 graphics card. It's a Dell G15 laptop. I cannot find a (good) solution to the graphical glitches that appears on an Android Emulator.
The only "solution" I found was to change the hw.gpu.mode in the config.ini file from auto to guest. That fixes the glitches, but causes really bad performance issues and one app I developed with Flutter for my company straight up doesn't load. (loads when hw.gpu.mode=auto).
I'd appreciate if you can point me to the right direction to solving this. Let me know if you need any other details about my machine:
OS: Windows 11 Home Single Language [64-bit]
Kernel: 10.0.22000.0
CPU: 11th Gen Intel(R) Core(TM) i5-11260H # 2.60GHz
GPU: NVIDIA GeForce RTX 3050 Laptop GPU
Nvidia driver version: 516.94 (Downloaded the "Game ready driver" from GeForce Experience)
This problem is only on emulators with android 12+, personally i installed a device with android 11 and a second device with android13 for tests.
It seems like dedicated Nvidia GPU's are causing the problem. I have a 3060 laptop and I have the same issue and when I set it to guest it seems to work. My guess is that setting changes it from using the GPU to the CPU. I would recommend you try setting android studio to use integrated graphics instead of dedicated. Since I have a 8 core CPU compared to a 4 core CPU of yours, I'm guessing that's the reason I don't get as bad performance

What config needed for 75fps in WebGL application?

I want to make webgl application (using ThreeJS) for Oculus Rift DK2. DK2 need to get 75fps rendering output for the best view. But I have problem with it: one PC output 75fps, another - 60fps only.
My PC configs:
PC1 (outputs 75fps)
Intel Core i5
10Gb DDR3
GeForce GTX 650
Windows 10
Firefox Nighty with WebVR addon
PC2 (outputs 60fps)
Intel Core i7
16Gb DDR3
GeForce GTX 650
Windows 7
Firefox Nighty with WebVR addon
What affects to config PC2 that doesn't output 75fps?
Here are some suggestions:
Oculus runtime 0.8 (direct mode only)
Geforce drivers v.358.70-beta (as per the 0.8 runtime instructions)
Firefox Nightly w/ following flags (set via about:config)
dom.vr.enabled: true
gfx.vr.mirror-texture: true
layout.frame_rate: 75 (to match DK2 refresh)
https://mail.mozilla.org/pipermail/web-vr-discuss/2015-November/000929.html

How to Render With OpenGL on Tesla equiped Windows based host

I used to think that Tesla will not support OpenGL API, but recently leanred that Tesla products also can be used on visualization via OpenGL.
I have a workstation, in which there are 2 Intel E5 CPUs, and 1 Tesla C2050. According to https://developer.nvidia.com/opengl-driver, Tesla C2050 should support at least OpenGL version 3.
Now, I'd like to run a render service program using OpenGL 3.3 on that workstation, but without success.
The following is what I tried.
If I login through RDP remote desktop, the supported OpenGL version is 1.1 due to the virtual graphics adapter. Here, I used tscon commond to reconnect to the pysical console. As a result, the RDP connection lost. When I reconnected, I saw all the windows resized to 800*600 and the detected OpenGL support was still 1.1.
If I login with a monitor pluged to some kind of "integrated graphics adapter", the supported OpenGL version is still 1.1, maybe because the program was started within the screen pluged to the basic adapter. BUt the Tesla GPU does not have a grpahics output port.
I wonder how should I config the host to enable the use of Tesla GPU for OpenGL based rendering.
I have solved this problem.
First, in fact the Tesla C2050 is dedicated video card and has one DVI display port.
What is more important is that, the BIOS on motherboard was set to start up on integrated GPU. Changing this config to PCI-E card solves the problem that unable to access Tesla card.
Next, about graphics API support.
The official driver on NVIDIA site is offering support for OpenGL 4.4.
And, the Tesla card can be used to render via OpenGL just as the same as Quadro or Geforce card. There's no notable difference and no special configuration is necessary.

What do I need to do to get WebGL up and running on my Mac?

I've got a 2007 MacPro, 8GB RAM, 2 x NVIDIA GeForce 7300 GT (256 MB). I tried to look at a couple of Google's WebGL demos, for example this one but am unable to do so because
my system is not WebGL compatible.
I'm running Lion and the latest version of Chrome - what else do I need to do? Or is my 'bleeding-edge' workstation now a relic of the past?
You need a compatible browser, and half-way decent hardware. (Which you have)
See http://get.webgl.org for better instructions.
[EDIT!]
Actually, after looking through get.webgl.org a bit more, they explicitly state that your card is incompatible:
If you have the following graphics cards, WebGL is unsupported and is disabled by default:
Mac:
ATI Radeon HD2400
ATI Radeon 2600 series
ATI Radeon X1900
GeForce 7300 GT
This is probably because of driver bugs that they've found affect the stability of the browser. (Most vendors have lousy OpenGL support, even on systems like the Mac!)
You still may be able to force WebGL to enable through by navigating to about:flags in Chrome and seeing if it has an Enable WegGL option.

What graphics cards can I use for Windows 7 Phone development?

I downloaded the new SDK (7.1) for Windows Phone 7 development. When I try to run a Silverlight application I get a message telling me my graphics card isn't supported so the experience is downgraded. I can't run XNA programs at all.
I have a GeForce 6600 family card, which I thought would be good enough but I guess not. Can anyone tell me some graphics cards that are suitable that are also inexpensive and will support dual monitors at 1920*1280?
The Windows Phone Emulator requires a DirectX 10 or later graphics card with WDDM 1.1 driver. AFAIK the latest certified drivers (certainly for Windows 7) for the GeForce 6600 family of graphics cards fit these requirements, so you may just need to update your drivers.
Did you have the newest graphic card drivers installed? I can run XNA games (tested with a Windows phone) on a Intel onboard graphics (Intel HD3200 I think).
If the driver doesn't help choose from a current card or the previous generation. How much many do you want to spend?

Resources