Incorrect GL_VENDOR and GL_RENDERER in Chromium build - macos

I'm building a Chromium project under macOS Catalina (10.15.7) following the instructions https://chromium.googlesource.com/chromium/src/+/master/docs/mac_build_instructions.md.
The compilation is successful, but the resulting binary has problems with hardware acceleration. Some pages are rendered by the browser with strange artifacts (black rectangles in different places of the web pages).
Output chrome://gpu
Driver vendor ANGLE
Driver version 2.1.14218
Pixel shader version 1.00
Vertex shader version 1.00
Max. MSAA samples 4
GL_VENDOR Google Inc.
GL_RENDER ANGLE (ATI Technologies Inc., AMD Radeon Pro 570 OpenGL Engine, OpenGL 4.1 core)
GL_VERSION OpenGL ES 2.0.0 (ANGLE 2.1.14218 git hash: f9e59ad07855)
But the Chromium binary downloaded from here https://chromium.woolyss.com/ doesn't have these problems with page rendering and hardware acceleration.
Downloaded Chromium output chrome://gpu
Driver vendor ATI
Driver version 3.10.19
Pixel shader version 4.10
Vertex shader version 4.10
Max. MSAA samples 8
GL_VENDOR ATI Technologies Inc.
GL_RENDERER AMD Radeon Pro 570 OpenGL Engine
GL_VERSION 4.1 ATI-3.10.19
The versions of Chromium are absolutely identical. For some reason my build chooses ANGLE as GL_RENDERER instead of native ATI driver.
I use the following args.gn to build:
ffmpeg_branding="Chrome"
is_official_build=true
proprietary_codecs=true
is_component_build=false
enable_stripping=true
is_debug=false
enable_nacl=false
blink_symbol_level=0
strip_absolute_paths_from_debug_symbols=true
symbol_level=0
dcheck_always_on=false
enable_plugins=true
enable_pdf=true
If anyone has encountered a similar problem, which way to look to fix this?

Related

Can I use Alea.cuBase / Alea GPU with CUDA 8.0?

I just tried to run Alea TK samples on machine with GTX 1070, and:
CUDA 7.5 installs, but doesn't seem to work there. NVidia says CUDA 8.0RC should be used with this GPU: https://devtalk.nvidia.com/default/topic/949823/cuda-setup-and-installation/when-the-cuda-toolkit-will-support-gtx1070-graphics-card-/
CUDA 8.0 also successfully installs there, but it seems all the bindings in Alea.cuBase are to CUDA 7.5 -- i.e. basically, all samples fail on attempt to load CUDA 7.5's "cu*64_75.dll" libraries, though 8.0 version includes similar ones with "_80" suffix.
Same samples run on machines with less recent GPUs (and thus CUDA 7.5) without any issues.
Is there any way to address this, or I should wait for an updated version of Alea.cuBase?
The GTX 1070 has a new GPU of Pascal architecture in it. Starting with Alea GPU V3 beta 17 we support also Pascal. Give it a try. Cuda 8 should also work. But you have to use the new Alea GPU version 3 beta release. The old Alea GPU v 2.2 is not capable to compile for Pascal.

How to Render With OpenGL on Tesla equiped Windows based host

I used to think that Tesla will not support OpenGL API, but recently leanred that Tesla products also can be used on visualization via OpenGL.
I have a workstation, in which there are 2 Intel E5 CPUs, and 1 Tesla C2050. According to https://developer.nvidia.com/opengl-driver, Tesla C2050 should support at least OpenGL version 3.
Now, I'd like to run a render service program using OpenGL 3.3 on that workstation, but without success.
The following is what I tried.
If I login through RDP remote desktop, the supported OpenGL version is 1.1 due to the virtual graphics adapter. Here, I used tscon commond to reconnect to the pysical console. As a result, the RDP connection lost. When I reconnected, I saw all the windows resized to 800*600 and the detected OpenGL support was still 1.1.
If I login with a monitor pluged to some kind of "integrated graphics adapter", the supported OpenGL version is still 1.1, maybe because the program was started within the screen pluged to the basic adapter. BUt the Tesla GPU does not have a grpahics output port.
I wonder how should I config the host to enable the use of Tesla GPU for OpenGL based rendering.
I have solved this problem.
First, in fact the Tesla C2050 is dedicated video card and has one DVI display port.
What is more important is that, the BIOS on motherboard was set to start up on integrated GPU. Changing this config to PCI-E card solves the problem that unable to access Tesla card.
Next, about graphics API support.
The official driver on NVIDIA site is offering support for OpenGL 4.4.
And, the Tesla card can be used to render via OpenGL just as the same as Quadro or Geforce card. There's no notable difference and no special configuration is necessary.

Does Kinect for windows v2 work with Parallels?

Does Kinect for windows v2 work on mac pro using windows 8.1 that is running on top of Parallels?
Considering Kinects v2's minimum hardware requirements below (copied from this MSDN blogs), it is not possible for windows 8/8.1 running on top of Parallels to recognize and run Kinect v2. The latest version of parallels v10, as of the time of this answer, only supports DirectX 10 which is below the minimum requirement. I have tried it myself, but no success even with Parallels Gaming Mode. Moreover, in order for Kinect to be recognized you need the full USB 3.0 bandwidth.
Alternative solution as discussed inthis MSDN blog, is to use WindowsToGo or by installing Windows using boot camp.
Kinects v2 minimum required capabilities:
64 bit (x64) processor
4 GB Memory (or more)
I7 3.1Ghz (or higher)
Built-in USB 3.0 host controller (Intel or Renesas chipset).
If you’re adding USB 3.0 functionality to your existing PC through an adapter, please ensure that it is a Windows 8 compliant device and that it supports Gen-2. See the troubleshooting section of Getting Started for more information.
DX11 capable graphics adapter (see list of known good adapters below)
Intel HD 4400 integrated display adapter
ATI Radeon HD 5400 series
ATI Radeon HD 6570
ATI Radeon HD 7800 (256bit GDDR5 2GB/1000Mhz)
NVidia Quadro 600
NVidia GeForce GT 640
NVidia GeForce GTX 660
NVidia Quadro K1000M
A Kinect v2 sensor, which includes a power hub and USB cabling.

OpenGL 3.3 Ubuntu (Virtual Machine)

I need openGL 3.3 or higher to use GLSL 3.3. My problem is that I have Mac Os X which doesn't allow to use a version higher than 2.1 of OpenGL. I've installed a virtual machine of Ubuntu inside my system but also has the version 2.1 of OpenGL. I don't understand what's going on because I have an AMD Radeon HD 6490M 256 MB which is compatible with version 4.1 of OpenGL. Is there any way that I can use openGL 3.3 or higher without doing a disk partition?
I don't understand what's going on because I have an AMD Radeon HD 6490M 256 MB which is compatible with version 4.1 of OpenGL
The GPU serves the host machine. The virtual machine sees only some dumb framebuffer device, or the OpenGL API available to the VM running on the host passed through the guest.
If you want to leverage the OpenGL-4 capabilities you must install an OS that can access the GPU natively. Also, if you want to run Linux you'll have to install the proprietary fglrx drivers (also called Catalyst for Linux), as the open source drivers that ship as distribution default haven't caught up yet.
Is there any way that I can use openGL 3.3 or higher without doing a disk partition?
Upgrade to OS X 10.9 when it comes out or grab the beta.
Or find some VM software for OS X that supports VGA passthrough.
If you're willing to repartition you can install Windows or Linux natively and use the drivers from AMD.

What do I need to do to get WebGL up and running on my Mac?

I've got a 2007 MacPro, 8GB RAM, 2 x NVIDIA GeForce 7300 GT (256 MB). I tried to look at a couple of Google's WebGL demos, for example this one but am unable to do so because
my system is not WebGL compatible.
I'm running Lion and the latest version of Chrome - what else do I need to do? Or is my 'bleeding-edge' workstation now a relic of the past?
You need a compatible browser, and half-way decent hardware. (Which you have)
See http://get.webgl.org for better instructions.
[EDIT!]
Actually, after looking through get.webgl.org a bit more, they explicitly state that your card is incompatible:
If you have the following graphics cards, WebGL is unsupported and is disabled by default:
Mac:
ATI Radeon HD2400
ATI Radeon 2600 series
ATI Radeon X1900
GeForce 7300 GT
This is probably because of driver bugs that they've found affect the stability of the browser. (Most vendors have lousy OpenGL support, even on systems like the Mac!)
You still may be able to force WebGL to enable through by navigating to about:flags in Chrome and seeing if it has an Enable WegGL option.

Resources