How to upgrade OpenGL on Ubuntu? - ubuntu-20.04

I'm running ubuntu 20.04.5 on an HP EliteBook 850. lspci shows the VGA:
00:02.0 VGA compatible controller: Intel Corporation Haswell-ULT Integrated Graphics Controller (rev 0b)
glxinfo gives:
OpenGL version string: 3.0 Mesa 21.2.6
How can I upgrade OpenGL to at least version 3.2? I followed the procedure described at this link, but afterward glxinfo still shows OpenGL 3.0. Is there a concise guide somewhere describing the correct procedure?
Thanks!

Related

Ubuntu 20.4 Asus Tuf Dash F15 Can't Install Nvidia Driver

I have installed a fresh version of Ubuntu 20.4 on my band new Asus Tuf Dash F15 laptop. Then I went into the software update settings and switched the driver to nvidia-460 proprietary using the GUI. I apply the changes and restart and I get stuck at a boot loading screen. The error is /dev/nvm1e0 clean: xxxblocks/xxx.
Specs: Nvidia RTX 36060 mobile Intel i7 40gb ram 1tb storage
I have tried installing the nvidia drivers 4 different ways, according to the nvidia official installation guide and multiple medium forums. Every time I get the same problem and have to alt+shift+f2 my way into the terminal to sudo apt-get purge nvidia* to be able to boot back in.
I've installed nvidia diver on ubuntu many other times according to the nvidia official docs and never ran into this error. I am concerned it may be hardware incompatibility at this point?
I've been trying for five days now and re-installed ubuntu multiple times, varying the partitions and installation methods. Everything is working fine until I try and switch to nvidia driver. I even tried some older and new driver versions and the nvidia cuda toolkit 10 and 11. Please help, thank you.

Is it possible to get tensrflowp-gpu working on macbook pro with High Sierra

I am trying to install tensorflow-gpu on my macbook pro and have tried pretty much everything which I will briefly mention. But first here is my setup
Macbook pro retina 15"
High Sierra 10.13.4
NVIDIA GT650M card
Over the past two weeks I have tried all sorts of combinations and fed up with drivers/versions especially with High Sierra 10.13.4 and variables and so on.
If anybody has had success with this please could they tell me or point me to the versions and method for
CUDA
CUDNN
Tensorflow-gpu (understand 1.1 is the highest for mac)
XCode (I have 9.2)
and anything else.
I have my anaconda environment working well for all of the machine learning stuff on CPU and consider using the GPU to be the next challenge.
Here's a link that may help.
I have installed using the same guidelines and it's working for me on macbook pro with same configuration.
I was able to compile tensorflow 1.4.0 with GPU support for my MacBook Pro (Retina, 15-inch, Late 2013) with GT 750M card under MacOS 10.12.6 with instructions from the following link:
http://paolino.me/tutorial/tensorflow/machine-learning/deep-learning/gpu/2017/11/18/installing-tensorflow-1.4.0-macos-cuda/

OpenGL 3.3 on OSX with FreeGLUT

I am using the following configuration
Mac OSX v 10.9.1
Intel HD graphics 4000 1024 MB.
My goal is to use OpenGL 3.3 using FreeGLUT, Is there a way to achieve that?
glxinfo gives me:
OpenGL vendor string: Intel Inc.
OpenGL renderer string: Intel HD Graphics 4000 OpenGL Engine
OpenGL version string: 2.1 INTEL-8.18.29
OpenGL shading language version string: 1.20,
and the programs where I try to open a 3.3 context gives me errors.
However this site https://developer.apple.com/graphicsimaging/opengl/capabilities/ states that HD 4000 should support 4.1. Is that only for glsl or is there any way to use FreeGLUT? The reason I want to use freeGlut is because the course I am taking right now requires the assignments to compile on their computers, and they are using FreeGlut, and I would like to be able to work from home.
MacOS X supports OpenGL-3.2 and later contexts only if you request a core context. You have to initialize FreeGLUT in addition with
glutInitContextVersion(3,2); /* or later versions, core was introduced only with 3.2 */
glutInitContextProfile(GLUT_CORE_PROFILE);
Another solution is given at https://stackoverflow.com/a/13751079/524368

GLUT on OS X with OpenGL 3.2 Core Profile

Is it possible to use GLUT on OS X Lion or OS X Mountain Lion using core profile (so I can use GLSL 1.50)?
Can I use the built in GLUT or do I need to use a third-part library such as FreeGLUT?
And is there any simple 'Hello world' applications available for OS X with either an XCode project or a make-file?
You need at least Mac OS X Lion (OS X 10.7 or higher) for the basic support of OpenGL 3.2. To use the OpenGL 3.2 Core Profile, just add
glutInitDisplayMode(GLUT_3_2_CORE_PROFILE | ... | ...);
in your main-function. You can check it by
std::printf("%s\n%s\n",
glGetString(GL_RENDERER), // e.g. Intel HD Graphics 3000 OpenGL Engine
glGetString(GL_VERSION) // e.g. 3.2 INTEL-8.0.61
);
GLUT does not support OpenGL 3.2, as it provides no functionality to specify the desired OpenGL context version. Also, GLUT's functionality relies on APIs that are not available with the OpenGL 3.2 Core Profile.
You have to switch to FreeGLUT or GLFW.
flyx is wrong, OpenGL 3.2 is the version that added core and compatibility profiles (not 3.3). However, Apple just doesn't support compatibility profiles at all (no reason, they just don't). GLUT comes in xcode as a framework and you can use it that way. So you can do it in a completely non-standard, platform specific way.

Does OS X Lion provide OpenCL image support for Radeon 5770?

On OS X Snow Leopard (10.6.8), OpenCL image support is not available on my Mac Pro with Radeon 5770 graphics card. Indeed this is believed to be common to all AMD/ATI Radeon cards under Snow Leopard and earlier. Specifically:
clGetDeviceInfo(cdDevices[uiDeviceUsed], CL_DEVICE_IMAGE_SUPPORT, sizeof(g_bImageSupport), &g_bImageSupport, NULL);
results in g_bImageSupport being false.
I want to know if anyone who has the final release 10.7 (Lion) and a Radeon 5770 graphics card in a Mac Pro, can check to see if CL_DEVICE_IMAGE_SUPPORT now returns true for this hardware?
An easy test is to download the Apple sample code for the raytraced Quarternion Julia-Set:
http://developer.apple.com/library/mac/#samplecode/OpenCL_RayTraced_Quaternion_Julia-Set_Example/Introduction/Intro.html
and build it and run it. The output on my system is sadly:
Connecting to AMD ATI Radeon HD 5770...
Qjulia requires images: Images not supported on this device.
Hope to hear that this now works in Lion ...
David.
I'm running Lion 10.7.1 with a Radeon 5770 and the given example works great (yay!), running around 150fps. So... yay!
For me it works fine with ATI Radeon 5870. Initial figure running around 290fps. The example requires 10.7 to run.
As per my comment, I'm running 10.7 with a Radeon 5870. The sample app can connect to the card and renders the "thing" fine.

Resources