Sometimes, bugs in my CUDA programs cause the desktop graphics to break (in Windows). Typically, the screen remains somewhat readable, but when graphics change, such as when dragging a window, lots of semi-random colored pixels and small blocks appear.
I have tried to reset the GPU and driver by changing the desktop resolution, but that doesn't help. The only fix I have found is to reboot the computer.
Is there a program out there or some trick I can use to get the driver and GPU to reset without rebooting?
Because the same problem occurs sometimes on unix and google forwarded me to this thread, I hope this helps somebody else..
On ubuntu unloading and reloading the nvidia kernel module solved the problem for me:
sudo rmmod nvidia_uvm
sudo modprobe nvidia_uvm
Edit:
If you are on Tesla hardware on Linux and can run nvidia-smi, then you can reset the GPU using
nvidia-smi -r
or
nvidia-smi --gpu-reset
Here is the man output for this switch:
Resets GPU state. Can be used to clear double bit ECC errors or
recover hung GPU. Requires -i switch to target specific device.
Available on Linux only.
Otherwise...
The way to truly reset the hardware is to reboot.
What you describe shouldn't happen. I recommend testing with different hardware and let us know if it still occurs.
To reset the graphics stack in Windows, press Win+Ctrl+Shift+B.
I have a GeForce GTX 260 over NVDIA GPU SDK 4.2 and I am experiencing the some problems.
Sometimes developing I have bugs in the programs. This causes the screen to show the random colored pixels described in this post.
As stated here, if I change resolution they do not disappear. Moreover, if I only change the COLOUR DEPTH from 32 to 16 bits, the random colored pixels disappear, but going back to 32 bits (without rebooting) make them appear again.
Last bug that caused this behaviour was using __constant__ memory but passing it as a pointer:
test<<<grid, threadsPerBlock>>>( cuda_malloc_data, cuda_constant_data );
If I do not pass cudb_constant_data, then there is no bug (and consequently, the random coloured pixels do not appear).
from "device manager", under Display adapters tab, find the driver
disable it
press win + ctrl +shift + B (monitor will blink)
enable the driver
there you go.
ps -ef
find something like root 4066644 1 99 08:56 ? 04:32:25 /opt/conda/bin/python /data/
kill 4066644
Related
I created a fractal animation using JWildfire. It consists of 5772 1024x768 still images. I tried importing all of the images into OpenShot one step and had serious problems. So, I broke it down the three "parts" of 1924 images apiece. That was still problematic, but I got the images imported. However, only the first 1924 will animate and when I tried to do subsequent video frames like I did with the first 1924 images, OpenShot would indicate that the first image in the set was not valid. However, I tested it and even resaved it and it would open in other apps without a problem. So, I imported the last two sets of 1924 without creating the video frame (I don't recall the exact term) thinking that I could edit the first frame to include all 5772 images. Apparently, I was wrong. How can I fix this so that it will play through all 5772 images rather than just the first 1924? The PC that I have is far from top notch. Here are my specs:
Processor Intel(R) Core(TM) i3-3225 CPU # 3.30GHz 3.30 GHz
Installed RAM 8.00 GB (7.88 GB usable)
Device ID 87BC0DCC-B603-4158-9700-09CEF99A171C
Product ID 00330-80000-00000-AA170
System type 64-bit operating system, x64-based processor
Pen and touch No pen or touch input is available for this display
Edition Windows 10 Pro
Version 21H2
Installed on 7/11/2020
OS build 19044.1503
Experience Windows Feature Experience Pack 120.2212.3740.0
I'm using OpenShot 2.6.1 64-bit. Any help is greatly appreciated. Thanks.
I solved my problem by installing Shotcut. It made creating three separate image sequences and adding them to a timeline trivial. Now, the animation plays through seamlessly just as it should. Shotcut seems to be written much better. It doesn't take up nearly as much RAM with all 5772 images loaded. It wasn't anywhere near as problematic to get all of those images loaded and creating the three image sequences was super easy. The UI isn't very "shiny", but it gets the job done. So, goodbye OpenShot and hello Shotcut! Oh, and the xml/mlt movie file is much smaller than the osp movie file. So, that's awesome too.
So I have a big 32 inch display with a resolution of 1440p, and I want to set the DPI scaling to 75% instead of 100%. But I can't find any way to do so on multiple monitors.
I currently have:
Display 1 [2560 x 1440] (Main display I want to change)
Display 2 [2560 x 1440] (This one is 27 inches so it's fine as is)
Display 3 [3840 x 2160] (Set to 100%, fine as it is)
This trick (click me) changes DPI scaling via some registry keys (LogPixels & Win8DpiScaling), but when I use that trick it downscales display 3 instead of display 1.
Is there a way to get this to work? I see no reason for Microsoft to limit the scaling in displays.
Note: I have a 2070 super, all the displays are plugged into the GPU via displayport directly, with the latest avalible firmware at the time of writing (september 2021)
The tl;dr:
Technical limitations aside, there are very solid user experience reasons why this probably isn't allowed.
No, Windows will not let you set UI scaling below 100%.
(even if a stable workaround were to be discovered, most users would probably be quite unhappy with the results)
While I would love¹ to be proven incorrect, the implications of scaling at less than 100% are so fraught that this limitation is unlikely to change in the near future.
Background:
This has been the case for ages, likely since Windows first introduced the feature.
Compatibility with current software
The only ~purely technical~ reason I can think of:
The 100% scaling size likely uses the smallest base image (e.g. Explorer and Taskbar icons, mouse and text cursors) resources included in various existing Microsoft and 3rd-party applications.
User experience
Going below the 100% point may cause small UI text and icons, especially in application toolbars and the Taskbar to be blurred to the point of ambiguity.
Those fine lines in the taskbar 'Windows' menu icon? Blurred or gone.
Taken to the extreme, the UI ~might~ become so unreadable that the user is effectively prevented from being able to read the text even in the 'Settings' window and therefore is 'stuck': i.e. not able to navigate through 'Settings' to restore the original '100%' scaling mode.
(Luckily, Windows is never used to run any SCADA software where confusing two icons could theoretically cost money or lives.)
Performance:
Since those carefully-designed graphic assets don't exist, if sub-100% scaling were allowed, it would also likely cause extra CPU/GPU workload - that is why only certain fixed sizes of up-sampling are shown on the normal Display settings screen and why the Advanced scaling settings screen warns that custom scaling between 100-500% is "not recommended".
That might also apply to any fixed scaling option offered below 100%, and absolutely would for custom scaling sizes.
Some people enjoy reading:
Vector-based TrueType/OpenType fonts usually contain a ~lot~ of manual tweaking / hints to enable readable display of very small point sizes.
The marketing department & friends of the C-suite
Could they implement this at a limited range of options? 90%? 75%?
Perhaps - but it's extra testing for a horrible-looking edge case.
The existence of the option, even if only available as a registry hack, might cause some people to actually use it in kiosks and other public-facing displays; this risks the same sort of bad PR as when a BSOD is seen on the 'arrivals' screen at a train station or airport monitor.
Combined with the first example below, even a 90% option could cause trouble in some environments.
Example and tutorial:
Imagine how Windows might look displayed on one of those cheapo '1080p-supported' projectors that actually only contains an imager with a native pixel resolution of, say, 1024x576 (or even 480x234).
Windows thinks it can send 1080p, since that what the HDMI connection advertises, so it does: any text / vector content looks atrocious.
(At least in this case the user could normally² unplug the projector and reconnect to a normal monitor to restore functionality.)
See for yourself... while connected to any monitor (at that monitor's native resolution), with Windows set to 100% scaling:
Open Windows Notepad
Type or paste in any block of text
Now, use the Zoom Out command from the View menu³ five or more times in a row
While not an exact analogue, you may still see how hard it could be to read down-sampled text, even when very high-contrast (the best-case scenario).
¹: As someone currently typing this very answer on a 1080p connection to a 55" 4K television as a second monitor, I came across the question very much hoping this was possible. Sadly, logic intervened and killed my potential joy.
²: Unless the computer is actually stored somewhere locked or inaccessible, such as a NUC-style PC hidden above the false ceiling in a conference room.
³: Alternatively, press <CTRL>-<Minus> five or more times.
First of all after 4 hours of debugging I have no problem with my code. But I'm curious why I had issue that I had.
I created fullscreen window with d3d11 rendering. Problem occurred when I tried to alt-tab window and didn't have Present() in my loop (I simply found this issue before implementing rendering function). In that case after minimizing window Red and Blue channels on my screen were swapped (yes, literally).
It took me long time to find because I suspected my swap chain or window itself (sdl). Can you help me find the reason of this bug- for educational purposes?
This usually is due to a graphics driver bug with RGBA swap chains. You can try updating your driver (run Windows Update). But to improve compatibility you can change your swap chain surface format to BGRA (specifically, B8G8R8A8_UNORM). As long as you are just doing normal rendering (and not doing anything fancy like UpdateSubresource directly to the back buffer), you should be able to leave everything else as-is and it will render correctly.
How can I capture the screen with Haskell on Mac OS X?
I've read Screen capture in Haskell?. But I'm working on a Mac Mini. So, the Windows solution is not applicable and the GTK solution does not work because it only captures a black screen. GTK in Macs only captures black screens.
How can I capture the screen with … and OpenGL?
Only with some luck. OpenGL is primarily a drawing API and the contents of the main framebuffer are undefined unless it's drawn to by OpenGL functions themself. That OpenGL could be abused was due to the way graphics system did manage their on-screen windows' framebuffers: After a window without predefined background color/brush was created, its initial framebuffer content was simply everything that was on the screen right before the window's creation. If a OpenGL context is created on top of this, the framebuffer could be read out using glReadPixels, that way creating a screenshot.
Today window compositing has become the norm which makes abusing OpenGL for taking screenshots almost impossible. With compositing each window has its own off-screen framebuffer and the screen's contents are composited only at the end. If you used that method outlined above, which relies on uninitialized memory containing the desired content, on a compositing window system, the results will vary wildly, between solid clear color, over wildly distorted junk fragments, to data noise.
Since taking a screenshot reliably must take into account a lot of idiosyncrasy of the system this is to happen on, it's virtually impossible to write a truly portable screenshot program.
And OpenGL is definitely the wrong tool for it, no matter that people (including myself) were able to abuse it for such in the past.
I programmed this C code to capture the screen of Macs and to show it in an OpenGL window through the function glDrawPixels:
opengl-capture.c
http://pastebin.com/pMH2rDNH
Coding the FFI for Haskell is quite trivial. I'll do it soon.
This might be useful to find the solution in C:
NeHe Productions - Using gluUnProject
http://nehe.gamedev.net/article/using_gluunproject/16013/
Apple Mailing Lists - Re: Screen snapshot example code posted
http://lists.apple.com/archives/cocoa-dev/2005/Aug/msg00901.html
Compiling OpenGL programs on Windows, Linux and OS X
http://goanna.cs.rmit.edu.au/~gl/teaching/Interactive3D/2012/compiling.html
Grab Mac OS Screen using GL_RGB format
I'm working on a apps that's based on DirectX10 by using SlimDX. I would like to enable vsync similar to DirectX9, but the fps doesn't seems to lock to 60Hz(which happens if I'm using Direct9). I'm setting vsync by using this
SwapChain.Present(1, PresentFlags.None);
Did I do something wrong?
Btw, I'm running Win7 with ATI HD5570 video card. After some googling, I gather that ATI can force vsync on certain games. So I wonder if that's related.
Reference for code to C++ will do as well. I'll translate it myself.
Thanks
First argument of SwapChain.Present is syncInterval. 0 indicates that presentation should occur immediately, without synchronization. Any other value indicates that presentation should be synchonized with the specified next vertical blank.
So use it like this:
SwapChain.Present(0, PresentFlags.None);
You can try to force vsync using Catalyst Control Center