I'm trying to create a fullscreen and transparent OpenGL window in XLib, with the purpose of applying shaders to the screen. Imagine applying a blur or pixelate effect, for example. For now, i'm trying to do something really simple, such as rendering the current date in the middle, blurring the background and that's it.
What's the appropriate setup for this? Should i capture the desktop when my app launches and use that as a bitmap to manipulate in my shader? Or should i actually make some kind of transparency protocol?
Related
Does it make sense to use WS_EX_NOREDIRECTIONBITMAP window style when rendering with Vulkan?
From MSDN:
WS_EX_NOREDIRECTIONBITMAP The window does not render to a redirection surface. This is for windows that do not have visible content or that use mechanisms other than surfaces to provide their visual.
It would be reasonable to assume that as we create our own surface with vkCreateWin32SurfaceKHR and swapchain with buffers with vkCreateSwapchainKHR we do not need any intermediate surface to render to.
When setting WS_EX_NOREDIRECTIONBITMAP it does work as expected on NVidia GPU (content is rendered, didn't measure a performance benefit though), however it doesn't work on Intel GPU (content is not visible, window is completely transparent).
Basically what I want to achieve is a sprite highlight animation effect as displayed below.
The idea is that the white-translucent gradient sprite moves on top of the other sprite (left-to-right), using a blend mode like Overlay (Photoshop). The difficult part is that the top gradient sprite should only be drawn on the visible pixels of the sprite underneath. The other part of the gradient overlay should be discarded to not affect the background or other sprites underneath (like on the image to the far right).
Is it possible to achieve that effect with a clever combination of OpenGL blend modes and how, or would I have to create a custom shader to combine these sprites?
Background: I'm using libgdx with OpenGL ES 2.0 and the app runs on Desktop, Android and iOS.
There arÄ™ many ways to do it. The simplest one:
You should render button and hilight in a single pass. In fragment shader, after sampling button texture and hilight texture calc the output color as for blending (could be mix(c1,c2,c2.a)) and alpha as button texture alpha only. Of course enable blending in usual way: (srcalpha,1-srcalpha)
I need to draw an overlay consisting of lines and text on top of another application. The application in question is a 3D outside world viewpoint, and the overlay is a head up display.
I don't have access to any type of callback from the outside world application to execute draw code in it's draw loop.
Drawing directly over the application's window will result in flickering as the draw loops will not be synchronized, so to me that doesn't seem like an option.
One method I can think of is to capture the outside world application's pixels and stream them into my application, so I can draw the overlay on top in the same draw loop, but that seems very inefficient.
Is there an efficient way to draw over the outside world application without flickering?
Is it possible to draw something over the final graphics card output / at the monitor's refresh rate?
P.s. It doesn't have to be OpenGL, but the HUD is already written in OpenGL so it would make it easier.
To repeat what I said in the comments, I've come across quite a few apps that hijack the 3D api calls and inject their own code to draw stuff right at the end of each frame - steam, teamspeak, mumble. Since it's in the same application there's no flickering and you can draw directly, rather than copying the result somewhere and compositing. I've never done it before and probably won't do a good job explaining it.
A relted question is here: Overlaying on a 3D fullscreen application
I am using LibGDX for a small app project, and I need to somehow take a series of sprites and place them (or their pixels rather) into a Pixmap. The basic idea is to take random sprites that are generated through various means while the app is running, and, only at specific times, merge some of them onto a single background sprite.
I believe that most of this can be done easily, but the step of getting the sprite images into the Pixmap isn't quite so obvious to me. The sprites also have various transparent and semi-transparent pixels, so simply grabbing the color at each pixel while it is all on the same screen isn't really applicable either, as it obviously shouldn't take the background colors with it.
If there is a suitable alternative to this that would accomplish what I am looking for I would also love to hear it. Any help is highly appreciated.
I think you want to render your sprites to an off-screen buffer (called an "FBO" or FrameBuffer in libgdx) (blending them as they're added), and then render that offscreen buffer to the screen as a single draw call? If so, this question should help: libgdx SpriteBatch render to texture
This requires OpenGL ES 2.0, which will eliminate support for some older devices.
I am developing a document based application for Mac OS X. It's a kind of media player, but instead of playing audio or video files it is supposed to open text-files containing meta-data specifying OpenGL animations. I would like to mimic Apples QuickTime X window style. This means, i have to do all the window drawings myself, because Cocoa has no appropriate window style.
There is one thing which gives me headaches: The rounded corners usually to be found on Mac OS X windows. I tried using the borderless window mask and working some CGS magic - there are some private Apple headers which allow window shaping, but they are of course undocumented. I was able to cut rectangular holes in my windows edges, but i couldn't figure out how Apple achieves rounded corners.
Creating a transparent window and drawing the frame myself does not work, because an OpenGL viewport is always rectangular, and the only way to change it is to turn on NSOpenGLCPSurfaceOpacity for alpha transparency and using the stencil buffer or shaders to cut out the edges, which seems like a hell of a lot of overhead.
If i put an OpenGLView into a standard Cocoa window with titlebar, the bottom edges are rounded. It seems this is happening at the NSThemeFrame stage of the view hierarchy. Any ideas how this is done?
Use a layer-backed view, and do your drawing in the CALayer on an invisible window. Layers include automatic handling of rounded corners and borders.
Background for CALayer is in the Core Animation Programming Guide. To create a layer for NSView, you need to call [view setWantsLayer:YES]. You would create a CAOpenGLLayer and assign it to the view using setLayer:.
See CALayerEssentials for sample code demonstrating how to use CAOpenGLLayer among other layer types.
Since Robs suggestion didn't work and no one else contributed to the discussion i settled on using the stencil buffer to crop the windows corners. I did this by creating a texture from the windows background and rendering it into the stencil buffer, discarding all transparent pixels. Looks fine, but is slow when resizing the window :/