Does the backbuffer that a RenderTargetView points to automagically change after Present? - directx-11

I've created a ID3D11RenderTargetView by specifying the ID3D11Texture2D* of back buffer 0 in the swap chain. But I'm unclear on what happens after _pSwapChain->Present(0, 0) is called.
Presumably what the ID3D11RenderTargetView had been pointing to is now the front buffer. Does Present somehow change the ID3D11RenderTargetView object to point to the new back buffer or does the application have to keep requesting the new 0 back buffer from the swap chain? I haven't found an explanation of exactly how the ID3D11RenderTargetView is supposed to always point to the new back buffer. I'm also unclear on how many buffers make sense to be in the swap chain.

Basically it is 'automagically' done in DirectX 11.
See Anatomy of Direct3D 11 Create Device and DX11 DeviceResources
This is not true in DirectX 12 where you have to explicitly bind to each back-buffer, as well as ensure all video memory that is 'in flight' for the frame is left in place until after the GPU is done with that frame. That's why DXGI 1.4 added IDXGISwapChain3::GetCurrentBackBufferIndex.
See Anatomy of Direct3D 12 Create Device and DX12 DeviceResources
There are a few other swapchain 'evolution' changes between DX11 & DX12 style. See The Care and Feeding of Modern Swap Chains.

Related

SDL2 - Combine front and back buffer?

I am rendering images with flickering objects (usually 30Hz) using double buffering. For screenshots, I would like to blend together the current and the previous buffer, without having to store the previous buffer permanently.
How would I access SDL2's current front and back buffer and blend them into one buffer?
From the SDL_RenderPresent documentation:
The backbuffer should be considered invalidated after each present; do not assume that previous contents will exist between frames.
The reason is probably that every backend does things differently, and so the SDL cannot guarantee anything about what a buffer contains after it gets presented (without incurring an unnecessary performance penalty).
So you have to store the previous buffer yourself. That said, you probably don't have to copy the buffer everytime, just do it for the frame you want a screenshot of.

Your app called glInvalidateFramebuffer before presenting renderbuffer

"Your app called glInvalidateFramebuffer before presenting renderbuffer" is an error message I get in the line:
int retVal = UIApplicationMain(argc, argv, nil, #"myAppDelegate");
of my main.m file.
There is not a single call to the glInvalidateFramebuffer method in my project. What is more, the project uses OpenGL ES 2.0 (running with a GLKView) and the problematic method is part of OpenGL ES 3.0.
I use a number of offscreen frame buffers to draw procedural textures. This error was not notified before iOS 10. Also, it does not prevent rendering, does not show any visible issues and it is impossible to indicate its exact place in code (other than main.m) using Capture Frame.
This is almost certainly a false alarm by XCode's GPU Report.
When presenting, both the multi sample buffer and the depth buffer are no longer required.
The multi sample buffer has already been resolved, and to present the pixels, the depth values are no longer necessary.
This means that GLKView does the correct thing: invalidate them before presenting.
Note: the false warning goes away with GLKViewDrawableMultisampleNone instead of 4x.
So in GLK's case, it is triggered by the invalidation of the multi sample buffer after resolving.
Just for Reference. When I doing Multisampling Anti-Alias, I get the same message with Instruments 8, but switch to Instruments 7 the message is gone. I'm not running with GLKView, CAEAGLLayer instead.

DirectX to OpenGL hot-swap, doesn't display on Win32 window

During the developement of my engine, I'm trying to implement a feature, that enables hot-swapping between OpenGL and DirectX. Currently I'm testing on Win32 platform, and came across the following problem:
I implemented both renderer (OpenGL 3.0, and Direct3D11), both work fine alone. The swapping mechanism is the following:
Destroy the current rendering context, and build up the new one. For example: Release all DirectX objects, and then create OpenGL context, via WGL. I'm trying to implement this, using only one window (HWND).
Swapping from OpenGL 3.0 to DirectX11 works. (After destroying OpenGL, DirectX renders fine)
Destroying OpenGL and then recreating OpenGL again, works. Same with DirectX.
When I'm trying to swap from DirectX to OpenGL, the window will stop displaying the newly draw content, only the lastly drawn DirectX frame.
To construct the OpenGL context I'm using WGL. The class for the window was created with the CS_OWNDC style. I'm using SwapBuffers to flip the window buffers. Before setting up the context, I use SetPixelFormat with the previously returned value from ChoosePixelFormat. The created context is version 3.0, ensured via wglCreateContextAttribsARB.
Additional information:
All of the DirectX references are released, this was checked by calling ReportLiveDeviceObjects and checking the return value of ID3D11Device1::Release (0). ID3D11DeviceContext1::ClearState and Flush were called to ensure object destruction.
None of the OpenGL methods report error via glGetError, this is checked after every call. This is same for all OS, and WGL calls.
The OpenGL rendering calls are executing as expected, for example:
OpenGL rendering with 150 fps
Swap to DirectX, render with 60 fps (VSYNC)
Swap back to OpenGL, rendering again with 150 fps (not more)
There are other scenarios where OpenGL renders with more than 150 fps, so the rendering calls are executing properly.
My guess is that the flipping of the buffers doesn't work somehow, however SwapBuffers returns TRUE anyway.
I tried using SaveDC and RestoreDC before and after using DirectX, this resulted in now solution.
Using wglSwapLayerBuffers instead of SwapBuffers gives no change.
Can I somehow restore the HWND, or HDC to the original state, or do you guys have any idea why this might happen?
Guess I posted my question to soon, but however, this is how I solved it.
I dug around the documnentation for DirectX, and for the function CreateSwapChainForHwnd, I found the following:
Because you can associate only one flip presentation model swap chain at a time with an HWND, the Microsoft Direct3D 11 policy of deferring the destruction of objects can cause problems if you attempt to destroy a flip presentation model swap chain and replace it with another swap chain.
I was using DXGI_SWAP_EFFECT_FLIP_SEQUENTIAL in my swap chain descriptor, and this could mean, that DirectX sets up a flip swap chain for the window, but when I try to use it with OpenGL, it will fail swapping buffers somehow.
The solution for this, is to not use FLIP mode for creating the swap chain:
DXGI_SWAP_CHAIN_DESC1 scd;
scd.SwapEffect = DXGI_SWAP_EFFECT_DISCARD;
scd.Scaling = DXGI_SCALING_ASPECT_RATIO_STRETCH;
You have to set the Scaling to something else than DXGI_SCALING_NONE, or the creation will fail.
The interesting part is, that the DirectX still does not properly destroy the flip model on the window, altough I did everything it suggested in the documentation (ClearState and Flush calls).
CreateSwapChainForHwnd see Remarks
Edit: I found this question after some time. If anybody still has some idea, how to revert back to using GDI again instead of the DWM backbuffer, it is greatly appreciated.

directX capture renderer in C++

I have to get (shared memory or GPU memory) the directX render from a windows application called "Myapp" and apply this render (view) to four directX simple applications (only exactly the same view as the first windows application "Myapp")
Someone tells about backbuffer and anothers tells about FrontBufferData
1) How can I get easily the directX render of a directXWindows application in C++ ?
2) How can i Share easily and quickly this render to 4 another DirectX applications in C++ ?
Thanks in advance
You can never get the rendering data from backbuffer for an 3rd application, the only Interface Microsoft provide is GetFrontBufferData(), this function is the only way to take an antialiased screen shot, and it's very slow.
front buffer contains the data currently displayed on your screen.
back buffer contains the data being draw, but not present yet.
when you call Present, DirecX will swap the two buffers by simply change the buffer pointers, thus the front buffer became back buffer now, and the back buffer became front buffer now. this is called surface flipping.
There are many ways to share memory between processes.
Can I ask a question, what do you want to do with the rendering data?
thanks for your answer.
I just want to post / show the render / view of the application "Myapp" in 4 others directX views without changes (in C++)

OpenGL 3.1+ with Ruby

I followed this post to play with OpenGL (programmable pipeline) on Ruby
Basically, I'm just trying to create a blue window, and here's the code.
Ray::GL.major_version = 3
Ray::GL.minor_version = 2
Ray::GL.core_profile = true # if you want/need one
window = Ray::Window.new("Test Window", [800, 600])
window.make_current
glClearColor(0, 0, 1, 1);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glEnable(GL_DEPTH_TEST);
Instead, I got a white window created. This indicated that I was missing something, but I couldn't figure out what I was missing as the resources for OpenGL on Ruby seemed limited. I have been searching all over the web, but all I found was fixed-pipeline OpenGL stuff for Ruby.
Yes, I could use Ray's built-in functions to set the background color and draw stuff, but I didn't want to do that. I just wanted to use Ray to setup the window, then called OpenGL APIs directly. However, I couldn't figure out what I was missing in the code above.
I would greatly appreciate any hint or pointer to this (maybe I needed to swap the buffer? but then I didn't know how to do it with Ray). Is there any body familiar with using Ray that can give me some hints on this?
Or, are there any other tools that would allow me to setup OpenGL binding (for none fixed-pipeline)?
It would appear that you set the clear color to be blue, then cleared the back buffer to make it blue. But, as you said, you have not swapped the buffers to put the back buffer onto your screen. As far as swapping buffers goes, here's another answer from stack overflow
"Swapping the front and back buffer of a double buffered window is a function provided by the underlying graphics system, i.e. Win32 GDI, or X11 GLX. The function's you're looking for are wglSwapBuffers and/or glXSwapBuffers. On MacOS X NSOpenGLViews are automatically swapped.
However most likely you're using some framework, like GLUT, GLFW or Qt, which provide a portable wrapper around those functions. Read the framework's documentation."
I've never used Ray, so I'd say just keep rooting around in the documentation or look through example projects to see how buffer swapping is done.

Resources