I've been trying without success to manipulate the samples produced by kAudioUnitType_Generator audio units by attaching an AURenderCallbackStruct to the input of the audio unit right after. I managed to get this working on OS X using the following simple graph:
(input callback) -> multichannel mixer -> (intput callback) -> default output
But I've failed with the following (even simpler) graphs that start with a generator unit:
speech synthesis -> (intput callback) -> default output | fails in render callback with kAudioUnitErr_Uninitialized
audiofile player -> (intput callback) -> default output | fails when scheduling file region with kAudioUnitErr_Uninitialized
I've tried just about everything I can think of, from setting ASBD format to sample rates, but I always get these errors. Does anyone know how to setup a graph where we can manipulate samples from these nice generator units?
Below is the failing render callback function and graph instantiation method for the attempt using speech synthesis. The audiofile player is almost identical for this, except setting up the file playback, of course. Both of these setups work if I remove the callback and add an AUGraphConnectNodeInput in it's place...
static OSStatus RenderCallback(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData) {
AppDelegate *app = (AppDelegate *)inRefCon;
AudioUnit inputUnit = app->_speechUnit;
OSStatus status = noErr;
status = AudioUnitRender(inputUnit, ioActionFlags, inTimeStamp, 0, inNumberFrames, ioData);
// *** ERROR *** kAudioUnitErr_Uninitialized, code: -10867
// ... insert processing code here...
return status;
}
- (int)createSynthGraph {
AUGRAPH_CHECK( NewAUGraph( &_graph ) );
AUNode speechNode, outputNode;
// speech synthesizer
AudioComponentDescription speechCD = {0};
speechCD.componentType = kAudioUnitType_Generator;
speechCD.componentSubType = kAudioUnitSubType_SpeechSynthesis;
speechCD.componentManufacturer = kAudioUnitManufacturer_Apple;
// output device (speakers)
AudioComponentDescription outputCD = {0};
outputCD.componentType = kAudioUnitType_Output;
outputCD.componentSubType = kAudioUnitSubType_DefaultOutput;
outputCD.componentManufacturer = kAudioUnitManufacturer_Apple;
AUGRAPH_CHECK( AUGraphAddNode( _graph, &outputCD, &outputNode ) );
AUGRAPH_CHECK( AUGraphAddNode( _graph, &speechCD, &speechNode ) );
AUGRAPH_CHECK( AUGraphOpen( _graph ) );
AUGRAPH_CHECK( AUGraphNodeInfo( _graph, outputNode, NULL, &_outputUnit ) );
AUGRAPH_CHECK( AUGraphNodeInfo( _graph, speechNode, NULL, &_speechUnit ) );
// setup stream formats:
AudioStreamBasicDescription streamFormat = [self streamFormat];
AU_CHECK( AudioUnitSetProperty( _speechUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 0, &streamFormat, sizeof(streamFormat) ) );
// setup callback:
AURenderCallbackStruct callback;
callback.inputProc = RenderCallback;
callback.inputProcRefCon = self;
AUGRAPH_CHECK( AUGraphSetNodeInputCallback ( _graph, outputNode, 0, &callback ) );
// init and start
AUGRAPH_CHECK( AUGraphInitialize( _graph ) );
AUGRAPH_CHECK( AUGraphStart( _graph ) );
return 0;
}
You must connect these nodes together with AUGraphConnectNodeInput. The AUGraph will not initialize AudioUnits within it unless they are connected. Units in a graph that are not connected to other units will not be initialized.
You could also try manually initing them before starting the graph.
kAudioUnitErr_Uninitialized error appears when your audio unit is not initialized. Just initialize your graph before setting problem properties. This will initialize all opened audio units in graph. From AUGraphInitialize discussion in AUGraph.h:
AudioUnitInitialize() is called on each opened node/AudioUnit
(get ready to render) and SubGraph that are involved in a
interaction.
Related
I'm trying to create a screen taking app that can get a raw bitmap of a desktop window from the GPU. I did it via GDI and it works properly, but using both DirectX and DXGI Duplication api, I have black screen or an error.
void dump_buffer() {
IDirect3D9* d3d = nullptr;
d3d = Direct3DCreate9(D3D_SDK_VERSION);
D3DPRESENT_PARAMETERS d3dpp;
out("1");
ZeroMemory(&d3dpp, sizeof(d3dpp));//COCK?
D3DDISPLAYMODE d3ddm;
if(FAILED(d3d->GetAdapterDisplayMode(D3DADAPTER_DEFAULT, &d3ddm)))
err("IDirect3D9");
d3dpp.Windowed = TRUE;
d3dpp.SwapEffect = D3DSWAPEFFECT_DISCARD;
d3dpp.BackBufferFormat = d3ddm.Format;
d3dpp.EnableAutoDepthStencil = FALSE;
IDirect3DDevice9* d3ddev = nullptr;
out("2");
d3d->CreateDevice(
D3DADAPTER_DEFAULT, // Используем видеокарту, установленную в Windows по умолчанию (= в качестве основной).
D3DDEVTYPE_HAL, // Используем аппаратный рендеринг (Hardware Abstraction Layer).
GetDesktopWindow(), // Дескриптор) окна
D3DCREATE_SOFTWARE_VERTEXPROCESSING, // Не использовать "фишки" T&L для лучшей совместимости с бОльшим числом видеоадаптеров.
&d3dpp, // Указатель на заранее заполненную структуру параметров отображения (D3DPRESENT_PARAMETERS).
&d3ddev // Указатель на создаваемый объект устройства Direct3D.
);
if (d3ddev == nullptr)
err("IDirect3DDevice9");
out("3");
IDirect3DSurface9 *pRenderTarget = nullptr;
IDirect3DSurface9 *pDestTarget = nullptr;
// sanity checks.
// get the render target surface.
HRESULT hr = d3ddev->GetRenderTarget(0, &pRenderTarget);
if (FAILED(hr)){
err("GetRenderTarget");
}
// get the current adapter display mode.
// hr = pDirect3D->GetAdapterDisplayMode(D3DADAPTER_DEFAULT,&d3ddisplaymode);
out("4");
D3DDISPLAYMODE mode;
hr = d3d->GetAdapterDisplayMode(D3DADAPTER_DEFAULT, &mode);
if (FAILED(hr)){
err("GetAdapterDisplayMode");
}
// create a destination surface.
out(mode.Width);
out(mode.Height);
out(mode.Format);
hr = d3ddev->CreateOffscreenPlainSurface(mode.Width,
mode.Height,
mode.Format,
D3DPOOL_SYSTEMMEM,
&pDestTarget,
nullptr);
if (FAILED(hr)){
err(("CreateOffscreenPlainSurface"));
}
out("5");
//copy the render target to the destination surface.
hr = d3ddev->GetRenderTargetData(pRenderTarget, pDestTarget);
if (FAILED(hr)){
err(DXGetErrorDescription9A(hr));
err(("GetRenderTargetData"));
}
//save its contents to a bitmap file.
out("6");
hr = D3DXSaveSurfaceToFile(file,
D3DXIFF_BMP,
pDestTarget,
nullptr,
nullptr);
out("7");
// clean up.
pRenderTarget->Release();
pDestTarget->Release();
}
p.s: Do I understand correctly that when using GetDesktopWindow() am I capturing a desktop containing an image of the entire screen?
Some help in solving this problem is needed
I am doing a D2D/D3D interoperability program. After getting ID3D11Texture2D from the outside, it is rendered in a designated area. I tried CreateDxgiSurfaceRenderTarget but no effect
code below, The code runs "normally", there are no errors and debugging information is displayed, But the interface shows a black screen. If you ignore the param texture, changing to only _back_render_target->FillRectange is effective
HRESULT CGraphRender::DrawTexture(ID3D11Texture2D* texture, const RECT& dst_rect)
{
float dpi = GetDpiFromD2DFactory(_d2d_factory);
CComPtr<ID3D11Texture2D> temp_texture2d;
CComPtr<ID2D1RenderTarget> temp_render_target;
CComPtr<ID2D1Bitmap> temp_bitmap;
D3D11_TEXTURE2D_DESC desc = { 0 };
texture->GetDesc(&desc);
CD3D11_TEXTURE2D_DESC capture_texture_desc(DXGI_FORMAT_B8G8R8A8_UNORM, desc.Width, desc.Height, 1, 1, D3D11_BIND_RENDER_TARGET);
HRESULT hr = _d3d_device->CreateTexture2D(&capture_texture_desc, nullptr, &temp_texture2d);
RETURN_ON_FAIL(hr);
CComPtr<IDXGISurface> dxgi_capture;
hr = temp_texture2d->QueryInterface(IID_PPV_ARGS(&dxgi_capture));
RETURN_ON_FAIL(hr);
D2D1_RENDER_TARGET_PROPERTIES rt_props = D2D1::RenderTargetProperties(D2D1_RENDER_TARGET_TYPE_DEFAULT, D2D1::PixelFormat(DXGI_FORMAT_B8G8R8A8_UNORM, D2D1_ALPHA_MODE_PREMULTIPLIED), dpi, dpi);
hr = _d2d_factory->CreateDxgiSurfaceRenderTarget(dxgi_capture, rt_props, &temp_render_target);
RETURN_ON_FAIL(hr);
D2D1_BITMAP_PROPERTIES bmp_prop = D2D1::BitmapProperties(D2D1::PixelFormat(DXGI_FORMAT_B8G8R8A8_UNORM, D2D1_ALPHA_MODE_PREMULTIPLIED), dpi, dpi);
hr = temp_render_target->CreateBitmap(D2D1::SizeU(desc.Width, desc.Height), bmp_prop, &temp_bitmap);
RETURN_ON_FAIL(hr);
CComPtr<ID3D11DeviceContext> immediate_context;
_d3d_device->GetImmediateContext(&immediate_context);
if (!immediate_context)
{
return E_UNEXPECTED;
}
immediate_context->CopyResource(temp_texture2d, texture);
D2D1_POINT_2U src_point = D2D1::Point2U();
D2D1_RECT_U src_rect = D2D1::RectU(0, 0, desc.Width, desc.Height);
hr = temp_bitmap->CopyFromRenderTarget(&src_point, temp_render_target, &src_rect);
RETURN_ON_FAIL(hr);
D2D1_RECT_F d2d1_rect = { (float)dst_rect.left, (float)dst_rect.top, (float)dst_rect.right, (float)dst_rect.bottom};
_back_render_target->DrawBitmap(temp_bitmap, d2d1_rect);
return S_OK;
}
Refer to DirectXTK, DirectXTK's SpritBatch code has a brief look. It is necessary to introduce a bunch of context settings. I don’t understand the relationship yet. I don’t know. Does the existing code have any influence, such as setting TargetView, ViewPort or even Shader.
Is there a relatively simple and effective method, as simple and violent as ID2D1RenderTarget's DrawBitmap?
I try to create a Direct3D 11 texture array holding multiple pages of text rendered using DirectWrite and Direct2D. Suppose layout holds the IDWriteTextLayouts for the individual pages, then I try to do the following:
{
D3D11_TEXTURE2D_DESC desc;
::ZeroMemory(&desc, sizeof(desc));
desc.ArraySize = static_cast<UINT>(layouts.size());
desc.BindFlags = D3D11_BIND_RENDER_TARGET;
desc.Format = DXGI_FORMAT_B8G8R8A8_UNORM;
desc.Height = height;
desc.MipLevels = 1;
desc.SampleDesc.Count = 1;
desc.Usage = D3D11_USAGE_DEFAULT;
desc.Width = width;
auto hr = this->_d3dDevice->CreateTexture2D(&desc, nullptr, &retval.Texture);
if (FAILED(hr)) {
throw std::system_error(hr, com_category());
}
}
for (auto &l : layouts) {
ATL::CComPtr<IDXGISurface> surface;
{
auto hr = retval.Texture->QueryInterface(&surface);
if (FAILED(hr)) {
// The code fails here with E_NOINTERFACE "No such interface supported."
throw std::system_error(hr, com_category());
}
}
// Go on creating the RT from 'surface'.
}
The problem is that the code fails at the designated line where I try to obtain the IDXGISurface interface from the ID3D11Texture2D if there is more than one page (desc.ArraySize > 1). I eventually found in the documentation (https://learn.microsoft.com/en-us/windows/win32/api/dxgi/nn-dxgi-idxgisurface) that this is by deisgn:
If the 2D texture [...] does not consist of an array of textures, QueryInterface succeeds and returns a pointer to the IDXGISurface interface pointer. Otherwise, QueryInterface fails and does not return the pointer to IDXGISurface.
Is there any other way to obtain the individual DXGI surfaces in the texture array to draw to them one after the other using Direct2D?
As I could not find any way to address the sub-surfaces, I now create a staging texture with one layer to render to and copy the result into the texture array using ID3D11DeviceContext::CopySubresourceRegion.
How about Texture => IDXGIResource1 => CreateSubresourceSurface ?
I want to use function glReadPixels() to do screenshot of my scene. And it works great if I don't use multisampling. But if I do I get GL_INVALID_OPERATION in glResolveMultisampleFramebufferAPPLE(). Is there a way to resolve this problem?
My save function:
var wid = GLint()
var hei = GLint()
glGetRenderbufferParameteriv(GLenum(GL_RENDERBUFFER), GLenum(GL_RENDERBUFFER_WIDTH), &wid)
glGetRenderbufferParameteriv(GLenum(GL_RENDERBUFFER), GLenum(GL_RENDERBUFFER_HEIGHT), &hei)
let byteLength = Int(hei * wid) * 4
let bytes = UnsafeMutablePointer<GLubyte>.alloc(byteLength)
// init non-multisampled frame buffer
var framebuffer: GLuint = 0
var colorRenderbuffer: GLuint = 0
glGenFramebuffersOES(1, &framebuffer)
glBindFramebufferOES(GLenum(GL_FRAMEBUFFER_OES), framebuffer)
glGenRenderbuffersOES(1, &colorRenderbuffer)
glBindRenderbufferOES(GLenum(GL_RENDERBUFFER_OES), colorRenderbuffer)
glRenderbufferStorageOES(GLenum(GL_RENDERBUFFER_OES), GLenum(GL_RGBA8_OES), wid, hei)
glFramebufferRenderbufferOES(GLenum(GL_FRAMEBUFFER_OES), GLenum(GL_COLOR_ATTACHMENT0_OES), GLenum(GL_RENDERBUFFER_OES), colorRenderbuffer)
glBindFramebufferOES(GLenum(GL_DRAW_FRAMEBUFFER_APPLE), framebuffer)
var default: GLint = 0
glGetIntegerv(GLenum(GL_FRAMEBUFFER_BINDING_OES), &default)
glBindFramebufferOES(GLenum(GL_READ_FRAMEBUFFER_APPLE), GLuint(default));
myglGetError() // OK
glResolveMultisampleFramebufferAPPLE()
myglGetError() // GL_INVALID_OPERATION
glBindFramebuffer(GLenum(GL_FRAMEBUFFER), framebuffer)
glReadPixels(0, 0, GLsizei(wid), GLsizei(hei), GLenum(GL_RGBA), GLenum(GL_UNSIGNED_BYTE), bytes)
glBindFramebuffer(GLenum(GL_FRAMEBUFFER), GLuint(default));
glDeleteFramebuffers(1, &framebuffer)
I use default frame buffer initialized by GLKit with glkView.drawableMultisample = GLKViewDrawableMultisample.Multisample4X
I have tried your sample and it seems that after some modifications it works.
Modificated code:
var wid = GLint()
var hei = GLint()
glGetRenderbufferParameteriv(GLenum(GL_RENDERBUFFER), GLenum(GL_RENDERBUFFER_WIDTH), &wid)
glGetRenderbufferParameteriv(GLenum(GL_RENDERBUFFER), GLenum(GL_RENDERBUFFER_HEIGHT), &hei)
var def: GLint = 0
glGetIntegerv(GLenum(GL_FRAMEBUFFER_BINDING_OES), &def)
// init non-multisampled frame buffer
var framebuffer: GLuint = 0
var colorRenderbuffer: GLuint = 0
glGenFramebuffersOES(1, &framebuffer)
glBindFramebufferOES(GLenum(GL_FRAMEBUFFER_OES), framebuffer)
glGenRenderbuffersOES(1, &colorRenderbuffer)
glBindRenderbufferOES(GLenum(GL_RENDERBUFFER_OES), colorRenderbuffer)
glRenderbufferStorageOES(GLenum(GL_RENDERBUFFER_OES), GLenum(GL_RGBA8_OES), wid, hei)
glFramebufferRenderbufferOES(GLenum(GL_FRAMEBUFFER_OES), GLenum(GL_COLOR_ATTACHMENT0_OES), GLenum(GL_RENDERBUFFER_OES), colorRenderbuffer)
glBindFramebufferOES(GLenum(GL_DRAW_FRAMEBUFFER_APPLE), framebuffer)
//commented
//here GL_FRAMEBUFFER_BINDING_OES will be overrided by previous call of
// 'glBindRenderbufferOES(GLenum(GL_RENDERBUFFER_OES), colorRenderbuffer)'
//var def: GLint = 0
//glGetIntegerv(GLenum(GL_FRAMEBUFFER_BINDING_OES), &def
glBindFramebufferOES(GLenum(GL_READ_FRAMEBUFFER_APPLE), GLuint(def));
var err = glGetError()
print(String(format: "Error %X", err))
glResolveMultisampleFramebufferAPPLE()
err = glGetError()
print(String(format: "Error %X", err)) // GL_INVALID_OPERATION
glBindFramebuffer(GLenum(GL_FRAMEBUFFER), framebuffer)
Also here is quote from APPLE_framebuffer_multisample.txt extension description which explains why modified code works, as far as I understand.
Calling
BindFramebuffer with set to FRAMEBUFFER binds the
framebuffer to both DRAW_FRAMEBUFFER_APPLE and READ_FRAMEBUFFER_APPLE.
APPLE_framebuffer_multisample
My app plays music in background. I have audio key on in Background modes, my audio session looks like:
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *err = NULL;
[audioSession setCategory:AVAudioSessionCategoryPlayback error:&err];
if( err ){
NSLog(#"There was an error creating the audio session");
}
[audioSession setMode:AVAudioSessionModeDefault error:&err];
if( err ){
NSLog(#"There was an error setting mote to the audio session");
}
[[AVAudioSession sharedInstance] setActive:YES error:&err];
if( err ){
NSLog(#"There was an error setting mote to the audio session");
}
I'm playing via AUGraph which is configured with 2 nodes: Remote I/O and Mixer:
AudioComponentDescription outputcd;
outputcd.componentFlags = 0;
outputcd.componentFlagsMask = 0;
outputcd.componentManufacturer = kAudioUnitManufacturer_Apple;
outputcd.componentSubType = kAudioUnitSubType_RemoteIO;
outputcd.componentType = kAudioUnitType_Output;
// Multichannel mixer unit
AudioComponentDescription MixerUnitDescription;
MixerUnitDescription.componentType = kAudioUnitType_Mixer;
MixerUnitDescription.componentSubType = kAudioUnitSubType_AU3DMixerEmbedded;
MixerUnitDescription.componentManufacturer = kAudioUnitManufacturer_Apple;
MixerUnitDescription.componentFlags = 0;
MixerUnitDescription.componentFlagsMask = 0;
Also according to Technical QA I added
UInt32 maxFPS = 4096;
AudioUnitSetProperty(_mixerUnit, kAudioUnitProperty_MaximumFramesPerSlice,kAudioUnitScope_Global, 0, &maxFPS,sizeof(maxFPS));
But still no luck, my app keeps crashing on ExtAudioFileRead in Render callback function approx 10 seconds as i lock iPhone. Any suggestions?
Important to mention this bug is not reproduced on ios 7.
The issue was with Data Protection enabled in app capabilities. So as device was locked, files got encrypted and could not be played in background. Hence the crash.
Changing encryption properties for audio files fixes this issue.