Core Audio : Recording in .wav format doesn't work properly - core-audio

I use AudioFileWritePackets() to write data, and it returns no error during
recording. The data format flag I am using is:
if(appDelegate.screenRecording){
cPacket = 0;inNumPackets = inCompleteAQBuffer->mPacketDescriptionCount;
if (AudioFileWritePackets(nAudioFile, false, numBytes, mPacketDescs, mPacketIndex, &nPackets, inCompleteAQBuffer->mAudioData) == noErr)
{
mPacketIndex += nPackets;
NSLog(#"sample result");
}
else {
NSLog(#"ext err");
}
}
Every time it's calling Nslog(ext err);
Please help me solve this.

Related

VLCSharp Unity3D, video Streaming freezes but audio still working

I want to get a RTMP link and stream the video at my unity project but, and it works fine.... just 4 frames long cause after that it freezes, but audio stills playing.
For this code i´m using VSCode,Unity3D and the public example that the creators have on their github.
// This is the start, where i initialize things of course hahauhsuhs
void Start()
{
Core.Initialize(Application.dataPath);
_libVLC = new LibVLC();
PlayPause();
}
//Then here we have the method play and Pause, where we play Media Player and give him the media URL that works for a few frames.
private void PlayPause()
{
if (_mediaPlayer == null)
{
_mediaPlayer = new MediaPlayer(_libVLC);
}
if (_mediaPlayer.IsPlaying)
{
_mediaPlayer.Pause();
}
else
{
_isPlaying = true;
if (_mediaPlayer.Media == null)
{
// playing remote media
_mediaPlayer.Media = new Media(_libVLC, new Uri(URL));
}
_mediaPlayer.Play();
}
}
// This method will be execute every frame and do some crazy stuff that i can´t explane right now.
private void Update()
{
//A few checks before executing video
if (!_isPlaying) return;
if (URL.Equals(null)) URL = "http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4"; //if URL is null give the BigChungus Bunny video
//Execute Video
if (_tex == null)
{
// If received size is not null, it and scale the texture
uint i_videoHeight = 0;
uint i_videoWidth = 0;
_mediaPlayer.Size(0, ref i_videoWidth, ref i_videoHeight);
var texptr = _mediaPlayer.GetTexture(out bool updated);
if (i_videoWidth != 0 && i_videoHeight != 0 && updated && texptr != IntPtr.Zero)
{
Debug.Log("Creating texture with height " + i_videoHeight + " and width " + i_videoWidth);
_tex = Texture2D.CreateExternalTexture((int)i_videoWidth,
(int)i_videoHeight,
TextureFormat.RGBA32,
false,
true,
texptr);
RenderSettings.skybox.mainTexture = _tex;
}
}
else if (_tex != null)
{
var texptr = _mediaPlayer.GetTexture(out bool updated);
if (updated)
{
_tex.UpdateExternalTexture(texptr);
}
}
}
this is my first real question on stackover flow so this post can have some issues, I´m open for suggestion to improve it.
I found that the problem it´s with my stream server latency, and not with my unity project.
But still... there is a way to persist the video to play with high latency?

Understanding DirectoryWatcher

I've been trying to use and understand the DirectoryWatcher class from Microsoft's Cloud Mirror sample. It uses ReadDirectoryChangesW to monitor changes to a directory. I don't think it's reporting all changes, to be honest. In any event, I had a question about the key part of the code, which is as follows:
concurrency::task<void> DirectoryWatcher::ReadChangesAsync()
{
auto token = _cancellationTokenSource.get_token();
return concurrency::create_task([this, token]
{
while (true)
{
DWORD returned;
winrt::check_bool(ReadDirectoryChangesW(
_dir.get(),
_notify.get(),
c_bufferSize,
TRUE,
FILE_NOTIFY_CHANGE_ATTRIBUTES,
&returned,
&_overlapped,
nullptr));
DWORD transferred;
if (GetOverlappedResultEx(_dir.get(), &_overlapped, &transferred, 1000, FALSE))
{
std::list<std::wstring> result;
FILE_NOTIFY_INFORMATION* next = _notify.get();
while (next != nullptr)
{
std::wstring fullPath(_path);
fullPath.append(L"\\");
fullPath.append(std::wstring_view(next->FileName, next->FileNameLength / sizeof(wchar_t)));
result.push_back(fullPath);
if (next->NextEntryOffset)
{
next = reinterpret_cast<FILE_NOTIFY_INFORMATION*>(reinterpret_cast<char*>(next) + next->NextEntryOffset);
}
else
{
next = nullptr;
}
}
_callback(result);
}
else if (GetLastError() != WAIT_TIMEOUT)
{
throw winrt::hresult_error(HRESULT_FROM_WIN32(GetLastError()));
}
else if (token.is_canceled())
{
wprintf(L"watcher cancel received\n");
concurrency::cancel_current_task();
return;
}
}
}, token);
}
After reviewing an answer to another question, here's what I don't understand about the code above: isn't the code potentially re-calling ReadDirectoryChangesW before the prior call has returned a result? Or is this code indeed correct? Thanks for any input.
Yes, I seem to have confirmed in my testing that there should be another while loop there around the call to GetOverlappedResultEx, similar to the sample code provided in that other answer. I think the notifications are firing properly with it.
Shouldn't there also be a call to CancelIo in there, too? Or is that not necessary for some reason?

LinqToTwitter User Stream memory leak

I'm having an issue with LinqToTwitter 4.1 where having a user stream open will eventually cause the programs memory usage to balloon out of control. This does not occur when the program starts running but only after some time, normally after a day or two.
Using the ANTS Memory Profiler I find this reference chain preventing System.Byte[] from being collected. The full profiler results can be downloaded here.
Code:
private async Task<Streaming> TwitterSteam(string trackHashTags, string twitterUserIds)
{
var stream = (from strm in TwitterCtx.Streaming.WithCancellation(CloseStreamSource.Token)
where strm.Type == StreamingType.Filter &&
strm.Track == trackHashTags
&& strm.Follow == (string.IsNullOrEmpty(twitterUserIds) ? "41553192" : twitterUserIds)
select strm).StartAsync(async strm =>
{
string message = string.IsNullOrEmpty(strm.Content) ? "Keep-Alive" : strm.Content;
if (message == "Keep-Alive")
{
IsRunning = true;
}
else
{
JsonData data = JsonMapper.ToObject(message);
Status tweet = new Status(data);
LogClient.LogInfo("Received Tweet: " + tweet.Text, null, LogType.Info, null);
ConvertToMessage(tweet);
IsRunning = true;
}
}).Result.SingleOrDefault();
return stream;
}
Can anyone provide insight as to why this is occurring and how I can prevent it?

No Data Sources show in Google Fit

I am trying to record location and speed from the Sensors API in Google Fit in my Watch Face Service (Android Wear). Here's the code I use:
private void connectFitnessDataListeners() {
Fitness.SensorsApi.findDataSources(mGoogleApiClient,
new DataSourcesRequest.Builder()
.setDataTypes(DataType.TYPE_ACTIVITY_SAMPLE)
.setDataTypes(DataType.TYPE_LOCATION_SAMPLE)
.setDataTypes(DataType.TYPE_SPEED)
// Can specify whether data type is raw or derived.
.setDataSourceTypes(DataSource.TYPE_RAW)
.build())
.setResultCallback(new ResultCallback<DataSourcesResult>() {
#Override
public void onResult(DataSourcesResult dataSourcesResult) {
Log.i(TAG, "Find Data Sources: " + dataSourcesResult.getStatus().toString());
for (DataSource dataSource : dataSourcesResult.getDataSources()) {
//only include dataSources from the watch
Log.i(TAG, "Data source found: " + dataSource.toString());
Log.i(TAG, String.format("Data Source device %s and type: %s", dataSource.getDevice().toString(),dataSource.getDataType().getName()));
if (dataSource.getDevice() != Device.getLocalDevice(getBaseContext())) continue;
//FIXME: Turn this into a loop over an array
if (dataSource.getDataType().equals(DataType.TYPE_ACTIVITY_SAMPLE) && (mActivitySampleListener == null)) {
Log.i(TAG, "Data source for ACTIVITY_SAMPLE found! Registering.");
mActivitySampleListener = registerFitnessDataListener(dataSource, DataType.TYPE_ACTIVITY_SAMPLE);
} else if (dataSource.getDataType().equals(DataType.TYPE_LOCATION_SAMPLE) && (mLocationSampleListener == null)) {
Log.i(TAG, "Data source for LOCATION_SAMPLE found! Registering.");
mLocationSampleListener = registerFitnessDataListener(dataSource, DataType.TYPE_ACTIVITY_SAMPLE);
} else if (dataSource.getDataType().equals(DataType.TYPE_SPEED) && (mSpeedListener == null)) {
Log.i(TAG, "Data source for SPEED found! Registering.");
mSpeedListener = registerFitnessDataListener(dataSource, DataType.TYPE_SPEED);
}
}
}
}
);
}
However, when I run this in an otherwise working onConnected call, I get no DataSources in the for loop (the Find Data Sources log call shows success).
Any thoughts as to what I should pursue?
UPDATE:
If I remove the setDataSourceTypes restriction, I get only this:
Data source found: DataSource{derived:Application{com.google.android.gms::null}:Device{Sony:SmartWatch 3:15e991d4::3:2}::DataType{com.google.speed[speed(f)]}}
Data Source device Device{Sony:SmartWatch 3:15e991d4::3:2} and type: com.google.speed
So why don't I see location information from the GPS?
This was my developer error: setDataTypes takes a vararg set of DataTypes; in my example I was only setting DataType for TYPE_SPEED

alBufferData() sets AL_INVALID_OPERATION when using buffer ID obtained from alSourceUnqueueBuffers()

I am trying to stream audio data from disk using OpenAL's buffer queueing mechanism. I load and enqueue 4 buffers, start the source playing, and check in a regular intervals to refresh the queue. Everything looks like it's going splendidly, up until the first time I try to load data into a recycled buffer I got from alSourceUnqueueBuffers(). In this situation, alBufferData() always sets AL_INVALID_OPERATION, which according to the official v1.1 spec, it doesn't seem like it should be able to do.
I have searched extensively on Google and StackOverflow, and can't seem to find any reason why this would happen. The closest thing I found was someone with a possibly-related issue in an archived forum post, but details are few and responses are null. There was also this SO question with slightly different circumstances, but the only answer's suggestion does not help.
Possibly helpful: I know my context and device are configured correctly, because loading small wav files completely into a single buffer and playing them works fine. Through experimentation, I've also found that queueing 2 buffers, starting the source playing, and immediately loading and enqueueing the other two buffers throws no errors; it's only when I've unqueued a processed buffer that I run into trouble.
The relevant code:
static constexpr int MAX_BUFFER_COUNT = 4;
#define alCall(funcCall) {funcCall; SoundyOutport::CheckError(__FILE__, __LINE__, #funcCall) ? abort() : ((void)0); }
bool SoundyOutport::CheckError(const string &pFile, int pLine, const string &pfunc)
{
ALenum tErrCode = alGetError();
if(tErrCode != 0)
{
auto tMsg = alGetString(tErrCode);
Log::e(ro::TAG) << tMsg << " at " << pFile << "(" << pLine << "):\n"
<< "\tAL call " << pfunc << " failed." << end;
return true;
}
return false;
}
void SoundyOutport::EnqueueBuffer(const float* pData, int pFrames)
{
static int called = 0;
++called;
ALint tState;
alCall(alGetSourcei(mSourceId, AL_SOURCE_TYPE, &tState));
if(tState == AL_STATIC)
{
Stop();
// alCall(alSourcei(mSourceId, AL_BUFFER, NULL));
}
ALuint tBufId = AL_NONE;
int tQueuedBuffers = QueuedUpBuffers();
int tReady = ProcessedBuffers();
if(tQueuedBuffers < MAX_BUFFER_COUNT)
{
tBufId = mBufferIds[tQueuedBuffers];
}
else if(tReady > 0)
{
// the fifth time through, this code gets hit
alCall(alSourceUnqueueBuffers(mSourceId, 1, &tBufId));
// debug code: make sure these values go down by one
tQueuedBuffers = QueuedUpBuffers();
tReady = ProcessedBuffers();
}
else
{
return; // no update needed yet.
}
void* tConverted = convert(pData, pFrames);
// the fifth time through, we get AL_INVALID_OPERATION, and call abort()
alCall(alBufferData(tBufId, mFormat, tConverted, pFrames * mBitdepth/8, mSampleRate));
alCall(alSourceQueueBuffers(mSourceId, 1, &mBufferId));
if(mBitdepth == BITDEPTH_8)
{
delete (uint8_t*)tConverted;
}
else // if(mBitdepth == BITDEPTH_16)
{
delete (uint16_t*)tConverted;
}
}
void SoundyOutport::PlayBufferedStream()
{
if(!StreamingMode() || !QueuedUpBuffers())
{
Log::w(ro::TAG) << "Attempted to play an unbuffered stream" << end;
return;
}
alCall(alSourcei(mSourceId, AL_LOOPING, AL_FALSE)); // never loop streams
alCall(alSourcePlay(mSourceId));
}
int SoundyOutport::QueuedUpBuffers()
{
int tCount = 0;
alCall(alGetSourcei(mSourceId, AL_BUFFERS_QUEUED, &tCount));
return tCount;
}
int SoundyOutport::ProcessedBuffers()
{
int tCount = 0;
alCall(alGetSourcei(mSourceId, AL_BUFFERS_PROCESSED, &tCount));
return tCount;
}
void SoundyOutport::Stop()
{
if(Playing())
{
alCall(alSourceStop(mSourceId));
}
int tBuffers;
alCall(alGetSourcei(mSourceId, AL_BUFFERS_QUEUED, &tBuffers));
if(tBuffers)
{
ALuint tDummy[tBuffers];
alCall(alSourceUnqueueBuffers(mSourceId, tBuffers, tDummy));
}
alCall(alSourcei(mSourceId, AL_BUFFER, AL_NONE));
}
bool SoundyOutport::Playing()
{
ALint tPlaying;
alCall(alGetSourcei(mSourceId, AL_SOURCE_STATE, &tPlaying));
return tPlaying == AL_PLAYING;
}
bool SoundyOutport::StreamingMode()
{
ALint tState;
alCall(alGetSourcei(mSourceId, AL_SOURCE_TYPE, &tState));
return tState == AL_STREAMING;
}
bool SoundyOutport::StaticMode()
{
ALint tState;
alCall(alGetSourcei(mSourceId, AL_SOURCE_TYPE, &tState));
return tState == AL_STATIC;
}
And here's an annotated screen cap of what I see in my debugger when I hit the error:
I've tried a bunch of little tweaks and variations, and the result is always the same. I've wasted too many days trying to fix this. Please help :)
This error occurs when you trying to fill buffer with data, when the buffer is still queued to the source.
Also this code is wrong.
if(tQueuedBuffers < MAX_BUFFER_COUNT)
{
tBufId = mBufferIds[tQueuedBuffers];
}
else if(tReady > 0)
{
// the fifth time through, this code gets hit
alCall(alSourceUnqueueBuffers(mSourceId, 1, &tBufId));
// debug code: make sure these values go down by one
tQueuedBuffers = QueuedUpBuffers();
tReady = ProcessedBuffers();
}
else
{
return; // no update needed yet.
}
You can fill buffer with data only if it unqueued from source. But your first if block gets tBufId that queued to the source. Rewrite code like so
if(tReady > 0)
{
// the fifth time through, this code gets hit
alCall(alSourceUnqueueBuffers(mSourceId, 1, &tBufId));
// debug code: make sure these values go down by one
tQueuedBuffers = QueuedUpBuffers();
tReady = ProcessedBuffers();
}
else
{
return; // no update needed yet.
}

Resources