Deploying OSX App as an available device (camera) - macos

I have an existing OSX app (Xcode 7, swift) that collects real time data (both textual and photos) from various local and internet (Json) sources to present in a user window.
I want to present the same data stream as an output stream that will appear as if it were just an OSX add-on camera (so that it can be selected as a camera source in other OSX programs).
Having trouble finding any framework documentation or examples for an app to act as a camera. Any help greatly appreciated.

Have you seen this project? https://github.com/rsodre/ofxFakam It seems the way to add a video recording source to OSX is to create and install a quicktime component. Unfortunately I don't know much more than that. On the plus side, it all happens in user space!

Related

How to do a Flutter Firebase app for both Mobile and Desktop at the same time

I see many samples and videos on how to do firebase for mobile and they call this "multiplatform". However, I don't see much on the desktop. There is one video on firebase flutter Windows by using a web and it seems to work. However, I do not see any tutorials for both mobile and desktop. Firebase would be a great example on syncing between desktop and mobile. We have such an app in development right now. Desktop development is new, but I'm surprised how little there is.
There is a library called firebase_dart, but the documentation seems weak.
The package firedart with the video listed above works in both desktop (linux and android without much modification..
What needed to be modified?
I had difficulty with the button on the very top of the phone, so I
added a sized box.
I had difficulty with debugPrint or print so I
added a Text widget with the results (to string).
That also worked.
Although I would like to not use fluent_ui, it does work for both desktop and mobile. I'm not sure what to do with realtime db, but I think I can make the firedart work for user sync between mobile and desktop.
It would be better if I could get firebase_dart to work.
https://www.youtube.com/watch?v=Tw7L2NkhwPc

Audio problems in nativescript app on iOS

I have a nativescript-app that uses text-to-speech functionality (#nativescript-community/texttospeech).
This is working fine so far.
Now, I open a modal-view with the #nstudio/camera-plus plugin that enables the user to capture a photo.
After closing the camera-view, the audio of the text-to-speech is only half as loud as before.
Is the camera messing up the audio-settings? And if so, how can I avoid that?
Nativescript 8.1
texttospeech 3.0.3
camera-plus 4.0.3
Didn't find the cause for this, but at least a workaround.
I put this above every audio-related task:
AVAudioSession.sharedInstance().setCategoryWithOptionsError( AVAudioSessionCategoryPlayAndRecord, AVAudioSessionCategoryOptions.DefaultToSpeaker);
That sets the volume back to full.

Mac OS: get full control of web-camera (USB connected)

The task:
OS: Mac OS X 10.9 +
Description:
There is web-camera connected to a Mac via USB. I need to discover a way of getting access to its' brightness, pan, color temperature, focus, etc.
I also need a way to apply image filters against camera's video stream.
I need to be able to control the camera while it is being used by other programs like Skype, so I can transmit for example video stream with increased contrast at Skype video call.
Reference app: https://itunes.apple.com/app/webcam-settings/id533696630?mt=12
Solution:
This is the question.
As far as I understood I must to find custom kext (driver) in order to perform all this magic.
Could you please show me right direction, libraries, drivers, etc.
You can use opencv library to capture camera frames, apply filters, etc.
http://docs.opencv.org/2.4/doc/tutorials/introduction/display_image/display_image.html
Then you can feed a virtual cam that can feed into Skype, etc.
http://download.cnet.com/Virtual-Webcam/3000-2348_4-75754338.html
There also many open source virtual webcam available.
I hope this helps.

Flash Player interface problem

I am facing an interface freeze issue with Flash Projector running a flex state based application. A Flash Projector exe was generated from a standalone flash player ver 10.2. The target machine on which the problem occuresd has 10.3.
Basically "screen freeze" means that the user interface is running as usual on Flash Player, but it's not responding to any user input (like button presses). But if we alt-tab to another application, the state changes in the Flash player. There is display with buttons on the screen, but touching the buttons or doing anything else - it did not respond. Rebooting the computer fixes the problem.
Can you suggest why this is happening? Is there any known bug in Flash Player.
The problem is this is being hard to reproduce on the developer workstation as it doesn't happen always. But it happens quite often on the target machine running an Intel Atom N270. What debugging steps can you suggest?
Problem : http://www.youtube.com/watch?v=z25oV9QWRyk
Have you tried publishing the projector in 10.1 or a version of Flash newer than 10.2? If you are able to publish it as a SWF first, you can use the stand-alone projector exe (downloadable here) to load and create a projector from it.
According to this Adobe bug issue (registration required), version 10.1 was supposed to have resolved this, but it sounds like it may have reappeared in 10.2.

simple record and play application!

I am trying to make a simple application which will store the sound said by user , say on click of record button and will play it back to him/her , say on click of play button.
Can anyone suggest me some appropriate way to do this ??
Thanks,
Miraaj
You can use QuickTime Kit's capture APIs to record a movie of the audio, and QTMovie (from the same framework) to convert it to a more conventional format for audio files and to play back both the intermediate file and the converted file.
There used to be a QuickTime Kit Programming Guide, but it didn't cover capturing and is now gone from developer.apple.com. You should file a bug against the docs.
This answer will work in a Cocoa (Mac) app. If you meant to ask about the iPhone, you should re-tag your question, as the solution will be completely different for a Cocoa app vs. a Cocoa Touch (iPhone) app.
I used direct sound to create an entire internet phone application a few years ago. Your question is far simpler, you won't have to deal with the circular buffer as critically. Direct sound is pretty main stream and you can find a lot of help with it in forums, and it's free!

Resources