Is there a way to interact with a built in camera in any smartwatch? - wear-os

I am trying to figure out if there is a way to interact with a built in camera for a smartwatch. Wear OS, WatchOS or any smartwatch OS would suffice. But I haven't been able to find any comprehensive documentation related to this.

Related

Where can I find an example of creating a FaceTime comparable camera in OSX

Many of us are working from home more and I envy the Windows guys who have a virtual webcam plugging in OBS (Open Broadcast Studio). OBS and the Windows plugin are open source projects. As a competent software engineer I should be able to create a plugin that works on OSX -But- I am not a hardened OSX dev.
I am sure I am not googling for the correct APIs and Subsystems. If someone(s) could help me with the Apple concept map to this obscure topic. I would be grateful for a set of crumbs that leads to the OSX API to call(s) to create a camera. I know it can be done as SnapCam does it, but that is a closed-source app.
I am aware of the workaround for OBS that
1) uses code injection and requires disabling security features
2) Doesn't even work in the current versions of OSX
3) Requires yet another app running with video previews etc.
I like the challenge of creating this plugin. I am also wise enough to try and ask for a road map if one is available.
Someone beat me to it. https://github.com/johnboiles/obs-mac-virtualcam
I thought I would search just githib.com directly with the search "virtual camera macos site:github.com". Constraining the search to just GitHub was quite useful.

Is it possible to use the ML Kit SDK on Wear OS?

I am wondering if anyone knows if it is possible to use the ML Kit SDK on Wear OS devices? I know Wear OS is based on Android, and I've seen references online to Firebase notifications working on Wear OS.
I have googled combinations of the terms "ML Kit", "Firebase ML Kit" and "Wear OS" but not found any definitive answers.
I don't have code on hand, but I am wondering would it even be possible to import/use the SDK on Wear OS apps in the first place.
Expected results would be being able to instantiate and use some of the machine learning models from the ML Kit API on a Wear OS watch.
Battery usage and efficiency are secondary at the moment, I'm just wondering if it is possible at all.
I have never tied it but I think it is possible. In the end, the Andorid Wear 2.0 (WearOS) is just another regular Android and it supplies direct internet access from the device.
I was able to succesfully use RenderScript on Wear devices but on the other hand I've faced an issue that Google Awareness API was not available on Wear. THis seem not to be the expected state.
Probably in the end you will need to do feasibility study on your own.
You can try to use some android MLKit sample and simply run it on Wear device.

Can I develop Scripts for DJI Drones and in which programming language?

I am thinking about buying a DJI Mavic pro Drone to develop my own scripts for Deep learning tests like autonomous flights, object recognition, and a lot more.
I want to know which libraries or SDK are out there to do this and what kind of programming languages it requires to do so?
I know some cheap drones allow you to program scripts in Python or many different languages and also you can change prewritten functions of their SDK, but what about DJI?
All SDKs and documentation are available here: http://developer.dji.com/mobile-sdk/
For Mavic Pro, we support iOS and Android and we have for both an SDK with all controls you might want along with a VideoPreviewer for handling the video feed and a UILibrary for drone-specific app UI objects.
For languages:
iOS: Objective-C, Swift or C++ will work fine.
Android: Java.
Finally if you are interested in object recognition, check out this project:
https://github.com/game-of-drones/dji-mobilesdk-vision
For a Mavic I think it would have to be iOS or Android
http://developer.dji.com/mobile-sdk/

Mac app with unity3d

I need to create a Mac app which shows 3D models and those should support the multi touch events. This will be run in a Mac computer and the touch screen will be a Mac OS X touch screen. I will be using Unity3d to create the 3D models in a scene. I want to know how I can integrate these Unity models/scenes to my Mac App.
One option is to embed Unity web player to my Mac app. But the issue is in the Web player it doesn't seem to recognize multi touch events like pinch zoom
So the other option is to integrate/embed the unity3d models to my app directly. But I've no clue how to do that. In the Unity build options I can build for Mac standalone. But it creates a .app file directly. So I don't think I can use that to integrate with my Mac app.
Any help on this is really appreciated
Thanks
why bother making a second app and integrating the unity app into the first.
Why not just make the whole thing in Unity ?
forgot to mention.. there are a couple of free Unity demo's that illustrate how to create or modify meshes procedurally (if you are trying to make anything more complex than basic primitive shapes)
What kind of 3D models are you trying to make? Unity3D can only make primitives and really isn't designed for 3D modeling. So if you are trying to make more than just boxes, spheres, etc. you will need another program. Blender is free and works well with Unity. There is a plug in here that will allow you to make 3D models in Unity. http://gamedraw.mixeddimensions.com/ I haven't tried it myself but they have a free version you could try before buying the full plugin to see if it will fit your needs. I agree with the previous reply that if you are going to use Unity you might as well make the whole app in Unity.

Syncing an iPod or iPhone with Cocoa

I'm creating an iTunes clone in Cocoa (don't ask why, it's not evil) and I want to be able to sync my iPod with it. This means: music, photos, videos and podcasts. I couldn't really find anything, since Google only shows articles about iPod touch and iPhone programming, but I'm actually creating a desktop application for Mac OS X, and I also want to be able to sync click-wheel iPods.
Is there an API or should I read and write directly to the USB port?
Can anyone help me? Thanks
Apple jealously guards sync capability and doesn't provide an API. As far as I know you can't even use iTunes automation to make it do the syncing for you.
Ever-resourceful, the open source community has reverse-engineered the protocols and the libimobiledevice project exists to provide a sync library for Linux-based systems. I don't believe the library will build on OSX -- it relies on the Linux USB architecture -- but if you need to write your own sync library, it will provide you with a good starting point to understand the protocol and device workings.

Resources