Sending images using i2c - image

I have a question. I would like to know if it is possible to send images from one device to another using I2C communication.
Someone can help me, please?
I am actually using a camera with a WiFi module to transfer data, but I need to separate the photo capture and transmission to do it with 2 devices to reduction size of capturing device. So, I am searching how send the data saved in a SD card in one device to another device and that's the reason to asking if it is possible to send images through I2C.

Related

Sending Bluetooth Advertising Packets and Getting Some Answers

I want to build something with Raspberry Pi Zero and write in Go,
I never tried bluetooth before and my goal is;
Sending a dynamic packet which it will change every second, an iOS app will expand this message and with a button, client will send a message back without a connection.
Is Bluetooth Advertising what I am looking for and do you know any GoLang library for it? Where should I start?
There are quite a lot of parts to your question. If you want to be connection-less then the BLE roles are Broadcaster (beacon) and Observer (scanner). There are a number of "standard" beacon formats out there. They are summarized nicely on this cheat sheet
Of course you can create your own format as these are using either the Service Data or Manufacturing Data in a BLE advertisement.
On Linux (Raspberry Pi) the official Bluetooth stack is BlueZ which documents the API's available at: https://git.kernel.org/pub/scm/bluetooth/bluez.git/tree/doc
If you want to be connection-less then each device is going to have to change it's role regularly. This requires a bit of careful thought on how long each is listening and broadcasting as you don't want them always talking at the same time and listening at the same time.
You might find the following article of interest to get you started with BLE and Go Lang:
https://towardsdatascience.com/spelunking-bluetooth-le-with-go-c2cff65a7aca

How can I capture microphone data and route it to a virtual microphone device?

Recently, I wanted to get my hands dirty with Core Audio, so I started working on a simple desktop app that will apply effects (eg. echo) on the microphone data in real-time and then the processed data can be used on communication apps (eg. Skype, Zoom, etc).
To do that, I figured that I have to create a virtual microphone, to be able to send processed (with the applied effects) data over communication apps. For example, the user will need to select this new microphone (virtual) device as Input Device in a Zoom call so that the other users in the call can hear her with her voiced being processed.
My main concern is that I need to find a way to "route" the voice data captured from the physical microphone (eg. the built-in mic) to the virtual microphone. I've spent some time reading the book "Learning Core Audio" by Adamson and Avila, and in Chapter 8 the author explains how to write an app that a) uses an AUHAL in order to capture data from the system's default input device and b) then sends the data to the system's default output using an AUGraph. So, following this example, I figured that I also need to do create an app that captures the microphone data only when it's running.
So, what I've done so far:
I've created the virtual microphone, for which I followed the NullAudio driver example from Apple.
I've created the app that captures the microphone data.
For both of the above "modules" I'm certain that they work as expected independently, since I've tested them with various ways. The only missing piece now is how to "connect" the physical mic with the virtual mic. I need to connect the output of the physical microphone with the input of the virtual microphone.
So, my questions are:
Is this something trivial that can be achieved using the AUGraph approach, as described in the book? Should I just find the correct way to configure the graph in order to achieve this connection between the two devices?
The only related thread I found is this, where the author states that the routing is done by
sending this audio data to driver via socket connection So other apps that request audio from out virtual mic in fact get this audio from user-space application that listen for mic at the same time (so it should be active)
but I'm not quite sure how to even start implementing something like that.
The whole process I did for capturing data from the microphone seems quite long and I was thinking if there's a more optimal way to do this. The book seems to be from 2012 with some corrections done in 2014. Has Core Audio changed dramatically since then and this process can be achieved more easily with just a few lines of code?
I think you'll get more results by searching for the term "play through" instead of "routing".
The Adamson / Avila book has an ideal play through example that unfortunately for you only works for when both input and output are handled by the same device (e.g. the built in hardware on most mac laptops and iphone/ipad devices).
Note that there is another audio device concept called "playthru" (see kAudioDevicePropertyPlayThru and related properties) which seems to be a form of routing internal to a single device. I wish it were a property that let you set a forwarding device, but alas, no.
Some informal doco on this: https://lists.apple.com/archives/coreaudio-api/2005/Aug/msg00250.html
I've never tried it but you should be able to connect input to output on an AUGraph like this. AUGraph is however deprecated in favour of AVAudioEngine which last time I checked did not handle non default input/output devices well.
I instead manually copy buffers from the input device to the output device via a ring buffer (TPCircularBuffer works well). The devil is in the detail, and much of the work is deciding on what properties you want and their consequences. Some common and conflicting example properties:
minimal lag
minimal dropouts
no time distortion
In my case, if output is lagging too much behind input, I brutally dump everything bar 1 or 2 buffers. There is some dated Apple sample code called CAPlayThrough which elegantly speeds up the output stream. You should definitely check this out.
And if you find a simpler way, please tell me!
Update
I found a simpler way:
create an AVCaptureSession that captures from your mic
add an AVCaptureAudioPreviewOutput that references your virtual device
When routing from microphone to headphones, it sounded like it had a few hundred milliseconds' lag, but if AVCaptureAudioPreviewOutput and your virtual device handle timestamps properly, that lag may not matter.

How should I transfer a large 30MB from wear to mobile?

I am currently using the DataApi and Asset class to transfer a 30MB file from mobile to wear. I am using an IntentService but the file never gets to the mobile. The wear freezes and says "Application is not responding, do you want to wait?"
Should I use a SyncAdapter to send this over? I'm not sure on the best way to do this?
It is better to use ChannelApi methods; first obtain a channel and then use Channel#sendFile() to send the file across. ChannelApi is built for transferring large files and doesn't do a sync across all the devices but rather for the target that you have used to open the channel to. It also saves space on the sender side. If you don't need syncing across multiple connected devices, this is the api to use.

at commands to send image/ audio clip though sms

i have used AT commands previously to send a simple text message from PC to mobile via gsm modem. this is the link i referred to. However, I was wondering if is it possible to send images or short audio clips using AT commands. I have searched a lot but couldn't find any source. Any kind of help will be appreciated.
I don't think it's possible for a GSM modem. You may need a GPRS modem instead. But I wonder if there's a standard for sending MMS, although it's clear that some manufacturers have their own AT commands set. On my putal modem the commands are AT^UPLOADFILE, AT^MMSSEND and AT+EMMSEXE. You could refer to your own modem's dev manual.

Accessing/monitoring battery status through SMBus

I am currenlty trying to monitor my battery status through SMBus.
I have a battery along with a control board that constantly outputs the battery status.
This control board is then connected to my mother board through a I2C-USB module.
I need to write a program to recognize the SMBus connection and transmit the battery status to the user.
I'm a beginner when it comes to dealing with smart batteries and I2C/SMBus, and I'm somewhat lost with how to approach this problem.
Any help of suggestions would be appreciated. Thanks.
Your question is a bit lacking. What kind of I2C-USB module? Or rather does it come with a Linux driver? If it does you probably won't need to write one. An application will do. You can read more about I2C and SMBus here.
Basically what you need is the I2C address of the control board (a single byte). When you have the address you (as the master) issue read commands over the I2C bus to the control board using its address and reads the response. If there's a driver for the I2C-USB module this should be straightforward enough. Plug in the device and open() the device (/dev/[i2c-usb-name] where [i2c-usb-name] is the name of the device). Then follow the driver implementer's guide how to setup and send data over that device (typically using read()/write() or ioctl()). Here are some additional information on working with I2C from user space: http://www.mjmwired.net/kernel/Documentation/i2c (select topics in the menu on the left hand side).
If you must write the driver yourself, the first stop for a Linux device driver beginner is the LDD3. Read it, it's quite a pleasant read.

Resources