Custom camera sensor driver is not working alongside display driver - linux-kernel

I am trying to bring up ov7695 camera driver on imx8mm EVK. This is MIPI based camera. After initial development, camera driver is working when display driver is disabled. It stopped working when I am enabling the display driver.
Setup
IMX8MM board
OV7695 camera sensor driver
MIPI to HDMI converter
HDMI display
Buildroot
Linux - 5.4
gstreamer to stream the video to display and store video.
v4l2grab to take image
fb-test to test the display.
Use Cases
Use Case-1:
Boot the device with camera and display both are connected. Display is working, camera is not working (streaming to display, image, and video storage - not working)
Use Case-2:
Boot the device with camera is connected, display is not connected
No display, camera is working (save image and video, can’t test if the streaming to display is working)
Use Case-3:
Boot the device with only camera connected, without display connected. Camara is working (save image and video, can’t test if the streaming to display is working). Connect the HDMI - Camera stop working, display is working
Use Case-4:
Boot the device with camera connected, without display connected. Video and Image storage is working, start the video streaming using gstreamer. Connect the HDMI display. Video is streaming to HDMI display.
Software Analysis
When the display driver is loaded it is creating the /dev/fb0 node.
Camera driver is using /dev/video0 node.
When the camera image/video capture is not working it is failing at the v4l2 ioctl - VIDIOC_DQBUF. We are using v4l2grab utility to store image.
https://github.com/twam/v4l2grab/blob/master/v4l2grab.c#L284
It is returning EAGAIN in loop. The app is waiting forever (because of the application logic, not driver). And the device is going to this state when the display driver gets loaded (/dev/fb0 is created).
I would appreciate if someone can help me extending the debug effort.
Thanks,
Avijit

Related

using joystick on one raspberry pi to control input on second raspberry pi

I have a raspberry pi 4(let's call it server) that's operating servos and motors on a boat. I've managed to control the boat using another raspberry pi 4( let's call it client) and connect via SSH. First remote control was via touchscreen and gui that ended up pretty useless. I wanna try control by joystick which I've placed on client side. My problem is how do I implement the signal from joystick that I get from script on client as input signal on server side? Is there a way to do that via SSH or something else? In longterm there will be a camera, radar and sensors placed on the boat. Is the websocket best solution for that type of remote control?
Personally, I think I'd set up MQTT for this which is ideal for transferring short messages and doesn't require absolute "lock-step" communication back and forth. That means you can start in any order and the second half doesn't need to necessarily be present.
I'd install mosquitto on either RasPi, and then:
let RasPi with devices attached subscribe to a "joystick" topic so it gets all joystick movements, and
let RasPi with joystick attached publish each movement on that topic.
Enter [mqtt] in the StackOverflow search box for examples.
Alternatively, you could use Redis "pub/sub" just the same.
Enter [redis] in the StackOverflow search box for examples.

ESP32 AsyncWebServer combined with WifiClient communicating with same IP address

My setup is as follows. An ESP32 devkit and a Sonos music player. The idea is to control the Sonos box, using a WiFiClient. With the ESP32 and its connected buttons, I send commands to the Sonos, to play a certain music file, set the volume and many other things. Works well.
The music files are just urls that direct the Sonos to fetch the music file from.
The next step was to add an SD card to my ESP32 and store musicfiles on the SD. I used ESPAsyncWebser, and I can retrieve the music files from my ESP32. Sonos plays them perfectly.
It's the combination that fails sometimes. While the asyncwebserver does it's job and sends out the music data, and the ESP talking to the same Sonos to change volume or add a song to the playlist/queue, it sometimes hangs just there, and the WifiClient is not able to connect or reconnect anymore, and no more music data is sent out. The ESP32 does not really crash, it still functions, the buttons keep working etcetera.
My WifiClient only connects to port 1400 on the Sonos. The Sonos gets the music from the webserver at port 80.
The bare question is, is it even possible to use WiFiClient objects together with the ESPAsyncWebServer when both are connected to the same IP address (the Sonos box) ?
Thanks in advance for any pointers ;-)

routing audio data from application to virtual audio driver in MAC OS

I'm very new to mac os , I want to route the audio data captured from the real hardware mic/speaker to the virtual audio driver (null audio driver).How can I invoke the driver form my application and How to communicate between driver and apllication.
For this task Which method I need to follow and any recommendation of existing methods.
Any help / suggestions will be appreciated.

Initial Connection for network setup from Smartphone to raspberry

I have Raspberry Pi 3 model B
OS: Raspbian(jessie)
I want to create a connection from my smartphone to a headless raspberry, and connect the headless raspberry to a network(which the smartphone is conencted).
like the vacum cleaner robot (IRobot) or Web camera which has an app, or any device that needs an initial connection to setup it's network.
the initial connection needs to be from a remote device, mainly a smartphone app.
What did i try:
1. bluetooth - i've managed to setup the bluetooth in the raspberry. when i try to connect, a popup appears on the raspberry which ask to confirm connection, because the raspberrry needs to be headless, this option is not good for us.
if i set the bluetooth as follow: sudo hciconfig hci0 sspmode 0
when i try to connect to the raspberry, a popup appears in the smartphone required a PIN code, this option may be good, if i new the PIN code.
2. P2P - i try use wifi direct from my smartphone
i've create file called p2p.conf inside /etc/wap_supplicant
inside this file i wrote the following:
ctrl_interface/var/run/wpa_supplicant
update_config=1
device_name=raspberry
device_type=1-0050F204-1
driver_param=use_p2p_group_interface=1
p2p_go_intent=1
p2p_go_ht40=1
then stop the wpa_supplicant and change
sudo wpa_supplicant -B -iwlan0 -Dnl80211 -c/etc/p2p.conf
i've started wpa_cli
and then p2p_find
it's started scanning
when i search from the smartphone for the raspberry, i found the raspberry, when i try to conenct it asks me for PIN code, the PIN code is showen in the raspberry wpa_cli.
two problem i have from this option:
1. the network needs to be connected.
2. the PIN code changes, and i can't make it fixed.
Is there a way to make an initial connection to setup a network from the smartphone to a headless raspberry?

Using Sony Camera API on AndroidWear

I want to create an app that can start and stop recording from a Sony AS100VR camera using camera remote API.
I can get the same working from my nexus using a direct Wifi connection, but when I establish a direct wifi connection from my Sony smartwatch, it fails at the SSDP detection stage.
It's definitely connected to the camera, SSID over Wifi, but it can't detect it.
I have tried playing with retries and timeout values, but I have sort of run out of ideas.
it's falling into the catch catch (InterruptedIOException e) with a java.net.SocketTimeoutException
Any suggestions gratefully appreciated!
UDP Mulitcast is not available on smartwatch, so SSDP discovery fails.
There is a fail-safe choice for any UPnP based application, that is:
As in most case, the resource URL structure keeps unchanged except IP Address, so when SSDP discovery failed, let user directly input the IP Address (maybe in form of UI Picker) and get "DeviceDescription.xml" or something else then setup services.
Have you taken a look at the CameraRemoteSampleApp that comes with the Camera Remote API SDK? I assume when you say Smartwatch you mean you are using a Sony SmartWatch 3 that supports a direct WiFi connection? If so, you should be able to modify the sample app with minimal changes and run it on the SW3.

Resources