Regarding Direction calculation using Gyroscope,Magnetometer and Accelerometer - rotation

We are using MEMS tri axial sensor which has Accelerometer,Magnetometer and Gyrometer.We have also done Accelerometer and Magnetometer calibration.This sensor is used in borehole application.We have calculated Deviation of borehole using Accelerometer.Now we are stuck up in direction calculation free from rotation(i.e if the sensor rotates also the direction should not change).Is it possible to calculate rotation free Direction using the above three(Acce,Magnet,Gyro) readings. If yes please let me know .
Thanks,
S.Naseer

Related

How can I combine properties of rotational data?

I have a project I'm working on that requires rotational data (yaw, pitch, roll) to be taken from a few different sensors to be combined in code.
The first sensor I have, I can get good angels from, but has a very bad drifting problem.
The second sensor can has very good angles with minimum drifting, but only has a -90 to 90 degree range
of motion.
My question is how can I combine these two sensors data so that I have minimum drifting and a 360° range?
I can provide sample data if needed.
Thanks in advance!

3D Stereo_camera and 2D LiDAR pointclouds data fusion

I have two separate pointclouds(type= sensor_msgs/PointCloud2) from two different sensors, a 3D stereo camera and a 2D LiDAR. I wanted to know how can I fuse these two pointclouds if the stereo pointcloud is 3D with fix length and a 2D LiDAR pointcloud with variable pointcloud length?
If someone has worked on it please help me, your help will be highly appreciated.
Thanks
I studied this in my research.
The first is you have to calibrate 2 sensors to know their extrinsic. There are a few open source packages you can play with which I listed Below
The Second is fuse the data. The simple way is just based on calibration transform and use the tf to send. The complicated way is to deply pipelines such as depth image to LIDAR alignment and depth map variance estimation and fusion. You can choose to do it ez way like easiler landmark included EKF estimation, or you can follow CMU Zhangji`s Visual-LIDAR-Inertial fusion work for the direct 3D feature to LIDAR alignment. The choice is urs
(1)
http://wiki.ros.org/velo2cam_calibration
Guindel, C., Beltrán, J., Martín, D. and García, F. (2017). Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups. IEEE International Conference on Intelligent Transportation Systems (ITSC), 674–679.
Pros. Pretty accurate and ez to use package. Cons. you have to made rigid cut board.
(2) https://github.com/ankitdhall/lidar_camera_calibration
LiDAR-Camera Calibration using 3D-3D Point correspondences, arXiv 2017
Pros. Ez to use, Ez to make the hardware. Cons May is not so accurate
There were couple of others I listed In my thesis, I`ll go back and check for it and update here. If I remember

How to detect the variation of coefficients in a matrix

I am trying to find a way to detect a hot object using Panasonic's GridEye which is a sensor that gives an 8x8 matrix with a temperature in each square.
Is there a common algorithm that allows to detect high variations in the coeffcients of my matrix? (allowing to detect very high and very low temperature objects)
In the image attached for example, I'd like to detect the red squares. The algorithm shall return "Object with temperature 28°C detected".
Temperature profile when a human enters the camera field (distance=1m)
Thanks in advance for your answers :)

how to make navigation with 3-axis accelerometer and 3-axis gyroscope and gps

as the title stated, I want to make Dead Reckoning with accelerometer and gyroscope. The accelerometer could apply with linear accleration and then used to get velocity by one time integration or get runing distance in the gap of sample time by twice integration. and also I could get changed angle by integration of gyroscope output value. so on the condition of initial position, I can get new postion by DR with distance and angle.
thought is perfect but fact is not so simple. acceleromter and gyroscope is unstable and always affected by temperature or unaligned axis direction. I have know there is one popular method called kalman filter to combined this sensors with gps to protect navigation from noise of output, but I think it's out of my comptent by now.
firstly, I want to know how to remove gravity force which is mixed with real accelerometer ouput?
sencond, how to correct gyroscope error?
the last, how to realize kalman filter with accelerometer, gyroscope and gps?
any suggestion is good to me, if you can give some code that's best of all!
thank you
[edit #2013/12/12]:
I have given up using accelemeter for calculating velocity and distance due to it's big drift. And also the error can become more and more under the influence of double integration. But there is a lucky thing that this work for a car in tis case. so I prefer to select receiving velocity from CAN, it was proven to be more exactly than accelerometer. By now a solution about velocity has been resolved, but the other question is still not sure in my opinion. expect more nice answer.
look here https://stackoverflow.com/a/19764828/2521214
And read the whole thing !!!
including the comments
and view also this http://www.youtube.com/watch?v=C7JQ7Rpwn2k ...
video link is from AlexWien answer in the same thread

Motion Tracking using Sensor fusion

Currently I am using an accelerometer, Gyro and magnetometer for motion tracking application.I have a 9D sensor fusion functionality to calculate the orientations and gravity cancellation from accelerometer data. How do i now calculate the position of the object in three dimensions? Kindly suggest any algorithm which could give good accuracy.
Extended Kalman filter can give you the best results for motion tracking if you are working on real time application. I would suggest you to refer a book Multi Sensor Data fusion with MATLAB (CRC Press).

Resources