I have a small assignment that consists of reading the luminance from a sensor (GND, VCC, sig).
How can I connect the cables into the board? What constraints file should I have?
Has anybody found any example?
Thank you!
Here are the images of the board and sensor:
board
sensor
The XADC is component number 14 in the image below.
Diagraman
So, I connected the sensor's pins to the board's pins(GND, VCC, sig).
After that, I created a module file and I configured the constraints file to map the AD14 pin.
Related
I am using a PIC24FJ128GA010 to read the microphone values. When I read the raw values from the sensor I am getting like 270 but over time the value will go down to 90 and stop. 3.3V is being used and when noise is made the values will change even when its slowly going down to 90. I am not sure why this is the case. I just have 3.3V, ground and an analog input pin connected to the microphone. The microphone I believe picks up noise from -1.7V to 1.7V.
I am setting AD1CON2 = 0x0000; where AVss and AVdd is used as Vref.
Is there a way I can fix the readings from the microphone to display 90 right away? I know that 90 represents 0 and probably will need to fix that so the values would be read just on the positive side. Would using a capacitor help fix this issue?
I am very new to programming microcontrollers and would appreciate any help.
I have two separate pointclouds(type= sensor_msgs/PointCloud2) from two different sensors, a 3D stereo camera and a 2D LiDAR. I wanted to know how can I fuse these two pointclouds if the stereo pointcloud is 3D with fix length and a 2D LiDAR pointcloud with variable pointcloud length?
If someone has worked on it please help me, your help will be highly appreciated.
Thanks
I studied this in my research.
The first is you have to calibrate 2 sensors to know their extrinsic. There are a few open source packages you can play with which I listed Below
The Second is fuse the data. The simple way is just based on calibration transform and use the tf to send. The complicated way is to deply pipelines such as depth image to LIDAR alignment and depth map variance estimation and fusion. You can choose to do it ez way like easiler landmark included EKF estimation, or you can follow CMU Zhangji`s Visual-LIDAR-Inertial fusion work for the direct 3D feature to LIDAR alignment. The choice is urs
(1)
http://wiki.ros.org/velo2cam_calibration
Guindel, C., Beltrán, J., Martín, D. and García, F. (2017). Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setups. IEEE International Conference on Intelligent Transportation Systems (ITSC), 674–679.
Pros. Pretty accurate and ez to use package. Cons. you have to made rigid cut board.
(2) https://github.com/ankitdhall/lidar_camera_calibration
LiDAR-Camera Calibration using 3D-3D Point correspondences, arXiv 2017
Pros. Ez to use, Ez to make the hardware. Cons May is not so accurate
There were couple of others I listed In my thesis, I`ll go back and check for it and update here. If I remember
How to use ad7490 ADC in Android Things?
I have a sensor 16x16 Sensor Matrix. For this sensor matrix i want to read data of the each cell and position. I want to implement ad7490 ADC with the help of SPI and GPIO Pins connection.
My requirement is, how to read ad7490 A2D value on SPI pin.
Note : I had also implemented this on arduino uno using ic 7495 shift register and 74hc4067 mux. By these i am able to generate 16x16 matrix and read Analog data of each position. Now i want to implement same thing in android things with ad7490
i'm working on a school project "automatic railway system"
my project suppose to close the gate when the train coming to the station with a buzzer on with 90 sec count down display on 7-seg. and a led flashing.
after the train leaving the station, the gate opens and the buzzer off and the led off .
i tried to use a dc motor to open and close the gate but it didn't give me the accurate angle that i need to i try to use a servo motor .
so i need it to open the gate at position zero and close it at position 90 .
all the code i found on the internet they using PWM and timers which i didn't take it in my course , so can anyone help me to do this with simple code ,please ?
i'm using Atmega32 running at 16000000 HZ
Its depend on your analog servo (which is controlled by PWM) frequency specification. After you learn about the servo specification, you can set your PWM using build-in features on cvavr compiler, or you can do some research about PWM registers.
Here is some example of PWM setup
//using OC0 (B.3)
DDRB.3 = 1; //set B.3 as output
TCCR0=0b0111 0001;
TCNT0=0; //set to Phase Correct PWM mode, no prescaler, and inverted output
//to assign a value to your PWM
OCR0 = 127 //50% duty cycle since it was 8 bit
I am interested in determining the unit of measurement from my point cloud.
I researched and found this where it says it is based on your device of gathering data.
I have a 64e Velodyne LiDAR system. However, I can't find anywhere that it mentions what units it gathers data in (I might just be overlooking it).
Here is a link to their user manual: manual
The 64e Velodyne LiDAR system's manual mentions the range of the sensor in meters (see page 2, second paragraph):
The HDL-64E is rated to provide usable returns up to 120 meters.
The manual's C appendix brings information about their software for reading and visualizing lidar data: Digital Sensor Recorder (DSR). It says on page 14, last paragraph:
Note: In live display mode, click on the double arrow button to begin display. The concentric gray circles and grid lines represent 10 meter increments from the sensor, which is depicted on the screen by a white circle.
You can use the sample data, which comes together with the DSR software CD, to extract descriptive information and statistics from this cloud and check the units by yourself.
Few months ago, I posted a tutorial about how one could do that using Fusion; the US Forest Service's software for lidar processing. See here:
Extracting descriptive information from a LiDAR cloud (.las files)