Localization using ultrasonic sensors - algorithm

I don't know if this is the right place to ask this question.
I am working on a project in which I have to use ultrasonic sensors only to do "simultaneous localization and mapping" of robot. I have 8 such sensors. Assume that i have enough computation power and the limited sensing(8 ultrasonic sensor) capability.
What would be an appropriate algorithm to use in this case?

I found SLAM for Dummies to be very helpful.

According to your question, the algorithm to use is SLAM.
There are many possible SLAM implementations. http://openslam.org

Related

What is the more common way to build up a robot control structure?

I’m a college student and I’m trying to build an underwater robot with my team.
We plan to use stm32 and RPi. We will put our controller on stm32 and high-level algorithm (like path planning, object detection…) on Rpi. The reason we design it this way is that the controller needs to be calculated fast and high-level algorithms need more overhead.
But later I found out there is tons of package on ROS that support IMU and other attitude sensors. Therefore, I assume many people might build their controller on a board that can run ROS such as RPi.
As far as I know, RPi is slower than stm32 and has less port to connect to sensor and motor which makes me think that Rpi is not a desired place to run a controller.
So I’m wondering if I design it all wrong?
Robot application could vary so much, the suitable structure shall be very much according to use case, so it is difficult to have a standard answer, I just share my thoughts for your reference.
In general, I think Linux SBC(e.g. RPi) + MCU Controller(e.g. stm32/esp32) is a good solution for many use cases. I personally use RPi + ESP32 for a few robot designs, the reason is,
Linux is not a good realtime OS, MCU is good at handling time critical tasks, like motor control, IMU filtering;
Some protection mechnism need to be reliable even when central "brain" hang or whole system running into low voltage;
MCU is cheaper, smaller and flexible to distribute to any parts inside robot, it also helps our modularized design thinking;
Many new MCU is actually powerful enough to handle sophisticated tasks and could offload a lot from the central CPU;

How does signal to noise ratio work in the OMNeT++?

I wonder how I can implement surrounding noise signals in OMNeT++.
As far as I know, there is an SNR box that I can input the number as a percentage, but I don't see it in the manual.
You are looking at the wrong place. OMNeT++ does not implement anything regarding SNR, because it is a generic discrete event simulator that does not know anything about domain models.
What you are speaking is INET framework, which is a network simulation model (written for OMNeT++). So, you have to look for documentation on the model's website, not in the generic omnet manual. There is detailed documentation about the transmission modeling and noise models, for example here: https://inet.omnetpp.org/docs/users-guide/ch-transmission-medium.html#background-noise-models

How is the bug algorithm implemented on an obstacle avoidance robot using the Arduino platform?

The robotic buggy is baed on the arduino mega and has a single ultrasonic sensor at the front with two sharp IR sensors on either side.
I want to implement the bug 2 algorithm but I have not idea how to begin. I've looked at the source code on this this site but it makes no sense to me.
Does anyone know of any sources that provides simple to follow instructions on how to implement the algorithm?
From what I saw in this tutorial, beyond a prior knowledge of programming, you also need a basic knowledge of linear algebra, if you already have an extensive knowledge of programming remains on studying linear algebra, trigonometry and etc ..

Advice for interfacing strain gauges to PC

I'm using an arduino to excitate and amplify strain gauges on a rod - the resulting voltage will be picked up by the analog inputs available on the arduino. I need to plot the 'torque' taken by that rod with respect to time on a graph, and the easiest way I see to do this is using the Processing language, as the basic arduino environment does not provide for graphical display.
Any tips on where to start? I only have prior experience with MATLAB, and a bit with Java.
EDIT: I should add a specific question - how do I assign a variable in Processing to the physical values read on the arduino (varying voltage through analog)?
Thanks.
Since you have experience with MATLAB, consider using the ArduinoIO API provided by The MathWorks. Basically lets you interface your Arduino to MATLAB - all the pin I/O features are available. So let MATLAB do the work plotting, etc, for you and just use your Arduino to collect your data.
I can personally vouch for how useful this API is. It's powering my master's thesis (building Arduino-powered vehicles and doing control on them).

Using Accelerometer in Wiimote for Physics Practicals

I have to develop some software in my school to utilize the accelerometer in the Wiimote for recording data from experiments, for example finding the acceleration and velocity of a moving object. I understand how the accelerometer values will be used but I am sort of stuck on the programming front. There is a set of things that I would like to do:
Live streaming of data from the Wiimote via bluetooth
Use the accelerometer values to find velocity and displacment via integration
Plot a set of results
Avoid the use of the infrared sensor on the Wiimote
Please can anyone give me their thoughts on how to go about this. Also it would be great if people could direct me to existing projects that utizlise the wiimote. Also can someone suggest what would be the best programming language to use for this. My current bet is on using Visual basic.
Any sort of help is greatly appretiated.
There's some famous projects using the Wii remote by Johnny Lee Chung.
They use C# and you can download the source.
By and large they are the reverse of what you want - they use the camera, but you should be able to use the source as a starting point and to analyse the data coming from the remote.
NOTE: At the time of writing the Wiimote library linked to is unavailable, but as it's an MSDN site it should be back soon.
Addendum: It looks like this is now available on Codeplex
This also has a link to various applications built on the library. Wii Drum High looks like it just reads the accelerometer.
I have written some software to do some of what you ask. Check out wiiphysics.site88.net.
You will find integrating the acceleration data very tricky to get any decent results.
It is written in c#.
One problem is what are your initial conditions (ok if you start at rest), the other is that by the time you get to displacement you will have a lot of noise (the acceleration data from a wiimote is only 8-bit)

Resources