How to get the RSSI in the MAC layer also the Network layer by the Omnet++ 6 and the Inet 4.4 framework for a wireless multi radio interface nodes - omnet++

By the Omnet++ 6 and then Inet 4.4 framework:
I want to simulate a WiFi network that includes some nodes, each node has several radio interfaces for example two radio interfaces,
Can I get the RSSI value when a packet is received in the MAC layer? If the answer is yes, how to get the RSSI value? Should the amount of the RSSI be calculated by each radio interface that received the packet?
How about in the network layer?
Please include the necessary code in addition to the explanation.
Thanks in advance

Ad. 1. In MAC layer it is possible to get the power of the received signal - take a look at OMNET++: How to obtain wireless signal power?. The RSSI is proportional to the power of the signal, however the IEEE 802.11 standard does not specify how the RSSI is related to that power. It is vendor specific.
Ad. 2. The power of received signal as well as the radio interface ID are available for developers in the network layer - in the same way as presented in the mentioned answer.

Related

Are two JESD204B FPGA masters able to form high speed serial link composed of multiple serial lines?

I need to realize point-to-point multigigabit connection between two FPGAs. For that I can use 4x6.25Gbps serial lines, and transport the data over optical link. The problem is, that the connection - realized over those 4 optical lanes - must look as single point-to-point channel (so single channel with >20Gbps).
The lanes assembly is exactly what for example JESD204B/C does when connecting fast ADC or DAC to the FPGA through HSSI.
I was wondering, whether an instantiated JESD204B ip core on two distant FPGAs is able to assemble a channel composed of 4 lines and function as a transport channel for the data.
I somehow feel, that the problem might occur during synchronization phase, because the two IP cores always act as masters and they expect a component (ADC or DAC) to be attached for the synchronization.
The link shall be established between Intel and Xilinx FPGAs.

Tap/NFC-like Eddystone Experience

(how) Is it possible to have the Eddysone-URL provide functionality, similar to NFC, that would have the user only within a close proximity be able to get the URL?
I've been testing using the eddystone-beacon library on the Intel Bluetooth 4 enabled Wifi card to send the signal successfully. But I find that I can receive the signal from far (20+m) away, when I'd like to limit it to within one meter.
The library has options to attenuate the power txPowerLevel: -22, // override TX Power Level, but I find that changing this only messes with the distance calculation, and not the ability to receive the signal.
Is this perhaps an issue with the hardware (maybe a dedicated USB would allow control?)
Eddystone-URL is not designed to work this way using Google's standard services. However, it is possible to do what you want if you have a dedicated app on the mobile device that detects the beacon.
If this is an option for you, then you won't want to reduce the transmitter power on your hardware device. Even if you get hardware that allows this, sending a very weak signal will lead to unpredictable minimum detection ranges of 3 feet or more on devices with strong receivers, and not detections at all (even if touching the beacon) on devices with weak receivers.
Instead, leave it at the maximum transmission power and then filter for a strong RSSI on the receiving device, showing the detection only when the RSSI meets a threshold. You'll still have trouble with varying strengths of receivers, but it is much more predictable. I have used this technique combined with a device database that tracks the strongest signal level seen for a device model, so I know what RSSI a specific device model will detect when it is right next to the beacon.
If you are game for this approach, you can use the Android Beacon Library to detect Eddytstone-URL for your app on Android devices and the iOS Beacon tools on iOS devices.

Time Signal Reception with RTL-SDR USB Dongle in Europe?

How can I receive a time signal with an unmodified RTL-SDR USB TV Dongle here in Europe?
RTL-SDR Dongles are able to receive the frequency range 52-2200MHz.
Here in Europe, radio-controlled clocks receive DCF-77, a time signal broadcast on 77KHz, but as 77KHz is a bit lower than 52MHz, that's out.
The GPS L2 signal is at 1575.42 MHz, so that's within the dongle's range, but the signal is way too weak to be received with the TV antenna. An active GPS antenna is needed, and for providing the antenna with the power, I'd need to make some modifications to the electronics which I don't really want to do.
In the old age of analog TV broadcasting, we had Teletext / Videotext here in Germany, which contained a time signal, but these times are long gone.
ADS-B reception with a dongle works like a charm, but unfortunately they did not put in time or date bits into the data packets.
So: Does anybody have any idea where in the spectrum that can be received by an unmodified RTL-SDR dongle there is a time signal that could be easily decoded?
I'm well aware that getting time over the network via NTP, or via a GPS modem via NMEA 0183 would be way easier, but I'm curious and just want to play around with that dongle a bit. Precision is not important. +/- 2 seconds is fine. And I'd like to do it the SDR-way, so using the dongle in the originally intended way (as a DVB-T receiver using the original software) defeats the purpose (i.e. learning and DIY)
The GPS L2 signal is at 1575.42 MHz, so that's within the dongle's range, but the signal is way too weak to be received with the TV antenna. An active GPS antenna is needed, and for providing the antenna with the power, I'd need to make some modifications to the electronics which I don't really want to do.
Well, first of all, GPS is really weak, but it still works under the noise floor; that's something important to realize – I've seen it more than one time that people are worried because they can't see GPS on a PSD display. You won't; you'll need signal processing to recover it from all the noise.
The modifications aren't all that complicated; basically, you need a capacitor to let through the AC component to the RTL dongle, and a voltage source to feed the active antenna; the required component is usually called a bias-T.
Nevertheless, an active antenna will be necessary – your RTL dongle probably won't have a Noise Figure low enough to receive GPS signals on its own.
In the old age of analog TV broadcasting, we had Teletext / Videotext here in Germany, which contained a time signal, but these times are long gone.
True; haven't looked at local FM stations, but RDS might be the way to go – it can contain a clock/time signal; the German Wikipedia claims that mainly publicly owned stations transmit that information field.
Have a look at gr-rds; it's a GNU Radio implementation of the Radio Data System. If you don't have a working GNU Radio installation (yet), you might try out the GNU Radio LiveSDR Image, which contains a ready-to-use gr-rds.

Measure input voltage of Raspberry Pi B+ running Ubuntu

I just learned that a red flashing LED indicates voltage below 4.63V on a Raspberry Pi Model B+.
Is there a command to determine the voltage programmatically?
I tried vcgencmd measure_volts. But it yields 1.2000V, independent of the input source and the LED status. And it doesn't seem to be related to the 4.63V mentioned above.
Update
Let me describe the situation in a bit more detail:
I'm powering the Raspberry Pi with a lead-acid battery built into a moving robot. After operating the robot for a while, the voltage seams to drop below a critical minimum, causing potential damage to the file system. Therefore, I'd like to detect low voltage automatically (and trigger the robot to return to the charging station).
I'm asking here in StackOverflow, since I assume the solution not to be robotic-specific, but generally applicable to other machines.
Yes you can, as it is said in this topic Under-voltage warnings you can know the low voltage reading the GPIO 35. For reading GPIO, you can refer to this topic:
Python Script to read one pin
Have a look at the adafruit ina219 sensor https://learn.adafruit.com/downloads/pdf/adafruit-ina219-current-sensor-breakout.pdf .
This sensor can be put between the battery and the raspberry and measures the current and the voltage along this connection (0-26V and max. 3.2A). It communicates via i2c bus. Together with an Arduino you can easyly build an battery watchdog for your raspberry. A sample program and the arduino driver can be found here: https://github.com/adafruit/Adafruit_INA219.
According to https://raspberrypi.stackexchange.com/questions/7414/is-it-possible-to-detect-input-voltage-using-only-software it's not possible to do it on software level without other hardware.

How to modify the TI SensorTag CC2650 Firmware to speed up data transfer?

I'd like to modify the SensorTag Software from TI for the CC2650STK kit, so that it speeds up the reading and also the transmission of the sensor values.
Do I need to modify only the Sensor Software (CCS BLE Sensor stack from TI) or also the android app?
I'd principally need only one temperature, so other sub-question is: how can the other sensors be deactivated if not needed or if they conflict with the higher speed of the temperature sensor?
What do you mean by "speeding up"?
There are a number of different things you might mean.
Reduce the latency between opening the mobile app and displaying a
reading.
Refactor the mobile app to make it simpler to get new
readings.
Increase the frequency with which notifications are sent
by the device, if you use it in that way.
Change the firmware interaction with the sensors to obtain a reading.
Each of these meanings entails a different approach.
The period for each sensor is described in the User Guide that you reference and is typically between hundreds of milliseconds and one or two seconds. Do you really need readings more frequently than that? Typically each sensor will need an amount of time in order to obtain a reliable reading. This would be described in the sensor data sheet, along with options for working with the sensor.
More generally 'speed' will be a function of the bluetooth handshake, the throughput available over the physical radio link, the processing within the sensor tag and the processing within the sensors. I would expect the most variable part of this would be the physical link.
It is up to the mobile app to decide which sensors services it wishes to use.
Have you studied the Software Developer's Guide, available at the same page as the BLE Stack?

Resources