How to Calculate statistics PDR, Delay ,Broadcast Success rate For Protocol works in Phy Layer ,Link Layer using OMNET 5.0,VEINS 4.4? - omnet++

What are The Programming instructions that can be written in omnet.ini file or .ned file for calculating the statistics Packet Delivery Ratio, Delay ,Broadcast Success rate For Protocol works in Physical Layer ,Link Layer using OMNET 5.0 ,VEINS 4.4???
Please reply ...

Related

Using Ettus USRPs in radar applications

We are using two USRPs and GNU Radio Companion to build an OFDM radar. The first USRP is the N210 which is used as a transmitter through its Tx/Rx port. The second USRP is N200 which is used as a receiver through its Rx2 port. They are connected together through the so-called MIMO cable to synchronize them. The N210 is connected to the host PC through the gigabit Ethernet cable. The samples that feed the transmitter USRP comes from a block "File Source", and the samples that are collected from the receiver USRP goes to a block "File Sink".
Initially, an external loopback cable is used between the Tx/Rx port of N210 and the the Rx2 port of the N200. Whenever we run the flowgraph, we expect that for every sample that is transferred from the File Source to N210, there should a corresponding sample with somehow same value that comes from N200 to File Sink. However, we have noticed that the N200 produces a stream of random samples before the awaited samples start to appear!!! The length and the values of this stream of random samples varies each time we re-run the flowgraph!!! Of course this issue constitutes an obstacle for our application because in radar the range of the target to be detected is estimated depending on the delay time. The latter is computed from the number of noise samples at the receiver that precedes the reception of the actual transmitted samples.
The question is: How can we guarantee or force the receiving USRP (N200) not to receive any sample before the transmitter USRP (N210) starts to transmit the required samples? Should not this be the task of GnuRadio?!!!! or we have to do something in GnuRadio to force this to happen?
thnx
We are using two USRPs and GNU Radio Companion to build an OFDM radar. The first USRP is the N210 which is used as a transmitter through its Tx/Rx port. The second USRP is N200 which is used as a receiver through its Rx2 port.
This sounds like you're reproducing my 2013 bachelor thesis!
Whenever we run the flowgraph, we expect that for every sample that is transferred from the File Source to N210, there should a corresponding sample with somehow same value that comes from N200 to File Sink.
No, that would only work when you start and stop both USRPs using the same command times and the same number of samples to be acquired.
However, we have noticed that the N200 produces a stream of random samples before the awaited samples start to appear!!!
Well, over-the air delay, and the state in the DSP chain. This is expected. Use timed commands to make the timing deterministic, and you'll know how many samples to ignore.

How CSMA/CA works on XBee?

I'm trying to implement S-MAC protocol on waspmote xbee sensors and i know it has its own CSMA/CA. So first of all I need to understand the basic of xBee collision avoidance.
Two senders set up in api mode in libraries and both periodically sending single bytes to a common receiver. I reduce the delay and many changing in libabries to make collision and to see how algorithm works. But when i monitor data at the receiver all looks as expected at the receiver .. byte1, byte 2 .. byte1, byte2.
Do u have any idea how can i make collision?
Are you sniffing the 802.15.4 traffic? That's the only way you'd see a collision.
The XBee module buffers the data you want to send, using the host communication parameters (baud rate, API mode, etc.) and then sends it out over 802.15.4 at 250kbps. The module has all of the collision avoidance built in, and will retransmit as necessary to deliver your message. If it's unable to deliver after some number of transmission attempts, you'll get a Transmit Status frame indicating failure.
On the receiving end, it buffers the data and delivers it to the local host using local serial settings (baud rate and API mode).
If you're trying to implement S-MAC, you need a different radio processor where you have low-level control over the radio. The XBee module provides an application layer and handles the MAC layer itself.

Arduino Serial transmitting Missing One Charcter

I am using Arduino Leonardo to transmit an string to a wifi module. The format of command that wifi module can recognize is:
AT60,1,content to a server
I am using an virtual server(TCP/IP Builder) to test the content I can received.
Here is the content I want to send:
smart/device/deviceCmd?userId=1010002003&deviceId=A00019999990002&cmd=ON
Since I try to send it again and again, I use a loop to send it. In the virtual server side, the content I got is:
smart/device/deviceCmd?userId=1010002003&devceId=A00019999990002&cmd=ON
smart/device/deviceCmd?userId=1010002003&devceId=A00019999990002&cmd=ON
smart/device/deviceCmd?userId=1010002003&dviceId=A00019999990002&cmd=ON
smart/device/deviceCmd?userId=1010002003&eviceId=A00019999990002&cmd=ON
smart/device/deviceCmd?userId=1010002003&devieId=A00019999990002&cmd=ON
smart/device/deviceCmd?userId=1010002003deviceId=A00019999990002&cmd=ON
smart/device/deviceCmd?userId=1010002003&dviceId=A00019999990002&cmd=ON
smart/device/deviceCmd?userId=1010002003&dviceId=A00019999990002&cmd=ON
smart/device/deviceCmd?userId=1010002003&deiceId=A00019999990002&cmd=ON
smart/device/deviceCmd?userId=1010002003&dviceId=A00019999990002&cmd=ON
This is the QUESTION: There exist one terrible mistake in the content I received, which is the deviceId part never correct. It's so weird.
Here is part of related code:
//In Uart.cpp
//These three lines can sent a formatted string as "AT60,1,content"
Serial1.write("AT60,");
Serial1.write(channelID); //channel ID = 1 here
Serial1.write(reportIsFire, 76);
//In Uart.h
//Definition of the string I need to send, which has 76 characters.
char reportIsFire[76] = ",smart/device/deviceCmd?userId=1010002003&deviceId=A00019999990002&cmd=ON \n";
Here is few background of this application:
I am using Arduino 1.5.8 IDE with VisualStudio
Since the serial buffer of Arduino is only 64 Bytes, I have already
change the buffer size to 128 Bytes in "HardwareSerial.h" to send
out this large string.
The baud rate is 115200 and I am using Serial 1. I have used Serial 1
to transmit few other characters and it works fine.
I will appreciate that If you have any idea about this question.
I am betting that the serial baud rate of the Arduino is not 100% correct. Increasing the buffer size will not matter if the data is being lost due to a timing issue in the physical link.
I'd recommend double-checking the code that initializes the serial baud rate generator. It may be possible to get a closer rate to 115,200 by either adjusting the available settings, altering the main clock speed (if possible), implementing some form of flow control, or all of the above.
In extreme cases, you may consider using a special-frequency oscillator. Many Microchip PICs use an internal or external 4MHz or 8MHz crystal, but this can produce far too much timing error for lengthy serial transmissions at high speed. In that case, something special, like a 7.3728MHz crystal can be used, bringing the accuracy to exactly 100% (at least on some PIC devices.)
Lastly, another consideration is if any pre-emptive code is running on the device, such as interrupts or timers which could inadvertently interfere with the serial output.
I don't have an answer, but I suspect the most likely problem is that the Wifi card can't read characters at a sustained 115200 baud rate. If possible, set the Wifi baud rate and the Arduino Serial.begin() to a lower rate, such as 57600 or 19200.
If the Arduino baud rate was simply inaccurate, I'd expect to see the problem appearing at random locations in the string, rather than about 40 characters in.

USB2.0 Transfer using usb_submit_urb gives kernel panic

Scenario
I am building and transferring ethernet packets from application over USB2.0.
Inside the USB class driver, I am issuing a request to send this packets to BULK endpoint using usb_submit_urb. My ethernet packet size is 112 bytes. I am able to transfer 8000 packets in ~ 200 msec without considering presentation time.
I am making an ioctl call to send packets at a very faster rate say I am making an ioctl call after every 3-4 usec. Inside my ioctl I am issuing usb_submit_urb which is non-blocking call unlike usb_bulk_msg.
Problem
If I consider presentation time, the kernel panics and dmesg log reads kernel panic - not syncing : Fatal exception in interrupt. For INFO: By considering presentation time, packets will wait in hardware device till timestamp_in_packet == hardware time.
I need the learning of how EHCI behaves in such conditions or what can be the status of endpoint queues in such a scenario. I am working on a ETH over USB chip. What is the actual cause of such kernel panics.
Any inputs will help alot.

Creating new task in FreeRTOS for USART reception

I am using EVK1105 development board with AVR Studio 5 as development IDE for my AVR project.
I am using FreeRTOS in it. I have 3 USART ports on this board. One external module is connected to my AVR32 board via USART-RS232 mode. It sends me continuous serial data to my board on USART0 with 19230 baudrate, 7-databits, odd parity, stopbit-1 and normal-channel mode. I created a new task for this purpose. After each 9 data bytes it sends '\n' and '\r'. So in my task I keep on collecting the 9 databytes in a string buffer and then transmit it on USART1. I am using polling method to collect data from USAR0 which is receiving port. But I am facing problem in receiving data. I don't know if its timing issue or something or the scheduler switches the task while collects the data. But I don't get the required data.
Following are things I have already checked as troubleshooting
1. Connected my external module to my PC hyper-terminal which gives me perfect result.
2. Implemented the same thing of using receiving from USART0 and whatever received is transmitted to USART1 as without FreeRTOS. Its works fine.
Please suggest some idea what may be wrong. I am using a queue to communicate between Tx and Rx task to pass the string buffer from USART0 to USART1. Is it problem in handling queue? How can I troubleshoot the queue?
I am using a delay of 50ms in my infinite task loop in Rx Task. Can it create a problem? If I don't use any delay the OS crashes. Please suggest some good practices to create a new task in FreeRTOS so that I will not get any timing issue.
For such a use case, I would not use a polling method with 50ms delay to retrieve data from UART peripheral. You can easily lose received data depending on the system load and UART reception buffer size.
At least use an interrupt on UART data reception that copies every received byte into a local buffer that will be read by your TX thread.
You can have an even better solution using a DMA channel to receive your data frame and be notified when 9 bytes have been received. I don't know if your AVR device has a DMA peripheral or not.
Are you still working on this? The statement of your problem is vague, but there I have several suggestions/leading questions.
1) You may want some documents to see what the registers are
Get the giant datasheet pdfs at
http://www.atmel.com/dyn/products/product_docs.asp?category_id=163&family_id=607&subfamily_id=2138&part_id=4117
2) In this and an earlier post you state that you have, in some cases, been able to RX data. You will need to find the USART HW initialization code from those example projects and get them into the freeRTOS example project. In particular calls to
gpio_enable_module() with {AVR32_USART0_RXD_0_0_PIN, AVR32_USART0_RXD_0_0_FUNCTION}
To connect to USART to CPU
and i believe
InitRs232()
Just doing this requires poking around a lot of code - there's alot of dependencies.
2) What function are you calling to retrieve data from USART0? 19kbaud is approximately 2000bytes/sec or 1 byte/0.5ms, so 50ms polling is not nearly enough. I'd suggest that your RX task poll continuously (never sleep explicitly) but at a lower priority than the TX task.
3) Concentrate on debugging the RX task at the call to retrieve data. Use the debugger to look at the hardware registers for the usart. In particular
USART0 cr register AVR32_USART_CR_RXEN_MASK should be set to enable RX
USART0 csr register AVR32_USART_CSR_RXRDY_MASK will indicate if there is new data there
You can also check the overlow flag to see if you have missed some data.
When the read of USART0 rhr occurs it should be a byte that you sent.
If you are still working on this I can look into this a bit more.

Resources