Vanet cognitive radio simulation - do i need mixim? - omnet++

I am implementing spectrum sensing for VANETs using SuMO, OMNeT++ and Veins. With these three, I believe I can simulate traffic scenarios. Is it also possible to perform spectrum sensing within the nodes (secondary users in VANETs) with only those 3 software packages or do I need to install MIXIM for cognitive radios as well?
Thanks,
Rop

You ask "is it possible" and you mentioned C++ libraries containing simulation models. This makes the question somewhat hard to answer. Yes, the libraries you mention can support you to write a simulation that does what you described.
If your question is whether any of the libraries already contains code that implements the functionality you describe, the answer is no. You need to write that part yourself.

Related

Comparing Flora(inet) and Ns3 for lorawan

I am new to inet and Ns3.
I am currently deciding between Flora (omnet++ based) and LoRaWAN (ns3 based). Which one is better in terms of features and vice versa. Also which one is easy to learn quickly.
Would really appreciate if someone could guide me..I am not focusing on machine learning, but just resource allocation problems..Have a nice day
Based on my personal experience Flora (Omnet++) has the following limitations.
Flora doesn't consider mobility
Flora doesn't take into account any type of interference (Intra/Inter spreading factor interference)
A LoRaWAN gateway should implement 8 parallel reception paths, but it is not considered in Flora.
In the case of ADR, network server should assign Spreading factors. This feature is not supported in Flora.
Flora doesn't support ADR in unconfirmed mode.
Simulation with multiple gateways has problems.
Flora doesn't provide a long range as defined by LoRaWAN.
The above features are implemented in ns3 based LoRaWAN. As compared to ns3 LoRaWAN, Flora implementation is difficult.

DE1-SoC Board FPGA for evolvable hardware

I would like to reproduce the experiment from Dr. Adrian Thompson, who used genetic algorithm to produce a chip (FPGA) which can distinguish between two different sound signals in a extreme efficient way. For more information please visit this link:
http://archive.bcs.org/bulletin/jan98/leading.htm
After some research I found this FPGA board:
http://www.terasic.com.tw/cgi-bin/page/archive.pl?Language=English&CategoryNo=167&No=836&PartNo=1
Is this board capable of reproducing Dr. Adrian Thompsons experiment or am I in need of another?
Thank you for your support.
In terms of programmable logic, the DE1-SoC is about ~20x bigger, and has ~70x as much embedded memory. Practically any modern FPGA is bigger than the "Xilinx XC6216" cited by his papers, as was linked to you in the other instance of this question you asked.
That said, most modern FPGAs don't allow for the same fine-granularity of configuration, as compared to older FPGAs - the internal routing and block structures are more complex, and FPGA vendors want to protect their products and compel you to use their CAD tools.
In short, yes, the DE1-SoC will be able to contain any design from 12+ years ago. As for replicating the specific functions, you should do some more research to determine if the methods used are still feasible with modern chips and CAD tools.
Edit:
user1155120 elaborated on the features of the XC6216 (see link below) that were of value to Thompson.
Fast Configuration: A larger device will generally take longer to configure, as you have to send more configuration data. That said, I/O interfaces are faster than they were 15 years ago, so it depends on your definition of "fast".
Reconfiguration: Cyclone V chips (like the one in the DE1-SoC) do support partial reconfiguration, but the subscription version of the Quartus II software is required, in addition to a separate license to support PR. I don't believe it supports wildcard reconfiguration, though I could be mistaken.
Memory-Mapped Addressing: The DE1-SoC's internal data can be access through the USB Blaster interface. However, this requires using the SystemConsole on the host PC, so it's not a direct access.

Choosing FPGA with enough inputs

I need a FPGA that can have 50 I/O pins. I'm going to use it as a MUX. I though about using MUX or CPLD but the the guy I'm designing this circuit for says that he might need more features in the future so it has to be a FPGA.
So I'm looking for one with enough design examples on the internet. Can you suggest anything (for example a family)?
Also if you could tell me what I should consider when picking, that would be great. I'm new to this and still learning.
This is a very open question, and the answer to it as stated can be very long, if possible at all given all the options. What I suggest to you is to make a list of all current and future requirements. This will help you communicate your needs (here and elsewhere) and force you, and the people you work with on this project, to think about them more carefully. Saying that "more features in the future" will be needed is meaningless; would you buy the most capable FPGA on the market? No.
When you've compiled this list and thought about the requirements, post them here again, and then you'd get plenty of help.
Another possibility to get feedback and help is to describe what you are trying to do/solve. Maybe an FPGA is not the best solution -- people here will tell you that.
I agree with Saar, but you have to go back one step further: when you decide which technology to target, keep in mind that an FPGA needs a lot of things to run, i.e. different voltages fore core, I/O, auxiliary, and probably more. Also you need some kind of configuration mechanism as an FPGA is in general (there are exceptions) SRAM based and therefore needs to be configured at startup. CPLDs are less flexible but much easier to handle...

FPGA Place & Route

For programming FPGAS, is it possible to write my own place & route routines? [The point is not that mine would be better; the point is whether I have the freedom to do so] -- or does the place & route stage output into undocumented bitfiles, essengially forcing me to use proprietary tools?
Thanks!
There's been some discussion of this on comp.arch.fpga in the past. The conclusion is generally that unless you want to attract intense legal action from the FPGA companies then you probably don't want to do something like this. bitfile formats are closely guarded secrets of the FPGA companies and you would likely have to understand the file format in order to do what you want to do. That implies that you would need to reverse engineer the format and that (if you made your tool public in any way) would get you a lawsuit in short order.
I will add that there probably are intermediate files and that you likely wouldn't read or write the bitfile itself to do what you want to do, but those intermediate files tend to be undocumented as well. Read the EULA for your FPGA synthesis tool (ISE from Xilinx, for example) - any kind of reverse engineering is strictly forbidden. It seems that the only way we'll ever have open source alternatives in this space is for an open source FPGA architecture to emerge.
I agree with annccodeal, but to amplify a little bit, on Xilinx, there may be a few ways to do this. The XDL file format allows (or used to allow) explicit placement and routing. In addition, it should be possible to script the FPGA Editor to implement custom routing.
As regards placement, there is a rich infrastructure to constrain technology mapping of logic to primitives and to control placement of those primitives. For example LUT_MAP constraints can control technology mapping and LOC and RLOC constraints can determine placement. In practice, these allow the experienced designer great control over how a design is implemented without requiring them to duplicate man-centuries of software development to generate a bitstream directly.
You may also find interesting the current state of the art FPGA CAD research software such VPR. In my opinion these are challenged to keep up with vendor's own tools that must cope with modern heterogeneous FPGAs with splittable 6-LUTs, DSP blocks, etc.
Happy hacking.

What is your experience with software model checking? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
What types of applications have you used model checking for?
What model checking tool did you use?
How would you summarize your experience w/ the technique, specifically in evaluating its effectiveness in delivering higher quality software?
In the course of my studies, I had a chance to use Spin, and it aroused my curiosity as to how much actual model checking is going on and how much value are organizations getting out of it. In my work experience, I've worked on business applications, where there is (naturally) no consideration of applying formal verification to the logic. I'd really like to learn about SO folks model checking experience and thoughts on the subject. Will model checking ever become a more widely used developing practice that we should have in our toolkit?
I just finished a class on model checking and the big tools we used were Spin and SMV. We ended up using them to check properties on common synchronization problems, and I found SMV just a little bit easier to use.
Although these tools were fun to use, I think they really shine when you combine them with something that dynamically enforces constraints on your program (so that it's a bit easier to verify 'useful' things about your program). We ended up taking the Spring WebFlow framework, which uses XML to write a state-machine like file that specifies which web pages can transition to which other ones, and using SMV to be able to perform verification on said applications (shameless plug here).
To answer your last question, I think model checking is definitely useful to have, but I lean more towards using unit testing as a technique that makes me feel comfortable about delivering my final product.
We have used several model checkers in teaching, systems design, and systems development. Our toolbox includes SPIN, UPPAL, Java Pathfinder, PVS, and Bogor. Each has its strengths and weaknesses. All find problems with models that are simply impossible for human beings to discover. Their usability varies, though most are pushbutton automated.
When to use a model checker? I'd say any time you are describing a model that must have (or not have) particular properties and it is any larger than a handful of concepts. Anyone who thinks that they can describe and understand anything larger or more complex is fooling themselves.
What types of applications have you used model checking for?
We used the Java Path Finder model checker to verify some security (deadlock, race condition) and temporal properties (using Linear temporal logic to specify them). It supports classical assertions (like NotNull) on Java (bytecode) - it is for program model checking.
What model checking tool did you use?
We used Java Path Finder (for academic purposes). It's open source software developed by NASA initially.
How would you summarize your experience w/ the technique, specifically in evaluating its effectiveness in delivering higher quality software?
Program model checking has a major problem with state space explosion (memory & disk usage). But there are a wide variety of techniques to reduce the problems, to handle large artifacts, such as partial order reduction, abstraction, symmetry reduction, etc.
I used SPIN to find a concurrency issue in PLC software. It found an unsuspected race condition that would have been very tough to find by inspection or testing.
By the way, is there a "SPIN for Dummies" book? I had to learn it out of "The SPIN Model Checker" book and various on-line tutorials.
I've done some research on that subject during my time at the university, expanding the State Exploring Assembly Model Checker.
We used a virtual machine to walk each and every possible path/state of the program, using A* and some heuristic, depending on the kind of error (deadlock, I/O errors, ...)
It was inspired by Java Pathfinder and it worked with C++ code. (Everything GCC could compile)
But in our experiences this kind of technology will not be used in business applications soon, because of GUI related problems, the work necessary for creating an initial test environment and the enormous hardware requirements. (You need lots of RAM and disc space, because of the gigantic state space)

Resources