How to set communication range in meters in related .cc files of the Inet 4.4 - omnet++

How to set communication range in meters in related .cc files of the Inet 4.4?
I don't mean in ini files such as omnetpp.ini.
For a more precise and clear explanation, when the following command in omnetpp.ini is run, which functions are called from which files to execute this command?
*.host*.wlan[0].radio.transmitter.communicationRange = 250m
Thanks in advance

Lines in an INI file are not commands. They are not executed. Instead whenever a module is initialized, it queries the WHOLE ini file and if it finds a matching line, it can read that value. If you are looking for the place where this INI file value is queried, the you should search the source for par("communicationRange"). You will find it most likely in a radio module's initialize() method.

Related

Combining object files to a single file

I have been using -r (relocatable) flag of arm-none-eabi-ld to combine a bunch of object files in a specific folder (using wildcard *.o). It gets one object file as I was expecting but I am having an issue.
In particular, after combining the files together I used Ghidra and loaded the assembly of the combined object file and a few individual object files that the combined file composed of. Upon inspecting the functions, I find functions from the individual object files that are having exactly same mnemonics with the (i.e there mnemonics and function size is same) functions from the combined object file and also there are some functions that having differences among the combined object file and the individual files. For example, there is a function called HAL_FLASH_ERASE which was having 630 bytes in the flash.o file while the same function was having the size of 396 in the combined.o file and the underline assembly instructions are different.
I had used this flag during the past and it was perfect and the functions were exactly similar and had one to one match between the combined object file with the individual object files. However, this time my directory contains large number of files in contrast to just few object files. I am thinking whether there is a possibility to linker to overwrite the different parts with the use of -relocatble flag. Other than this method, is there any other way that I could try to combine multiple object files without loosing the one to one mapping between them? Thanks for the time.

How to show redundant docs on multiple pages in read the docs

In our read the docs project we have a use case where we need to show some specific docs on multiple pages in the same version of docs. As of now, we do this either by one of the following ways
Copy-pasting the content to each page's rst file
Write it in one of the concerned files with a label and use :std:ref: in rest of the files to redirect it to the main file
I would want to achieve something like writing content only in one file and then showing it (without any redirection for user) in each of the files. Is it possible?
Use the include directive in the parent file.
.. include:: includeme.rst
Note that the included file will be interpreted in the context of the parent file. Therefore section levels (headings) in the included file must be consistent with the parent file, and labels in the included file might generate duplicate warnings.
You can use for this purpose the include directive.
Say that you write the text in dir/text.rst.
The following will include in other documents:
..include :: /dir/text.rst
where the path is either relative (then, with no slash) or absolute which is possible in sphinx (doc)
in Sphinx, when given an absolute include file path, this directive
takes it as relative to the source directory

How can I specify filesystem rules?

I have the following two challanges:
I'd like to assert that the filename of an xml file is always equal to a certain string in the file itself
I'd like to assert that in every folder called 'Foo' is a file called 'bar.xml'
How can I do this with sonar? Is there already a plugin for this available?
There's no plugin for that, you will have to write your own.
To do the first point, you can write a sensor that parses the XML files to find if the name of the files exists in the file itself, this should not be complicated.
For the second point, you would have to write a sensor that is executed only on folders.
You can check the "Extension Guide" documentation to find code samples on how to do that.

Prolog - Finding the current directory, relative directory for 'tell' predicate

I'm having trouble trying to figure out how to get prolog to spit out a text file where I want it to. I'm currently doing a bunch of operations and then using
tell('output.txt')
to record the output. Now the problem is that when I do this, it creates this file in the SWI \bin\ folder. I was wondering if there's a way to make it create this file in the directory containing the actual .pl file. So even if the file was moved (and it will be), the text file gets created right where the source file is.
Long story short, is there a way to get the location of the source file once the source file has been consulted?
Many Thanks!
You can get the names of all the loaded files using source_file/1.
From the SWI-Prolog manual:
source_file(?File)
True if File is a loaded Prolog source file. File is the absolute and
canonical path to the source-file.

What is the best way to edit the middle of an existing flat file?

I have tool that creates variables for a simulation. The current workflow involves hand copying those variables into the simulation input file. The input file is a standard flat file, i.e. not binary or XML. I would like to automate the addition of the variables to the flat input file.
The variables copy over existing variables in the file, e.g.
New Variables:
Length 10
Height 20
Depth 30
Old Variables:
...
Weight 100
Age 20
Length 10
Height 20
Depth 30
...
Would like to have the old variables copy over the new variable. They are 200 lines into the flat input file.
Thanks for any insights.
P.S. This is on Windows.
If you're stuck using flat, then you're stuck using the old fashioned way of updating them: read from original, write to temp file, either write the original row or change the data and then write that. To add data, write it to the temp file at the appropriate point; to delete data, simply don't copy it from the original file.
Finally, close both files and rename the temp file to the original file name.
Alternatively, it might be time to think about a little database.
For something like this I'd be looking at a simple template engine. You'd have a base template with predefined marker tokens instead of variable values and then just pass the values required to your engine along with the template and it will spit out the resultant file, all present and correct. There are a number of Open Source template engines available in Java that would meet your needs, I imagine such things are also available in your language of choice. You could even roll your own without too much difficulty.
Note that under Unix, one would probably look at using mmap() because you can then use functions such as memmove() to move the data around and add new data or truncate() the result if the file is then smaller (you may also want to use truncate() to grow the file).
Under MS-Windows, you have the MapViewOfFileEx() function to do the same thing. The API is different, though,
and I'm not exactly sure what happens or how to grow/shrink the file (MSDN also includes a truncate()-like function and maybe that works).
Of course, it's important to use memcpy() or memmove() properly to not overwrite the wrong data or go outside the buffer.

Resources