External Omnet++ Project as Feature - omnet++

I have two Omnet++ projects A and B. A currently needs B. Is it possible to declare B as a feature of A somehow, so I can turn it on and off? I want to have the separate projects as B can be reused in other projects. I added a feature using a new .oppfeatures file in A and I added corresponding ifdef statements in the C++ code. Currently I struggle with the ned file:
import namespaceB.B;
network Network
{
parameters:
[...]
bool bDisabled = default(false);
submodules:
[...]
b: B if !bDisabled;
}
How can I conditionally import the ned file of B? If I use a wildcard for importing Omnet complains about "no such module type" in the submodule declaration. Is my idea to have an external project as feature possible at all? Any other idea how I can accomplish this (maybe a git submodule or something)?
Thanks!

Try to follow the pattern what INET is using with regards to the TCP implementations. INET has it's own TCP module (in the TCP_INET feature) and also two alternative implementation defined by the TCP_lwIP and TCP_NSC features. Take a look at the src/makefrags file to check the wizardry how to add compiler and linker flags based on the enablement of a particular feature. e.g.:
WITH_TCP_LWIP := $(shell (cd .. && ./bin/inet_featuretool -q isenabled TCP_lwIP && echo enabled) )
ifeq ($(WITH_TCP_LWIP), enabled)
INCLUDE_PATH += -Iinet/transportlayer/tcp_lwip/lwip/include -Iinet/transportlayer/tcp_lwip/lwip/include/ipv4 -Iinet/transportlayer/tcp_lw
endif
This code checks whether the TCP_lwIP feature is enabled and adds the necessary include paths. You can add also linker flags if you have to link with the external project. (you can add compiler and linker flags also in the oppfeatures file if you don't need anything complicated )
Now, how to deal with the optional NED types? Also take a look at how the TCP module is defined in the StandardHost:
tcp: <default(firstAvailable("Tcp", "TcpLwip", "TcpNsc"))> like ITcp
As the given module is optional, you always have to have "something" that could be used instead of it. At least a dummy module. Now the there is an ITcp interface module (that is always present) and all the different TCP implementations "implement" it. What you see here is that tcp is a module which looks like the ITcp module and during the build up, the actual implementation will be either Tcp, TcpLwip or TcpNsc in this particular order. Whichever is present actually in the ned type tree. By disabling a feature, we actually remove the given ned type from the ned path too (the feature excludes the given NED package that contains the implememtation).
from the OMNeT++ API (NED functions):
string firstAvailable(...)
Accepts any number of strings, interprets them as NED type names (qualified or unqualified), and returns the first one that exists and its C++ implementation class is also available. Throws an error if none of the types are available.
This NED function was specifically invented and added to OMNeT++ for these purposes (to scan for the presence of a given NED type)

Related

Compiling proto2 syntax file with proto3 compiler

I have a proto file written in proto2 syntax. I am compiling this proto file using proto3 compiler. Although it bulids successfully, it shows the following error on runtime. Can someone please help me.
[libprotobuf FATAL google/protobuf/extension_set.cc:102] Multiple extension registrations for type "x.y.z.a", field number 200.
terminate called after throwing an instance of 'google::protobuf::FatalException'
what(): Multiple extension registrations for type "x.y.z.a", field number 200.
The error indicates that, somehow, your program has two copies of the definition for this extension. This is probably not protoc's fault, but rather some bug in the way your program is being built.
Here's my theory: You proto file has been separately compiled and linked into two different components/libraries, that are both then being loaded into the same program. One of these components is yours, the other is someone else's that shares the same protocol. The other component was already using protobuf 3.5.1 before, but yours was using 2.3.0. This means you actually had two copies of libprotobuf in your program. Because of this, the two copies of the extension were loaded using different copies of libprotobuf, therefore there was no error. But now you've switched your component to use protobuf 3.5.1, and so now only one copy of libprotobuf is being loaded, and both copies of the proto file are being loaded into that one copy of libprotobuf. So now, you get an error.
To fix this, you need to make sure that your program contains exactly one compiled copy of every proto file. If two components need to share a protocol, then that protocol needs to be factored out into a separate component to be shared.
It sounds like you have a message x.y.z.a, and you have multiple places where you define an extension with id 200 for it.
So something like this:
package x.y.z;
message a {
extensions 200 to 255;
}
extend a {
optional int32 foo = 200;
}
extend a {
optional int32 bar = 200;
}
So check for such duplicated extensions, which could be defined in multiple files.

how to check for a macro defined in a c file in Makefile?

I have a #define ONB in a c file which (with several #ifndef...#endifs) changes many aspects of a programs behavior. Now I want to change the project makefile (or even better Makefile.am) so that if ONB is defined and some other options are set accordingly, it runs some special commands.
I searched the web but all i found was checking for environment variables... So is there a way to do this? Or I must change the c code to check for that in environment variables?(I prefer not changing the code because it is a really big project and i do not know everything about it)
Questions: My level is insufficient to ask in comments so I will have to ask here:
How and when is the define added to the target in the first place?
Do you essentially want a way to be able to post compile query the binaries to to determine if a particular define was used?
It would be helpful if you could give a concrete example, i.e. what are the special commands you want run, and what are the .c .h files involved?
Possible solution: Depending on what you need you could use LLVM tools to maybe generate and examine the AST of your code to see if a define is used. But this seems a little like over engineering.
Possible solution: You could also use #includes to pull in .c or header files and a conditional error be generated, or compile (to a .o), then if the compile fails you know it is defined or not. But this has it's own issues depending on how things are set-up in your make file.

autoconf computed include file name

Would anybody know if there's any way of computing the name of an included file in autoconf ?
I have a project capable of building one of several variants currently based on an identity defined in configure.ac - my aim is to be able to identify the variant from the CLI when autoconf/configure is run and include an m4 file if/as appropriate. AFAICT, only string literals are allowed as the filename in either the include or sinclude macro calls ... and it's now starting to drive me to distraction =:-O
Any help most gratefully received
DP
The identity is setup at configure runtime, not when configure is created. Any m4 level (e.g. m4sugar) manipulation of configure will happen before then, so this doesn't look possible.

Windows Static Library with Default Functions

I would like to create a static library (.lib) in Windows that can be used in subsequent builds as a "backup" for undefined functions.
For instance let's say I have foobar.lib which has definitions for:
FOO
BAR
and I have some other program that defines FOO only and is to be built into a DLL that must export FOO and BAR. I want to be able to use foobar.lib to automatically export the default definition of BAR in the resulting DLL (and ignore the definition of FOO in foobar.lib).
I have tried sending foobar.lib to the linker but I get a multiple defined symbols error (/FORCE is supposed to override that but with strong warnings that it probably won't work as expected). I've also tried using /NODEFAULTLIB:foobar.lib but then it completely ignores the library and says BAR is undefined.
I am almost 100% certain there is a way to do this because I use an application that does this (Abaqus) to allow users to write plug-ins and not have to define all of the required exports for the plug-in DLL. And they do not use the /FORCE option.
I figured out a solution (not sure if it is the only or best solution).
I was trying to define FOO and BAR in foobar.lib using one object file (foobar.obj). If I split it up into foo.obj and bar.obj and then use those to create foobar.lib the linker can effectively ignore the appropriate .obj files.
So the short answer is: one function per object file for the static library.

Can we add a specific CFLAG to a eCos package when an cdl_option was enabled?

Say we have a package named CYGPKG_FOO, which has a cdl_option "CYGPKG_FOO_FEATURE_A_ENABLE". I want a specific gcc flags (e.g. "-DFEATURE_A=1") added to the CFLAGS of this package when this cdl_option was enabled.
But "The eCos Component Writer's Guide" said:
http://ecos.sourceware.org/docs-3.0/cdl-guide/build.make.html#BUILD.FLAGS
From the link above, it looks like we can't add/remove CFLAGS based on cdl_option selections...
So my question here is: can I do what was described in first paragraph, and if yes, how?
Thanks!
-DFEATURE_A=1 is just the same as writing #define FEATURE_A 1 in a source or header file.
When CYGPKG_FOO_FEATURE_A_ENABLE is set, by this causes some preprocessor symbols to be set in the autogenerated include files. Check out the install/include/pkgconf/ directory after running ecosconfig. You can add further define lines to your CDL to cause further symbols to be defined, if the defaults are not to your liking.
If you want to precisely control what values those symbols take, you may be able to do so with a further cdl_option, perhaps with a legal_values or a calculated directive; if not, you can add suitable defines in a header file inside your package which switch on the presence or absence of a preprocessor symbol.

Resources