I want to provide a solution for building our large distributed control system. The current implementation is written in C++. I need to rewrite it again.
I have several questions:
The system should have hot-plugin feature, I don't know whether
it exists some OSGi implementations to support C++ programming model
Which ESB could be better if consider real-time and flexible
routing, since large volume messages will be transferred quickly
between nodes?
Since integration is very important in our system, which MOM can
be used to build my ESB according to real-time and flexible routing
constraint?
Which open source SCA implementation is suitable for C++
programming model?
Hope your answers eagerly!
Thanks very much!
If you require a C++ runtime, I would look at Trentino (http://trentino.sourceforge.net/), which is sponsored by Siemens.
There are a number of Java=based runtimes. One that supports dynamic deployment of contributions is Fabric3 (www.fabric3.org).
Related
I'm working on a site which allows contractors to upload blueprints and place various assemblies on it for estimating purposes. I'd like to use an algorithm so that if they place an assembly on a blueprint, and click on the assembly, they can see other assemblies that are most commonly used with that one (e.g. if they place a breaker box it could show that 50' of electrical wire is commonly also used) based using data from all the blueprints currently saved in the system.
Based on the research I've done, it appears that affinity analysis would be the best way to determine which other assemblies are most commonly used with each assembly. I was wondering if anyone can provide any feedback if this would be the best algorithm to use or if there is a better one. All the blueprint/assembly data is stored in an MS SQL database and the backend of the website is written in C# with the data provided to the site via a REST service.
Any information or suggestions would be greatly appreciated.
Thank you in advance.
"counting"
The problem, as you described it, does not require any particular algorithm. It is a simple counting operation, and can be performed by any decent database using COUNT(), GROUP_BY, and WHERE.
Is it good architecture of an application if,
I am using multiple technologies leveraging strong points of each.
for example:
Encryption in python,
integration of services in java etc.
or should I stick to one technology like Java as I am comfortable with it?
Also the reason for this question is I am thinking of developing a new application in which speed is a major concern, I am targeting to attain.
Also Database that I am preferring for now is MongoDb.
Any suggestions on the Technologies apart from these technologies?
Also will this approach help in speeding up the application?
Writing the main application in one language only is a easier approach than dividing your application and attempting to write pieces in each language that is best suited for the task, unless you are fluent in a few languages and the ones chosen are particularly suited to specific groups of tasks that make up parts of the functionality.
Because MongoDB has a Java Driver there's nothing wrong with writing your main application in Java and relying on libraries written in other languages (MongoDB is written in C++, C and JavaScript).
As long as other works you need to rely on are well maintained there's no reason to switch from your preferred language to match what any of your libraries are using.
If you add artificial intelligence to your program in the future and part of the code is to run on a GPU you are forced to have a program that is a hybrid; learning a new language along with the details of the underlying algorithms is certainly more of a burden than learning the API.
Decide where to draw the line, what you will write in your preferred language and what will be written by others. It's certainly better to choose libraries and programs that you interface with written in languages you understand (assuming that they are open source). If what you interface with has no source available it becomes a 'black box' which simply must work, there are occasions when that is acceptable and occasional when there is no choice.
I am working on a project where I'd like to develop some static source code analysis tools. The source code will be in multiple proprietary languages that interact with one another. So, I am looking for a project that defines an abstract Model/AST and can do some data flow analysis for languages where I can translate each proprietary language into the Model and be able to analyze the data flow/tree.
Does such a project exist?
Not open source, but designed and proven useful for building tools to handle multiple, complex langauges: our DMS Software Reengineering Toolkit.
DMS contains strong parsing machinery (capable of handling difficult languages such as C++) that builds ASTs automatically from just a grammar description, and libraries to support construction of symbol tables, and various kinds of control and data flow analysis.
OP will have to provide grammar and semantical descriptions of his proprietary languages, but I think he is expecting that. If he wants to model flows across the languages, he'll have to organize his flow analyses for the individual languages to be compatible. The fact that DMS uses uniform infrastructure/datastructures to support all these activities even for different langauges will make this easier.
He should not expect a project involving multiple languages to be easy or quick, regardless of the framework he finds. Our intention with DMS was to make this practical.
I think the Object Management Group's (OMG) Specification for the Knowledge Discovery Metamodel (KDM) is kind of in the space you're looking for. (See http://www.omg.org/spec/KDM/). It's part of the Architecture Driven Modernization (ADM) activity at the OMG. KDM has been republished by ISO as ISO/IEC 19506:2012(E).
From the introduction:
This International Standard defines a meta-model for representing existing software assets, their associations, and
operational environments, referred to as the Knowledge Discovery Meta-model (KDM).
You'll likely have to do most of the heavy lifting yourself, but at least the metamodel has been provided.
More as a sidemark: If you are not too much interested in syntactic details and have the free choice of your platform, you might as well analyze code for a VM, like .Net bytecode.
There are compilers for C# and F# and also C++(/CLI) and Visual Basic (of course most of them from a well-known, large software company :-) )
They all compile to bytecode programs, which can be inspected e.g. by tools like Mono.Cecil, which allow to construct control flow graphs etc.
I'm a fairly advanced hobby programmer. I consider myself capable at Objective-C, Java, some straight C, Python, and general MVC design.
I've written quite a few programs but they have all been relatively self-contained, using external libraries occasionally.
When reading about larger projects, and/or more complicated programs, I hear a lot of language thrown around about "Writing one part in X, and writing this part in Y."
Since I have a lack of experience with this, I was wondering if someone could point me in the right direction. What general designs/mechanisms are employed for applications or projects written in more than one language? What is involved in a "scriptable" design?
Thanks for any guidance on the topic!
-Chase
There is no single "right way". A multitude of approaches exist, including the .NET-way, where all the languages are hosted inside a common runtime environment with well-specified interoperability constraints, and a good old Unix-way, where all the components are supposed to communicate via pipes or sockets, using simple text-based protocols.
For the latter you can read a classic book: http://en.wikipedia.org/wiki/The_Unix_Programming_Environment
Depends on what you need to do. For example if you want to build a poker game online then, most probably you would use java for the application and flash/flex for the interface. Java has the power of the libraries and the flash/flex are quite generally available and offer a rich interface.
If you have a software that receives input from an online application and offers output on a specific output (label printer for example) then your online-ready software (Java/PHP/Python) would best communicate with a specially designed program on the target computer. A program for which I'd use C++ for it's technical power, rigurosity and speed compared to java.
The idea is to identify the languages that suit your purpose best. In my opinion it is ideal that you use one language to do all the stuff, that is why I like java as it seems to fit everything although it has a more or less bad renown for slowness.
I see things in a kind of this way:
1. Engineered, machine oriented stuff then it is C++ (and languages of it's kind)
2. Mobile multifunctional stuff (middle-ware mainly) Java
3. Online , browser based stuff PHP especially for B2C(people oriented) applications
4. Python,Ruby etc are from my point of view somewhere between java and PHP but I never really worked with them so I can not give an exact opinion
You can link them together depending on your needs.
Are there any client-server frameworks similar to SETI available ?
I have such client-server model, where volunteers sign up as client (agent or node, call it whatever) and give their idle computing resources.
So I will need to write a framework to distribute and track the work-units (or jobs) given to agents.
Is there any such FW available which i could go for. Then I save time to write the job processing logic etc.
Further, I hope the framework will also support the OS compatibility issues, agent binaries updates etc.
Pl. give any other suggestions in general on such distributed computing project you think I should investigate.
Look at BOINC, which is a general framework for handling SETI style stuff.
Edit to expand: in fact, iirc BOINC is a spinoff of SETI. It'll probably handle all of your requirements.