How do I get "xmi:type" value - acceleo

I am trying to generate a java program to implement a State Machine from a UML model using Acceleo.
In my model I have entries like:--
<subvertex xmi:type="uml:State" xmi:id="{BB1999-E740-4e7d-A1BE-F099BEXYD970}" name="WaitingApproval">
I want to check the value of "xmi:type" but I cannot work out how to access this from Acceleo. (I have tried every combination of gets I can think of and the type only appears as part of a longer string if I dump the whole vertex.)

If you are on the subvertex relation, you must be on a Region. The xmi:type is the way XMI handles polymorphic references. As subvertex is defined as Vertex [*], XMI must specify the type of each element in the collection. To check this field, you simply need to test the type of the element (using oclIsTypeOf or oclIsKindOf)
So, from a Region:
[template public test(r : Region)]
[r.subvertex->filter(State)/] --> filter all States from the subvertex collection
which is equ. to
[r.subvertex->select(oclIsKindOf(State))/]
and if you want only the State elements (no subclasses)
[r.subvertex->select(oclIsTypeOf(State))/]
[/template]
Also, you can handle them in different templates by adding template guard:
[template public test(r : Region)]
[r.subvertex.test2()/]
[/template]
[template public test2(s : Vertex) ? (oclIsKindOf(State))]
[s/] is a state for sure
[/template]
you can also avoid guard by rewriting the above templates as this:
[template public test(r : Region)]
[r.subvertex.test2()/]
[/template]
[template public test2(v : Vertex)/]
[template public test2(s : State)]
[s/] is a state for sure
[/template]
EDIT
If you absolutely want the type value in a String format, you have to go check the element metaclass and ask for its name:
...
[s.eClass().name/] -> result as String, s.eClass() gets the EClass
...

Related

What's the usage of org.springframework.data.repository.query.parser.Part?

As you can see in the title , I'd appreciate it if somebody can tell the usage of the Class .
There's a inside enum Type ,how to use it?
public static enum Type {
BETWEEN(2, "IsBetween", "Between"), IS_NOT_NULL(0, "IsNotNull", "NotNull"), IS_NULL(0, "IsNull", "Null"), LESS_THAN(
"IsLessThan", "LessThan"), LESS_THAN_EQUAL("IsLessThanEqual", "LessThanEqual"), GREATER_THAN("IsGreaterThan",
"GreaterThan"), GREATER_THAN_EQUAL("IsGreaterThanEqual", "GreaterThanEqual"), BEFORE("IsBefore", "Before"), AFTER(
"IsAfter", "After"), NOT_LIKE("IsNotLike", "NotLike"), LIKE("IsLike", "Like"), STARTING_WITH("IsStartingWith",
"StartingWith", "StartsWith"), ENDING_WITH("IsEndingWith", "EndingWith", "EndsWith"), NOT_CONTAINING(
"IsNotContaining", "NotContaining", "NotContains"), CONTAINING("IsContaining", "Containing", "Contains"), NOT_IN(
"IsNotIn", "NotIn"), IN("IsIn", "In"), NEAR("IsNear", "Near"), WITHIN("IsWithin", "Within"), REGEX(
"MatchesRegex", "Matches", "Regex"), EXISTS(0, "Exists"), TRUE(0, "IsTrue", "True"), FALSE(0, "IsFalse",
"False"), NEGATING_SIMPLE_PROPERTY("IsNot", "Not"), SIMPLE_PROPERTY("Is", "Equals");
// Need to list them again explicitly as the order is important
// (esp. for IS_NULL, IS_NOT_NULL)
private static final List<Part.Type> ALL = Arrays.asList(IS_NOT_NULL, IS_NULL, BETWEEN, LESS_THAN, LESS_THAN_EQUAL,
GREATER_THAN, GREATER_THAN_EQUAL, BEFORE, AFTER, NOT_LIKE, LIKE, STARTING_WITH, ENDING_WITH, NOT_CONTAINING,
CONTAINING, NOT_IN, IN, NEAR, WITHIN, REGEX, EXISTS, TRUE, FALSE, NEGATING_SIMPLE_PROPERTY, SIMPLE_PROPERTY);
...}
Part is internal to Spring Data. It is not intended to be used by client code. So if you don't implement your own Spring Data Modul you shouldn't use it at all nor anything inside it.
A Part is basically an element of an AST that will probably result in an element of a where clause or equivalent depending on the store in use.
E.g. if you have a method findByNameAndDobBetween(String, Date, Date) parsing the method name will result in two parts. One for the name condition and one for the DOB between condition.
The type enum lists all the different types of conditions that are possible.
The parameters of the elements are the number of method arguments required and (possibly multiple) Strings that identify this type inside a method name.

How to avoid the cache or concurrency in telerik data access ORM?

Now I want to calculate the average concentration of pm2.5 these days in each city through stored procedure .The first input parameter of the stored procedure is cityname string like Beijing,NewYork,and the other input parameters are beginTime and endTime.And the output class is called AvgPM25.
public class AvgPM25{
public String CityName{get;set;}
public decimal AvgValue{get;set;}
}
I pass parameters like Beijing,NewYork,20140801,20140802,it calculate and output the result.Now my problem is sometimes I try to change the parameters like search time,and the input parameters are like Beijing,NewYork,20130801,20140802,but it still output the result which is the previous search result.Is this problem due to the cache or concurrentcy?I try to set identity of 'CityName' of the AvgPM25 class property to be True,and set the cache policy to be NoCache,and set the concurrency mode to be Changed but still have this problem?How to fix it?
Although you change the search time,but the identity is not changed.Change the output class below may help you.Or you can try to set identity of the AvgPM25 class property to be True?
public class AvgPM25{
//identity:true,cache:default
public String CityName{get;set;}
public decimal AvgValue{get;set;}
//identity:true,cache:default,conbined with the two strings,like '2013080120140802'
public string SearchTIme{get;set;}
}

How to declare a value type in CIL: `.class value` or just `.class`?

I have taken a look at a C# struct FooStruct in ILDASM, and have seen the following:
ILDASM here displays two differing declarations:
one starting with .class value public (rear window & front window's title bar)
one starting with just .class public (front window)
And I wonder which syntax (if not both) is the correct one for declaring a value type? Is the value modifier strictly necessary, or optional, or a syntax error?
Short answer: Value type definitions only require extends [mscorlib]System.ValueType; the value attribute appears to be optional and has no apparent effect.
I assume that the CLI specification (ECMA-335) would be the best place to look for an authorative answer.
MUST a value type definition include the value attribute?
Section II.10 deals with defining types. More specifically, subsection II.10.1.3 states:
The type semantic attributes specify whether an interface, class, or value type shall be defined. … If [the interface] attribute is not present and the definition extends
(directly or indirectly) System.ValueType, and the definition is not for System.Enum, a value type shall be defined (§II.13). Otherwise, a class shall be defined (§II.11).
The value attribute is not mentioned at all in the whole section.
Conclusion: A correct value type definition does not have to include value. Deriving from System.ValueType is sufficient.
MAY a value type definition include the value modifier?
The CLI standard also contains a grammar for ILASM (in section VI.C.3). According to that grammar, there exists a value attribute for .class type definitions. I additionally searched the standard for concrete value type definitions and found these examples:
.class public sequential ansi serializable sealed beforefieldinit System.Double extends System.ValueType …
.class private sealed Rational extends [mscorlib]System.ValueType …
.class value sealed public MyClass extends [mscorlib]System.ValueType …
Conclusion: A value attribute may be included in a value type definition.
And what does the value attribute MEAN?
I tried to compile these three IL type definitions into an assembly:
.class public sealed … A extends [mscorlib]System.ValueType { … }
.class value public sealed … B extends [mscorlib]System.ValueType { … }
.class value public sealed … C extends [mscorlib]System.Object { … } // !!!
There was no compilation error, even though the value attribute is used in a reference type declaration (see last line). Looking at the resulting assembly using Visual Studio 2012's Object Browser reveals two value types (struct) A and B, and one reference type (class) C.
Speculation: The presence of the value attribute has no effect whatsoever on the type definition. It is only there as a potential aid for humans in spotting value type definitions.
This great book contains simple
answer: when you provide extends clause then value flag is ignored, but if you doesn't provide
extends and use value then ilasm will declare given type as value type.
In other words value was introduced as syntactic sugar, to quickly declare value type.

Multiple Custom Writable formats

I have multiple input sources and I have used Sqoop's codegen tool to generate custom classes for each input source
public class SQOOP_REC1 extends SqoopRecord implements DBWritable, Writable
public class SQOOP_REC2 extends SqoopRecord implements DBWritable, Writable
On the Map side, based on the input source, I create objects of the above 2 classes accordingly.
I have the key as type "Text" and since I have 2 different types of values, I kept the value output type as "Writable".
On the reduce side, I accept the value type as Writable.
public class SkeletonReduce extends Reducer<Text,Writable, Text, Text> {
public void reduce(Text key, Iterable<Writable> values, Context context) throws IOException,InterruptedException {
}
}
I also set
job.setMapOutputValueClass(Writable.class);
During execution, it does not enter the reduce function at all.
Could someone tell me if it possible to do this? If so, what am I doing wrong?
You can't specify Writable as your output type; it has to be a concrete type. All records need to have the same (concrete) key and value types, in Mappers and Reducers. If you need different types you can create some kind of hybrid Writable that contains either an "A" or "B" inside. It's a little ugly but works and is done a lot in Mahout for example.
But I don't know why any of this would make the reducer not run; this is likely something quite separate and not answerable based on this info.
Look into extending GenericWritable for your value type. You need to define the set of classes which are allowed (SQOOP_REC1 and SQOOP_REC2 in your case), and it's not as efficient because it creates new object instances in the readFields method (but you can override this if you have a small set of classes, just have instance variables of both types, and a flag which denotes which one is valid)
http://hadoop.apache.org/common/docs/r0.20.1/api/org/apache/hadoop/io/GenericWritable.html
Ok, I think I figured out how to do this. Based on a suggestion give by Doug Cutting himself
http://grokbase.com/t/hadoop/common-user/083gzhd6zd/multiple-output-value-classes
I wrapped the class using ObjectWritable
ObjectWritable obj = new ObjectWritable(SQOOP_REC2.class,sqoop_rec2);
And then on the Reduce side, I can get the name of the wrapped class and Cast it back to the original class.
if(val.getDeclaredClass().getName().equals("SQOOP_REC2")){
SQOOP_REC2temp = (SQOOP_REC2) val.get();
And don't forget
job.setMapOutputValueClass(ObjectWritable.class);

Gson, How to write a JsonDeserializer for Generic Typed Classes?

Situation
I have a class that holds a generic type, and it also has a non-zero arg constructor. I don't want to expose a zero arg constructor because it can lead to erroneous data.
public class Geometries<T extends AbstractGeometry>{
private final GeometryType geometryType;
private Collection<T> geometries;
public Geometries(Class<T> classOfT) {
this.geometryType = lookup(classOfT);//strict typing.
}
}
There are several (known and final) classes that may extend AbstractGeometry.
public final Point extends AbstractGeometry{ ....}
public final Polygon extends AbstractGeometry{ ....}
Example json:
{
"geometryType" : "point",
"geometries" : [
{ ...contents differ... hence AbstractGeometry},
{ ...contents differ... hence AbstractGeometry},
{ ...contents differ... hence AbstractGeometry}
]
}
Question
How can I write a JsonDeserializer that will deserialize a Generic Typed class (such as Geometires)?
CHEERS :)
p.s. I don't believe I need a JsonSerializer, this should work out of the box :)
Note: This answer was based on the first version of the question. The edits and subsequent question(s) change things.
p.s. I don't believe I need a JsonSerializer, this should work out of the box :)
That's not the case at all. The JSON example you posted does not match the Java class structure you apparently want to bind to and generate.
If you want JSON like that from Java like that, you'll definitely need custom serialization processing.
The JSON structure is
an object with two elements
element 1 is a string named "geometryType"
element 2 is an object named "geometries", with differing elements based on type
The Java structure is
an object with two fields
field 1, named "geometryType", is a complex type GeometryType
field 2, named "geometries" is a Collection of AbstractGeometry objects
Major Differences:
JSON string does not match Java type GeometryType
JSON object does not match Java type Collection
Given this Java structure, a matching JSON structure would be
an object with two elements
element 1, named "geometryType", is a complex object, with elements matching the fields in GeometryType
element 2, named "geometries", is a collection of objects, where the elements of the different objects in the collection differ based on specific AbstractGeometry types
Are you sure that what you posted is really what you intended? I'm guessing that either or both of the structures should be changed.
Regarding any question on polymorphic deserialization, please note that the issue was discussed a few times on StackOverflow.com already. I posted a link to four different such questions and answers (some with code examples) at Can I instantiate a superclass and have a particular subclass be instantiated based on the parameters supplied.

Resources