How to export a GMF diagram outside eclipse? - papyrus

One of the features of Papyrus that I find really useful is the ability to programmatically interrogate the UML models that it creates by using the UML2 runtime outside the Eclipse UI. This is great for running simple tools to, e.g., produce documentation using POI or write model driven configuration for the Talend MDM tool. However, while traversing and processing the model tree is easily achieved by loading up the resources in a resource set, manipulating the diagrams in the .notation files has proven to be more of a challenge.
I have got to the point (by examining the source for org.eclipse.papyrus.infra.export.ExportAllDiagrams) where I can load all the resources and find the Diagram elements from the .notation file thus:
File uml = new File(model + ".uml");
File di = new File(model + ".di");
File notation = new File(model + ".notation");
URI umlUri = URI.createFileURI(uml.getAbsolutePath());
URI diUri = URI.createFileURI(di.getAbsolutePath());
URI notationUri = URI.createFileURI(notation.getAbsolutePath());
final ModelSet resourceSet = new ModelSet();
resourceSet.getPackageRegistry().put(UMLPackage.eNS_URI, UMLPackage.eINSTANCE);
resourceSet.getPackageRegistry().put(NotationPackage.eNS_URI, NotationPackage.eINSTANCE);
resourceSet.getResourceFactoryRegistry().getExtensionToFactoryMap().put(Resource.Factory.Registry.DEFAULT_EXTENSION, new XMIResourceFactoryImpl());
resourceSet.getResourceFactoryRegistry().getExtensionToFactoryMap().put("notation", new XMIResourceFactoryImpl());
resourceSet.getResourceFactoryRegistry().getExtensionToFactoryMap().put(UMLResource.FILE_EXTENSION, UMLResource.Factory.INSTANCE);
try {
resourceSet.getLoadOptions().put(XMLResource.OPTION_DEFER_IDREF_RESOLUTION, true);
resourceSet.getLoadOptions().put(XMLResource.OPTION_DEFER_ATTACHMENT, true);
resourceSet.getResource(diUri, true);
resourceSet.getResource(umlUri, true);
resourceSet.getResource(notationUri, true);
List<Diagram> diagrams = new ArrayList<Diagram>();
for (Iterator<Notifier> i = resourceSet.getAllContents(); i.hasNext();) {
Notifier n = i.next();
if (n instanceof Diagram) {
diagrams.add((Diagram) n);
}
}
//export(diagrams);
} finally {
// Unload the resource set so that we don't leak loads of UML content in the CacheAdapter
unload(resourceSet);
}
However, the ExportAllDiagrams class ultimately uses org.eclipse.gmf.runtime.diagram.ui.render.util.CopyToImageUtil to render the diagram, at which point it fails because it relies on the DiagramUIRenderPlugin and DiagramUIRenderPlugin.getInstance() returns null.
I then had a look at org.eclipse.gmf.runtime.diagram.ui.render.clipboard.DiagramSVGGenerator but had similar problems with the need for various eclipse plugins to be initialised.
I have no experience of the Eclipse plugin system but I am assuming that that the platform loads and initialises plugins and, therefore, the approaches tried so far need to be running within Eclipse GUI environment in order to work. Is there any other method that could be used to easily render the diagrams to SVG without relying on the whole of the Eclipse runtime?

Related

How does one configure VSTS specific load test context parameters for own azure agents intelligently

Recently moved from utilising AWS to Azure for the location of our load test agents, thus making the transition to making full use of VSTS.
It was described that, for the moment, to get a load test file working with VSTS to using our own VMs for testing, we need to provide two context parameters, UseStaticLoadAgents and StaticAgentsGroupName in each loadtest file.
Our load test solution is getting very large, and we have multiple loadtest files where we have to set these two values each time. This leads us into the situation where, if we were to change our agents group name for example, we would have to update each individual load test file with the new information.
Im looking at a way to centralise this until a nicer way is implemented by Microsoft. The idea was to use a load test plugin, to add these context parameters with the plugin drawing the needed values from a centralised config file.
However, it seems that none of the hooks in the load test plugin or simply using the initialise method to manually set these values is working. Likely because they are set after full initialisation.
Has anyone got a nice, code focused solution to manage this and stop us depending on adding brittle values in the editor? Or even gotten the above approach to work?
The loadtest file is the XML file, so you can update it programmatically, for example:
string filePath = #"XXX\LoadTest1.loadtest";
XmlDocument doc = new XmlDocument();
doc.Load(filePath);
XmlNamespaceManager nsmgr = new XmlNamespaceManager(doc.NameTable);
nsmgr.AddNamespace("ns", "http://microsoft.com/schemas/VisualStudio/TeamTest/2010");
XmlNode root = doc.DocumentElement;
XmlNode nodeParameters = root.SelectSingleNode("//ns:RunConfigurations/ns:RunConfiguration[#Name='Run Settings1']/ns:ContextParameters", nsmgr);
if(nodeParameters!=null)
{
//nodeParameters.SelectSingleNode("//ns:ContextParameter[#Name='UseStaticLoadAgents']").Value = "agent1";
foreach (XmlNode n in nodeParameters.ChildNodes)
{
switch (n.Attributes["Name"].Value)
{
case "Parameter1":
n.Attributes["Value"].Value = "testUpdate";
break;
case "UseStaticLoadAgents":
n.Attributes["Value"].Value = "agent1";
break;
case "StaticAgentsGroupName":
n.Attributes["Value"].Value = "group1";
break;
}
}
}
doc.Save(filePath);

How do I validate XML against XSD (separate documents) in DNX Core 5.0 (ASP.NET 5)?

I am porting some code to ASP.NET 5, and want to target DNX Core 5.0. However, I am having trouble locating the types that are required to validate an XML document against an XSD document.
Here is the code:
var xsdStream = this.GetType().GetTypeInfo().Assembly.GetManifestResourceStream(xsdPath);
using (XmlReader xsd = XmlReader.Create(xsdStream))
{
XmlSchemaSet schema = new XmlSchemaSet();
schema.Add(null, xsd);
XmlReaderSettings xmlReaderSettings = new XmlReaderSettings();
xmlReaderSettings.ValidationType = ValidationType.Schema;
xmlReaderSettings.Schemas.Add(schema);
using (XmlReader xmlReader = XmlReader.Create(xmlPath, xmlReaderSettings))
{
try
{
while (xmlReader.Read());
}
catch (Exception ex)
{
throw new Exception(string.Format(Resources.Messages.XmlValidationFailed, xmlPath), ex);
}
}
}
As you can see, all I want is to stop on the first error and throw an exception indicating what the error is.
The problems are:
The XmlSchemaSet class doesn't exist in the System.Xml.Schema namespace (or anywhere else I have found).
The XmlReaderSettings.ValidationType and XmlReaderSettings.Schemas properties do not exist.
I checked the MSDN Documentation which has a slightly different approach. However, as before XmlSchemaSet doesn't exist, and neither does XDocument.Validate(). I have also searched several of the ASP.NET projects for an example but can't seem to find any.
What facilities (if any) exist in DNX Core 5.0 to validate XML against an XSD? I would prefer to do this using streams if possible, but if absolutely necessary I will accept an approach that reads the entire documents into memory at once.
There is no support for XSD in the first release. When I heard right in one of the tweets, posts, bugs or community standups they do, it is considered for a later release.
ps: Pawel should answer this and get the credits ... but we should close this question.

Save image (via ImageWriter / FileImageOutputStream) to the filesystem without use of a File object

As a learning task I am converting my software I use every day to NIO, with the somewhat arbitrary objective of having zero remaining instances of java.io.File.
I have been successful in every case except one. It seems an ImageWriter can only write to a FileImageOutputStream which requires a java.io.File.
Path path = Paths.get(inputFileName);
InputStream is = Files.newInputStream(path, StandardOpenOption.READ);
BufferedImage bi = ImageIO.read(is);
...
Iterator<ImageWriter> iter = ImageIO.getImageWritersBySuffix("jpg");
ImageWriter writer = iter.next();
ImageWriteParam param = writer.getDefaultWriteParam();
File outputFile = new File(outputFileName);
ImageOutputStream ios = new FileImageOutputStream(outputFile);
IIOImage iioi = new IIOImage(bi, null, null);
writer.setOutput(ios);
writer.write(null, iioi, param);
...
Is there a way to do this with a java.nio.file.Path? The java 8 api doc for ImageWriter only mentions FileImageOutputStream.
I understand there might only be a symbolic value to doing this, but I was under the impression that NIO is intended to provide a complete alternative to java.io.File.
A RandomAccessFile, constructed with just a String for a filename, can be supplied to the ImageOutputStream constructor constructor.
This doesn't "use NIO" any more than just using the File in the first place, but it doesn't require File to be used directly..
For direct support of Path (or to "use NIO"), the FileImageOutputStream (or RandomAccessFile) could be extended, or a type deriving from the ImageOutputStream interface created, but .. how much work is it worth?
The intended way to instantiate an ImageInputStream or ImageOutputStream in the javax.imageio API, is through the ImageIO.createImageInputStream() and ImageIO.createImageOutputStream() methods.
You will see that both these methods take Object as its parameter. Internally, ImageIO will use a service lookup mechanism, and delegate the creation to a provider able to create a stream based on the parameter. By default, there are providers for File, RandomAccessFile and InputStream.
But the mechanism is extendable. See the API doc for the javax.imageio.spi package for a starting point. If you like, you can create a provider that takes a java.nio.Path and creates a FileImageOutputStream based on it, or alternatively create your own implementation using some more fancy NIO backing (ie. SeekableByteChannel).
Here's source code for a sample provider and stream I created to read images from a byte array, that you could use as a starting point.
(Of course, I have to agree with #user2864740's thoughts on the cost/benefit of doing this, but as you are doing this for the sake of learning, it might make sense.)

Reload rb scripts from different locations in JRuby

BACKGROUND:
I am using JRuby in an eclipse plugin for my product. I have a bunch of scripts that define a DSL and perform operations for me. I want to be able to dynamically reload these scripts whenever required. The scripts could change themselves on file system and moreover the location of the scripts could also change. I could even have multiple copies on file system of slightly modified/changed scripts. Each time I want scripts from a specific location to be utilized.
As I have understood so far, using "load" instead of "require" should do the job. So now if before calling any Ruby methods/functions I use "load 'XXX.rb'", it will reload the XXX.rb utilizing the new changes.
PROBLEM:
In my code I am using ScriptingContainer to run scriplets to access ruby functions. I set load paths on this scripting container to indicate from which locations the scripts should be loaded. However, the problem is that on subsequent calls and even with different instances of ScriptingContainer, I have noticed that the scripts that were loaded the first time are utilized every time. "load" reloads them, but after loading those scripts once, the next time I might need to load similar scripts from a different location but its not happening.
My assumption was that utilizing a different scripting container instance should have done the job but it seems that the load paths are globally set somewhere and calling "setLoadPath" on new ScriptingContainer instances either does not modify existing paths or only appends. If the latter case is true then probably when searching for scripts they are always found on oldest paths set and newer load paths get ignored.
Any ideas???
The solution is to specify scope for a ScriptingContainer instance when creating it. One of the ScriptingContainer constructors takes in a parameter of type LocalContextScope, use one of the constants to define the scope. See LocalContextScope.java
To test this defect and solution I have written a small snippet. You may try it out:
public class LoadPathProblem {
public static void main(String[] args) {
// Create the first container
ScriptingContainer c1 = new ScriptingContainer();
// FIX ScriptingContainer c1 = new ScriptingContainer(LocalContextScope.SINGLETHREAD);
// Setting a load path for scripts
String path1[] = new String[] { ".\\scripts\\one" };
c1.getProvider().setLoadPaths(Arrays.asList(path1));
// Run a script that requires loading scripts in the load path above
EvalUnit unit1 = c1.parse("load 'test.rb'\n" + "testCall");
IRubyObject ret1 = unit1.run();
System.out.println(JavaEmbedUtils.rubyToJava(ret1));
// Create the second container, completely independent of the first one
ScriptingContainer c2 = new ScriptingContainer();
// FIX ScriptingContainer c2 = new ScriptingContainer(LocalContextScope.SINGLETHREAD);
// Setting a different load path for this container as compared to the first container
String path2[] = new String[] { ".\\Scripts\\two" };
c2.getProvider().setLoadPaths(Arrays.asList(path2));
// Run a script that requires loading scripts in the different path
EvalUnit unit2 = c2.parse("load 'test.rb'\n" + "testCall");
IRubyObject ret2 = unit2.run();
/*
* PROBLEM: Expected here that the function testCall will be called from
* the .\scripts\two\test.rb, but as you can see the output says that
* the function was still called from .\scripts\one\test.rb.
*/
System.out.println(JavaEmbedUtils.rubyToJava(ret2));
}
}
Test scripts to try out the above code can be in different folders but with the same filename ("test.rb" for the above example:
./scripts/one/test.rb
def testCall
"Called one"
end
./scripts/two/test.rb
def testCall
"Called two"
end

Visual Studio 2008 Add-In Check if Hierarchy Item is solution folder

I've got a visual studio addin written by developer who is no longer at the company and have no idea how to debug it. But I want to add a feature so it can recurse into solution folders.
Sounds simple but I'm not sure the api allows testing for this?
Well there's got to be a way because AnkhSVN and VisualSVN work fine with Solution Folders.
StackOverflow I'm reaching out for some help on this issue.
Thanks
Notes
-We are using solution folders to hide "Dependency Projects" which are basically a list of project references that we probably don't care about in the particular solution and want to hide by default.
public class Connect : IDTExtensibility2, IDTCommandTarget
{
public void GetProjectLocations(DTE2 dte)
{
UIHierarchy UIH = dte.ToolWindows.SolutionExplorer;
try
{
UIHierarchyItem UIHItemd = UIH.UIHierarchyItems.Item(1);
}
catch (Exception E)
{
Debug.Write(E);
}
UIHierarchyItem UIHItem = UIH.UIHierarchyItems.Item(1);//this looks suspect to me
// Iterate through first level nodes.
for (int i = 1; i <= UIHItem.UIHierarchyItems.Count; i++)
{
Project TempGeneralProjObj = dte.Solution.Item(i);
if (TempGeneralProjObj.Kind == PrjKind.prjKindCSharpProject)
{
}
}
}
}
So far from my tests it appears that solution folders will cast to type Project surprisingly and once that is done the Project.ProjectItems property will hold a list of Projects that may exists underneath that folder. So in short, this is one way to at least get information about how things are structured. The problem however is that each ProjectItem located underneath a solution folder appears to cast find to type ProjectItem but doesn't seem to be able to be cast to a Project.
This is how I'm currently detecting a solution folder in my loop.
if(project.Kind == "{66A26720-8FB5-11D2-AA7E-00C04F688DDE}")
{
// TODO: Do your thing
}
This has also been frustrating me and I've also noticed a bug in how ActiveReports handles solution folders as well which is related to this same issue.
UPDATE!
Ok so I found the solution but I can't claim it 100% because I found most of it at Macaw's Blog.
So it appears that my original findings were right on however in order to get the actual project type for each ProjectItem under the solution item you need to look under the ProjectItem.SubProject property.
Now Macaw takes a recursive approach to walking the project structure which I think I would also normally recommend however in my case I wanted a single method implementation to simply log out an XML representation of the project for simple research purposes so I ended up using a Stack implementation. For reference you can find my code below which is successfully handling at least one level of solution folders full of projects only and no other specialty solution items.
XElement rootNode = new XElement("Solution");
rootNode.Add(new XAttribute("Name", _applicationObject.Solution.FullName));
Stack<Project> projectStack =
new Stack<Project>(_applicationObject.Solution.Projects.Cast<Project>());
while(projectStack.Count > 0)
{
var project = (Project)projectStack.Pop();
var solutionItemName = "Project";
if(project.Kind == "{66A26720-8FB5-11D2-AA7E-00C04F688DDE}")
{
foreach(ProjectItem innerProject in project.ProjectItems)
{
if(innerProject.SubProject != null)
{
projectStack.Push(innerProject.SubProject);
}
}
solutionItemName = "Folder";
}
var projectNode = new XElement(
solutionItemName,
new XAttribute("Name", project.Name),
new XAttribute("Kind", project.Kind)
);
rootNode.Add(projectNode);
foreach(ProjectItem item in project.ProjectItems)
{
var itemNode = new XElement("Item", new XAttribute("Name", item.Name));
projectNode.Add(itemNode);
if(item.Properties == null)
{
continue;
}
foreach(Property property in item.Properties)
{
var propertyNode = new XElement(property.Name, property.Value);
itemNode.Add(propertyNode);
}
}
}
By the fact of this post and by apparent bugs in other Add-ins it is apparent that this isn't the most intuitive design but thats what we have to live with.
To debug a Visual Studio add-in, load the source code into a copy of visual studio that is not running the add-in. Then, configure the project to start a second copy of visual studio when you "run" the project, that second copy will then run with the first able to breakpoint and debug it.
Make sure you have a batch file (or equivalent) to clean up, so that you can always get back to running VS without the plugin.
Useful resources ...
How to debug a Visual Studio .NET 2005 Add-In
Walkthrough: Debugging an Add-in Project

Resources