BACKGROUND:
I am using JRuby in an eclipse plugin for my product. I have a bunch of scripts that define a DSL and perform operations for me. I want to be able to dynamically reload these scripts whenever required. The scripts could change themselves on file system and moreover the location of the scripts could also change. I could even have multiple copies on file system of slightly modified/changed scripts. Each time I want scripts from a specific location to be utilized.
As I have understood so far, using "load" instead of "require" should do the job. So now if before calling any Ruby methods/functions I use "load 'XXX.rb'", it will reload the XXX.rb utilizing the new changes.
PROBLEM:
In my code I am using ScriptingContainer to run scriplets to access ruby functions. I set load paths on this scripting container to indicate from which locations the scripts should be loaded. However, the problem is that on subsequent calls and even with different instances of ScriptingContainer, I have noticed that the scripts that were loaded the first time are utilized every time. "load" reloads them, but after loading those scripts once, the next time I might need to load similar scripts from a different location but its not happening.
My assumption was that utilizing a different scripting container instance should have done the job but it seems that the load paths are globally set somewhere and calling "setLoadPath" on new ScriptingContainer instances either does not modify existing paths or only appends. If the latter case is true then probably when searching for scripts they are always found on oldest paths set and newer load paths get ignored.
Any ideas???
The solution is to specify scope for a ScriptingContainer instance when creating it. One of the ScriptingContainer constructors takes in a parameter of type LocalContextScope, use one of the constants to define the scope. See LocalContextScope.java
To test this defect and solution I have written a small snippet. You may try it out:
public class LoadPathProblem {
public static void main(String[] args) {
// Create the first container
ScriptingContainer c1 = new ScriptingContainer();
// FIX ScriptingContainer c1 = new ScriptingContainer(LocalContextScope.SINGLETHREAD);
// Setting a load path for scripts
String path1[] = new String[] { ".\\scripts\\one" };
c1.getProvider().setLoadPaths(Arrays.asList(path1));
// Run a script that requires loading scripts in the load path above
EvalUnit unit1 = c1.parse("load 'test.rb'\n" + "testCall");
IRubyObject ret1 = unit1.run();
System.out.println(JavaEmbedUtils.rubyToJava(ret1));
// Create the second container, completely independent of the first one
ScriptingContainer c2 = new ScriptingContainer();
// FIX ScriptingContainer c2 = new ScriptingContainer(LocalContextScope.SINGLETHREAD);
// Setting a different load path for this container as compared to the first container
String path2[] = new String[] { ".\\Scripts\\two" };
c2.getProvider().setLoadPaths(Arrays.asList(path2));
// Run a script that requires loading scripts in the different path
EvalUnit unit2 = c2.parse("load 'test.rb'\n" + "testCall");
IRubyObject ret2 = unit2.run();
/*
* PROBLEM: Expected here that the function testCall will be called from
* the .\scripts\two\test.rb, but as you can see the output says that
* the function was still called from .\scripts\one\test.rb.
*/
System.out.println(JavaEmbedUtils.rubyToJava(ret2));
}
}
Test scripts to try out the above code can be in different folders but with the same filename ("test.rb" for the above example:
./scripts/one/test.rb
def testCall
"Called one"
end
./scripts/two/test.rb
def testCall
"Called two"
end
Related
I'm trying to delay the construction of every 'page' (i.e. a Wt::WWidget inside my global Wt::WStackedWidget), until it is needed. Therefore I'm using a method similar to the DeferredWidget of the Widget Gallery example of Wt.
However, when I load a library using require, the execution of javascript code is not delayed until the library is loaded, when the content is not loaded with the first request (f.ex. inside WWidget::load()), i.e. running the following code
wApp->require("myLibrary.js"); // defines function MyFunction ();
doJavaScript ("MyFunction ();");
runs without error when it is requested on the first loaded page, but when the content is loaded after a user event, the following javascript error occurs:
MyFunction is not defined
Question: How should I overcome this error or how should I correctly delay the loading of my (large) javascript library until needed?
Further research
Inspecting the source code of WebRenderer::collectJS:
Javascript updates seems to be performed before loading new libraries:
// Executing javascript updates, including doJavaScript calls.
for (unsigned i = 0; i < changes.size(); ++i) {
changes[i]->asJavaScript(sout, DomElement::Priority::Update);
delete changes[i];
}
...
// Loading new libraries.
int librariesLoaded = loadScriptLibraries(*js, app);
Shouldn't the javascript update being delayed until the new libraries are loaded?
Further research - Part 2
Executing javascript code (which may depend on required libraries) is delayed at two different places, i.e. inside
WebRenderer::collectJavaScript: delays execution of all javascript code (including invisible changes) until all old required libraries (excluding newly required libraries f.ex. inside WWidget::load) are loaded.
WebRenderer::collectJS: delays execution of some javascript code until all required libraries (including newly required libraries f.ex. inside WWidget::load) are loaded.
I am not sure with the javascript scriploader behavior. But in my wt experience i make it append in this way.
1) My javascript library is load in my main page at start with require.
2) If i need later some new function, i write it in my script string like this :
string javacode = "function MyTest ( ) { "
"alert('test') ; }"
"MyTest();"
doJavaScript ( javacode ) ;
If you want load some javascript file and run some function after it is load you schould make the require in the contructor of your container class.
Then you derived the function bool Wt::WCompositeWidget::loaded()
and put in this function your dojavaScript...
Recently moved from utilising AWS to Azure for the location of our load test agents, thus making the transition to making full use of VSTS.
It was described that, for the moment, to get a load test file working with VSTS to using our own VMs for testing, we need to provide two context parameters, UseStaticLoadAgents and StaticAgentsGroupName in each loadtest file.
Our load test solution is getting very large, and we have multiple loadtest files where we have to set these two values each time. This leads us into the situation where, if we were to change our agents group name for example, we would have to update each individual load test file with the new information.
Im looking at a way to centralise this until a nicer way is implemented by Microsoft. The idea was to use a load test plugin, to add these context parameters with the plugin drawing the needed values from a centralised config file.
However, it seems that none of the hooks in the load test plugin or simply using the initialise method to manually set these values is working. Likely because they are set after full initialisation.
Has anyone got a nice, code focused solution to manage this and stop us depending on adding brittle values in the editor? Or even gotten the above approach to work?
The loadtest file is the XML file, so you can update it programmatically, for example:
string filePath = #"XXX\LoadTest1.loadtest";
XmlDocument doc = new XmlDocument();
doc.Load(filePath);
XmlNamespaceManager nsmgr = new XmlNamespaceManager(doc.NameTable);
nsmgr.AddNamespace("ns", "http://microsoft.com/schemas/VisualStudio/TeamTest/2010");
XmlNode root = doc.DocumentElement;
XmlNode nodeParameters = root.SelectSingleNode("//ns:RunConfigurations/ns:RunConfiguration[#Name='Run Settings1']/ns:ContextParameters", nsmgr);
if(nodeParameters!=null)
{
//nodeParameters.SelectSingleNode("//ns:ContextParameter[#Name='UseStaticLoadAgents']").Value = "agent1";
foreach (XmlNode n in nodeParameters.ChildNodes)
{
switch (n.Attributes["Name"].Value)
{
case "Parameter1":
n.Attributes["Value"].Value = "testUpdate";
break;
case "UseStaticLoadAgents":
n.Attributes["Value"].Value = "agent1";
break;
case "StaticAgentsGroupName":
n.Attributes["Value"].Value = "group1";
break;
}
}
}
doc.Save(filePath);
I'm currently using MvvmCross DownloadCache -- and it's working alright -- especially nice when I just need to drop in an Image URL and it automagically downloads / caches the image and serves up a UIImage.
I was hoping to leverage the code for one other use case -- which is I'd like to grab source images from URL's and cache the files on the local file system, but what I really want for this other use case is the image path on the local file system instead of the UIImage itself.
What would help me most if I could get an example of how I might accomplish that. Is it possible to make that happen in a PCL, or does it need to go into the platform specific code?
Thanks -- that works, but just in case anyone else is following along, I wanted to document how I got the Mvx.Resolve<IMvxFileDownloadCache>() to work. In my setup.cs (in the touch project), I had:
protected override void InitializeLastChance ()
{
Cirrious.MvvmCross.Plugins.DownloadCache.PluginLoader.Instance.EnsureLoaded();
Cirrious.MvvmCross.Plugins.File.PluginLoader.Instance.EnsureLoaded();
Cirrious.MvvmCross.Plugins.Json.PluginLoader.Instance.EnsureLoaded();
...
}
But that wasn't enough, because nothing actually registers IMvxFileDownloadCache inside the DownloadCache plugin (which I was expecting, but it's just not the case).
So then I tried adding this line here:
Mvx.LazyConstructAndRegisterSingleton<IMvxFileDownloadCache, MvxFileDownloadCache>();
But that failed because MvxFileDownloadCache constructor takes a few arguments. So I ended up with this:
protected override void InitializeLastChance ()
{
...
var configuration = MvxDownloadCacheConfiguration.Default;
var fileDownloadCache = new MvxFileDownloadCache(
configuration.CacheName,
configuration.CacheFolderPath,
configuration.MaxFiles,
configuration.MaxFileAge);
Mvx.RegisterSingleton<IMvxFileDownloadCache>(fileDownloadCache);
...
}
And the resolve works okay now.
Question:
I do wonder what happens if two MvxFileDownloadCache objects that are configured in exactly the same way will cause issues by stepping on each other. I could avoid that question by changing the cache name on the one I'm constructing by hand, but I do want it to be a single cache (the assets will be the same).
If you look at the source for the plugin, you'll find https://github.com/MvvmCross/MvvmCross/blob/3.2/Plugins/Cirrious/DownloadCache/Cirrious.MvvmCross.Plugins.DownloadCache/IMvxFileDownloadCache.cs - that will give you a local file path for a cached file:
public interface IMvxFileDownloadCache
{
void RequestLocalFilePath(string httpSource, Action<string> success, Action<Exception> error);
}
You can get hold of a service implementing this interface using Mvx.Resolve<IMvxFileDownloadCache>()
To then convert that into a system-wide file path, try NativePath in https://github.com/MvvmCross/MvvmCross/blob/3.2/Plugins/Cirrious/File/Cirrious.MvvmCross.Plugins.File/IMvxFileStore.cs#L27
Is there a way to control the name of the MSTEST video recoding file names or the folder names with the test name. It seems to generate different guid everytime and thus very difficult to map the test with its corresponding video recording files.
The only solution I can see is to read the TRX file and map the guid to Test Name.
Any suggestions ??
If you're not opposed to doing it by hand, it's pretty easy. I encountered the same problem, and needed them to be somewhere predictable so I could email links to the videos. In the end my solution just ended up being to code in the functionality by hand. It's a bit involved, but not too difficult.
First, you'll need to have Expression Encoder 4 installed.
Then you'll need to add these references to your project:
Microsoft.Expression.Encoder
Microsoft.Expression.Encoder.Api2
Microsoft.Expression.Encoder.Types
Microsoft.Expression.Encoder.Utilities
Next, you need to add the following inclusion statements:
using Microsoft.Expression.Encoder.Profiles;
using Microsoft.Expression.Encoder.ScreenCapture;
Then you can use [TestInitialize] and [TestCleanup] to define the correct behavior. These methods will run at the beginning and end of each test respectively. This can be done something like this:
[TestInitialize]
public void startVideoCapture()
{
screenCapJob.CaptureRectangle = RectangleSelectionUtilities.GetScreenRect(0);
screenCapJob.CaptureMouseCursor = true;
screenCapJob.ShowFlashingBoundary = false;
screenCapJob.OutputScreenCaptureFileName = "path you want to save to";
screenCapJob.Start();
}
[TestCleanup]
public void stopVideoCapture()
{
screenCapJob.Stop();
}
Obviously this code needs some error and edge case handling, but it should get you started.
You should also know that the free version of Expression Encoder 4 limits you to 10 minutes per video file, so you may want to make a timer that will start a new video for you when it hits 10 minutes.
I have code such that:
[CodedUITest]
public class CodedUITest1
{
[TestMethod]
public void CodedUITestMethod1( )
{
using(var dlg = new MyWinForm( ))
{
dlg.Show();
System.Threading.Thread.Sleep(2000);
this.UIMap.AssertMethod1( );
this.UIMap.RecordedMethod1( );
this.UIMap.AssertMethod2( );
}
}
}
The code was running fine when I manually launched the app(before invoking the test) without the using clause to directly create the control.
I'd like to just use a reference to create an instance of the control and go from there instead of relying on trying to determine a path to the executable and opening it. The app just gets stuck with a ContextSwitchDeadlock.
Is there a way to do coded-Ui tests without doing a process launch? (using the reference and creating the control in the test code) or is there something wrong with the way I'm trying to do it?
It might be possible if you invoke the coded ui test portions (this.UIMap...) on a separate thread. But the way you have it now, they are both on the same thread, so you are going to get deadlocked.