I started using spring integration SFTP and I have some questions.
Filters not working. I have example configuration:
Sftp.inboundAdapter(ftpFileSessionFactory())
.preserveTimestamp(true)
.deleteRemoteFiles(false)
.remoteDirectory(integrationProperties.getRemoteDirectory())
.filter(sftpFileListFilter()) // doesn't work
.patternFilter("*.xlsx") // doesn't work
And my ChainFileListFilter:
private ChainFileListFilter<ChannelSftp.LsEntry> sftpFileListFilter() {
ChainFileListFilter<ChannelSftp.LsEntry> chainFileListFilter = new ChainFileListFilter<>();
chainFileListFilter.addFilter(new SftpPersistentAcceptOnceFileListFilter(metadataStore(), "INT"));
chainFileListFilter.addFilter(new SftpSimplePatternFileListFilter("*.xlsx"));
return chainFileListFilter;
}
If I understand correctly, only the XLSX file should be saved in the local directory. If yes it doesn't work with this configuration. Am I doing something wrong or misunderstood this?
How I can configure SFTP that each downloaded file emit message? I see in the doc two params max-messages-per-poll and max-fetch-size, but I don't know how to set it up so that every file emits a message. I would like to sync files once every 24 hours and produce batch job queue. Maybe there is a workaround?
Is there built-in filter which allow me fetch only files with changed content? The best solution would be to check the checksums of the files.
I will be grateful for your help and explanations.
You cannot combine filter() and patternFilter(). Only one of them can be used: the last one overrides whatever you used before. In other words: or filter() or patternFilter() - not both. By default the logic is like this:
public SftpInboundChannelAdapterSpec patternFilter(String pattern) {
return filter(composeFilters(new SftpSimplePatternFileListFilter(pattern)));
}
private CompositeFileListFilter<ChannelSftp.LsEntry> composeFilters(FileListFilter<ChannelSftp.LsEntry>
fileListFilter) {
CompositeFileListFilter<ChannelSftp.LsEntry> compositeFileListFilter = new CompositeFileListFilter<>();
compositeFileListFilter.addFilters(fileListFilter,
new SftpPersistentAcceptOnceFileListFilter(new SimpleMetadataStore(), "sftpMessageSource"));
return compositeFileListFilter;
}
So, technically you don't need your custom one, if you don't use external persistent MetadataStore. But if you do, think about flipping SftpSimplePatternFileListFilter with SftpPersistentAcceptOnceFileListFilter. Since it is better to check for the pattern before storing the file into MetadataStore.
It is the fact that every synched remote file, passed those filters, is stored into local dir and the message for that local file is emitted immediately when the poller does a request.
The maxFetchSize plays the role when we load remote files into a local dir. The maxMessagesPerPoll is used from the poller, but those are already built from the local files. The message is emitted per local file, not as a batch for all of them. That's not what messaging is designed for.
Please, share more info what does not work with files. The SftpPersistentAcceptOnceFileListFilter checks not only file name, but also mtime of the file. So, that it not about any checksum, but more last modified timestamp of the file.
I've a requirement to download a file from S3 based on a message content. In other words, the file to download is previously unknown, I've to search and find it at runtime. S3StreamingMessageSource doesn't seem to be a good fit because:
It relies on polling where as I need to wait for the message.
I can't find any way to create a S3StreamingMessageSource dynamically in the middle of a flow. gateway(IntegrationFlow) looks interesting but what I need is a gateway(Function<Message<?>, IntegrationFlow>) that doesn't exist.
Another candidate is S3MessageHandler but it has no support for listing files which I need for finding the desired file.
I can implement my own message handler using AWS API directly, just wondering if I'm missing something, because this doesn't seem like an unusual requirement. After all, not every app just sits there and keeps polling S3 for new files.
There is S3RemoteFileTemplate with the list() function which you can use in the handle(). Then split() result and call S3MessageHandler for each remote file to download.
Although the last one has functionality to download the whole remote dir.
For anyone coming across this question, this is what I did. The trick is to:
Set filters later, not at construction time. Note that there is no addFilters or getFilters method, so filters can only be set once, and can't be added later. #artem-bilan, this is inconvenient.
Call S3StreamingMessageSource.receive manually.
.handle(String.class, (fileName, h) -> {
if (messageSource instanceof S3StreamingMessageSource) {
S3StreamingMessageSource s3StreamingMessageSource = (S3StreamingMessageSource) messageSource;
ChainFileListFilter<S3ObjectSummary> chainFileListFilter = new ChainFileListFilter<>();
chainFileListFilter.addFilters(
new S3SimplePatternFileListFilter("**/*/*.json.gz"),
new S3PersistentAcceptOnceFileListFilter(metadataStore, ""),
new S3FileListFilter(fileName)
);
s3StreamingMessageSource.setFilter(chainFileListFilter);
return s3StreamingMessageSource.receive();
}
log.warn("Expected: {} but got: {}.",
S3StreamingMessageSource.class.getName(), messageSource.getClass().getName());
return messageSource.receive();
}, spec -> spec
.requiresReply(false) // in case all messages got filtered out
)
Recently moved from utilising AWS to Azure for the location of our load test agents, thus making the transition to making full use of VSTS.
It was described that, for the moment, to get a load test file working with VSTS to using our own VMs for testing, we need to provide two context parameters, UseStaticLoadAgents and StaticAgentsGroupName in each loadtest file.
Our load test solution is getting very large, and we have multiple loadtest files where we have to set these two values each time. This leads us into the situation where, if we were to change our agents group name for example, we would have to update each individual load test file with the new information.
Im looking at a way to centralise this until a nicer way is implemented by Microsoft. The idea was to use a load test plugin, to add these context parameters with the plugin drawing the needed values from a centralised config file.
However, it seems that none of the hooks in the load test plugin or simply using the initialise method to manually set these values is working. Likely because they are set after full initialisation.
Has anyone got a nice, code focused solution to manage this and stop us depending on adding brittle values in the editor? Or even gotten the above approach to work?
The loadtest file is the XML file, so you can update it programmatically, for example:
string filePath = #"XXX\LoadTest1.loadtest";
XmlDocument doc = new XmlDocument();
doc.Load(filePath);
XmlNamespaceManager nsmgr = new XmlNamespaceManager(doc.NameTable);
nsmgr.AddNamespace("ns", "http://microsoft.com/schemas/VisualStudio/TeamTest/2010");
XmlNode root = doc.DocumentElement;
XmlNode nodeParameters = root.SelectSingleNode("//ns:RunConfigurations/ns:RunConfiguration[#Name='Run Settings1']/ns:ContextParameters", nsmgr);
if(nodeParameters!=null)
{
//nodeParameters.SelectSingleNode("//ns:ContextParameter[#Name='UseStaticLoadAgents']").Value = "agent1";
foreach (XmlNode n in nodeParameters.ChildNodes)
{
switch (n.Attributes["Name"].Value)
{
case "Parameter1":
n.Attributes["Value"].Value = "testUpdate";
break;
case "UseStaticLoadAgents":
n.Attributes["Value"].Value = "agent1";
break;
case "StaticAgentsGroupName":
n.Attributes["Value"].Value = "group1";
break;
}
}
}
doc.Save(filePath);
I'm trying to figure out what the purpose of the file /var/resource_config.json is in Magento. It appears to perhaps be a caching of a configuration, but can't see where in the source code it is being created and/or updated.
I'm in the process of setting up local/dev/staging/prod environments for an EE1.12 build and want to figure out if I can safely exclude it from my repo or whether I need to script some updates to it for deploys.
Maybe the flash image uploader in admin creates it?
Any ideas or directions to look?
This is a configuration cache file for the "alternative media store" system. This is a system where requests for media files are routed through get.php, and allows you to store media in the database instead of the file system. (That may be a gross over simplification, as I've never used the feature myself)
You can safely, (and should) exclude this file from deployments/source control, as it's a cache file and will be auto generated as needed. See the following codeblock in the root level get.php for more information.
if (!$mediaDirectory) {
$config = Mage_Core_Model_File_Storage::getScriptConfig();
$mediaDirectory = str_replace($bp . $ds, '', $config['media_directory']);
$allowedResources = array_merge($allowedResources, $config['allowed_resources']);
$relativeFilename = str_replace($mediaDirectory . '/', '', $pathInfo);
$fp = fopen($configCacheFile, 'w');
if (flock($fp, LOCK_EX | LOCK_NB)) {
ftruncate($fp, 0);
fwrite($fp, json_encode($config));
}
flock($fp, LOCK_UN);
fclose($fp);
checkResource($relativeFilename, $allowedResources);
}
Speaking in general terms, Magento's var folder serves the same purpose as the *nix var folder
Variable files—files whose content is expected to continually change during normal operation of the system—such as logs, spool files, and temporary e-mail files. Sometimes a separate partition
and should be isolated to particular systems (i.e. not a part of deployments)
In Visual Studio debug mode it's possible to hover over variables to show their value and then right-click to "Copy", "Copy Expression" or "Copy Value".
In case the variable is an object and not just a basic type, there's a + sign to expand and explore the object. It there a way to copy all that into the clipboard?
In the immediate window, type
?name_of_variable
This will print out everything, and you can manually copy that anywhere you want, or use the immediate window's logging features to automatically write it to a file.
UPDATE: I assume you were asking how to copy/paste the nested structure of the values so that you could either search it textually, or so that you can save it on the side and then later compare the object's state to it. If I'm right, you might want to check out the commercial extension to Visual Studio that I created, called OzCode, which lets you do these thing much more easily through the "Search" and "Compare" features.
UPDATE 2 To answer #ppumkin's question, our new EAP has a new Export feature allows users to Export the variable values to Json, XML, Excel, or C# code.
Full disclosure: I'm the co-creator of the tool I described here.
You can run below code in immediate window and it will export to an xml file the serialized XML representation of an object:
(new System.Xml.Serialization.XmlSerializer(obj.GetType())).Serialize(new System.IO.StreamWriter(#"c:\temp\text.xml"), obj)
Source: Visual Studio how to serialize object from debugger
Most popular answer from https://stackoverflow.com/a/23362097/2680660:
With any luck you have Json.Net in you appdomain already. In which
case pop this into your Immediate window:
Newtonsoft.Json.JsonConvert.SerializeObject(someVariable)
Edit: With .NET Core 3.0, the following works too:
System.Text.Json.JsonSerializer.Serialize(someVariable)
There is a extension called Object Exporter that does this conveniently.
http://www.omarelabd.net/exporting-objects-from-the-visual-studio-debugger/
Extension: https://visualstudiogallery.msdn.microsoft.com/c6a21c68-f815-4895-999f-cd0885d8774f
You can add a watch for that object, and in the watch window, expand and select everything you want to copy and then copy it.
By using attributes to decorate your classes and methods you can have a specific value from your object display during debugging with the DebuggerDisplay attribute e.g.
[DebuggerDisplay("Person - {Name} is {Age} years old")]
public class Person
{
public string Name { get; set; }
public int Age { get; set; }
}
I always use:
string myJsonString = JsonConvert.SerializeObject(<some object>);
Then I copy the string value which unfortunately also copies the back slashes.
To remove the backlashes go here:
https://www.w3schools.com/jsref/tryit.asp?filename=tryjsref_replace
Then within the <p id="demo">Visit Microsoft!</p> element replace the text with the text you copied.
then replace the var res = str.replace("Microsoft", "W3Schools"); line with
var res = str.replace(/\\/g, '')
Run these new changes but don't forget to click the "try it" button on the right.
Now you should have all the text of the object in json format that you can drop in a json formatter like http://jsonformatter.org or to create a POCO you can now use http://json2csharp.com/
ObjectDumper.NET
This is an awesome way!
You probably need this data for a unit test, so create a Sandbox.cs temporary test or you can create a Console App.
Make sure to get NuGet package, ObjectDumper.NET, not ObjectDumper.
Run this test (or console app)
View test output or text file to get the C# initializer code!
Code:
[TestClass]
public class Sandbox
{
[TestMethod]
public void GetInitializerCode()
{
var db = TestServices.GetDbContext();
var list = db.MyObjects.ToList();
var literal = ObjectDumper.Dump(list, new DumpOptions
{
DumpStyle = DumpStyle.CSharp,
IndentSize = 4
});
Console.WriteLine(literal); // Some test runners will truncate this, so use the file in that case.
File.WriteAllText(#"C:\temp\dump.txt", literal);
}
}
I used to use Object Exporter, but it is 5 years old and no longer supported in Visual Studio. It seems like Visual Studio Extensions come and go, but let's hope this NuGet package is here to stay! (Also it is actively maintained as of this writing.)
Google led me to this 8-year-old question and I ended up using ObjectDumper to achieve something very similar to copy-pasting debugger data. It was a breeze.
I know the question asked specifically about information from the debugger, but ObjectDumper gives information that is basically the same. I'm assuming those who google this question are like me and just need the data for debugging purposes and don't care whether it technically comes from the debugger or not.
I know I'm a bit late to the party, but I wrote a JSON implementation for serializing an object, if you prefer to have JSON output. Uses Newtonsoft.Json reference.
private static void WriteDebugJSON (dynamic obj, string filePath)
{
using (StreamWriter d = new StreamWriter(filePath))
{
d.Write(JsonConvert.SerializeObject(obj));
}
}
I've just right clicked on the variable and selected AddWatch, that's bring up watch window that consists of all the values. I selected all and paste it in a text a text editor, that's all.
Object Dumper is a free and open source extension for Visual Studio and Visual Studio Code.
"Dump as" commands are available via context menu in the Code and Immediate windows.
It's exporting objects to:
C# object initialization code,
JSON,
Visual Basic object initialization code,
XML,
YAML.
I believe that combined with the Diff tool it can be helpful.
I'm the author of this tool.
if you have a list and you want to find a specific variable:
In the immediate window, type
myList.Any(s => s.ID == 5062);
if this returns true
var myDebugVar = myList.FirstOrDefault(s => s.ID == 5062);
?myDebugVar
useful tips here, I'll add my preference for when i next end up here asking this question again in the future.
if you don't mind adding an extension that doesn't require output files or such there's the Hex Visualizer extension for visual studio, by mladen mihajlovic, he's done versions since 2015.
provides a nice display of the array via the usual magnifine glass view object from the local variables window.
https://marketplace.visualstudio.com/items?itemName=Mika76.HexVisualizer2019 is the 2019 version.
If you're in debug mode, you can copy any variable by writing copy() in the debug terminal.
This works with nested objects and also removes truncation and copies the complete value.
Tip: you can right click a variable, and click Copy as Expression and then paste that in the copy-function.
System.IO.File.WriteAllText("b.json", page.DebugInfo().ToJson())
Works great to avoid to deal with string debug format " for quote.
As #OmerRaviv says, you can go to Debug → Windows → Immediate where you can type:
myVariable
(as #bombek pointed out in the comments you don't need the question mark) although as some have found this limits to 100 lines.
I found a better way was to right click the variable → Add Watch, then press the + for anything I wanted to expand, then used #GeneWhitaker's solution, which is Ctrl+A, then copy Ctrl+C and paste into a text editor Ctrl+V.