I want to download multiple xml files from web service API. I have a query that gets a JSON document:
= Json.Document(Web.Contents("http://reports.sem-o.com/api/v1/documents/static-reports?DPuG_ID=BM-086&page_size=100"))
and manipulates it to get list of file names such as: PUB_DailyMeterDataD1_201812041627.xml in a column on an excel spreadsheet.
I hoped to get a function to run against this list of names to get all the data, so first I worked on one file: PUB_DailyMeterDataD1_201812041627
= Xml.Tables(Web.Contents("https://reports.sem-o.com/documents/PUB_DailyMeterDataD1_201812041627.xml"))
This gets an xml table which I manipulate to get the data I want (the half hourly metered MWh for generator GU_401970
Now I want to change the query into a function to automate the process across all xml files avaiable from the service. The function requires a variable to be substituted for the filename. I try this as preparation for the function:
let
Filename="PUB_DailyMeterDataD1_201812041627.xml",
Source = (Web.Contents("https://reports.sem-o.com/documents/Filename")),
(followed by the manipulating Mcode)
This doesnt work.
then this:
let
Filename="PUB_DailyMeterDataD1_201812041627.xml",
Source = Xml.Tables(Web.Contents("https://reports.sem-o.com/documents/[Filename]")),
I get:
DataFormat.Error: Xml processing failed. Either the input is invalid or it isn't supported. (Internal error: Data at the root level is invalid. Line 1, position 1.)
Details:
Binary
So stuck here. Can you help.
thanks
Conor
You append strings with the "&" symbol in Power Query. [Somename] is the format for referencing a field within a table, a normal variable is just referenced with it's name. So in your example
let Filename="PUB_DailyMeterDataD1_201812041627.xml",
Source = Xml.Tables(Web.Contents("https://reports.sem-o.com/documents/" & Filename)),
Would work.
It sounds like you have an existing query that drills down to a list of filenames and you are trying to use that to import them from the url though, so assuming that the column you have gotten the filenames from is called "Filename" then you could add a custom column with this in it
Xml.Tables(Web.Contents("https://reports.sem-o.com/documents/" & [Filename]))
And it will load the table onto the row of each of the filenames.
I used to have a series of independent arrays (e.g. name(), id(), description() ). I used to be able to check whether a value existed in a specific array by doing name.include?("Mark")
Now that I moved to a MUCH MORE elegant way to manage different these independent arrays (here for background: How do I convert an Array with a JSON string into a JSON object (ruby)) I am trying to figure out how I do the same.
In short I put all the independent arrays in a single structure so that I can reference the content as object().name, object().id, object().description.
However I am missing now how I can check whether the object array has a value "Mark" in its name structure.
I have tried object.name.include?("Mark") but it doesn't quite like it.
I have also tried to use has_value?but that doesn't seem to be working either (likely because it used to be an hash before I imported it into the structure but right now is no longer a hash - see here: How do I convert an Array with a JSON string into a JSON object (ruby))
Thoughts? How can I check whether object.name contains a certain string?
Thanks.
If you want to find all customers called Mark you can write the following:
customers_named_mark = array_of_customers.select{|c| c.name == 'Mark' }
This will return a potentially empty array.
If you want to find the first customer named Mark, write
customer_named_mark = array_of_customers.detect{|c| c.name == 'Mark' }
This will return the first matching item or nil.
I am already familiar with How can I save an object to a file?
But what if we have to store multiple objects (say hashes) to a file.
I tried appending YAML.dump(hash) to a file from various locations in my code. But the difficult part is reading it back. As yaml dump can extend to many lines, do I have to parse the file? Also this will only complicate code. Is there a better way to achieve this?
PS: Same issue will persist with Marshal.dump. So I prefer YAML as its more human readable.
YAML.dump creates a single Yaml document. If you have several Yaml documents together in a file then you have a Yaml stream. So when you appended the results from several calls to YAML.dump together you would have had a stream.
If you try reading this back using YAML.load you will only get the first document. To get all the documents back you can use YAML.load_stream, which will give you an array with an entry for each of the documents.
An example:
f = File.open('data.yml', 'w')
YAML.dump({:foo => 'bar'}, f)
YAML.dump({:baz => 'qux'}, f)
f.close
After this data.yml will look like this, containing two separate documents:
---
:foo: bar
---
:baz: qux
You can now read it back like this:
all_docs = YAML.load_stream(File.open('data.yml'))
Which will give you an array like [{:foo=>"bar"}, {:baz=>"qux"}].
If you don’t want to load all the documents into an array in one go you can pass a block to load_stream and handle each document as it is parsed:
YAML.load_stream(File.open('data.yml')) do |doc|
# handle the doc here
end
You could manage to save multiple objects by creating a delimiter (something to mark that one object is finished and that you go to the next one). You could then process the file in two steps:
read the file, splitting it around each delimiter
use YAML to restore the hashes from each chunk
Now, this would be a bit cumbersome, as there is a much simpler solution. Let's say you have three hash to save:
student = { first_name: "John"}
restaurant = { location: "21 Jump Street" }
order = { main_dish: "Happy Meal" }
You can simply put them in an array and then dump them:
objects = [student, restaurant, order]
dump = YAML.dump(objects)
You can restore your objects easily:
saved_objects = YAML.load(dump)
saved_student = saved_objects[0]
Depending of your objects relationship, you may prefer to use an Hash to save them instead of an array (so that you can name them instead of depending on the order).
Is there some way to save array/list/collection data to a file while debugging in VS2010?
For example, in this code:
var addressGraphs = from a in context.Addresses
where a.CountryRegion == "Canada"
select new { a, a.Contact };
foreach(var ag in addressGraphs) {
Console.WriteLine("LastName: {0}, Addresses: {1}", ag.Contact.LastName.Trim(),
ag.Contact.Addresses.Count());
foreach(var Address in ag.Contact.Addresses) {
Console.WriteLine("...{0} {1}", Address.Street1, Address.City);
}
}
I'd like to set a breakpoint on the first 'foreach' line and then save the data in 'addressGraph' to a file.
where 'a' contains fields such as:
int addressID
string Street1
string City
<Ect.>
and 'Contact' contains fields such as:
string FirstName
string LastName
int contactID
<Ect.>
I'd like the file to contain the values of each of the fields for each item in the collection.
I don't see an obvious way to do this. Is it possible?
When your breakpoint is hit, open up the Immediate window and use Tools.LogCommandWindowOutput to dump the output to a file:
>Tools.LogCommandWindowOutput c:\temp\temp.log
?addressGraphs
>Tools.LogCommandWindowOutput /off
Note: You can use Log which is an alias for Tools.LogCommandWindowOutput
Update:
The > character is important. Also, the log alias is case sensitive.
See screenshot:
I also encoutered such a question, but in VS2013. I have to save a content of array while debugging.
For example, I need to save a content of double array named "trimmedInput". I do so:
Open QuickWatch Window from Debug menu (Ctrl+D, Q).
Put your variable in Expression and push Recalculate Button
You'll see all the values. Now you could select them all (Ctrl+A) and copy (Ctrl+C).
Paste (Ctrl+V) them in your favorite editor. Notepad, for example. And use them.
That's the simples way that I know. Without additional efforts. Hope that my description helps you!
P.S. Sorry for non English interface on screenshots. All necessary information are written in the text.
Something similar is possible with this method:
I built an extension method that I use in all of my projects that is a general and more powerful ToString() method that shows the content of any object.
I included the source code in this link:
https://rapidshare.com/files/1791655092/FormatExtensions.cs
UPDATE:
You just have to put FormatExtensions.cs in your project and change the Namespace of FormatExtensions to coincide to the base Namespace of your project. So when you are in your breakpoint you can type in your watch window:
myCustomCollection.ToStringExtended()
And copy the output wherever you want
On Visual studio Gallery search for: Object Exporter Extension.
be aware: as far as I worked with, it has a bug that block you from exporting object once in a while.
You can also call methods in the Immediate Window, and so I think your best bet would be to use an ObjectDumper object, like the one in the LINQ samples or this one, and then write something like this in the Immediate Window:
File.WriteAllText("myFileName.txt", ObjectDumper.Dump(addressGraph));
Depending on which ObjectDumper you decide to use, you may be able to customize it to suit your needs, and to be able to tell it how many levels deep you want it to dig into your object when it's dumping it.
Here's a solution that takes care of collections. It's a VS visualizer that will display the collection values in a grid while debugging as well as save to the clipboard and csv, xml and text files. I'm using it in VS2010 Ultimate. While I haven't tested it extensively, I have tried it on List and Dictionary.
http://tinyurl.com/87sf6l7
It handles the following collections:
•System.Collections classes
◦System.Collections.ArrayList
◦System.Collections.BitArray
◦System.Collections.HashTable
◦System.Collections.Queue
◦System.Collections.SortedList
◦System.Collections.Stack
◦All classes derived from System.Collections.CollectionBase
•System.Collections.Specialized classes
◦System.Collections.Specialized.HybridDictionary
◦System.Collections.Specialized.ListDictionary
◦System.Collections.Specialized.NameValueCollection
◦System.Collections.Specialized.OrderedDictionary
◦System.Collections.Specialized.StringCollection
◦System.Collections.Specialized.StringDictionary
◦All classes derived from System.Collections.Specialized.NameObjectCollectionBase
•System.Collections.Generic classes
◦System.Collections.Generic.Dictionary
◦System.Collections.Generic.List
◦System.Collections.Generic.LinkedList
◦System.Collections.Generic.Queue
◦System.Collections.Generic.SortedDictionary
◦System.Collections.Generic.SortedList
◦System.Collections.Generic.Stack
•IIS classes, as used by
◦System.Web.HttpRequest.Cookies
◦System.Web.HttpRequest.Files
◦System.Web.HttpRequest.Form
◦System.Web.HttpRequest.Headers
◦System.Web.HttpRequest.Params
◦System.Web.HttpRequest.QueryString
◦System.Web.HttpRequest.ServerVariables
◦System.Web.HttpResponse.Cookies
As well as a couple of VB6-compatible collections
In "Immediate Window" print following to get the binary dump:
byte[] myArray = { 02,01,81,00,05,F6,05,02,01,01,00,BA };
myArray
.Select(b => string.Format("{0:X2}", b))
.Aggregate((s1, s2) => s1 + s2)
This will print something like:
0201810005F60502010100BA
Change the '.Aggregate(...)' call to add blanks between bytes, or what ever you like.
I'm getting all collections using [[NSFontManager sharedFontManager] collectionNames], but I see some untranslated strings (such as "com.apple.AllFonts"). There is a way to localize them? I see that Font Book does translate them successfully. Maybe I am doing something wrong.
Thanks,
—Albe
Apple prefixes all of its internal collection names with "com.apple", probably to avoid conflicts. Depending on what you're doing, you could:
Skip any collection name that begins with "com.apple" -- they're not collections created by the user.
If a collection begins with "com.apple", split it and just get the last part of the name. Something like if ([name hasPrefix:#"com.apple"]) name = [[name componentsSeparatedByString:#"."] objectAtIndex:2]; would work.