We need to do some queries of a Mongo DB from BASH shell scripts. Using eval and Mongo's printjson() gives me text output, but it needs to be parsed. Using other scripting languages (Python, Ruby, Erlang, etc) is not an option.
I looked at JSON.sh ( a BASH script lib JSON parser: https://github.com/rcrowley/json.sh ) and it appears to be close to a solution other than the issue that it does not recognize BSON-but-not-JSON data types. Before I try to mod it to recognize BSON data types, is anyone aware of an existing solution?
Thanks.
10/11 Below Stennie notes that I have received an answer in the MongoDB User group, and provides a URL. The answer is very nice and complete, and begins, "MongoDB actually uses what we call Mongo Extended JSON which differs a bit from the vanilla JSON standard..." so I will have to modify the parser. Thanks to all.
Do you perhaps want to use tojson() rather than printjson() and loop through the result of tojson() to parse the fields?
Related
Currently I am working on moving some API DDT (data from CSV) tests from RobotFramework to Jmeter and what troubles me is the lack of proper JSONs assertion which is able to ignore some keys during comparison. I am totally new to jmeter so I am not sure if there's no such option available.
I am pretty sure we are using the wrong tool for this job, especially because functional testers would take the job of writing new tests. However, my approach (to make it as easy as possible for functionals) is to create jmeter plugin which takes response and compare it to baseline (excluding ignored keys defined in its GUI). What do you think? Is there any builtin I can use instead? Or do you know anything about some existing plugin?
Thanks in advance
The "proper" assertion is JSON Assertion available since JMeter 4.0
You can use arbitrary JSON Path queries to filter response according to your expected result
Example:
If it is not enough - you can always go for JSR223 Assertion, Groovy language has built-in JSON support so it will be way more flexible than any existing or future plugin.
Please find below the approach that I can think of:-
Take the response/HTML/json source code dump for the base line using "save response to a file".
Take the response dump for the AUT that needs to be compare or simply 2nd run dump.
Use 2 FTP sampler's to make calls for the locally saved response dump's.
Use compare assertion to compare the 2 FTP call response's. In the compare assertion, you can use RegEx String and Substitution to mask the timestamps or userID to something common for both so that it will be ignored in comparison.
Below I have shown just an image for my thought's for help.
You need to take care on how to save and fetch the response's.
Hope this help.
I wrote a little program that creates a hash called movies. Then I can add, update, delete, and display all current movies in the hash by typing the title.
Instead of having it start a new hash each time and save anything added to a file, and, when updated or deleted, update or delete the key, value pair from the file, I want the program to auto-load the file on startup and create it if it doesn't exist.
I have no idea how to go about doing this.
After reading a lot of the comments I have decided that maybe I should do this with SQL instead, seems like a much better approach!
You can't store Ruby objects directly on the disk; you will first need to convert them to some sequence of bytes (i.e. a string). This is called serialization, and there are several different ways to do it and several different formats the data could be in. I think I would recommend JSON, but you might also want to try YAML or Marshal.
Any of those libraries will allow you to convert your hash into a string and allow you to convert that same string back into a hash. Then you can use Ruby's File class to save and load that string from the disk.
This should get you pointed in the right direction. From here you can search for more specific things like "how do I convert a hash to JSON" or "how do I write a string to a file".
You have the ability to marshal your code in a few ways.
YAML if you would like to use a gem, or JSON. There is also a built in Marshal
RI tells us:
Marshal
(from ruby site)
----------------------------------------------------------------------------- The marshaling library converts collections of Ruby objects into a
byte stream, allowing them to be stored outside the currently active
script. This data may subsequently be read and the original objects
reconstituted.
Marshaled data has major and minor version numbers stored along with
the object information. In normal use, marshaling can only load data
written with the same major version number and an equal or lower minor
version number. If Ruby's ``verbose'' flag is set (normally using -d,
-v, -w, or --verbose) the major and minor numbers must match exactly. Marshal versioning is independent of Ruby's version numbers. You can
extract the version by reading the first two bytes of marshaled data.
And I will leave it at that for Marshal. But there is a bit more documentation there.
You can also use IO#puts to write to a file, and then modify that file to load later, which I use sometimes for config settings. Why use YAML or another external source, when Ruby is easy enough to have a user modify? You use YAML when it needs to be more generally accessible, as the Tin Man points out.
For example this file is the sample file, but is intended for interactive editing (with constraints, of course) but it is simply valid Ruby. And it gets read by a Ruby program, and is a valid object (in this case a Hash stored in a constant.)
I want to have parse nested configurations in Bash, like below:
[foo]
[bar]
key="value"
[baz]
key="value"
I tried this .ini parser but it does not support nesting. Later I found out that nesting isn't allowed in .ini files.
I searched for a YAML parser for bash, but I couldn't find a lot. Nested configuration parsing in bash seems to me as a basic problem, so I guess a trivial solution exists, but I could not find one. Does a triivial solution for parsing nested configuration in Bash exists? If yes, which one?
EDIT
I want to write a script/program for automated backup and restore of databases. The configuration needs to flexible so that I can select databases on different hosts, with different users and passwords and with different backup intervals. Oh, and I want to learn bash. But I am starting to think that Bash is not the right tool for my problem.
Bash is not the right language for this. There are no nested arrays, and dynamic variable assignment is a bit of a mine field compared to languages like Python and Ruby. That said, it sounds like you're specifying the format and parser yourself, so you could simply use a hierarchical naming scheme for your configuration:
foo_bar_key="value"
foo_baz_key="value"
I wrote a Yamlesque parser in response to this similar question.
It will parse
foo:
bar:
key: value
baz:
key: value
into bash associative arrays. 100% Bash, but it needs to be Bash 4.x.
I was going to use yaml because it has great feature called merge! ("<<" key)
And I'm using 'yaml-cpp' for parser since i'm working on cpp.
But! yaml-cpp does not support merge. What can I do for alternatives?
Other scripts, other parser, other way to parse or whatever is good if I can use merge feature.
BUT I don't need to merge more than one object. I just need define something and create another object inheritd from the first one and override some values. That it.
Thanks for reading.
If you're unable to wait and need merges, you can follow the suggestion by "barma" on the yaml-cpp issue: http://code.google.com/p/yaml-cpp/issues/detail?id=41#c12
The change is to insert the lines below into FindValueForKey template (between for-loop and return 0):
const Node *pValueMerge = FindValueForKey(std::string("<<"));
if(pValueMerge) {
return pValueMerge->FindValueForKey(key);
}
The problem (as I mentioned on the issue page) is that the spec allows
<<: [*dict1, *dict2]
to merge multiple dictionaries; but it appears you don't need that.
Ask 'yaml-cpp' to implement the feature.
Problem
Using YAML merge keys.
Solution
Other scripts, other parser, other way to parse or whatever is good if I can use merge feature.
The following YAML implementations support the desired feature as of this writing
Ruby 2.x
Python 2.x // 3.x
I've got a situation where I need to alter the contents of a cached file based off of one of the query string arguments passed in. I'd love to use sed to do a simple regular expression replacement of a value based off of said argument but I can't figure that one out. I could use a ruby script to do the replacement for me but can't seem to access the query string for the request within the script. The documents for mod_ext_filter say:
In addition to the standard CGI environment variables, DOCUMENT_URI, DOCUMENT_PATH_INFO, and QUERY_STRING_UNESCAPED will also be set for the program.
Um yeah, can't seem to access those.
Has anybody any experience with this or does anybody have a better solution?
Doh! Looks like I simply need to access the ENV variable within ruby. Pretty dumb of me.
Using PHP scripting language server function we can able to get the query string values.
echo $_SERVER['REQUEST_URI'];
And pass the URL arguments as a variable to the file and make it dynamic.
Refer : PHP.net