Word 2010 combining INCLUDEPICTURE and IF - image

Using MS Word 2010 I am trying to place an INCLUDEPICTURE field into a block of an IF statement. While both the IF statement and the INCLUDEPICTURE work correctly separate, they do not work in combination.
IF Statement:
{ IF { MERGEFIELD condition \* MERGEFORMAT } = "expression" "true" "false" \* MERGEFORMAT }
This works correctly.
INCLUDEPICTURE:
{ INCLUDEPICTURE "picture.png" }
This works correctly, too.
Combination of the two:
{ IF { MERGEFIELD condition \* MERGEFORMAT } = "expression" "{ INCLUDEPICTURE "picture.png" }" "false" \* MERGEFORMAT }
This does not work. If the IF expression is true, nothing is displayed at all.
How can I combine both the IF statement and the INCLUDEPICTURE command?

This is a well known-problem (i.e. you are right, it doesn't work).
Unfortunately, there isn't a particularly good solution - the simplest involves using a blank 1-pixel image file.
The usual starting point is to invert the nesting so that you have something more like this...
{ INCLUDEPICTURE "{ IF "{ MERGEFIELD condition }" = "expression" "picture.png" }" }" \d }
This always tries to insert a picture, and will report (and insert) an error in the case where { MERGEFIELD condition } <> "expression". The simplest resolution is to have a blank 1-pixel picture that you can include instead, e.g.
{ INCLUDEPICTURE "{ IF "{ MERGEFIELD condition }" = "expression" "picture.png" "blank1.png" }" }" \d }
It is sometimes clearer to remove the test and assignment and do it separately, particularly if there are multiple tests. In this case,
{ SET picname "{ IF "{ MERGEFIELD condition }" = "expression" "picture.png" "blank1.png" }" }
or if you prefer,
{ IF "{ MERGEFIELD condition }" = "expression" "{ SET picname "picture.png" }" "{ SET picname "blank1.png" }" }
You still need an IF nested inside the INNCLUDEPICTURE to make it work. You can use:
{ INCLUDEPICTURE "{ IF TRUE { picname } }" \d }
If you merge those nested fields to an output document, the fields will remain in the output. If you want the fields to be resolved (e.g. because you need to send the output to someone who does not have the image files) then you need something more like this:
{ IF { INCLUDEPICTURE "{ IF TRUE { picname } }" } { INCLUDEPICTURE "{ IF TRUE { picname } }" \d } }
I believe you can reduce this to
{ IF { INCLUDEPICTURE "{ picname }" } { INCLUDEPICTURE "{ IF TRUE { picname } }" \d } }
In fact, I believe you can insert the full path+name of any graphic file that you know exists instead of the first { picname }, e.g.
{ IF { INCLUDEPICTURE "the full pathname of blank1.png" } { INCLUDEPICTURE "{ IF TRUE { picname } }" \d } }
But you should check that those work for you.
EDIT
FWIW, some recent tests suggest that whereas the pictures appear unlinked, a save/re-open displays a reconstituted link (with a *MERGEFORMATINET near the end), and the pictures are expected to be at the locaitons indicated in those links. Whether this is due to a change in Word I cannot tell. If anything has changed, it looks to be an attempt to allow some relative path addressing in the Relationship records that Word creates inside the .docx.
Some observations...
Make sure paths have doubled-up backslashes, e.g.
c:\\mypath\\blank1.png . This is usually necessary for any paths
hard-coded into fields. For paths that come in via nested field
codes, please check.
As a general point, it is easier to work with INCLUDEPICTURE fields
when the document is a .doc, not .docx, and to ensure that
File->Options->Advanced->General->Web options->Files->"Update links
on save" is checked. Otherwise, Word is more likely to replace
INCLUDEPICTURE fields with a result that cannot be redisplayed as a
field using Alt-F9
When you want to treat the comparands in an IF field as strings, it
is advisable to surround them with double-quotes, as I have done.
Otherwise, a { MERGEFIELD } field that resolves to the name of a
bookmark may not behave as you would hope. Otherwise, spacing and
quoting is largely a matter of personal choice.
So far, none of these field constructions will deal with the situation where you have path names for pictures that may or may not exist. If that is what you need, please modify your original question.

Step by step guide:
bibadia's answer works, but word does not tell you when you make mistakes, so it is very hard to get it right. So I hope this step by step answer helps.
Step 1: Add a Picture
In Word 2013 docx (no idea about other versions) add
{ INCLUDEPICTURE "C:\\picture.png" }
Note: Use CTRL+F9 to add { } , don't ever type them in, as they will not work.
Use \\ and not \
Run the mail merge, do Ctrl+A then F9 to show the picture.
Step 2: Auto Show it
To change the mail merge document use (CTRL+A Shift+F9). Change it to
{ SET picname "C:\\picture.png" }
{ INCLUDEPICTURE "{ IF TRUE { picname } }" \d }
Run the mail merge - the picture should show up, no need for Ctrl+A then F9
Step 3: Unlink it
Remove the \d
This will let you email the doc. As the \d causes the document to create a link to the image file, rather than include it.
Step 4: add an IF
Use bibadia's solution, i.e.
{ SET picname "{ IF "{ MERGEFIELD condition }" = "expression" "picture.png" "blank1.png" }" }

Another option that I've tested works is to use an If statement to check an expression (In my example check if the entry is not null), and if not then display the image, if not display some custom text (If you don't want text just have empty quotation marks i.e. ""):
{IF {MERGEFIELD my_photo_variable_name} <> "" {INCLUDEPICTURE "{IF TRUE {MERGEFIELD my_photo_variable_name}}" \d} "Text to display if no picture available"}
Which translates as:
If there is no value for the image my_photo_variable_name, include the image in the mail merge.
If there is no value i.e no image, then display custom text Text to display if no picture available.

Related

Filtering in Tablesorter. Unexpected behaviour

In my tablesorter I applied this addParser to the column I'm showing here in this question. And it works well, but I found an unexpected behaviour when I filter in a way.
The results without filtering will be like this next picture:
The code for the addParser is the next one:
$.tablesorter.addParser({
// set a unique id
id: 'kilogramos',
is: function(s) {
// return false so this parser is not auto detected
return false;
},
format: function(s) {
// format your data for normalization
return parseFloat(s.replace(' Kg','').replace('.',''));
},
// set type, either numeric or text
type: 'numeric'
});
If I use the ">=" it seems to apply the addParser, because I can get rid the "." and the " Kg" of and it finds the 11.689 Kg results.
But seems that if I don't use the operators like ">", or ">=", etc. the behaviour change and it needs the dot to find what you are trying to get. In the next pictures I show what I mean.
In this last picture, I don't use the operators and I doesn't find any results. Instead, it needs now the "." and also even the " Kg" it works. The next image proves that:
I just don't want to need this "." or " Kg" to be used in any case.
Any help? Thanks
I think all you're missing is a "filter-parsed" class in the header (demo)
<th class="sorter-kilogramos filter-parsed">Kg</th>

logstash name fields dynamically

i have a dynamical field, the field format looks like
A-B-C::D_[randomNum]
the field is dynamic because the randonNUM ,
i want to change the '-' to '_' and remove the [randomNUM]
and it's will be looks like as follow,
A_B_C::D
Is there any plugin / strategy to solve this problem?
You should be able to achieve this with a mutate/gsub filter
filter {
mutate {
gsub => [
# replace random num suffix
"fieldname", "_\d+", "",
# replace all dashes with underscores
"fieldname", "-", "_"
]
}
}
Make sure to replace fieldname with your actual field name.
UPDATE
Given your comments, it turned out it's the field names that are dynamic and not the value. For this reason, you cannot use the above solution but the next one should work, i.e. using the ruby filter:
filter {
ruby {
code => "
newhash = {}
event.to_hash.each {|key, value|
if key =~ /^CISCO/ then
newkey = key.gsub(/_\d+/, '').gsub('-', '_')
newhash[newkey] = event[key]
event.remove(key)
end
}
newhash.each {|key,value|
event[key] = value
}
"
}
}
After this filter runs, your event will have the field A_B_C::D instead of the original A-B-C::D_num

Elasticsearch escape hyphenated field in groovy script

I am attempting to add a field to a document doing something similar to https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-update.html#_scripted_updates. However, I appear to be running into issues due to the field being hyphen separated(appears to be treated as a minus sign) as opposed to underscore separated.
Example body below:
{"script":"ctx._source.path.to.hyphen-separated-field = \"new data\""}
I attempted to escape the hyphens with a backslash, but to no luck.
You can access the field using square brackets, i.e. simply do it like this:
{"script": "ctx._source.path.to['hyphen-separated-field'] = \"new data\""}
This one worked for me on 2.x (or maybe other version as well):
"script": {
"inline": "ctx._source.path.to[field] = val",
"params": {
"val": "This is the new value",
"field": "hyphen-separated-field"
}
}
Or this will also work
{"script": "ctx._source.path.to.'hyphen-separated-field' = 'new data'"}

Bash: Reading from array

I'm writing a script in bash, and I'm trying to read from an array. When I iterate through the array with the code below:
for item in "${!functionDict[#]}"
do
echo "{ $item : ${functionDict[$item]} }" >> array.txt
done
it outputs (in "array.txt"):
{ month_start_date($year_selected, $month_selected) : return $date; }
{ logWarning($message) : return logEvent($message, $c_event_warning); }
{ daysPastLastQuarterX($curYear, $curMonth, $curDay, $selected_year, $selected_quarter, $nDays) : return false;:return false;:return false;:return false;:return true;:return $delta > $nDays; }
{ setExcelLabelCell($sheet, $cell, $label, $width) : }
{ asCurrencyString($value) : return formatCurrency($value); }
{ getNumericMonthName($m) : return $numericMonth; }
{ normalize_for_PDF(&$text) : }
However, I'm having trouble querying individual elements from the array.
I've tried:
string='month_start_date($year_selected, $month_selected)'
echo "test_output: ${functionDict[$string]}"
but I get
test_output: <blank>
I've also tried inserting some RegEx wildcards, in case there is some whitespace around the key.
echo 'size of array: '"${#functionDict[#]}"
echo "TEST: functDict[logWarning] = ${functionDict[.*'logWarning($message)'.*]}"
I get
size of array: 157 //I didn't copy/paste all the elements in the array in this post
TEST: functDict[logWarning] = <blank>
Alas, I'm stuck. The content that I'm trying to get back are the "return _" items, or just a "blank" for the keys that don't have any "return" items.
Credits to answer goes to #gniourf_gniourf. (See comments)
There was an extra leading space in all of the keys, which I hadn't included in my test queries.
(Providing an answer here so that people know this question has been solved. Hopefully this is okay practice on SO)

CouchDB Filtered Replication

Trying out filters for replication, I stumbled upon a problem.
While my filter is working as an entry in the _replicator database, I doesn't when using cURL.
The filter in the design document is:
{
"_id": "_design/partial",
"filters": {
"mobile": "function(doc, req) {
if (doc._attachments) {
var result = new Boolean(true);
for (attachment in doc._attachments) {
if (attachment.content_type == 'image/jpeg') {
return true;
}
if (doc._attachments.length > 1024) {
result = false;
}
}
return result;
} else {
return true;
}
}"
}
}
The cURL line:
curl -X POST http://admin:pass#192.168.178.13:5985/_replicate -d '{\"source\":\"http://admin:pass#192.168.2:5984/docs2\",\"target\":\"docs2_partial\",\"filter\":\"partial/mobile\",\"create_target\":true}' -H "Content-Type: application/json"
I created _design/partial document on both target and source, but all documents are being replicated. Even the one with an attached binary bigger than 1 MB.
Any help is appreciated!
The cURL reply is:
{"ok":true,"session_id":"833ff96d21278a24532d116f57c45f31","source_last_seq":32,"replication_id_version":2,"history":[{"session_id":"833ff96d21278a24532d116f57c45f31","start_time":"Wed, 17 Aug 2011 21:43:46 GMT","end_time":"Wed, 17 Aug 2011 21:44:22 GMT","start_last_seq":0,"end_last_seq":32,"recorded_seq":32,"missing_checked":0,"missing_found":28,"docs_read":28,"docs_written":28,"doc_write_failures":0}]}
Using either " instead of \" or " instead of ' the result is:
{"error":"bad_request","reason":"invalid UTF-8 JSON: [...]}
Now I think perhaps the logic of your filter function simply has a bug. Here is how I read your filter policy:
All docs that have no attachments pass
All docs that have an image/jpeg attachment pass
Docs with more than 1,024 attachments fail
In any other case, the docs pass
That sounds like perhaps an incorrect policy. Another way to restate this policy is "Docs with more than 1024 attachments fail, everything else passes." However since you wrote so much code, I suspect my summary is not the true policy.
Another quick note, on what looks like a bug. Given:
for (attachment in doc._attachments) { /* ... */ }
The attachment variable will be things like "index.html" or "me.jpeg", i.e. filenames. To get the attachment content-type, you need:
var type;
// This is WRONG
type = attachment.content_type; // type set to undefined
// This is RIGHT
type = doc._attachments[attachment].content_type; // type set to "text/html" etc.
To avoid this bug, you could change your code to make things more clear:
for (attachment_filename in doc._attachments) { /* ... */ }
Next, doc._attachments.length will tell you the number of attachments in the document, not for example the length of the current attachment. It is odd that you test for that inside the loop, because the expression will never change. Are you trying to test for attachment size instead?
What is the output from curl (i.e. from CouchDB)?
From your example, my first guess is that you have a quoting error. Inside single-quotes, you do not need to escape the double-quotes. Try removing all those backslashes. What happens?
If you are on Windows, the single quote is not valid in the shell. In that case, keep the backslashes and just change the single-quote to a double-quote.

Resources