Proper cmd line for ExifTool for pdf meta tagging - macos

I'm having issues getting exiftools to write custom meta tags for my pdf files. In MacOS terminal with exiftools installed. Here's a sample command.
exiftool -overwrite_original -config
new.config -XMP::pdfx:document_id=”7A 2017 091 d” -XMP::pdfx:description=”Booklet on stuff and more stuff” pdf_files/7A_2017_091_d.pdf
Here's the config lines:
#user-defined pdfs
%Image::ExifTool::UserDefined = (
'Image::ExifTool::XMP::pdfx' => {
document_id => { },
description => { },
},
);
1; #end
All I get back is:
Warning: Tag 'XMP::pdfx:Document_id' is not defined
Warning: Tag 'XMP::pdfx:Description' is not defined
Nothing to do.
What am I doing wrong?

This is the config file that worked for me:
#user-defined pdfs
%Image::ExifTool::UserDefined = (
'Image::ExifTool::XMP::pdfx' => {
document_id => { Writable => 'string', },
description => { Writable => 'string',},
},
);
1; #end
And the command line with results (you don't need the config file to read XMP tags, just write)
C:\>exiftool -config temp.txt -xmp-pdfx:document_id="test" -xmp-pdfx:description="Description Test" y:/!temp/test.pdf
1 image files updated
C:\>exiftool -xmp-pdfx:document_id -xmp-pdfx:description y:/!temp/test.pdf
Document id : test
Description : Description Test

Related

Function to read role, environment file in masterless puppet

I'm working with Puppet 4.5 in masterless configuration and am trying to create a Puppet function to read a simple config file that assigns roles and environments. I don't have any integration with hiera/facter that I can change.
The file format is:
host1::java_app_node::qa
host2::nodejs_app_node::prod
The Puppet function that will read this file is in a module called homebase. I want to function to return a hash or array of hashes that split the config values. This will let me use them in templates.
In modules/homebase/manifests/init.pp I define:
$role_file = 'puppet://role.lst'
I then created modules/homebase/functions/get_roles.pp as follows:
function homebase::get_roles() {
$func_name = 'homebase::get_roles()'
if ! File.exists?($::homebase::role_file) {
fail("Could not find #{$::homebase::role_file}")
}
hosts = { }
File.open($::homebase::role_file).each |line| {
parts = line.split(/::/)
hosts[parts[0]] = { 'host' => parts[0], 'role' => parts[1], 'env' => parts[2] }
}
return hosts
}
In other classes, I then want to call:
class myapp {
$servers = homebase::get_roles().each | k, v | {
$v['host'] if $v['role'] =~ /myapp/ && $v['env'] == $environment
}
file { 'myapp.cfg':
ensure => file,
path => '/opt/myapp/myapp.cfg',
source => template("/myapp/myapp.cfg.erb"),
mode => '0644',
owner => myuser,
group => myuser,
}
}
Seems like there would be a better way to do this. Am I completely off base?
There turned out to be a much easier way to this rather than try to create a function to read a non-standard configuration file. Instead, I used a site.pp file to create node {} entries. I also parameterized the myapp class to take inputs based on the node.
So my site.pp looks like:
node 'server1.mydomain', 'server2.mydomain' {
$myvar = [ 'val1', 'val2' ]
class { 'myapp':
values => $myvar
}
}
This could probably be improved. One of the issues is with a non-Puppet configuration file I was able to also able to control execution in my bash wrapper script. Much of the need for that went away when, though, with the node definitions.

Logfile won't apear in elasticsearch

I'm very new to logstash and elasticsearch, I am trying to stash my first log to logstash in a way that I can (correct me if it is not the purpose) search it using elasticsearch....
I have a log that looks like this basically:
2016-12-18 10:16:55,404 - INFO - flowManager.py - loading metadata xml
So, I have created a config file test.conf that looks like this:
input {
file {
path => "/home/usr/tmp/logs/mylog.log"
type => "test-type"
id => "NEWTRY"
}
}
filter {
grok {
match => { "message" => "%{YEAR:year}-%{MONTHNUM:month}-%{MONTHDAY:day} %{HOUR:hour}:%{MINUTE:minute}:%{SECOND:second} - %{LOGLEVEL:level} - %{WORD:scriptName}.%{WORD:scriptEND} - " }
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "ecommerce"
codec => line { format => "%{year}-%{month}-%{day} %{hour}:%{minute}:%{second} - %{level} - %{scriptName}.%{scriptEND} - \"%{message}\"" }
}
}
And then : ./bin/logstash -f test.conf
I do not see the log in elastic search when I go to: http://localhost:9200/ecommerce OR to http://localhost:9200/ecommerce/test-type/NEWTRY
Please tell me what am I doing wrong.... :\
Thanks,
Heather
I found a solution eventually-
I added both sincedb_path=>"/dev/null" (which from what I understood is for testing enviorment only) and start_position => "beginning" to the output file plugin and the file appeared both in elastic and in kibana
Thanks anyway for responding and trying to help!

puppet notify Exec doesn't working

Here is my code, don't worry about variable which is already set in original code. I am just putting small snippet here to show you what its doing. Following code updating file /etc/sysctl.d/pgsql.conf but not triggering notify or Exec to reload file. what is wrong here?
$sysctl_config = "/etc/sysctl.d/pgsql.conf"
exec { 'update_sysctl_shmall':
unless => "grep -q ^kernel.shmall ${sysctl_config}",
command => "/bin/echo \"kernel.shmall = ${shmall}\" >> ${sysctl_config}",
}
file { '/etc/sysctl.d/pgsql.conf':
ensure => present,
notify => Exec['reload_sysctl']
}
exec { 'reload_sysctl':
provider => shell,
command => '/bin/sysctl --system',
logoutput => 'on_failure',
refreshonly => true,
}
The following code:
file { '/etc/sysctl.d/pgsql.conf':
ensure => present,
notify => Exec['reload_sysctl']
}
only ensures that /etc/sysctl.d/pgsql.conf file exists. If the file exist it will do nothing, that's why Exec was not triggered to reload the file.
Please check the following links about notifications in puppet 1,2.
UPDATE:
Consider using audit metaparemeter:
file { '/etc/sysctl.d/pgsql.conf':
audit => 'content',
ensure => present,
notify => Exec['reload_sysctl']
}

Logstash exec plugin - available arguments

I have the following in my logstash configuration file:
input {
file {
path => "C:/myfile.txt"
}
}
output {
exec {
command => 'mytest.bat %message% %path%'
interval => 0
}
}
the %message% and %path% parameters are being passed to the batch file.
I am expecting to see:
message contain the line of the input file currently being parsed
path contain C:/myfile.txt
However, this is what the batch file recieves:
message "%message%"
path "C:/logstash-1.5.0/vendor/bundle/jruby/1.9/bin"
What is correct way to define the placeholders for:
the current line to be output
the name of file being parsed
Thanks
Please modify your config to
input {
file {
path => "C:/myfile.txt"
}
}
output {
exec {
command => "mytest.bat %{message} %{path}"
interval => 0
}
}
If you want to get the field in the event, please use %{message} ,not %message%

Retain signature files and db images while migrating in rhodes rhomobile

I am using rhodes version 3.4.2, RMS 2.2.1.13 for building an application.
What I want is, when I am migrating to a newer build with some data migrations, I would like to keep all the image files and signature files from the previous build to be present.
The image files and signature files are being stored in the root directory of the app, and not the default db-files folder for rhomobile.
Right now, when I am trying to upload an image which was there in previous build from a new build,
I get the following error: Rho::AsyncHttp.upload_file: finished with error 26 failed to open/read local data from file.
Any Suggestion anyone?
Update with code:
result = Rho::AsyncHttp.upload_file(
:url => url,
:ssl_verify_peer => false,
:multipart => [
{
:filename => filePath,
:name => imageValuesJSON,
:content_type => imageHeader
},
{
:body => "",
:name => imageValuesJSON,
:content_type => imageHeader
}
]
)

Resources