CFE syntax errors - syntax

Trying to write a cfengine3 promise that will take an entire directory and move it down one level.
I've used my policy hub to distribute the promise, but I've not yet folded it into my active promise.cf
Here's the promise:
body common control
{
bundlesequence => { dirstructure };
}
#Find out by existance of directories if filesystem is old structure or new
#Set classes for each instance. If old, copy down one level.
#If new file system already, pat yourself on the back
bundle agent dirstructure
{
classes:
"oldFILEstructure"expression => isdir("/old/dir/structure/");
"newFILEstructure" expression => isdir("/new/dir/structure/");
reports:
oldFILEstructure::
"system has old file structure..";
newFILEstructure::
"system has new file structure..";
methods:
oldFILEstructure::
"migratedirectories" usebundle => movedirectories
}
bundle agent movedirectories
{
files:
"/new/dir/"
copy_from => local_cp ("/old/dir/structure/.");
depth_search => recurse ("inf");
}
I've used this "isdir" source and this example for local_cp, both from CFE to base the promise on.
When invoked, I get the following error output and I'm trying to figure out why.
:/var/cfengine/inputs/standalone# cf-agent --no-lock --inform --file ./file_structure.cf
./file_structure.cf:41:12: error: syntax error
depth_search => recurse ("inf");
^
./file_structure.cf:41:12: error: Expected promiser string, got 'depth_search'
depth_search => recurse ("inf");
^
./file_structure.cf:41:15: error: Expected ';', got '=>'
depth_search => recurse ("inf");
^
./file_structure.cf:41:23: error: Expected promiser string, got 'recurse'
depth_search => recurse ("inf");
^
./file_structure.cf:41:25: error: Expected ';', got '('
depth_search => recurse ("inf");
^
./file_structure.cf:41:31: error: Expected ';', got ')'
depth_search => recurse ("inf");
^
./file_structure.cf:41:32: error: Expected promiser string, got ';'
depth_search => recurse ("inf");
^
./file_structure.cf:42:1: error: Expected ';', got '}'
}

files:
"/new/dir/"
copy_from => local_cp ("/old/dir/structure/.");
depth_search => recurse ("inf");
}
You have an extra semicolon at the end of the copy_from line.
A semicolon ; identifies the end of a promise. Try switching the semicolon at the end of the copy_from line to a comma ,.
files:
"/new/dir/"
copy_from => local_cp ("/old/dir/structure/."),
depth_search => recurse ("inf");
}
Additionally you may want to check out the transformer attribute.
It may or may not be a good use for your case.\
bundle agent example
{
files:
"/old/dir/structure" -> { "jira:EXAMPLE-1234" }
transformer => "/bin/mv /old/dir/structure /new/dir/structure",
comment => "The standard is to use the new location because x, y, z. Bad thing Q or U might happen if this is not managed properly.";
}

Related

Sphinx-autodoc with napoleon (Google Doc String Style): Warnings and Errors about Block quotes and indention

I am using Sphinx 4.4.0 with napoleon extension (Google Doc String). I have this two problems
ARNING: Block quote ends without a blank line; unexpected unindent.
ERROR: Unexpected indentation.
I found something about it on the internet but can not fit this two my code. My problem is I even do not understand the messages. I do not see where the problem could be.
This is the code:
def read_and_validate_csv(basename, specs_and_rules):
"""Read a CSV file with respect to specifications about format and
rules about valid values.
Hints: Do not use objects of type type (e.g. str instead of "str") when
specificing the column type.
specs_and_rules = {
'TEMPLATES': {
'T1l': ('Int16', [-9, ' '])
},
'ColumnA': 'str',
'ColumnB': ('str', 'no answer'),
'ColumnC': None,
'ColumnD': (
'Int16',
-9, {
'len': [1, 2, (4-8)],
'val': [0, 1, (3-9)]
}
}
Returns:
(pandas.DataFrame): Result.
"""
This are the original messages:
.../bandas.py:docstring of buhtzology.bandas.read_and_validate_csv:11: WARNING: Block quote ends without a blank line; unexpected unindent.
.../bandas.py:docstring of buhtzology.bandas.read_and_validate_csv:15: ERROR: Unexpected indentation.
.../bandas.py:docstring of buhtzology.bandas.read_and_validate_csv:17: ERROR: Unexpected indentation.
.../bandas.py:docstring of buhtzology.bandas.read_and_validate_csv:19: WARNING: Block quote ends without a blank line; unexpected unindent.
.../bandas.py:docstring of buhtzology.bandas.read_and_validate_csv:20: WARNING: Block quote ends without a blank line; unexpected unindent.
reStructuredText is not Markdown, and indentation alone is not enough to demarcate the code block. reStructuredText calls this a literal block. Although the use of :: is one option, you might want to explicitly specify the language (overriding the default) with the use of the code-block directive.
Also I noticed that you have invalid syntax in your code block—a missing ) and extra spaces in your indentation—which could have caused those errors.
Try this.
def read_and_validate_csv(basename, specs_and_rules):
"""Read a CSV file with respect to specifications about format and
rules about valid values.
Hints: Do not use objects of type type (e.g. str instead of "str") when
specificing the column type.
.. code-block:: python
specs_and_rules = {
'TEMPLATES': {
'T1l': ('Int16', [-9, ' '])
},
'ColumnA': 'str',
'ColumnB': ('str', 'no answer'),
'ColumnC': None,
'ColumnD': (
'Int16',
-9, {
'len': [1, 2, (4-8)],
'val': [0, 1, (3-9)]
}
)
}
Returns:
(pandas.DataFrame): Result.
"""

Can a logstash filter error be forwarded to elastic?

I'm having these json parsing errors from time to time:
2022-01-07T12:15:19,872][WARN ][logstash.filters.json ] Error parsing json
{:source=>"message", :raw=>" { the invalid json }", :exception=>#<LogStash::Json::ParserError: Unrecognized character escape 'x' (code 120)
Is there a way to get the :exception field in the logstash config file?
I opened the exact same thread on the elastic forum and got a working solution there. Thanks to #Badger on the forum, I ended up using the following raw ruby filter:
ruby {
code => '
#source = "message"
source = event.get(#source)
return unless source
begin
parsed = LogStash::Json.load(source)
rescue => e
event.set("jsonException", e.to_s)
return
end
#target = "jsonData"
if #target
event.set(#target, parsed)
end
'
}
which extracts the info I needed:
"jsonException" => "Unexpected character (',' (code 44)): was expecting a colon to separate field name and value\n at [Source: (byte[])\"{ \"baz\", \"oh!\" }\r\"; line: 1, column: 9]",
Or as the author of the solution suggested, get rid of the #target part and use the normal json filter for the rest of the data.

Why does puppet think my custom fact is a string?

I am trying to create a custom fact I can use as the value for a class parameter in a hiera yaml file.
I am using the openstack/puppet-keystone module and I want to use fernet-keys.
According to the comments in the module I can use this parameter.
# [*fernet_keys*]
# (Optional) Hash of Keystone fernet keys
# If you enable this parameter, make sure enable_fernet_setup is set to True.
# Example of valid value:
# fernet_keys:
# /etc/keystone/fernet-keys/0:
# content: c_aJfy6At9y-toNS9SF1NQMTSkSzQ-OBYeYulTqKsWU=
# /etc/keystone/fernet-keys/1:
# content: zx0hNG7CStxFz5KXZRsf7sE4lju0dLYvXdGDIKGcd7k=
# Puppet will create a file per key in $fernet_key_repository.
# Note: defaults to false so keystone-manage fernet_setup will be executed.
# Otherwise Puppet will manage keys with File resource.
# Defaults to false
So wrote this custom fact ...
[root#puppetmaster modules]# cat keystone_fernet/lib/facter/fernet_keys.rb
Facter.add(:fernet_keys) do
setcode do
fernet_keys = {}
puts ( 'Debug keyrepo is /etc/keystone/fernet-keys' )
Dir.glob('/etc/keystone/fernet-keys/*').each do |fernet_file|
data = File.read(fernet_file)
if data
content = {}
puts ( "Debug Key file #{fernet_file} contains #{data}" )
fernet_keys[fernet_file] = { 'content' => data }
end
end
fernet_keys
end
end
Then in my keystone.yaml file I have this line:
keystone::fernet_keys: '%{::fernet_keys}'
But when I run puppet agent -t on my node I get this error:
Error: Could not retrieve catalog from remote server: Error 500 on SERVER: Server Error: Evaluation Error: Error while evaluating a Function Call, "{\"/etc/keystone/fernet-keys/1\"=>{\"content\"=>\"xxxxxxxxxxxxxxxxxxxx=\"}, \"/etc/keystone/fernet-keys/0\"=>{\"content\"=>\"xxxxxxxxxxxxxxxxxxxx=\"}}" is not a Hash. It looks to be a String at /etc/puppetlabs/code/environments/production/modules/keystone/manifests/init.pp:1144:7 on node mgmt-01
I had assumed that I had formatted the hash correctly because facter -p fernet_keys output this on the agent:
{
/etc/keystone/fernet-keys/1 => {
content => "xxxxxxxxxxxxxxxxxxxx="
},
/etc/keystone/fernet-keys/0 => {
content => "xxxxxxxxxxxxxxxxxxxx="
}
}
The code in the keystone module looks like this (with line numbers)
1142
1143 if $fernet_keys {
1144 validate_hash($fernet_keys)
1145 create_resources('file', $fernet_keys, {
1146 'owner' => $keystone_user,
1147 'group' => $keystone_group,
1148 'subscribe' => 'Anchor[keystone::install::end]',
1149 }
1150 )
1151 } else {
Puppet does not necessarily think your fact value is a string -- it might do, if the client is set to stringify facts, but that's actually beside the point. The bottom line is that Hiera interpolation tokens don't work the way you think. Specifically:
Hiera can interpolate values of any of Puppet’s data types, but the
value will be converted to a string.
(Emphasis added.)

Getting error Aws::S3::Errors::InvalidBucketName when creating bucket

I am trying to create a bucket, with help of aws-sdk-ruby from ruby console. Following is the code I am running in my console:
Aws.config.update({
:region => "myRegion",
:credentials => Aws::Credentials.new("access_key", "secret_key"),
:endpoint => "http://Ip",
:force_path_style => true
})
bucket_name = "abc"
bucket = s3.bucket(bucket_name)
bucket.create({ acl: "authenticated-read", grant_full_control: "GrantFullControl"})
Last line gives this error:
Aws::S3::Errors::InvalidBucketName:
I dont see this error in the documentation here. When can one get such error and how to resolve this?
I was also getting this error.. the main reason why was getting this error because my bucket name was starting with capital letter.. Bucket name always starts with lowercase ..

mocha/chai test Unexpected token =>

I'm running a test, and I'm getting an unexpected error.
I'm sorting the results:
docs.sort((a, b) => m_ids.findIndex(id => a._id.equals(id)) -
m_ids.findIndex(id => b._id.equals(id)));
The error I'm getting is definitely related to that line
mbp:test testlab$ mocha .
/Users/testlab/Documents/workspace/KBase/controller/KBase.js:112
docs.sort((a, b) => m_ids.findIndex(id => a._id.equals(id)) -
^^
SyntaxError: Unexpected token =>
I was thinking about skipping the code by putting an if statement around it, but that doesn't seem to work either.
I rewrote the sort to not use =>, so it is working on the webpage and it is working on chai/mocha now.

Resources