Organizing the output of shell script into tables within the text file - shell

I am working with a unix shell script which have an output of script like below code:
EVENT DATE: 2019-05-12
TrapLogId Severity EventTime Model Description
1604 [major] 05:59:50 14 Network Interface Down: service 1-16
1605 [major] 05:59:51 14 Network Interface Down: service 1-15
EVENT DATE: 2019-05-13
TrapLogId Severity EventTime Model Description
1619 [minor] 07:58:50 30 Delayed Subscriber Mapping
1620 [minor] 08:03:49 79 Failed Reload: File syntax
1621 [clear] 08:04:49 79 Failed Reload Cleared: File syntax
1622 [clear] 08:28:50 30 Delayed Subscriber Mapping Cleared
EVENT DATE: 2019-05-15
TrapLogId Severity EventTime Model Description
1627 [minor] 01:43:58 22 Misconfigured Network Awareness: 10.1.17.0/24
1628 [clear] 01:48:58 22 Misconfigured Network Awareness Cleared
Im trying to organize it into table like this format :
EVENT DATE TrapLogId Severity EventTime Model Description
2019-05-12 1604 [major] 05:59:50 14 Network Interface Down: service 1-16
2019-05-12 1605 [major] 05:59:51 14 Network Interface Down: service 1-15
2019-05-13 1619 [minor] 07:58:50 30 Delayed Subscriber Mapping
2019-05-13 1620 [minor] 08:03:49 79 Failed Reload: File syntax
2019-05-13 1621 [clear] 08:04:49 79 Failed Reload Cleared: File syntax
2019-05-13 1622 [clear] 08:28:50 30 Delayed Subscriber Mapping Cleared
2019-05-15 1627 [minor] 01:43:58 22 Misconfigured Network Awareness: 10.1.17.0/24
2019-05-15 1628 [clear] 01:48:58 22 Misconfigured Network Awareness Cleared
how to parse it ? how to export it into table using shell ?
the code i want to organize into table has:
event date 1
header
content 1
event date 2
header
content 2
etc
i want it as
event date (as part of the header) header
content 1
content 2
content 3

You can pipe your script to:
awk 'BEGIN {
print "EVENT DATE TrapLogId Severity EventTime Model Description"
print
}
/EVENT DATE/ {date=$3}
match($3, "[0-9][0-9]:[0-9][0-9]:[0-9][0-9]") {
printf( "%-14s%-14s%-13s%-12s%-3s", date, $1, $2, $3, $4)
for(i=1;i<=4;i++) $i=""
print
}
'

$ cat tst.awk
BEGIN { OFS="\t"; dateTag="EVENT DATE" }
{ gsub(/^[[:space:]]+|[[:space:]]+$/,"") }
/^[^0-9]/ {
if ( $0 ~ dateTag ) {
date = $NF
}
else if ( !doneHdr++ ) {
numCols = NF
gsub(/[[:space:]]+/,OFS)
print dateTag, $0
}
}
/^[0-9]/ {
rest = desc = $0
sub("([[:space:]]+[^[:space:]]+){"(NF-numCols)+1"}$","",rest)
sub("^([^[:space:]]+[[:space:]]+){"numCols-1"}","",desc)
gsub(/[[:space:]]+/,OFS,rest)
print date, rest, desc
}
.
$ awk -f tst.awk file | column -s$'\t' -t
EVENT DATE TrapLogId Severity EventTime Model Description
2019-05-12 1604 [major] 05:59:50 14 Network Interface Down: service 1-16
2019-05-12 1605 [major] 05:59:51 14 Network Interface Down: service 1-15
2019-05-13 1619 [minor] 07:58:50 30 Delayed Subscriber Mapping
2019-05-13 1620 [minor] 08:03:49 79 Failed Reload: File syntax
2019-05-13 1621 [clear] 08:04:49 79 Failed Reload Cleared: File syntax
2019-05-13 1622 [clear] 08:28:50 30 Delayed Subscriber Mapping Cleared
2019-05-15 1627 [minor] 01:43:58 22 Misconfigured Network Awareness: 10.1.17.0/24
2019-05-15 1628 [clear] 01:48:58 22 Misconfigured Network Awareness Cleared

Related

bash - count, process, and increment thru multiple "Tasks" in log file

I have log files that are broken down into between 1 and 4 "Tasks". In each "Task" there are sections for "WU Name" and "estimated CPU time remaining". Ultimately, I want to the bash script output to look like this 3 Task example;
Task 1 Mini_Protein_binds_COVID-19_boinc_ 0d:7h:44m:28s
Task 2 shapeshift_pair6_msd4X_4_f_e0_161_ 0d:4h:14m:22s
Task 3 rep730_0078_symC_reordered_0002_pr 1d:1h:38m:41s
So far; I can count the Tasks in the log. I can isolate x number of characters I want from the "WU Name". I can convert the "estimated CPU time remaining" in seconds to days:hours:minutes:seconds. And I can output all of that into 'pretty' columns. Problem is that I can only process 1 Task using;
# Initialize counter
counter=1
# Count how many iterations
cnt_wu=`grep -c "WU name:" /mnt/work/sec-conv/bnc-sample3.txt`
# Iterate the loop for cnt-wu times
while [ $counter -le ${cnt_wu} ]
do
core_cnt=$counter
wu=`cat /mnt/work/sec-conv/bnc-sample3.txt | grep -Po 'WU name: \K.*' | cut -c1-34`
sec=`cat /mnt/work/sec-conv/bnc-sample3.txt | grep -Po 'estimated CPU time remaining: \K.*' | cut -f1 -d"."`
dhms=`printf '%dd:%dh:%dm:%ds\n' $(($sec/86400)) $(($sec%86400/3600)) $(($sec%3600/60)) \ $(($sec%60))`
echo "Task ${core_cnt}" $'\t' $wu $'\t' $dhms | column -ts $'\t'
counter=$((counter + 1))
done
Note: /mnt/work/sec-conv/bnc-sample3.txt is a static one Task sample only used for this scripts dev.
What I can't figure out is the next step which is to be able to process x number of multiple Tasks. I can't figure out how to leverage the while/counter combination properly, and can't figure out how to increment through the occurrences of Tasks.
Adding bnc-sample.txt (contains 3 Tasks)
1) -----------
name: Rosetta#home
master URL: https://boinc.bakerlab.org/rosetta/
user_name: XXXXXXX
team_name:
resource share: 100.000000
user_total_credit: 10266.993660
user_expavg_credit: 512.420495
host_total_credit: 10266.993660
host_expavg_credit: 512.603669
nrpc_failures: 0
master_fetch_failures: 0
master fetch pending: no
scheduler RPC pending: no
trickle upload pending: no
attached via Account Manager: no
ended: no
suspended via GUI: no
don't request more work: no
disk usage: 0.000000
last RPC: Wed Jun 10 15:55:29 2020
project files downloaded: 0.000000
GUI URL:
name: Message boards
description: Correspond with other users on the Rosetta#home message boards
URL: https://boinc.bakerlab.org/rosetta/forum_index.php
GUI URL:
name: Your account
description: View your account information
URL: https://boinc.bakerlab.org/rosetta/home.php
GUI URL:
name: Your tasks
description: View the last week or so of computational work
URL: https://boinc.bakerlab.org/rosetta/results.php?userid=XXXXXXX
jobs succeeded: 117
jobs failed: 0
elapsed time: 2892439.609931
cross-project ID: 3538b98e5f16a958a6bdd2XXXXXXXXX
======== Tasks ========
1) -----------
name: shapeshift_pair6_msd4X_4_f_e0_161_X_0001_0001_fragments_abinitio_SAVE_ALL_OUT_946179_730_0
WU name: shapeshift_pair6_msd4X_4_f_e0_161_X_0001_0001_fragments_abinitio_SAVE_ALL_OUT_946179_730
project URL: https://boinc.bakerlab.org/rosetta/
received: Mon Jun 8 09:58:08 2020
report deadline: Thu Jun 11 09:58:08 2020
ready to report: no
state: downloaded
scheduler state: scheduled
active_task_state: EXECUTING
app version num: 420
resources: 1 CPU
estimated CPU time remaining: 26882.771040
slot: 1
PID: 28434
CPU time at last checkpoint: 3925.896000
current CPU time: 4314.761000
fraction done: 0.066570
swap size: 431 MB
working set size: 310 MB
2) -----------
name: rep730_0078_symC_reordered_0002_propagated_0001_0001_0001_A_v9_fold_SAVE_ALL_OUT_946618_54_0
WU name: rep730_0078_symC_reordered_0002_propagated_0001_0001_0001_A_v9_fold_SAVE_ALL_OUT_946618_54
project URL: https://boinc.bakerlab.org/rosetta/
received: Mon Jun 8 09:58:08 2020
report deadline: Thu Jun 11 09:58:08 2020
ready to report: no
state: downloaded
scheduler state: scheduled
active_task_state: EXECUTING
app version num: 420
resources: 1 CPU
estimated CPU time remaining: 26412.937920
slot: 2
PID: 28804
CPU time at last checkpoint: 3829.626000
current CPU time: 3879.975000
fraction done: 0.082884
swap size: 628 MB
working set size: 513 MB
3) -----------
name: Mini_Protein_binds_COVID-19_boinc_site3_2_SAVE_ALL_OUT_IGNORE_THE_REST_0aw6cb3u_944116_2_0
WU name: Mini_Protein_binds_COVID-19_boinc_site3_2_SAVE_ALL_OUT_IGNORE_THE_REST_0aw6cb3u_944116_2
project URL: https://boinc.bakerlab.org/rosetta/
received: Mon Jun 8 09:58:47 2020
report deadline: Thu Jun 11 09:58:46 2020
ready to report: no
state: downloaded
scheduler state: scheduled
active_task_state: EXECUTING
app version num: 420
resources: 1 CPU
estimated CPU time remaining: 27868.559616
slot: 0
PID: 30988
CPU time at last checkpoint: 1265.356000
current CPU time: 1327.603000
fraction done: 0.032342
swap size: 792 MB
working set size: 668 MB
Again, I appreciate any guidance!

how to fetch logs content between two dates in windows powershell?

In our project, we are getting AppDynamics logs(application logs) and machine logs and sometime the the size of the logs increase which eats out the disk size. What I am trying to do it is to get the content between two dates like 10 Nov and 13 Nov and delete the rest. Since we are working in windows environment, this needs to be done in powershell. It is easier to handle such things in linux but I am not good at powershell scripting. Below is the code snippet.
[AD Thread-Metric Reporter1] 10 Nov 2019 14:47:32,899 ERROR ManagedMonitorDelegate - Error sending metrics - will requeue for later transmission
com.singularity.ee.agent.commonservices.metricgeneration.metrics.MetricSendException: Connection back off limitation in effect: /controller/instance/702/metrics
at com.singularity.ee.agent.commonservices.metricgeneration.AMetricSubscriber.publish(AMetricSubscriber.java:350)
at com.singularity.ee.agent.commonservices.metricgeneration.MetricReporter.run(MetricReporter.java:113)
at com.singularity.ee.util.javaspecific.scheduler.AgentScheduledExecutorServiceImpl$SafeRunnable.run` ]
[AD Thread-Metric Reporter1] 11 Nov 2019 14:46:32,899 ERROR ManagedMonitorDelegate - Error sending metrics - will requeue for later transmission
com.singularity.ee.agent.commonservices.metricgeneration.metrics.MetricSendException: Connection back off limitation in effect: /controller/instance/702/metrics
at com.singularity.ee.agent.commonservices.metricgeneration.AMetricSubscriber.publish(AMetricSubscriber.java:350)
at com.singularity.ee.agent.commonservices.metricgeneration.MetricReporter.run(MetricReporter.java:113)
at com.singularity.ee.util.javaspecific.scheduler.AgentScheduledExecutorServiceImpl$SafeRunnable.run` ]
[extension-scheduler-pool-5] 13 Nov 2019 18:45:40,634 INFO ReportMetricsConfigSupplier - Basic metrics will be collected and reported through the SIM extension because SIM is enabled.
[extension-scheduler-pool-8] 14 Nov 2019 18:47:18,650 INFO ReportMetricsConfigSupplier - Basic metrics will be collected and reported through the SIM extension because SIM is enabled.` ]
Code Snippet with file paths
# Get Start Time
$startDTM = (Get-Date)
$zstart = Read-Host -prompt '
Enter your start date in "10 Nov" format. Start date must be earlier than stop date.'
$zstop = Read-Host -prompt 'Enter your stop date in "13 Nov" format. Stop date must be later than start date.'
# $zstart = '10 Nov'
# $zstop = '13 Nov'
$zstart= Select-String $zstart "$env:userprofile\Desktop\machine-log.txt" | Select-Object -ExpandProperty LineNumber
$zstop= Select-String $zstop "$env:userprofile\Desktop\machine-logtxt" | Select-Object -ExpandProperty LineNumber
$AppLog = gc $env:userprofile\Desktop\machine-log.txt
$i = 0
$array = #()
foreach ($line in $AppLog){
foreach-object { $i++ }
if (($i -ge $zstart) -and ($i -le $zstop))
{$array += $line}}
$array | Out-File -encoding ascii -filepath $env:userprofile\Desktop\logfile-output.txt
The ERROR i get while executing the script.
8923 8924 8925 8926 8927 8928 8929 8930 8931 8932 8933 8934 8935 8936 8937 8938 8939 8940 8941 8942 8943
8944 8945 8946 8947 8948". Error: "Cannot convert the "System.Object[]" value of type "System.Object[]" to
type "System.Int32"."
At C:\Users\xa_abbasmn\Documents\Logs\test.ps1:13 char:5
+ if (($i -ge $zstart) -and ($i -le $zstop))
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [], RuntimeException
+ FullyQualifiedErrorId : ComparisonFailure
Could not compare "18" to "1 15 16 17 31 45 46 47 48 62 76 77 91 105 106 107 121 135 136 137 151 152 196 210
211 225 239 240 241 255 269 270 271 285 299 300 314 315 329 330 331 332 346 390 404 447 448 449 463 477 478
492 506 507 508 522 536 537 538 552 566 567 581 582 583 627 641 642 643 657 671 685 686 687
The powershell and windows version:
Name: Windows PowerShell ISE Host - Version : 5.1.14409.1018
Name: Microsoft Windows Server 2012 R2 Standard 64bit
your help will be highly obliged.
Best regards,
Windows 10 64-bit. Powershell 5. Does not require admin privileges.
How to quickly and efficiently extract text from a large logfile from input1 to input2 using powershell 5?
Sample logfile below.
For testing purposes copy your logfile to the desktop and name it logfile.txt
What is the default text editor in Windows Server 2012 R2 Standard 64-bit? See line 42.
What program do you have associated with .txt files? See line 42.
# Get Start Time
$startDTM = (Get-Date)
$zstart = Read-Host -prompt '
Enter your start date in "10 Nov" format (w/o quotes). Start date must be earlier than stop date.'
$zstop = Read-Host -prompt 'Enter your stop date in "13 Nov" format (w/o quotes). Stop date must be later than start date.'
# $zstart = '10 Nov'
# $zstop = '13 Nov'
$zstart= Select-String $zstart "$env:userprofile\Desktop\logfile.txt" | Select-Object -ExpandProperty LineNumber
$zstop= Select-String $zstop "$env:userprofile\Desktop\logfile.txt" | Select-Object -ExpandProperty LineNumber
$AppLog = gc $env:userprofile\Desktop\logfile.txt
$i = 0
$array = #()
foreach ($line in $AppLog){
foreach-object { $i++ }
if (($i -ge $zstart) -and ($i -le $zstop))
{$array += $line}}
$array | Out-File -encoding ascii -filepath $env:userprofile\Desktop\logfile-edited.txt
# begin get file size
$y = (Get-ChildItem "$env:userprofile\Desktop\logfile.txt" | Measure-Object -property length -sum)
$y = [System.Math]::Round(($y.Sum /1KB),2)
$z = (Get-ChildItem "$env:userprofile\Desktop\logfile-edited.txt" | Measure-Object -property length -sum)
$z = [System.Math]::Round(($z.Sum /1KB),2)
# end get file size
# get Stop Time
$endDTM = (Get-Date)
Write-Host "
Extracted $z KB from $y KB. The extraction took $(($endDTM-$startDTM).totalseconds) seconds"
start-process -wait notepad $env:userprofile\Desktop\logfile-edited.txt
remove-item $env:userprofile\Desktop\logfile-edited.txt
exit
logfile.txt:
[AD Thread-Metric Reporter1] 09 Nov 2019 14:48:32,899 ERROR ManagedMonitorDelegate - Error sending metrics
com.singularity.ee.agent.commonservices.metricgeneration.metrics.MetricSendException: Connection back off limitation
at com.singularity.ee.agent.commonservices.metricgeneration.AMetricSubscriber.publish(AMetricSubscriber.java:350)
at com.singularity.ee.agent.commonservices.metricgeneration.MetricReporter.run(MetricReporter.java:113)
at com.singularity.ee.util.javaspecific.scheduler.AgentScheduledExecutorServiceImpl$SafeRunnable.run ]
[AD Thread-Metric Reporter1] 10 Nov 2019 14:47:32,899 ERROR ManagedMonitorDelegate - Error sending metrics
com.singularity.ee.agent.commonservices.metricgeneration.metrics.MetricSendException: Connection back off limitation
at com.singularity.ee.agent.commonservices.metricgeneration.AMetricSubscriber.publish(AMetricSubscriber.java:350)
at com.singularity.ee.agent.commonservices.metricgeneration.MetricReporter.run(MetricReporter.java:113)
at com.singularity.ee.util.javaspecific.scheduler.AgentScheduledExecutorServiceImpl$SafeRunnable.run ]
[AD Thread-Metric Reporter1] 11 Nov 2019 14:46:32,899 ERROR ManagedMonitorDelegate - Error sending metrics
com.singularity.ee.agent.commonservices.metricgeneration.metrics.MetricSendException: Connection back off limitation
at com.singularity.ee.agent.commonservices.metricgeneration.AMetricSubscriber.publish(AMetricSubscriber.java:350)
at com.singularity.ee.agent.commonservices.metricgeneration.MetricReporter.run(MetricReporter.java:113)
at com.singularity.ee.util.javaspecific.scheduler.AgentScheduledExecutorServiceImpl$SafeRunnable.run ]
[extension-scheduler-pool-5] 13 Nov 2019 18:45:40,634 INFO ReportMetricsConfigSupplier - Basic metrics will be collected
[extension-scheduler-pool-8] 14 Nov 2019 18:47:18,650 INFO ReportMetricsConfigSupplier - Basic metrics will be collected
Results of running script on logfile.txt:
[AD Thread-Metric Reporter1] 10 Nov 2019 14:47:32,899 ERROR ManagedMonitorDelegate - Error sending metrics
com.singularity.ee.agent.commonservices.metricgeneration.metrics.MetricSendException: Connection back off limitation
at com.singularity.ee.agent.commonservices.metricgeneration.AMetricSubscriber.publish(AMetricSubscriber.java:350)
at com.singularity.ee.agent.commonservices.metricgeneration.MetricReporter.run(MetricReporter.java:113)
at com.singularity.ee.util.javaspecific.scheduler.AgentScheduledExecutorServiceImpl$SafeRunnable.run ]
[AD Thread-Metric Reporter1] 11 Nov 2019 14:46:32,899 ERROR ManagedMonitorDelegate - Error sending metrics
com.singularity.ee.agent.commonservices.metricgeneration.metrics.MetricSendException: Connection back off limitation
at com.singularity.ee.agent.commonservices.metricgeneration.AMetricSubscriber.publish(AMetricSubscriber.java:350)
at com.singularity.ee.agent.commonservices.metricgeneration.MetricReporter.run(MetricReporter.java:113)
at com.singularity.ee.util.javaspecific.scheduler.AgentScheduledExecutorServiceImpl$SafeRunnable.run ]
[extension-scheduler-pool-5] 13 Nov 2019 18:45:40,634 INFO ReportMetricsConfigSupplier - Basic metrics will be collected
Powershell in four hours at Youtube
Parse and extract text with powershell at Bing
How to quickly and efficiently extract text from a large logfile from line x to line y using powershell 5?
How to quickly and efficiently extract text from a large logfile from date1 to date2 using powershell 5?

rsyslog - Avoid pushing certain logs to /var/log/messages

I'm having an ec2 linux server, and am tracking the logs of my application server using rsyslog so that I can push these logs to loggly.
The problem is, rsyslog is also logging these in /var/log/messages which I don't want. Is there any way to avoid this? Can I filter out certain messages in /etc/rsyslog.conf so that these are not pushed to var/log/messages?
****** UPDATE *******
I tried adding the following lines in rsyslog.conf:
if $programname == 'programName' then {
*.err /var/log/messages
} else {
*.info;mail.none;authpriv.none;cron.none /var/log/messages
}
However, upon restarting rsyslog, I see the following error:
Dec 11 08:01:46 <hostname> rsyslogd: the last error occured in /etc/rsyslog.conf, line 37:"if $programname == 'programName' then {"
Dec 11 08:01:46 <hostname> rsyslogd: warning: selector line without actions will be discarded
Dec 11 08:01:46 <hostname> rsyslogd-3000: unknown priority name "" [try http://www.rsyslog.com/e/3000 ]
Dec 11 08:01:46 <hostname> rsyslogd: the last error occured in /etc/rsyslog.conf, line 39:"} else {"
Dec 11 08:01:46 <hostname> rsyslogd: warning: selector line without actions will be discarded
Dec 11 08:01:46 <hostname> rsyslogd-3000: unknown priority name "" [try http://www.rsyslog.com/e/3000 ]
Dec 11 08:01:46 <hostname> rsyslogd: the last error occured in /etc/rsyslog.conf, line 41:"}"
Dec 11 08:01:46 <hostname> rsyslogd: warning: selector line without actions will be discarded
I suppose my version of rsyslog (5.8.10) doesn't support if / else. Is there any other way to do this?
Thanks.
first send the message to the file that you want.
then use stop to prevent further actions.
if $programname == 'apache2' then {
action(type="omfile" file="/var/log/apache2/rewrite.log" name="action-omfile-apache2-rewrite")
stop
}

Shell Script to parse log and Convert to csv

I need a shell script to parse a log file and look for a certain pattern. if that paatern found, then take key values from that line and put it into a csv.
Example:
Here is the log file i have :
*webauthRedirect: Mar 24 08:57:50.903: #EMWEB-6-PARSE_ERROR: webauth_redirect.c:1034 parser exited. client mac= a0:88:b4:d3:55:8c bytes parsed = 0 and bytes read = 213
*webauthRedirect: Mar 24 08:57:50.903: #EMWEB-6-HTTP_REQ_BEGIN_ERR: http_parser.c:579 http request should begin with a character
***ewmwebWebauth1: Mar 04 11:33:46.870: #PEM-6-GUESTIN: pem_api.c:7851 Guest user logged in with user account (mrathi_dev) MAC address 00:1e:65:39:10:8e, IP address 192.168.133.146.**
*ewmwebWebauth1: Mar 04 11:33:46.870: #AAA-5-AAA_AUTH_NETWORK_USER: aaa.c:2178 Authentication succeeded for network user 'mrathi_dev'
*ewmwebWebauth1: Mar 04 11:33:46.858: #APF-6-USER_NAME_CREATED: apf_ms.c:6532 Username entry (mrathi_dev) with length (10) created for mobile 00:1e:65:39:10:8e
*mmListen: Mar 24 08:57:49.030: #APF-6-RADIUS_OVERRIDE_DISABLED: apf_ms_radius_override.c:1085 Radius overrides disabled, ignoring source 4
*webauthRedirect: Mar 24 08:57:47.008: #EMWEB-6-PARSE_ERROR: webauth_redirect.c:1034 parser exited. client mac= 5c:a:5b:60:f1:a7 bytes parsed = 0 and bytes read = 440
*webauthRedirect: Mar 24 08:57:47.008: #EMWEB-6-HTTP_REQ_BEGIN_ERR: http_parser.c:579 http request should begin with a character
*webauthRedirect: Mar 24 08:57:45.453: #EMWEB-6-PARSE_ERROR: webauth_redirect.c:1034 parser exited. client mac= 5c:a:5b:60:f1:a7 bytes parsed = 0 and bytes read = 440
*webauthRedirect: Mar 24 08:57:45.453: #EMWEB-6-HTTP_REQ_BEGIN_ERR: http_parser.c:579 http request should begin with a character
All I am interested in is the #PEM-6-GUESTIN line. I need to take the user id , mac and IP address from this line and put it in a csv. Only log lines with that status are required.
This is my first time working with shell scripts and all your help would be appreciated.
I think it is easier using grep to filter + sed to get groups using regex:
grep "#PEM-6-GUESTIN" log.txt | sed -r "s/.*user account \((.*)\).* MAC address (.*), IP address (.*)\.\*\*.*/\1,\2,\3/"
And the output is in CSV format:
mrathi_dev,00:1e:65:39:10:8e,192.168.133.146

Expect script to simulate SNMP

I want to monitor a device that doesn´t support SNMP, so I have tried get a counter via an expect script. This script connects to the device using SSH, logs the output on a file and then it parses the output in order to get the desired counter.
When I execute the script from the console I get the following desired output:
root#box:/path# ./GGSN-PDP-Contexts.expect
.1.3.6.1.4.1.6147.2.1
Integer32
310838
However, when I try to get the result using snmpget it doesn´t work!
root#box:/path# snmpget -m TDP-MIB -v 2c -c TM_Com_Pub localhost .1.3.6.1.4.1.6147.2.1
TDP-MIB::PDPContextsNumber = No Such Instance currently exists at this OID
By the way, this is the relevant configuration at snmpd.conf:
pass .1.3.6.1.4.1.6147.2.1 /usr/bin/expect /path/GGSN-PDP-Contexts.expect
And this is the expect script that I'm using:
#!/usr/bin/expect -f
# Constants
set user "user"
set device "10.10.222.176"
set pass "blablabla"
set timeout -1
set prompt "GGSN-LV02#"
set file "./GGSN-PDP-Contexts.log"
# Options
match_max 100000
log_user 0
# Access to device
spawn ssh -oStrictHostKeyChecking=no -oCheckHostIP=no $user#$device
expect "*?assword:*"
send -- "$pass\r"
# Commands execution
expect -exact "$prompt"
send -- "display pdp-number\r"
log_file -a $file
# Logging
expect -exact "$prompt"
log_file
send -- "quit\r"
# Get the value
set result [exec cat $file | grep "ALL GTP" | cut -d " " -f14]
set value [format %d $result]
# Print the value
puts ".1.3.6.1.4.1.6147.2.1"
puts "Integer32"
puts $value # If I replace the $value with a number, it doesn't work either
# Erase log file
exec rm $file
close
Could you bring me any hint? Thanks in advance!
EDIT:
In addition, these are the last lines of snmpget's debug output:
trace: snmp_comstr_parse(): snmp_auth.c, 135:
dumph_recv: SNMP version
dumpx_recv: 02 01 01
dumpv_recv: Integer: 1 (0x01)
trace: snmp_comstr_parse(): snmp_auth.c, 147:
dumph_recv: community string
dumpx_recv: 04 0A 54 4D 5F 43 6F 6D 5F 50 75 62
dumpv_recv: String: TM_Com_Pub
trace: _snmp_parse(): snmp_api.c, 4149:
dumph_recv: PDU
trace: snmp_pdu_parse(): snmp_api.c, 4255:
dumpv_recv: Command RESPONSE
trace: snmp_pdu_parse(): snmp_api.c, 4336:
dumph_recv: request_id
dumpx_recv: 02 04 3B 9E CF 74
dumpv_recv: Integer: 1000263540 (0x3B9ECF74)
trace: snmp_pdu_parse(): snmp_api.c, 4347:
dumph_recv: error status
dumpx_recv: 02 01 00
dumpv_recv: Integer: 0 (0x00)
trace: snmp_pdu_parse(): snmp_api.c, 4358:
dumph_recv: error index
dumpx_recv: 02 01 00
dumpv_recv: Integer: 0 (0x00)
trace: snmp_pdu_parse(): snmp_api.c, 4376:
dumph_recv: VarBindList
trace: snmp_pdu_parse(): snmp_api.c, 4406:
dumph_recv: VarBind
trace: snmp_parse_var_op(): snmp.c, 166:
dumph_recv: Name
dumpx_recv: 06 09 2B 06 01 04 01 B0 03 02 01
dumpv_recv: ObjID: TDP-MIB::PDPContextsNumber
trace: snmp_pdu_parse(): snmp_api.c, 4415:
dumph_recv: Value
TDP-MIB::PDPContextsNumber = No Such Instance currently exists at this OID
Also, this is my current MIB:
TDP-MIB DEFINITIONS ::= BEGIN
IMPORTS
MODULE-IDENTITY, OBJECT-TYPE, Integer32, enterprises
FROM SNMPv2-SMI
OBJECT-GROUP FROM SNMPv2-CONF;
TDP MODULE-IDENTITY
LAST-UPDATED "201210080000Z" -- 8/oct/2012
ORGANIZATION "TELEFONICA"
CONTACT-INFO "Authors: Hernan Romano / Antonio Ocampo
Email: h.romanoc#pucp.edu.pe / aocampo#pucp.edu.pe"
DESCRIPTION "MIB para gestionar los equipos que carecen de SNMP"
REVISION "201210080000Z" -- 08/oct/2012
DESCRIPTION "Revision 2.1"
::= { enterprises 6147 }
Nokia OBJECT IDENTIFIER ::= { TDP 1 }
Huawei OBJECT IDENTIFIER ::= { TDP 2 }
TDPMIBConformance OBJECT IDENTIFIER ::= { TDP 3 }
ClearCodeGroup1 OBJECT-TYPE
SYNTAX Integer32
MAX-ACCESS read-only
STATUS current
DESCRIPTION "Clear Code Group 1"
::= { Nokia 1 }
PDPContextsNumber OBJECT-TYPE
SYNTAX Integer32
MAX-ACCESS read-only
STATUS current
DESCRIPTION "PDP Contexts Number"
::= { Huawei 1 }
TDPMIBGroup OBJECT IDENTIFIER
::= { TDPMIBConformance 1 }
--grupoTDP OBJECT-GROUP
-- OBJECTS {
-- ClearCodeGroup1,
-- PDPContextsNumber
-- }
-- STATUS current
-- DESCRIPTION "Objetos para el monitoreo de los equipos que carecen de SNMP"
-- ::= { TDPMIBGroup 1 }
END
Finally solved :) The trouble was the path file.
Instead of relative path:
set file "./GGSN-PDP-Contexts.log"
I put the absolute path:
set file "/FULL_PATH/GGSN-PDP-Contexts.log"
and snmpget works!!
root#box:/path# snmpget -m TDP-MIB -v 2c -c TM_Com_Pub localhost .1.3.6.1.4.1.6147.2.1
TDP-MIB::PDPContextsNumber = INTEGER: 319291

Resources