The follow is on MacOS High Sierra with Perl v5.28.2. The version of Net::MAC::Vendor is 1.265. This sort of a repeat post. My prior post lacked a lot of detail.
I am trying to put together a script to list ip addresses, MAC addresses, and vendors on a network. The Net::MAC::Vendor::lookup function is returning a timeout error amongst other things. I have checked a few IEEE links that were supposed to have the OUI data but they are all dead returning no data. I have seen a number of mentions stating the file can be found in some installations of linux. I have search high and low and have not found an oui.txt file on my system. If I downloaded a copy I wouldn't know where to put it or how I could have the Net::MAC::Vendor function to locate it. Also, if I did find a link I still wouldn't know how to direct the vendor lookup function to use it.
The errors I am getting are as follows:
Use of uninitialized value in concatenation (.) or string at /Users/{username}/perl5/perlbrew/perls/perl-5.28.2/lib/site_perl/5.28.2/Net/MAC/Vendor.pm line 320.
Failed fetching [https://services13.ieee.org/RST/standards-ra-web/rest/assignments/download/?registry=MA-L&format=html&text=D8-D7-75] HTTP status []
message [Connect timeout] at simplemacvendor.pl line 23.
Could not fetch data from the IEEE! at simplemacvendor.pl line 23.
The sample code:
#!/usr/bin/perl;
use strict;
use feature qw(say);
use Data::Dumper qw(Dumper);
use Net::MAC::Vendor;
open(ARP, "arp -na|") || die "Failed $!\n";
my #arp_table;
while (<ARP>) {
if ($_ =~ m/incomplet/) {next;}
if ($_ =~ m/Address/) {next;}
my #line = split(' ',$_);
my $computer = {};
$line[1] =~ s/(\()([0-9\.]*)(\))/\2/;
$computer->{ip} = $line[1];
$computer->{mac} = $line[3];
$computer->{if} = $line[5];
say Dumper($computer);
# Get vendor info
my $vendor_info = Net::MAC::Vendor::lookup( $computer->{mac} ); # line 23
$computer->{vendor} = $vendor_info->[0];
push #arp_table , $computer;
}
print "ARP Table with vendors:\n";
for my $i (0 .. $#arp_table) {
print "$arp_table[$i]{ip}\t";
print "$arp_table[$i]{if}\t";
print "$arp_table[$i]{mac}\t";
print "$arp_table[$i]{vendor}";
print "\n";
}
Related
I needed to write a solution to write data on and then print RFID labels en-masse, each generated as .png images from a template python script and data taken from a database or excel file.
To print the program simply calls the relative system utility (CUPS on unix systems) using subprocess.check_call(print_cmd) passing the image file (saved on a ram-mounted file system for minimal disk usage)
Now, it also needs to run on Windows systems, but there is not really a decent system utility for that, and solutions under a similar question command line tool for print picture? don't account for print-job completion or if the job results in an error, the margins are all screwed and the image is always rotated 90 degrees for some reason.
How can I sanely print an image using a command or a script in Windows and wait for it to complete successfully or return an error if the job results in an error?
Possibly with no dependencies
If you can install dependencies, there are many programs that offer a solution out-of-the-box.
The only sane way i could find to solve this issue with no dependencies is by creating a powershell script to account for this
[CmdletBinding()]
param (
[string] $file = $(throw "parameter is mandatory"),
[string] $printer = "EXACT PRINTER NAME HERE"
)
$ERR = "UserIntervention|Error|Jammed"
$status = (Get-Printer -Name $printer).PrinterStatus.ToString()
if ($status -match $ERR){ exit 1 }
# https://stackoverflow.com/a/20402656/17350905
# only sends the print job to the printer
rundll32 C:\Windows\System32\shimgvw.dll,ImageView_PrintTo $file $printer
# wait until printer is in printing status
do {
$status = (Get-Printer -Name $printer).PrinterStatus.ToString()
if ($status -match $ERR){ exit 1 }
Start-Sleep -Milliseconds 100
} until ( $status -eq "Printing" )
# wait until printing is done
do {
$status = (Get-Printer -Name $printer).PrinterStatus.ToString()
if ($status -match $ERR){ exit 1 }
Start-Sleep -Milliseconds 100
} until ( $status -eq "Normal" )
I would then need to slightly modify the print subprocess call to
powershell -File "path\to\print.ps1" "C:\absolute\path\to\file.png"
Then there are a couple of necessary setup steps:
(discaimer, I don't use windows in english so i don't know how the english thigs are supposed to be called. i will use cursive for those)
create an example image, right click and then select Print
from the print dialog that opens then set up all the default options you want, like orientation, margins, paper type, etc etc for the specific printer you're gonna use.
Go to printer settings, under tools then edit Printer Status Monitoring
edit monitoring frequency to "only during print jobs". it should be disabled by default
in the next tab, modify polling frequency to the minimum available, 100ms during print jobs (you can use a lower one for the while not printing option
Assuming the following:
only your program is running this script
theres always only 1 printing job at a time for a given printer
the printer drivers were not written by a monkey and they actually report the current, correct printer status
This little hack will allow to print an image from a command and await job completion, with error management; and uses only windows preinstalled software
Further optimization could be done by keeping powershell subprocess active and only passing it scripts in the & "path\to\print.ps1" "C:\absolute\path\to\file.png" format, waiting for standard output to report an OK or a KO; but only if mass printing is required.
Having had to work on this again, just wanted to add a simpler solution in "pure" python using the pywin32 package
import time
import subprocess
from typing import List
try:
import win32print as wprint
PRINTERS: List[str] = [p[2] for p in wprint.EnumPrinters(wprint.PRINTER_ENUM_LOCAL)]
PRINTER_DEFAULT = wprint.GetDefaultPrinter()
WIN32_SUPPORTED = True
except:
print("[!!] an error occured while retrieving printers")
# you could throw an exception or whatever
# bla bla do other stuff
if "WIN32_SUPPORTED" in globals():
__printImg_win32(file, printer_name)
def __printImg_win32(file: str, printer: str = ""):
if not printer:
printer = PRINTER_DEFAULT
# verify prerequisites here
# i still do prefer to print calling rundll32 directly,
# because of the default printer settings shenaningans
# and also because i've reliably used it to spool millions of jobs
subprocess.check_call(
[
"C:\\Windows\\System32\\rundll32",
"C:\\Windows\\System32\\shimgvw.dll,ImageView_PrintTo",
file,
printer,
]
)
__monitorJob_win32(printer)
pass
def __monitorJob_win32(printer: str, timeout=16.0):
p = wprint.OpenPrinter(printer)
# wait for job to be sheduled
t0 = time.time()
while (time.time()-t0) < timeout:
ptrr = wprint.GetPrinter(p, 2)
# unsure about those flags, but definitively not errors.
# it seems they are "moving paper forward"
if ptrr["Status"] != 0 and ptrr["Status"] not in [1024,1048576]:
raise Error("Printer is in error (status %d)!" % ptrr["Status"])
if ptrr["cJobs"] > 0:
break
time.sleep(0.1)
else:
raise Error("Printer timeout sheduling job!")
# await job completion
t0 = time.time()
while (time.time()-t0) < timeout:
ptrr = wprint.GetPrinter(p, 2)
if ptrr["Status"] != 0 and ptrr["Status"] not in [1024,1048576]:
raise Error("Printer is in error (status %d)!" % ptrr["Status"])
if ptrr["cJobs"] == 0 and ptrr["Status"] == 0:
break
time.sleep(0.1)
else:
raise Error("Printer timeout waiting for completion!")
wprint.ClosePrinter(p)
return
useful additional resources
Print image files using python
Catch events from printer in Windows
pywin32's win32print "documentation"
I simply want to have a rest api server which I can call to update a file via a URL, that's it.
Here is the file:
mytextfile:
key1 = value1
key2 = value2
On the client, a script will be run which sends a string or strings to the API server.
The API server will receive them, for example /update.script?string1="blah"&string2="fun" (pretend its url encoded)
The server should then parse these strings, and then call an exec function, or another script even on the system which does some sed command to update a file
Language or implementation doesn't matter.
Looking for fresh ideas.
All suggestions are appreciated.
I don't get it: What exactly is your problem/question?
My approach to the problem "modifying a file from inside a cgi script using url-encoded arguments" would be:
Pick a language you like and start coding, in my case with Perl.
#!/usr/bin/perl
use strict; use warnings;
Fetch all your arguments. I will use the CGI module of Perl here:
use CGI::Carp;
use CGI;
my $cgi = CGI->new;
# assuming we don't have multivalued fields:
my %arguments = $cgi->Values; # handles (almost) *all* decoding and splitting
# validate arguments
# send back CGI header to acknowledge the request
# the server will make a HTTP header from that
Now either call a special subroutine / function with them …
updateHandler(%arguments);
...;
my $filename = 'path to yer file name.txt';
sub updateHandler {
my %arguments = #_;
# open yer file, loop over yer arguments, whatever
# read in file
open my $fileIn, '<', $filename or die "Can't open file for reading";
my #lines = <$fileIn>;
close $fileIn;
# open the file for writing, completely ignoring concurrency issues:
open my $fileOut, '>', $filename or die "Can't open file for writing";
# loop over all lines, make substitutions, and print it out
foreach my $line (#lines) {
# assuming a file format with key-value pairs
# keys start at the first column
# and are seperated from values by an '=',
# surrounded by any number of whitespace characters
my ($key, $value) = split /\s*=\s*/, $line, 2;
$value = $arguments{$key} // $value;
# you might want to make sure $value ends with a newline
print $fileOut $key, " = ", $value;
}
}
Please don't use this rather insecure and suboptimal code! I just wrote this as a demonstration that this isn't really complicated.
… or contrieve a way to send your arguments to another script (although Perl is more than well suited for file manipulation tasks). Choose one of the qw{}, system or exec commands, depending on what output you need from your script, or decide to pipe your arguments to the script using the open my $fh, '|-', $command mode of open.
As for the server to run this script on: Apache looks fine to me, unless you have very special needs (your own protocol, single-threading, low security, low performance) in which case you might want to code your own server. Using the HTTP::Daemon module you might manage <50 lines for a simplicistic server.
When using Apache, I'd strongly suggest using mod_rewrite to put the /path into the PATH_INFO environment variable. When using one script to represent your whole REST API, you could use the PATH_INFO to choose one of many methods/subroutines/functions. This also eliminates the need to name the script in the URL.
For example, turn the URL
http://example.com/rest/modify/filename?key1=value1
into
/cgi-bin/rest-handler.pl/modify/filename?key1=value1
Inside the Perl script, we would then have $ENV{PATH_INFO} containing /modify/filename.
This is a bit Perl-centric, but just pick any language you are comfortable with and start coding, leveraging whatever module you can use on the way.
I would use a newer Perl framework, like Mojolicious. If I make a file (test.pl):
#!/usr/bin/env perl
use Mojolicious::Lite;
use Data::Dumper;
my $file = 'file.txt';
any '/' => sub {
my $self = shift;
my #params = $self->param;
my $data = do $file;
$data->{$_} = $self->param($_) for #params;
open my $fh, '>', $file or die "Cannot open $file";
local $Data::Dumper::Terse = 1;
print $fh Dumper $data;
$self->render( text => "File Updated\n" );
};
app->start;
Then run morbo test.pl
and visit http://localhost:3000/?hello=world (or run ./test.pl get /?hello=world)
then I get in file.txt:
{
'hello' => 'world'
}
and so on.
I've tested my program on a dozen Windows machines, a half dozen Macs, and a Linux machine and it works without error on both the Windows and Linux but not the Macs. My program is designed to work with protein database files which are text files that range from 250MB to 10GB. I took 1/10th of the 250MB file to make a sample file for debugging purposes but found that the error did not occur with the smaller file.
I've narrowed down the bug to this section of code, in this section $tempFile, is the protein database file:
open(ps_file, "..".$slash."dataset".$slash.$tempFile)
or die "couldn't open $tempFile";
while(<ps_file>){
chomp;
my #curLine = split(/\t/, $_);
my $filter = 1;
if($taxon){
chomp($curLine[2]);
print "line2 ".$curLine[2].",\t".$taxR{$curLine[2]}."\n";
$filter = $taxR{$curLine[2]};
}
if($filter){
checkSeq(#curLine);
}
}
This is a screenshot of the output of that print statement showing special characters:
This is what the output looks like on a Windows Machine:
Here is an example of 1 line from the $tempFile
>sp|P48255|ABCX_CYAPA Probable ATP-dependent transporter ycf16 OS=Cyanophora paradoxa GN=ycf16 PE=3 SV=1 MSTEKTKILEVKNLKAQVDGTEILKGVNLTINSGEIHAIMGPNGSGKSTFSKILAGHPAYQVTGGEILFKNKNLLELEPEERARAGVFLAFQYPIEIAGVSNIDFLRLAYNNRRKEEGLTELDPLTFYSIVKEKLNVVKMDPHFLNRNVNEGFSGGEKKRNEILQMALLNPSLAILDETDSGLDIDALRIVAEGVNQLSNKENSIILITHYQRLLDYIVPDYIHVMQNGRILKTGGAELAKELEIKGYDWLNELEMVKK CYAPA
The problem probably lies in inconsistent line-endings. If, as I suspect, trailing whitespace is not significant, you're better off removing that instead of chomping.
Also note:
Bareword filehandles such as ps_file are package global variables that are subject to action at a distance, use lexical filehandles.
Use File::Spec or Path::Class to handle file paths in a platform independent way.
Include full file paths and error message if there is an error opening a file.
In
chomp;
my #curLine = split(/\t/, $_);
my $filter = 1;
if($taxon){
chomp($curLine[2]);
$curLine[2] comes from a string that was read in as a line and chomped. I don't see why you are chomping that again.
Here's tidied up version of your code-snippet:
use File::Spec::Functions qw( catfile );
my $input_file = catfile('..', dataset => $tempFile);
open my $ps_file, '<', $input_file
or die "couldn't open '$input_file': $!";
while (my $line = <$ps_file>) {
$line =~ s/\s+\z//; # remove all trailing space
my #curLine = split /\t/, $line;
my $filter = 1;
if ($taxon) {
my $field = $curLine[2];
$filter = $taxR{ $field };
print join("\t", "line2 $field", $filter), "\n";
}
if ($filter) {
checkSeq(#curLine);
}
}
I use the following Perl code without any problems in Ubuntu, but when I try it in XP using activeperl it hangs, no error messages, just a blank screen. Are there any issues I should be aware of when moving code between standard perl and active perl or windows and ubuntu?
*sub do_search
{
my $term = shift #_;
my $page = 1;
my #results;
while (scalar #results < $opts{maxresults})
{
my $rset = $handle->search({query=>$term, page => $page, rpp => $opts{rpp} });
print "Searching for $term (page $page)\n" if $opts{verbose};
if (ref $rset eq 'HASH' && exists $rset->{results})
{
# break out if no results came back
last unless #{$rset->{results}};
push #results, #{$rset->{results}};
printf "Now we have %d entries\n", scalar #results if $opts{verbose};
}
# go to the next page
$page++;
}
print_post($_) foreach #results;
}*
source:http://www.ibm.com/developerworks/web/library/l-perl-twitter/index.html
-Thanks
There is quite extensive manual page about Windows-specific perl problems - perlwin32.
Only non-core package the script is using is Net::Twitter, which seems to have good test results under Windows - platform test matrix.
I'm trying to port a Perl script over from Unix to Windows but am having a near impossible time getting it to work due to the unsupported forking pipes in the open function. Here's the code:
sub p4_get_file_content {
my $filespec = shift;
return 'Content placeholder!' if ($options{'dry-run'});
debug("p4_get_file_content: $filespec\n");
local *P4_OUTPUT;
local $/ = undef;
my $pid = open(P4_OUTPUT, "-|");
die "Fork failed: $!" unless defined $pid;
if ($pid == 0) { # child
my $p4 = p4_init();
my $result = undef;
$result = $p4->Run('print', $filespec);
die $p4->Errors() if $p4->ErrorCount();
if (ref $result eq 'ARRAY') {
for (my $i = 1; $i < #$result; $i++) {
print $result->[$i];
}
}
$p4->Disconnect();
exit 0;
}
my $content = <P4_OUTPUT>;
close(P4_OUTPUT) or die "Close failed: ($?) $!";
return $content;
}
The error is:
'-' is not recognized as an internal or external command,
operable program or batch file.
Does anyone know how to make this work? Thanks!
Mike
I know it's not a direct answer to your question, but it looks like you're scripting something on top of Perforce in Perl? If so you might find an existing library does what you want already and save yourself a lot of headaches, or at least give you some sample code to work from.
For example:
P4Perl
P4::Server
P4::C4
EDIT: Now that I know what you're doing I'm guessing you're trying to port p42svn to Windows, or rather make it compatible with Windows at least. See this thread for a discussion of this exact issue. The recommendation (untested) is to try the code samples listed at http://perldoc.perl.org/perlfork.html under "Forking pipe open() not yet implemented" to explicitly create the pipe instead.
It's not going to work as-is. You'll need to find another method to accomplish what it's doing. It doesn't look like there's that burning a need for the fork-pipe, but it's hard to tell since I don't know what a p4 is and a lot of your code is being lost to angle bracket interpretation.