Refreshing a sinatra app reading from updated files - ruby

I'm writing a sinatra app in ruby that gathers information about my network into two different files. The first, a .csv, gathers information on the IP Addresses and given names of all workstations in the network. The second, a .txt, reads into an Asterisk server and gathers information on active SIP channels linking to specified IP Addresses.
My app is merely compiling the information from these two files and creating tables on a webpage matching up users currently logged into specific stations. However, I want the app to feign realtime-use. I have the two files being automatically updated every 5 minutes, but as the files being read by the app are being overwritten, the app's output doesn't change. Is there a way to rig it so the app is reading the "new" files after they are written?
I've dug around on stack overflow, and I've seen things mentioning Kqueue for python users to simply watch for changes in those files before rewriting them, which would be really nice. Is there a ruby version for that? Additionally, I want the app to be accessible at all times, and maybe only "refresh" when it needs to update something, so that would make sense.
I also came to a funny little epiphany when I realized I'm never closing my files after I read them. I won't post the whole app, but here's where I read the files:
#Pulls active SIP channels from Asterisk
$sip = {}
File.open('sip.txt').each do |line|
userid,ip = line.split(" ")
$sip[ip] = userid[0..3]
end
#Prepares hash of all stations, ip addresses
$machines = {}
CSV.foreach('/Volumes/Scripts/report-51.csv') do |row|
name = row[1]
address = row[0]
$machines[name] = address
end
Is it possible that because I never close the files, the program never opens up the "new" documents? Just trying to brainstorm here.

It's hard to tell the overall structure of what's going on, but yeah, whatever code is being called every 5 minutes and which in turn does the file reading, I would make sure that the files are closed and reopened in that code with every call. That should solve your problem.
You may also consider switching to a database approach for more robust results.

Related

Applescript to read notifications to the Apps in Dock

I want to be able to have a script which can detects if I have new messages in my messaging apps.. Slack, lync,.
Is it possible to use applescript to read if there is any active notification on the apps in the Dock..
If you do:
`getconf DARWIN_USER_DIR`/com.apple.notificationcenter/db
(which line I found at Ask Different), you'll get returned:
/var/folders/_d/pg2g_[some_funny_numbers]/0//com.apple.notificationcenter/db: is a directory
Inside this/my folder I found:
db db-shm db-wal db2upgraded
When some action happens (I sent a notification) only db-wal gets updated (nearly) at once.
So, in principle it should be possible to write an AS (saved as Stay Open app) that periodically looks if "db-wal" has changed (comparing saved sizes or change dates) and, ONLY if so, searches it for some keywords (Slack, lync,…) again comparing # of occurrences, thus learning if s.th. new has arrived. Admittedly sounds awkward but could work.
It would be much more elegant to use a folder script, but as no file is moved nor a folder opened/closed such a script can not be invoked.

get_dir_file_info() hangs when run on a large directory

I have made a little function that deletes files based on date. Prior to doing the deletions, it lets the user choose how many days/months back to delete files, telling them how many files and how much memory it would clean up.
It worked great in my test environment, but when I attempted to test it on a larger directory (approximately 100K files), it hangs.
I’ve stripped everything else from my code to ensure that it is the get_dir_info() function that is causing the issue.
$this->load->helper('file');
$folder = "iPad/images/";
set_time_limit (0);
echo "working<br />";
$dirListArray = get_dir_file_info($folder);
echo "still working";
When I run this, the page loads for approximately 60 seconds, then displays only the first message “working” and not the following message “still working”.
It doesn’t seem to be a system/php memory problem as it is coming back after 60 seconds and the server respects my set_time_limit() as I’ve had to use that for other processes.
Is there some other memory/time limit I might be hitting that I need to adjust?
from the CI user guide the get_dir_file_info() is:
Reads the specified directory and builds an array containing the filenames, filesize, dates, and permissions. Sub-folders contained within the specified path are only read if forced by sending the second parameter, $top_level_only to FALSE, as this can be an intensive operation.
so if you are saying that you have 100k files then the best way to do it, is to cut it into two steps:
First: use get_filenames('path/to/directory/') to retrieve all your files without their information.
Second: use get_file_info('path/to/file', $file_information) to retrieve a specific file info, as you might not need all the file information immediately. it can be done on file name click or something relevant.
the idea here is not to force your server to deal with large amount of process while in production. that would kill two things, responsiveness, and performance (I haven't found a better definition for performance) but the idea here is clear.

Automatically saving notebook (or other type files in mathematica) files

I have been facing this problem for sometimes now, a laziness caused in part by the fact that Microsoft Office automatically save files you are working on with versions and automatic recovery.
Many times when I am starting a new notebook in mathematica to do some tests or whatever, I often forget to save what I am doing.
Every now and then, depending on the computer I am using, the computer crashes and all the beautiful work I was doing is lost forever...
Is there a way to get around this other that manically saving my files every five minutes? How about file versioning?
BTW: Using MMA V8
Regarding autosaving, you may want to check out the NotebookAutoSave option, which can be set to True through Fromat->Option Inspector. You have to choose "Selected notebook", then go to Notebook Options -> File Options, and set NotebookAutoSave to True. Then, your notebook will be saved after every evaluation. Whether or not this is a satisfactory solution, of course depends on the situation.
But my experience is that the most reliable way is to develop a CTRL+S reflex - this one never lets me down and is working quite well.
As for the versioning, it is much easier with packages, for which you can use WorkBench which has integrated support for CVS and support for SVN via Eclipse plugin. For notebooks, I refer you to this SO thread. You may also find this Mathgroup discussion of some interest.
EDIT
For M8, for auto-saving purposes you can probably also run
RunScheduledTask[NotebookSave[EvaluationNotebook[]],{300}]
But I can not test this code at the moment
EDIT2
I just came across this post in the Toolbag repository - which may also be an alternative for the autosave part of the question (but please see also the discussion in comments on the relative advantages of scheduled tasks vs. Dynamic)
Since you have MMA version 8 you could use:
saveTask = CreateScheduledTask[FrontEndExecute[FrontEndToken["Save"]], 5*60];
StartScheduledTask[saveTask];
to save every 5 minutes (change the term 5*60 for other timings).
To remove the auto-save task use:
RemoveScheduledTask[saveTask];
To save only a fixed, specific notebook, store its handle in nb (finding it using Notebooks, SelectedNotebook, InputNotebook or EvaluationNotebook) and use FrontEndToken[nb,"Save"] instead of just FrontEndToken["Save"]
I have a Mathematica package that provides auto-backup functionality. When enabled, the current notebook--call it "blah.nb"--will be backed up to "blah.nb~" after a configurable amount of time has elapsed. I use it constantly and it has saved me from losing work many, many times. It's better than autosaving since it doesn't touch the actual notebook file: if you screw something up or something gets corrupted you don't want to overwrite your main file. :)
It's on GitHub here.
I've got an autosave routine that saves a copy of every open, modified notebook every 5 minutes (or whatever interval you prefer. It leaves your manually-saved copy alone, and saves a "swap file" in a separate directory that can be easily recovered if need be. The code (to be copied to init.m) is given in this answer: https://mathematica.stackexchange.com/questions/18380/automatic-recovery-after-crash/65852#65852, and copied below:
Motivated by the same concerns, I wrote the following code and added it to my init.m file. There are two main entries you'll want to change to use this. The global variable $SwapDirectory is where the swap files are saved (by swap file, I mean it in the VIm sense; an "extra" copy of your notebook, separate from your manually saved copy that periodically saves any new work). The swap files are organized within the swap directory in a directory structure which "mirrors" their original file locations, and have ".swp" appended to their file names. The other variable you might want to change is the number of seconds between autosaves, indicated by the "300" (corresponding to 5 minutes) near the bottom of the code below. At the appropriate times, this code will (automatically in the background) save swap files for ALL open notebooks, unless they are unmodified from their manually-saved versions (this exception makes the code more efficient, and more importantly, prevents the storage of swap files for documentation notebooks, for example).
In its current form, the code does not filter for only the input cells, but hopefully you can use the other answers to make that modification yourself.
Some things to note:
1) the Mathematica Put command seems to have trouble writing to network drives, even when offline access is enabled. Therefore, it is probably best to choose a SwapDirectory that is on your local machine.
2) Within SwapDirectory, you should create a sub-directory called "Recovery". This is where the AutoSaveSwap routine will make an initial save of any notebooks for which there is NO existing manual save location.
3) Simply evaluate
RecoverSwap["filePath"]
where "filePath" is a string representing the filePath of the MANUALLY-SAVED copy of the file (i.e., not the file that was created by AutoSave). This will then pop up a window containing the most recent auto-saved version of the file. The manually saved version is NEVER overwritten, unless you explicitly choose to do so. Once the recovered version pops up, you can save it whereever you like, or discard it at your discretion.
4) You should probably add this code to the KERNEL version of init.m ($UserBaseDirectory/Kernel/init.m) rather than the frontend version... this way, if you quit and restart the kernel, the autosave feature will also restart. On the other hand, this means that you must evaluate at least one expression after each start or restart to begin auto-saving. Once this initial evaluation is done, you do NOT need to have evaluated a cell for it to be backed up (unlike the built-in autosave utility).
Hope this helps someone! Feel free to respond with any questions, suggestions, or requests for improvement you may have. And, if you find this post useful, upvotes would be most appeciated! Take care.
$SwapDirectory= "C:\\Users\\pacoj\\Swap Files\\";
SaveSwap[nb_NotebookObject]:=Module[
{fileName, swapFileName, nbout, nbdir, nbdirout, recoveryDir},
If[ ! SameQ[Quiet[NotebookFileName[nb]], $Failed],
(* if the notebook is already saved to the file system *)
fileName = Last[ FileNameSplit[ NotebookFileName[nb]] ];
swapFileName = fileName <> ".swp";
nbdir = Rest[FileNameSplit # NotebookDirectory[nb]];
nbdirout= FileNameJoin[ FileNameSplit[$SwapDirectory]~Join~nbdir]<>"\\";
If[!DirectoryQ[nbdirout], CreateDirectory[nbdirout]];
nbout = NotebookGet[nb];
Put[nbout, nbdirout <> swapFileName],
(* else, if the file has never been saved, save as untitled *)
recoveryDir= $SwapDirectory <> "Recovery\\\";
fileName= ("WindowTitle" /. NotebookInformation[nb])<>".nb";
NotebookSave[nb, recoveryDir <> fileName]
]
];
RecoverSwap::noswp= "swap file `1` not found in expected location";
RecoverSwap[nbfilename_String]:=Module[
{fileName, swapFileName, nbin, nbdir, nbdirout},
fileName= Last[ FileNameSplit[ nbfilename] ];
swapFileName= fileName <> ".swp";
nbdir= Most[ Rest[FileNameSplit # nbfilename] ];
nbdirout= FileNameJoin[ FileNameSplit[$SwapDirectory]~Join~nbdir]<>"\\\";
If[ FileNames[swapFileName, {nbdirout}] == {},
Message[RecoverSwap::noswp,nbdirout <> swapFileName]; Return[],
nbin= Get[nbdirout <> swapFileName]; NotebookPut[nbin]
]
];
AutoSaveSwaps= CreateScheduledTask[
SaveSwap /# Select[Notebooks[], "ModifiedInMemory" /. NotebookInformation[#]&],
300
]
StartScheduledTask[AutoSaveSwaps]

Script to Add Computer Accounts to AD from list

Ok, I have tried to google this and keep running into things that are close, but not quite there. I mess with them for a few hours and can't bridge it across to what I need.
Requirements: Read a list of computer names and add them to specific OUs.
The list can be formated however, but right now I have it as a csv.
/////////
Comp1,Computers,cold,Alaska,mydomain,com,
Comp2,servers,New Jersey,test,temp,training,Room3,trainers,mydomain,com,
Comp3,computers,New Jersey,test,temp,training,Room3,students,restricted,mydomain,com
Comp4,computers,New Jersey,test,temp,training,Room3,students,power users,mydomain,com
////////
As you can see, the domains portion is not the same on all the machines.
I tried using a vbscript but all I would get is "unable to connect to LDap" so I was thinking about storing the lines in an array and using dsadd and building the command line from the variables in the array.
I already have the portion written to browse for the file, and dsquery, dsadd, etc are all on the server that this will be run from.
This is probably a lot easier than I am trying to make it, I tend to over complicate things if I don't finish it right away.
Look at this:
Automating the creation of computer accounts

FTP FileWatcher

So, I am in this little predicament where I am stuck watching a few ftp folders to see if they have new files added to them. If they do, it needs to throw an event with the file name. Thereby telling something else to download that file.
This is a pretty simple object to make, I was just curious if anyone knew how expensive this operation would be?
I plan on using the command NLIST because I don't need file size information, and there will be no sub-directories in the folder. Each file in the folder will have exactly 25 characters in its name.
There could be anywhere from 10 to 'maybe' a couple thousand (max around 2000) files per folder (usually on the lower end, 100-300, but currently growing).
The files are anywhere from 250kb to a very VERY unlikely 10mb (usually within the 250kb to 4mb range).
There possibly could be up to a few hundred folders (in which case I could change the watch frequency depending on number of folders), but currently there are only a few (6-10ish).
There also would be multiple logins for the ftp server, different logins would have access to different folders.
I am not asking for an implementation, just if anyone has some first or second hand knowledge about FTP, how could this affect my network.
I am not opposed to putting in file retention times or change the frequency in which I check for new files.
Do you have any control over the remote servers? FTP isn't really optimized for this, and you could probably do a lot better with some sort of dedicated mini-server. You could use file system monitoring on the remote side and just send out the filenames when they arrive rather than continuously polling. You'd only need to have one connection open too, rather than the two that FTP requires.

Resources