Python: is SimpleHTTPServer only capable of serving directory? - bash

I have some experience in nodejs and php and almost zero experience in python, and I need to write a very simple 1-page script on my small VM which takes input through GET and executes some bash commands based on the input. So I thought python would be better suited for it since it already comes pre-installed with ubuntu, plus it would provide me an opportunity to start learning python.
I Googled all day and only found references to SimpleHTTPServer which only seems to be for delivering static server-side web-pages. Where do I start in python to achieve a mix of dynamic server-side pages and bash scripting with minimum additional installation and configuration?

Related

How to build a command line package from scripts?

I have spent time working on a bioinformatics project and produced numerous scripts and now I would like to use them for building a bioinformatics software that runs in the command line terminal, with the costumary manual and binary files. I would like to be able 1. to protect the code, 2. Make it fancy by not having to count with multiple scripts and 3. share the code with any one interested.
Since I don't really know where to start from, I would like to ask for orientation on the topic. I have been reading about script compilation and I think this could work, but I have scripts in three different coding languages, mainly python and bash, so I have not seen any tutorial on this specific case.
Any help as sharing resources (videos, manuals, software, etc.) or giving tips is appreciated. I know this is a VERY open question, so open answers are also welcome.
You could use the python argparse library to build a command line application that accepts arguments and flags. With this method, you can provide flags for user input and run your different scripts, including the bash scripts, based on user input.
https://realpython.com/command-line-interfaces-python-argparse/
Similarly, you can do this in a bash script that provides the user with options and run your other scripts based on input.
https://www.redhat.com/sysadmin/arguments-options-bash-scripts
I'm not sure what you mean by protect the code? If you mean hide the code, as far as I know, you cannot easily hide bash and python code or turn them into binaries if you want to share the script.

Is mod_perl what I'm looking for? FastCGI? PSGI/Plack?

I have exhausted my ability to find answers on the web for this one. I'm trying to install mod_perl on windows and there are many dead ends.
Is mod_perl even what I'm looking for?***
I have a collection of web apps used within my company's local network for database and file system interface. The web server runs Apache 2.2 and ActivePerl 5.16 using DBI, DBD::mysql, and CGI. The clients get their dynamic content via AJAX calls (jQuery.getJSON) to the Perl scripts using CGI parameters. The traffic is extremely light - only 4 or so users and only a few queries at a time here and there.
The issue I'm having is that the latency is unacceptable for the nature of these apps. The delay is typically around 400ms, all waiting time. I have experimented with increasingly simplistic Perl scripts and believe all of the delay is the Perl interpreter. I've looked into FastCGI but as I understand this deals mostly with high traffic which is not my problem: it's the overhead of each low-traffic call. So it seems like an Apache-embedded Perl interpreter (as I understand mod_perl to be) would solve my overhead-related latency issues.
How do you install it in a post Randy Kobes world?
All resources I've found for installing mod_perl for my setup involve a server theory5x.uwinnipeg.ca formerly run by him and now defunct after his passing. ActivePerl ppm does not have any mod_perl built in packages and the website shows all build failed listings.
Here is an ActiveState community post explaining why there is no ppm.
I did find this resource that seems to have all the missing pieces but for Strawberry Perl.
So I'm left to think the only way to do this is to install from source, but I have no understanding of how to do this. I have zero familiarity with Linux and it seems like most of this stuff is geared toward it. Worse yet I have a 64-bit Windows XP and a Windows Server to install it on.
The other thing that crossed my mind is maybe I need to install some kind of distribution like XAMPP instead of putting together all the pieces myself. I'd be quite nervous to change course now and risk breaking my working but slow apps
Is mod_perl even what I'm looking for?
I hope not.
There are issues with mod_perl. Your Apache, mod_perl and perl need to all be built with compatible compilers and architectures so they can all be linked at run time. There will be no running of a 32bit Apache with a 64bit perl when you are using mod_perl. In my experience mod_perl should also be compiled against header files for your specific versions of both Apache and perl. Presuming you get all this secret sauce mixed up correctly, you are now running a web server that can be crashed by a poorly written perl script. But on the bright side, this is more efficient than common CGI.
After a few years of this madness, FastCGI was invented. By running as a persistent, but separate, process, the web server was able to achieve mod_perl (or mod_PHP, or mod_python) efficiency without the need for binary compatibility or the stability risks. Just think of the freedom! An Apache module that cares only about binary compatability with it's Apache host and can farm out tasks to Perl, Python, C or even Visual Basic. ( I just had an evil thought about trying to do web services with Forth or Lisp, but that would just be crazy.)
Running on a linux distro (or other canned XAMPP stack) can make setup and maintenance of mod_perl easier because they will distribute it in a package that has been compiled to work with the packages they supply of both Apache & perl. Unfortunately if you want to run with a version of Apache or perl that is not "official" to your distro, get ready to DIY. Even so, a distro's packages do not mitigate the stability issues inherent in running mod_(language-of-choice).
In any case, before you're up and running in your new configuration, your existing CGI scripts will need to be modified. You can choose to rewrite them to mod_perl, FastCGI, or PSGI/Plack standards. If you choose to rewrite to PSGI/Plack standards, then you can care much less about the specifics of your web server's current or future configuration.
How do you install it in a post Randy Kobes world?
The last link in your question appears to be spot on. Do you have a religious or PHB based reason to prefer ActivePerl over StrawberryPerl? In the end, mod_perl requires that it be built against your specific version of Apache and your specific version of perl. This will either involve compiling it yourself, somebody else wrapping up versions for multiple Apache/perl version combos, or somebody else wrapping up a single version and asking you to use their preferred versions of Apache & perl.
If you choose the mod_perl route and believe even slightly that server software should be kept up to date (XP? Seriously?), then be prepared to either roll your own or trust your 3rd party to keep you up to date. Of course, if you're a hit-and-run developer, well that frees up your choices considerably...
tl-dr:
FastCGI is your friend. Particularly if you are running Windows and like to keep server software up to date.
mod_perl works best when supported by a responsible distro or a responsible developer who is comfortable building it from it's source. ...repeatedly.
It's been an eternity since I've installed mod_perl on Windows so I'm not sure I can help you with that.
But your understanding that FastCGI "deals mostly with high traffic" is not correct. Both FastCGI and mod_perl will offer very similar performance benefits, because both will execute your scripts with a persistent interpreter–eliminating the overhead of starting up perl and compiling your code on each request. Therefore, there is no reason not to give FastCGI a shot.
You might want to look at the PSGI/Plack API which allows you to write code agnostically that can run under vanilla CGI, FastCGI, mod_perl, or with a PSGI-aware server such as Starman, or uwsgi. All of these except for vanilla CGI offer a persistent environment that will reduce the overhead of executing your scripts.

Ruby (not Rails) on IIS 7 via FastCGI

Hopefully this is a simple one. I want to run Ruby on IIS 7+ as I would PHP, Perl, Python, etc. (that is to set a handler mapping to .rb files).
Every time I Google for it I get Rails. But I don't want to run Rails, I would just like to run straight Ruby.
I know I'll to set up my headers and all sorts of other stuff, since, unlike PHP, Ruby isn't technically native to the web.
I'm looking for a manual installation for this, not something automated with an executable.
My reason for needing this is simply for learning.
Any answers would be very helpful! Thanks!

Uploading to FTP server with as many threads as possible

This might sound like a strange request, but I'm hoping I have more luck here than I've had googling for the same topic.
I'm searching for a Windows based application that allows me to upload files to an FTP server via the command line, across as many threads as possible.
I'm currently trialing WinSCP, which has a simple scripting interface, that I can invoke from the command line. However, whilst it's a) windows based, b) command line driven/scriptable it doesn't make use of any multi threading to synchronise uploads of large files.
It seems I'm forever limited to achieving 2 of my 3 goals. For example, FileZilla is a) windows based and b) multithreaded for uploads, but unfortunately lacks any command line or scripting capabilities :/
Does anyone know of anything that might be able to achieve all 3 of my desires?
Well, FileZilla is GPL, so you could fork it and create a command-line/scriptable version sans the GUI. You'd have to implement the scripting engine though.
Alternately, you could implement a client on top of Twisted FTP (in Python).

Ruby server-side back end scripting tools/resources

I am trying to learn how ruby is used in a server based back end environment. For example I want to be running a ruby script 24/7 on a server. What are the best practices for this and how does one go about doing this?
Can anyone provide some resources on how to do this or if you could label what I am trying to do? I am unsure of the terms that I am supposed to be googling.
Use cron. From OS point of view Ruby app is just a script like bash.
Also all Unix OSes have some kind of daemon script (like see examples in /etc/init.d)
Try BackgroundRb - this is a special Rails plugin that works like a Linux daemon. You could use any classes/models defined in your Rails application within the background code. You could also pass data to/from background process.

Resources