Is there a quick and easy way to make a ruby/Tk script run in a browser? I wrote a handy-dandy little ruby/tk script to do stuff. A friend saw it and said that his friends would love to use it, but they are extremely non-technical and cannot handle installing ruby (even though I showed him how simple it is). What he wants is for me to give him a link that someone can click on a browser and magically make the tool run.
I've looked around and get the impression that it cannot be done. 'it just does not work that way'. But I have not seen a clear 'NO' . I see things on how to get ruby to run in a browser, but not the Tk part. I also looked at rubyscript2exe but get the impression that it was abandoned.
No, you can't directly run Ruby in the browser like that.
There are websites such as RubyFiddle which let you run short snippets. What they're actually doing is executing the code remotely then displaying the result.
Because this code is actually being executed remotely, there's no way of running something interactive (like a TK UI) with it. There are some services which give you a hosted Ruby environment with working Terminal, but even these aren't going to work with a TK UI.
This section of Ruby Gems https://www.ruby-toolbox.com/categories/packaging_to_executables has a good list of all the tools available to package Ruby apps for distribution though, so you might be able to send them a simple installer to allow them to use it locally?
I've had a lot of success doing this with https://github.com/larsch/ocra/ which is actively maintained.
Related
I've written a game in Ruby. Currently the only way to interact with it is with a CLI. It's packaged up as a gem.
I would like to put it on a website, for people to use. How would I go about that?
To clarify, I want a browser window that has a command prompt (like http://tryruby.org), that loads my gem first. I'm familiar with Heroku which I presume is the avenue of choice for this?
I am looking for a way to automate the testing, web page data filling, and also wanted to extract web page data and get them stored into our database permanent basis. Is there any way to fulfil such requirement using Ruby? If so, please guide me to what Ruby modules can help me.
Yes you can do all this tasks using Ruby and some gems.
I recommend you to take a look at Nokogiri gem for data extraction:
https://github.com/sparklemotion/nokogiri
And Capybara gem for testing and automation of forms and stuff:
https://github.com/jnicklas/capybara
P.S.: Capybara gem does much more than just this, but it can be applied to your case too.
Since some Webpages may not be valid XML, you are also able to use Regular Expressions to fetch the data you want from a webpage. Sometimes a XMLReader-approach just fails.
Sample:
require 'open-uri'
page_content = open("http://your_page.com").read
page_body = page_content.scan(/<body>(.*)<\/body>/i).first
# do whatever you want with it
As VBSlover said, capybara is useful to deal with browsing related stuff.
Doing this in an automated way every n minutes or the like is also possible with the whenever gem.
For handling Database-Storing there are plenty of very good gems out there.
Final answer: there is nothing you can't do with Ruby nowadays. Okay, maybe except writing some really (!) high-performance code / 3D-Engines.
Edit:
if you can tell what you exactly want to do i may suggest you some matching gems.
Usually "There is a gem for it" is a good saying. you can browse rubygems.org for some keywords you need, or look at https://www.ruby-toolbox.com/ for some categorized/ranked suggestions for your problem. :)
EDIT 2:
have a look at http://watir.com/
maybe just play around with it in some little painless scripts to get a feeling for it and if it is the solution for you.
Watir drives browsers the same way people do. It clicks links, fills
in forms, presses buttons. Watir also checks results, such as whether
expected text appears on the page.
Once you have it clicked everything for you, just scrape the results (or whatever you need) from the webpage, using some XML-Parser (nokogiri would be a good choice) or some regexp's.
Then stuff your data in your database. Activerecord comes to mind for this, but it may or may not be overkill. depending on your database, choose whatever adapter/connection gem you like (again: there are MANY).
If you want to do this every hour or the like, just use the whenever gem (manages a cronjob for you) or simply write a infinite loop with sleep(x) in it if you want. There is more than one way to do it. :)
First of all, you need a proper operation system, either use Linux or BSD or MacOS.
Windows will fit for some people, but not for you as ruby developer, too much libraries need c extensions with are pain in the ass to compile under cygwin.
I recommend, install a Ruby version manager, so you can try out different ruby versions, I prefer RVM, the Ruby Version Manager.
Install Ruby 1.9.3 it is the standard nowadays.
Trough rubygems install the gem mechanize, with does pretty all automation for websites you will need. It is a successor of LWP::Mechanize from Perl.
Nokogiri would be also useful, for parsing XML data like (X)HTML, but remember you should have prior libxml libs installed on your system.
Ah, according to your question:
Yes, you can read websites using ruby, for example read this webpage:
http = HTTPClient.new
http.get "http://stackoverflow.com/questions/14235393/can-i-read-webpage-data-using-ruby"
Done
I tried to install Squeak/Pharo into Ubuntu server machine.
./squeak -vm-display-null ./Pharo-1.2.2-12353/Pharo-1.2.image
It executed, but there was no command-line. No way to use without GUI?
Have a look at Coral, it provides a scripting interface to Pharo. Not sure where to find up-to-date documentation, but there is a build on the Pharo Build Server.
You can send scripts by parameter to the VM.
./squeak -vm-display-null ./Pharo-1.2.2-12353/Pharo-1.2.image myScript.st
But that's all you can do apart from Coral. Otherwise you should use GNUSmalltalk
I believe in the current VM you have to use a full "file:///" URL, a choice made a while back and only recently having been discussed as wanting to be reversed.
I am not sure if I understand your needs correctly, but I guess you could write a few liner read-eval loop, and pass it as script argument at start up.
Other than that most headless usages of smalltalk are for web servers (seaside, aida), in which case there is usually an admin url which lets you to poke around image by sending messages to objects and similar. If you have seaside one click image you could try out:
http://localhost:8080/tools/classbrowser
http://localhost:8080/tools/screenshot
http://localhost:8080/tools/versionuploader
to give you a taste of what can be done.
There is Coral, but you might also look into a lighter version of Pharo called "Pharo Kernel":
Pharo Kernel is a small Smalltalk kernel that is stripped down from Pharo Core image. Meanwhile there is also a 3MB Pharo-Kernel-Gofer image available that has networking support and Gofer (a pharo installer to load packages) installed.
Check it out at https://ci.inria.fr/pharo-contribution/view/Pharo-Kernel-2.0/job/PharoKernel2.0/
Im wondering how to go about creating an online IRB that runs in the browser. I have an idea to include an irb console in my blog and give the option for users to send code blocks in my tutorials directly into the irb console so they can play around with it.
_Why did this previously, but of course it is gone now: Cached Version
TryRuby is still available here, with source code at GitHub.
Well, you could use the sandbox that _why created. But you'll need to be able to patch your ruby and it seems to only work on ruby 1.8.5 .
That's insanely dangerous. Don't do this. You expose your system to all sorts of vulnerabilities when you allow users to execute arbitrary Ruby code.
Anyway there are some client-side Ruby implementations in JavaScript/Flash. Take a look at HotRuby.
I would suspect you run Ruby in a sandbox to prevent "bad" commands being run.
I am writing a WiX-based installer for our software. I need to download some non-trivial dependencies (like Sql Server Express 2008), then install them.
I could just use wget, but having the console open to show progress could be very confusing for non-technical people. Instead, I have been looking for a program that works just like wget, but shows its progress in a very simple window with the name of the file and the progress.
If I could show a small message that would be fantastic, but just having the GUI progress is the main thing.
I would even be interested in an existing program that almost does this, which I could recompile to add whatever I need. Since this is in an installer, it can't depend on .Net or anything else that needs installing to work.
Is anyone aware of such a program?
Why not to get wget sources and remove console output from there?
Since I did not find such a program, I wrote one. I used the latest libcurl available for Windows.
The code is not beautiful, and the program is not feature-complete, but it does what I need it to do: download from http:// while displaying a simple, attractive Window.
The titlebar is customizeable on the command-line, and I intend to allow window positioning too.
The project is hosted on google code: http://code.google.com/p/installerget/