Proxy for Ruby HTTP traffic - ruby

I have a ruby script, that posts data to a URL:
require 'httparty'
data = {}
data[:client_id] = '123123'
data[:key] = '123321'
url = "http://someserver.com/endpoint/"
response = HTTParty.post(url, :body => data)
Now i am using Charles for sniffing the HTTP traffic. This works great from the browser, but not from the terminal, where I run my script:
$ ruby MyScript.rb
How can I tell ruby or my Terminal.app to use the Charles proxy at http://localhost:88888
Update Another solution would be to see the request before it is being sent. So that I would not necessarily need the proxy.

Setting the proxy as timmah suggested should work.
Anyway 88888 is not a valid port! I think you want to use 8888 (Charles proxy default port).
So the right commands would be:
export http_proxy=localhost:8888
ruby MyScript.rb
If your script was to use https:// you would also/instead need to specify a HTTPS proxy like so:
export https_proxy=localhost:8888

Related

How to mock a 3rd-party library [duplicate]

an important part of my project is to log in into remote server with ssh and do something with files on it:
Net::SSH.start(#host, #username, :password => #password) do |ssh|
ssh.exec!(rename_files_on_remote_server)
end
How to test it?
I think I can have local ssh server on and check file names on it (maybe it could be in my test/spec directory).
Or maybe someone could point me better solution?
I think it's enough to test that you're sending the correct commands to the ssh server. You're application presumably doesn't implement the server - so you have to trust that the server is correctly working and tested.
If you do implement the server then you'd need to test that, but as far as the SSH stuff goes, i'd do some mocking like this (RSpec 2 syntax):
describe "SSH Access" do
let (:ssh_connection) { mock("SSH Connection") }
before (:each) do
Net::SSH.stub(:start) { ssh_connection }
end
it "should send rename commands to the connection" do
ssh_connection.should_receive(:exec!).ordered.with("expected command")
ssh_connection.should_receive(:exec!).ordered.with("next expected command")
SSHAccessClass.rename_files!
end
end
Your suggested solution is similar to how I've done it before:
Log into the local machine. For convenience you could use 'localhost' or '127.0.0.1', but for a better simulation of network activity you might want to use the full hostname. On Mac OS and Linux you can grab the host easily by using:
`hostname`
or
require 'socket'
hostname = Socket.gethostname
which should be universal.
From there create or touch a file on the local machine after logging in, so you can test for the change with your test code.

Why can't I see SNMP Traps coming in?

I'm attempting to use Ruby SNMP to capture SNMP traps from various devices. In order to test them I'm attempting to send them from my laptop using the 'snmptrap' command. I can see that the traps are being sent and arriving at my server (the server is the manager) in packet captures, as well as in the 'snmptrapd' utility when I run it. I'm using the following example code exactly as it is, in the demo from the documentation to set up a TrapListener.
require 'snmp'
require 'logger'
log = Logger.new(STDOUT)
m = SNMP::TrapListener.new do |manager|
manager.on_trap_default do |trap|
log.info trap.inspect
end
end
m.join
I'm sending an SNMPv2c trap, and nothing ever appears on the screen...
Here is the command I'm using to send a test SMTP trap, in the even that it's useful:
snmptrap -v 2c -c public hostname_goes_here SNMP-NOTIFICATION-MIB::snmpNotifyType SNMPv2-MIB::sysLocation
Any suggestions appreciated! Thanks!
I was stuck on this for a long time as well. It turns out that by default, Traplistener only opens ports on 127.0.0.1. To make it listen on ALL interfaces on the port you specified (or default port 162), specify a :Host option. '0' makes it listen on ALL interfaces, or you can provide an IP address.
log = Logger.new(STDOUT)
m = SNMP::TrapListener.new(:Host => 0) do |manager|
manager.on_trap_default do |trap|
log.info trap.inspect
end
end
m.join

Perl Script to Monitor URL Using proxy credentials?

Please help on the following code, this is not working in our environment.
use LWP;
use strict;
my $url = 'http://google.com';
my $username = 'user';
my $password = 'mypassword';
my $browser = LWP::UserAgent->new('Mozilla');
$browser->credentials("172.18.124.11:80","something.co.in",$username=>$password);
$browser->timeout(10);
my $response=$browser->get($url);
print $response->content;
OUTPUT :
Can't connect to google.com:80 (timeout)
LWP::Protocol::http::Socket: connect: timeout at C:/Perl/lib/LWP/Protocol/http.p m line 51.
OS: windows XP
Regards, Gaurav
Do you have a HTTP proxy at 172.18.124.11? I assume LWP is not using the proxy. You might want to use env_proxy => 1 with the new() call.
You also have a mod-perl2 tag in this question. If this code runs inside mod-perl2, it's possible that the http_proxy env variable is not visible to the code. You can check this eg. by printing $browser->proxy('http').
Or just set the proxy with $browser->proxy('http', '172.18.124.11');
Also, I assume you don't have use warnings on, because new() takes a hash, not just a string. It's a good idea to always enable warnings. That will save you lots of trouble.

Sending an email from R using the sendmailR package

I am trying to send an email from R, using the sendmailR package. The code below works fine when I run it on my PC, and I recieve the email. However, when I run it with my macbook pro, it fails with the following error:
library(sendmailR)
from <- sprintf("<sendmailR#%s>", Sys.info()[4])
to <- "<myemail#gmail.com>"
subject <- "TEST"
sendmail(from, to, subject, body,
control=list(smtpServer="ASPMX.L.GOOGLE.COM"))
Error in socketConnection(host = server, port = port, blocking = TRUE) :
cannot open the connection
In addition: Warning message:
In socketConnection(host = server, port = port, blocking = TRUE) :
ASPMX.L.GOOGLE.COM:25 cannot be opened
Any ideas as to why this would work on a PC, but not a mac? I turned the firewall off on both machines.
Are you able to send email via the command-line?
So, first of all, fire up a Terminal and then
$ echo “Test 123” | mail -s “Test” user#domain.com
Look into /var/log/mail.log, or better use
$ tail -f /var/log/mail.log
in a different window while you send your email. If you see something like
... setting up TLS connection to smtp.gmail.com[xxx.xx.xxx.xxx]:587
... Trusted TLS connection established to smtp.gmail.com[xxx.xx.xxx.xxx]:587:\
TLSv1 with cipher RC4-MD5 (128/128 bits)
then you succeeded. Otherwise, it means you have to configure you mailing system. I use postfix with Gmail for two years now, and I never had have problem with it. Basically, you need to grab the Equifax certificates, Equifax_Secure_CA.pem from here: http://www.geotrust.com/resources/root-certificates/. (They were using Thawtee certificates before but they changed last year.) Then, assuming you used Gmail,
Create relay_password in /etc/postfix and put a single line like this (with your correct login and password):
smtp.gmail.com login#gmail.com:password
then in a Terminal,
$ sudo postmap /etc/postfix/relay_password
to update Postfix lookup table.
Add the certificates in /etc/postfix/certs, or any folder you like, then
$ sudo c_rehash /etc/postfix/certs/
(i.e., rehash the certificates with Openssl).
Edit /etc/postfix/main.cf so that it includes the following lines (adjust the paths if needed):
relayhost = smtp.gmail.com:587
smtp_sasl_auth_enable = yes
smtp_sasl_password_maps = hash:/etc/postfix/relay_password
smtp_sasl_security_options = noanonymous
smtp_tls_security_level = may
smtp_tls_CApath = /etc/postfix/certs
smtp_tls_session_cache_database = btree:/etc/postfix/smtp_scache
smtp_tls_session_cache_timeout = 3600s
smtp_tls_loglevel = 1
tls_random_source = dev:/dev/urandom
Finally, just reload the Postfix process, with e.g.
$ sudo postfix reload
(a combination of start/stop works too).
You can choose a different port for the SMTP, e.g. 465.
It’s still possible to use SASL without TLS (the above steps are basically the same), but in both case the main problem is that your login informations are available in a plan text file... Also, should you want to use your MobileMe account, just replace the Gmail SMTP server with smtp.me.com.

How to scrape a _private_ google group?

I'd like to scrape the discussion list of a private google group. It's a multi-page list and I might have to this later again so scripting sounds like the way to go.
Since this is a private group, I need to login in my google account first.
Unfortunately I can't manage to login using wget or ruby Net::HTTP. Surprisingly google groups is not accessible with the Client Login interface, so all the code samples are useless.
My ruby script is embedded at the end of the post. The response to the authentication query is a 200-OK but no cookies in the response headers and the body contains the message "Your browser's cookie functionality is turned off. Please turn it on."
I got the same output with wget. See the bash script at the end of this message.
I don't know how to workaround this. am I missing something? Any idea?
Thanks in advance.
John
Here is the ruby script:
# a ruby script
require 'net/https'
http = Net::HTTP.new('www.google.com', 443)
http.use_ssl = true
path = '/accounts/ServiceLoginAuth'
email='john#gmail.com'
password='topsecret'
# form inputs from the login page
data = "Email=#{email}&Passwd=#{password}&dsh=7379491738180116079&GALX=irvvmW0Z-zI"
headers = { 'Content-Type' => 'application/x-www-form-urlencoded',
'user-agent' => "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/6.0"}
# Post the request and print out the response to retrieve our authentication token
resp, data = http.post(path, data, headers)
puts resp
resp.each {|h, v| puts h+'='+v}
#warning: peer certificate won't be verified in this SSL session
Here is the bash script:
# A bash script for wget
CMD=""
CMD="$CMD --keep-session-cookies --save-cookies cookies.tmp"
CMD="$CMD --no-check-certificate"
CMD="$CMD --post-data='Email=john#gmail.com&Passwd=topsecret&dsh=-8408553335275857936&GALX=irvvmW0Z-zI'"
CMD="$CMD --user-agent='Mozilla'"
CMD="$CMD https://www.google.com/accounts/ServiceLoginAuth"
echo $CMD
wget $CMD
wget --load-cookies="cookies.tmp" http://groups.google.com/group/mygroup/topics?tsc=2
Have you tried with mechanize for ruby?
Mechanize library is used for automating interaction with website; you could log in to google and browse your private google group saving what you need.
Here an example where mechanize is used for gmail scraping.
I did this previously by logging in manually with Firefox and then used Chickenfoot to automate browsing and scraping.
Found this PHP Solution to scraping private Google Groups.

Resources