SOAP + SSL + Ruby Savon - HTTPClient::ConnectTimeoutError: execution expired - ruby

I am trying to get a simple ruby script to send requests to a SOAP API but I am not able to get responses back.
This is what I am trying to do:
require 'date'
require 'savon'
# Create the client
client = Savon::Client.new do
wsdl.document = File.expand_path("path to wsdl document", __FILE__)
end
# Setup namespaces and credentials
client.wsdl.namespace = "http://www.example.com"
client.wsse.credentials "[USERNAME]", "[PASSWORD]"
# Setup ssl configuration
client.http.auth.ssl.cert_key_file = "path to key pem file"
client.http.auth.ssl.cert_file = "path to cert pem file"
client.http.auth.ssl.verify_mode=:peer
# execute request
response = client.request :sub, :get_data do
soap.body = {"sub:id" => "123456"}
end
This request finishes with:
D, [2011-05-05T10:21:45.014588 #22136] DEBUG -- : SOAP request: "http://www.example.com"
D, [2011-05-05T10:21:45.014743 #22136] DEBUG -- : Content-Type: text/xml;charset=UTF-8, SOAPAction: "getData"
D, [2011-05-05T10:21:45.014787 #22136] DEBUG -- : <?xml version="1.0" encoding="UTF-8"?><env:Envelope ...(XML request)... </env:Body></env:Envelope>
D, [2011-05-05T10:21:45.014864 #22136] DEBUG -- : HTTPI executes HTTP POST using the httpclient adapter
HTTPClient::ConnectTimeoutError: execution expired
However, when I try to send the same request via curl, it works (copying the xml request above to the soap-request.xml file):
curl -k -v --header "Content-Type: text/xml;charset=UTF-8, SOAPAction: 'getData'" https://www.example.com -d#soap-request.xml --cert-type PEM --cert path_to_pem_file_with_both_key_and_cert
Any ideas about what I'm missing in the ruby script?
Thanks in advance.
UPDATE:
The code above works if the WSDL document is correct. However, in case there isn't one or in case it is erroneous, just replace the client declaration with this:
# Create the client
client = Savon::Client.new do
wsdl.endpoint = "https://whateverendpoint.com"
wsdl.namespace = "http://whatevernamespace.com"
end
Finally, it is also a good idea to catch possible faults as described in Savon's documentation:
begin
# execute request
response = client.request :sub, :get_data do
soap.body = {"sub:id" => "123456"}
end
rescue Savon::SOAP::Fault => fault
puts fault.to_s
end

Have you tried to extend the http timeout? I had the same problem with some of my SOAP call that took very long on the server side. What I did was this
jira = Savon::Client.new do
wsdl.document = 'http://jira.xxx.com/rpc/soap/jirasoapservice-v2?wsdl'
end
jira.http.read_timeout = 300
done = 0
dotPrinter = Thread.new do
sec = 0
while(done==0) do
sleep 1
$STDERR.print "\b\b\b%03i" % sec
sec += 1
end
end
resp = jira.request :get_issues_from_filter do
soap.body = {:in0 => jira_token, :in1 => 18579}
end
done = 1

Related

How to use standard library to parse URL params in Ruby?

I am using the following code with "uri" and "CGI" to parse the params of the URL:
require 'socket'
require 'uri'
require 'CGI'
server = TCPServer.new 8888
while session = server.accept
request = session.gets
p "request", request
url = "http://somewebsite.com" + request.sub("GET ", "").sub(" HTTP/1.1", "").gsub(/(\r|\n)/, "")
uri = URI(url)
params = CGI.parse(uri.query)
p "params", params
session.print "HTTP/1.1 200\r\n" # 1
session.print "Content-Type: text/html\r\n" # 2
session.print "\r\n" # 3
session.print "Hello world! The time is #{Time.now}" #4
session.close
end
I had to "make up" a full URL by adding the http://somewebsite.com to the path, and use uri and CGI functions to do it. If the browser uses http://localhost:8888/?a=123&b=hello then it works well. But if the browser tried to access http://localhost:8888/favicon.ico or http://localhost:8888 then it broke right away, saying cannot split. (failing at CGI.parse(uri.query))
I can change the line to
params = uri.query ? CGI.parse(uri.query) : nil
but the whole thing seems a bit hacky, needing to make up a URL and then CGI.parse would break if query doesn't exist. Is there actually a better way to use standard library to do it? (should something else be used instead of uri and CGI?)
(Using standard library has the advantage of automatically handling the cases for %20 and multiple params as in http://localhost:8888/?a=123&b=hello%20world&b=good, giving
{"a"=>["123"], "b"=>["hello world", "good"]}
as the result.)
Why do you need to make up anything? You don't need a full URI to use CGI.parse. Something like this should work:
require 'socket'
require 'CGI'
server = TCPServer.new 8888
while session = server.accept
request = session.gets
method, full_path = request.split(' ')
path, params = full_path.split('?')
params = CGI.parse(params.gsub('?','')) if params
session.print "HTTP/1.1 200\r\n" # 1
session.print "Content-Type: text/html\r\n" # 2
session.print "\r\n" # 3
session.print "Hello world! The time is #{Time.now}" #4
session.print "\nparams: #{params}"
p "params:", params
session.close
end
You could also just use Rack if you don't want to learn about reinventing the CGI wheel. Rack is not technically part of the standard library but is as close as you get.
# Gemfile.rb
gem 'rack'
run $ bundle install.
# application.rb
class Application
# This is the main entry point for Rack
# #param env [Hash]
# #return [Array] status, headers, body
# #see https://www.rubydoc.info/gems/rack/Rack/Request
# #see https://www.rubydoc.info/gems/rack/Rack/Response
def self.call(env)
request = Rack::Request.new(env)
Rack::Response.new(
"Hello " + request.params["name"] || "World",
200,
{ "Content-Type" => "text/plain" }
).finish
end
end
request.params is a hash that contains query string parameters and parameters from the request body for POST requests.
# config.ru
require 'rack'
require_relative 'application'
run Application
Run $ rackup to start the server.

Net::HTTP Proxy list

I understand that you could use proxy in the ruby Net::HTTP. However, I have no idea how to do this with a bunch of proxy. I need the Net::HTTP to change to another proxy and send another post request after every post request. Also, is it possible to make the Net::HTTP to change to another proxy if the previous proxy is not working? If so, how?
Code I'm trying to implement the script in:
require 'net/http'
sleep(8)
http = Net::HTTP.new('URLHERE', 80)
http.read_timeout = 5000
http.use_ssl = false
path = 'PATHHERE'
data = '(DATAHERE)'
headers = {
'Referer' => 'REFERER HERE',
'Content-Type' => 'application/x-www-form-urlencoded; charset=UTF-8',
'User-Agent' => '(USERAGENTHERE)'}
resp, data = http.post(path, data, headers)
# Output on the screen -> we should get either a 302 redirect (after a successful login) or an error page
puts 'Code = ' + resp.code
puts 'Message = ' + resp.message
resp.each {|key, val| puts key + ' = ' + val}
puts data
end
Given an array of proxies, the following example will make a request through each proxy in the array until it receives a "302 Found" response. (This isn't actually a working example because Google doesn't accept POST requests, but it should work if you insert your own destination and working proxies.)
require 'net/http'
destination = URI.parse "http://www.google.com/search"
proxies = [
"http://proxy-example-1.net:8080",
"http://proxy-example-2.net:8080",
"http://proxy-example-3.net:8080"
]
# Create your POST request_object once
request_object = Net::HTTP::Post.new(destination.request_uri)
request_object.set_form_data({"q" => "stack overflow"})
proxies.each do |raw_proxy|
proxy = URI.parse raw_proxy
# Create a new http_object for each new proxy
http_object = Net::HTTP.new(destination.host, destination.port, proxy.host, proxy.port)
# Make the request
response = http_object.request(request_object)
# If we get a 302, report it and break
if response.code == "302"
puts "#{proxy.host}:#{proxy.port} responded with #{response.code} #{response.message}"
break
end
end
You should also probably do some error checking with begin ... rescue ... end each time you make a request. If you don't do any error checking and a proxy is down, control will never reach the line that checks for response.code == "302" -- the program will just fail with some type of connection timeout error.
See the Net::HTTPHeader docs for other methods that can be used to customize the Net::HTTP::Post object.

Ruby TCP server basics

Can someone explain to me what each part of this code is doing?
It would be helpful if someone could give me a step by step explanation.
Also, how could I upload files?
How do I manipulate a ruby server in general?
#!/usr/bin/env ruby
require 'socket'
require 'cgi'
server = TCPServer.new('127.0.0.1', 8888)
puts 'Listening on 127.0.0.1:8888'
loop {
client = server.accept
first_request_header = client.gets
resp = first_request_header
headers = ['http/1.1 200 ok',
"date: #{CGI.rfc1123_date(Time.now)}",
'server: ruby',
'content-type: text/html; charset=iso-8859-1',
"content-length: #{resp.length}\r\n\r\n"].join("\r\n")
client.puts headers # send the time to the client
client.puts resp
client.close
}
#required gems
require 'socket'
require 'cgi'
#creating new connection to a local host on port 8888
server = TCPServer.new('127.0.0.1', 8888)
puts 'Listening on 127.0.0.1:8888'
loop {
#looks like a client method call to open the connection
client = server.accept
first_request_header = client.gets
resp = first_request_header
#setting the request headers
headers = ['http/1.1 200 ok',
"date: #{CGI.rfc1123_date(Time.now)}",
'server: ruby',
'content-type: text/html; charset=iso-8859-1',
"content-length: #{resp.length}\r\n\r\n"].join("\r\n")
#inserts custom client headers into request
client.puts headers
client.puts resp
#closes client connection to local host
client.close
}

Faraday (Ruby) Timeout Errors

I'm attempting to put a small payload generated in route A (Sinatra app) to Route B using Faraday. So the code basically looks like:
post "/routeA" do
foo.save
foo_id = foo.id
conn = Faraday.new(:url => "http://localhost:3001/routeB" ) do |builder|
builder.request :url_encoded
builder.response :logger
builder.adapter :net_http
end
resp = conn.put do |req|
req.url '/routeB'
req.headers['Content-Type'] = 'application/json'
req.body = {:id => foo_id }.to_json
req.options = {
#:timeout => 5, # see below, these aren't the problem
#:open_timeout => 2
}
end
# never gets here b/c Timeout error always thrown
STDERR.puts resp.body
end
put "/routeB" do
# for test purposes just log output
STDERR.puts request.body.read.to_s.inspect
status 202
body '{"Ok"}'
end
Problem is that it always throws a timeout error (I've run without the timeout options, and with the ones shown above -> same results). However, the logs show the request is going through:
I, [2012-03-24T16:56:13.241329 #17673] INFO -- : put http://localhost:3001/routeB
D, [2012-03-24T16:56:13.241427 #17673] DEBUG -- request: Content-Type: "application/json"
#<Faraday::Error::TimeoutError>
DEBUG - POST (60.7987ms) /routeA - 500 Internal Server Error
"{\"id\":7}"
DEBUG - PUT (0.0117ms) /routeB - 202 Accepted
Not sure how to get past the timeout error? Any insight would be appreciated. Thanks.
The problem is that the application cannot respond to another request until it's finished with the current one. That is, when you make a PUT request on the /routeB, the application got that, and it's waiting for the current request (/routeA) to finish. But the request won't finish because it's waiting to get the response from the /routeB. I think this is what causes the timeout error.

Web Server Flow in the Rack OAuth-2 Server

I'm trying to integrate the Rack OAuth-2 server into my sinatra application, to use it in a web-server flow implementation and I can't make it work :(. I the following code in the oauth controller
require "rack/oauth2/sinatra"
module RestKit
module Network
class OAuth2 < Sinatra::Base
use Rack::Logger
set :sessions, true
set :show_exceptions, true
ENV["DB"] = "test"
DATABASE = Mongo::Connection.new[ENV["DB"]]
register Rack::OAuth2::Sinatra
oauth.authenticator = lambda do |username, password|
"Batman" if username == "cowbell" && password == "more" end
oauth.host = "localhost"
oauth.database = DATABASE
# 3. Obtaining End-User Authorization
before "/oauth/*" do
halt oauth.deny! if oauth.scope.include?("time-travel") # Only Superman can do that
end
get "/oauth/authorize" do
"client: #{oauth.client.display_name}\nscope: #{oauth.scope.join(", ")}\nauthorization: #{oauth.authorization}"
end
post "/oauth/grant" do
oauth.grant! "Batman"
end
post "/oauth/deny" do
oauth.deny!
end
# 5. Accessing a Protected Resource
before { #user = oauth.identity if oauth.authenticated? }
oauth_required "/user"
get "/user" do
#user
end
get "/list_tokens" do
oauth.list_access_tokens("Batman").map(&:token).join(" ")
end
end
end
end
Then I try to obtain an authorization code using curl from terminal with:
curl -i http://localhost:4567/oauth/authorize -F response_type=code -F client_id=[the ID] -F client_secret=[the secret] -F redirect_uri=http://localhost:4567/oauth/showcode
and Just I got as a response:
HTTP/1.1 400 Bad Request
Content-Type: text/plain
Content-Length: 20
Connection: keep-alive
Server: thin 1.2.11 codename Bat-Shit Crazy
Missing redirect URL
Do you have any ideas what I'm doing wrong? Thanks!
The end of your curl request is:
-F redirect_uri=http://localhost:4567/oauth/showcode
but you haven't defined that route in the code above, i.e. where is:
get "/oauth/showcode" do
? That's why the error is "Missing redirect URL".

Resources