`
http_server.rb
require 'socket'
require 'json'
server = TCPServer.new 5678
while session = server.accept
request = session.gets
puts request
session.print "HTTP/1.1 200\r\n" # 1
session.print "Content-Type: text/html\r\n" # 2
session.print "\r\n" # 3
output = {
"error" => false,
"total_marks" => "0"
}
session.puts(output.to_json);
session.close
end
`
So this standalone ruby http server file works perfectly locally , I would prefer not to have to use rails because I also need to make it a docker container . is there any way to enable cors solely inside this file for this simple server ?
I'm new to docker and ruby so the less complex the better.
CORS is just an Access-Control-Allow-Origin header (see the documentation here https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS) so it's easy to add it to your example.
session.print "Access-Control-Allow-Origin: something.com\r\n"
Related
I am using the following code with "uri" and "CGI" to parse the params of the URL:
require 'socket'
require 'uri'
require 'CGI'
server = TCPServer.new 8888
while session = server.accept
request = session.gets
p "request", request
url = "http://somewebsite.com" + request.sub("GET ", "").sub(" HTTP/1.1", "").gsub(/(\r|\n)/, "")
uri = URI(url)
params = CGI.parse(uri.query)
p "params", params
session.print "HTTP/1.1 200\r\n" # 1
session.print "Content-Type: text/html\r\n" # 2
session.print "\r\n" # 3
session.print "Hello world! The time is #{Time.now}" #4
session.close
end
I had to "make up" a full URL by adding the http://somewebsite.com to the path, and use uri and CGI functions to do it. If the browser uses http://localhost:8888/?a=123&b=hello then it works well. But if the browser tried to access http://localhost:8888/favicon.ico or http://localhost:8888 then it broke right away, saying cannot split. (failing at CGI.parse(uri.query))
I can change the line to
params = uri.query ? CGI.parse(uri.query) : nil
but the whole thing seems a bit hacky, needing to make up a URL and then CGI.parse would break if query doesn't exist. Is there actually a better way to use standard library to do it? (should something else be used instead of uri and CGI?)
(Using standard library has the advantage of automatically handling the cases for %20 and multiple params as in http://localhost:8888/?a=123&b=hello%20world&b=good, giving
{"a"=>["123"], "b"=>["hello world", "good"]}
as the result.)
Why do you need to make up anything? You don't need a full URI to use CGI.parse. Something like this should work:
require 'socket'
require 'CGI'
server = TCPServer.new 8888
while session = server.accept
request = session.gets
method, full_path = request.split(' ')
path, params = full_path.split('?')
params = CGI.parse(params.gsub('?','')) if params
session.print "HTTP/1.1 200\r\n" # 1
session.print "Content-Type: text/html\r\n" # 2
session.print "\r\n" # 3
session.print "Hello world! The time is #{Time.now}" #4
session.print "\nparams: #{params}"
p "params:", params
session.close
end
You could also just use Rack if you don't want to learn about reinventing the CGI wheel. Rack is not technically part of the standard library but is as close as you get.
# Gemfile.rb
gem 'rack'
run $ bundle install.
# application.rb
class Application
# This is the main entry point for Rack
# #param env [Hash]
# #return [Array] status, headers, body
# #see https://www.rubydoc.info/gems/rack/Rack/Request
# #see https://www.rubydoc.info/gems/rack/Rack/Response
def self.call(env)
request = Rack::Request.new(env)
Rack::Response.new(
"Hello " + request.params["name"] || "World",
200,
{ "Content-Type" => "text/plain" }
).finish
end
end
request.params is a hash that contains query string parameters and parameters from the request body for POST requests.
# config.ru
require 'rack'
require_relative 'application'
run Application
Run $ rackup to start the server.
My first question here... so be gentle :D
I have the following code:
server = TCPServer.new('localhost', 8080)
loop do
socket = server.accept
# Do something with the URL parameters
response = "Hello world";
socket.print response
socket.close
end
The point is that I want to be able to retrieve if any parameters have been sent in URL of the HTTP request.
Example:
From this request:
curl http://localhost:8080/?id=1&content=test
I want to be able to retrieve something like this:
{id => "1", content => "test"}
I've been looking for CGI::Parse[1] or similar solutions but I haven't found a way to extract that data from a TCPSocket.
[1] http://www.ruby-doc.org/stdlib-1.9.3/libdoc/cgi/rdoc/CGI.html#method-c-parse
FYI: My need is to have a minimal http server in order to receive a couple of parameters and wanted to avoid the use of gems and/or full HTTP wrappers/helpers like Rack.
Needless to say... but thanks in advance.
If you want to see a very minimal server, here is one. It handles exactly two parameters, and puts the strings in an array. You'll need to do more to handle variable numbers of parameters.
There is a fuller explanation of the server code at https://practicingruby.com/articles/implementing-an-http-file-server.
require "socket"
server = TCPServer.new('localhost', 8080)
loop do
socket = server.accept
request = socket.gets
# Here is the first line of the request. There are others.
# Your parsing code will need to figure out which are
# the ones you need, and extract what you want. Rack will do
# this for you and give you everything in a nice standard form.
paramstring = request.split('?')[1] # chop off the verb
paramstring = paramstring.split(' ')[0] # chop off the HTTP version
paramarray = paramstring.split('&') # only handles two parameters
# Do something with the URL parameters which are in the parameter array
# Build a response!
# you need to include the Content-Type and Content-Length headers
# to let the client know the size and type of data
# contained in the response. Note that HTTP is whitespace
# sensitive and expects each header line to end with CRLF (i.e. "\r\n")
response = "Hello world!"
socket.print "HTTP/1.1 200 OK\r\n" +
"Content-Type: text/plain\r\n" +
"Content-Length: #{response.bytesize}\r\n" +
"Connection: close\r\n"
# Print a blank line to separate the header from the response body,
# as required by the protocol.
socket.print "\r\n"
socket.print response
socket.close
end
I understand that you could use proxy in the ruby Net::HTTP. However, I have no idea how to do this with a bunch of proxy. I need the Net::HTTP to change to another proxy and send another post request after every post request. Also, is it possible to make the Net::HTTP to change to another proxy if the previous proxy is not working? If so, how?
Code I'm trying to implement the script in:
require 'net/http'
sleep(8)
http = Net::HTTP.new('URLHERE', 80)
http.read_timeout = 5000
http.use_ssl = false
path = 'PATHHERE'
data = '(DATAHERE)'
headers = {
'Referer' => 'REFERER HERE',
'Content-Type' => 'application/x-www-form-urlencoded; charset=UTF-8',
'User-Agent' => '(USERAGENTHERE)'}
resp, data = http.post(path, data, headers)
# Output on the screen -> we should get either a 302 redirect (after a successful login) or an error page
puts 'Code = ' + resp.code
puts 'Message = ' + resp.message
resp.each {|key, val| puts key + ' = ' + val}
puts data
end
Given an array of proxies, the following example will make a request through each proxy in the array until it receives a "302 Found" response. (This isn't actually a working example because Google doesn't accept POST requests, but it should work if you insert your own destination and working proxies.)
require 'net/http'
destination = URI.parse "http://www.google.com/search"
proxies = [
"http://proxy-example-1.net:8080",
"http://proxy-example-2.net:8080",
"http://proxy-example-3.net:8080"
]
# Create your POST request_object once
request_object = Net::HTTP::Post.new(destination.request_uri)
request_object.set_form_data({"q" => "stack overflow"})
proxies.each do |raw_proxy|
proxy = URI.parse raw_proxy
# Create a new http_object for each new proxy
http_object = Net::HTTP.new(destination.host, destination.port, proxy.host, proxy.port)
# Make the request
response = http_object.request(request_object)
# If we get a 302, report it and break
if response.code == "302"
puts "#{proxy.host}:#{proxy.port} responded with #{response.code} #{response.message}"
break
end
end
You should also probably do some error checking with begin ... rescue ... end each time you make a request. If you don't do any error checking and a proxy is down, control will never reach the line that checks for response.code == "302" -- the program will just fail with some type of connection timeout error.
See the Net::HTTPHeader docs for other methods that can be used to customize the Net::HTTP::Post object.
I am trying to use the NET:HTTP gem to add an api-key to http header of a client, but it just doesn't seem to be working for some reason when I try and test it out.Basically the server requires the http header of the client or anything to have http_x_api header in order to serve the request.
Server code
require 'sinatra'
before do
halt 400 if (env['API_KEY']) != 'wow'
end
get '/' do
"boo"
end
Client code
require 'net/http'
require 'uri'
port = ENV['PORT'] || '7474'
res = Net::HTTP.start('localhost', port ) { |h| h.get('/')}
res.add_field('api-key', 'wow')
res.each_header do |key, value|
p "#{key} => #{value}"
end
puts (res.code == '200' && res.body == 'boo') ? 'OK' : 'FAIL'
this the response i get back :=>
"x-frame-options => sameorigin"
"x-xss-protection => 1; mode=block"
"content-type => text/html;charset=utf-8"
"content-length => 0"
"connection => keep-alive"
"server => thin 1.5.0 codename Knife"
"api-key => wow"
FAIL
On the server, the HTTP header variables in env are prefixed with HTTP_, so you need to check env['HTTP_API_KEY']. From the documentation:
HTTP_ Variables: Variables corresponding to the client-supplied HTTP request headers (i.e., variables whose names begin with HTTP_). The presence or absence of these variables should correspond with the presence or absence of the appropriate HTTP header in the request.
I am trying to get a simple ruby script to send requests to a SOAP API but I am not able to get responses back.
This is what I am trying to do:
require 'date'
require 'savon'
# Create the client
client = Savon::Client.new do
wsdl.document = File.expand_path("path to wsdl document", __FILE__)
end
# Setup namespaces and credentials
client.wsdl.namespace = "http://www.example.com"
client.wsse.credentials "[USERNAME]", "[PASSWORD]"
# Setup ssl configuration
client.http.auth.ssl.cert_key_file = "path to key pem file"
client.http.auth.ssl.cert_file = "path to cert pem file"
client.http.auth.ssl.verify_mode=:peer
# execute request
response = client.request :sub, :get_data do
soap.body = {"sub:id" => "123456"}
end
This request finishes with:
D, [2011-05-05T10:21:45.014588 #22136] DEBUG -- : SOAP request: "http://www.example.com"
D, [2011-05-05T10:21:45.014743 #22136] DEBUG -- : Content-Type: text/xml;charset=UTF-8, SOAPAction: "getData"
D, [2011-05-05T10:21:45.014787 #22136] DEBUG -- : <?xml version="1.0" encoding="UTF-8"?><env:Envelope ...(XML request)... </env:Body></env:Envelope>
D, [2011-05-05T10:21:45.014864 #22136] DEBUG -- : HTTPI executes HTTP POST using the httpclient adapter
HTTPClient::ConnectTimeoutError: execution expired
However, when I try to send the same request via curl, it works (copying the xml request above to the soap-request.xml file):
curl -k -v --header "Content-Type: text/xml;charset=UTF-8, SOAPAction: 'getData'" https://www.example.com -d#soap-request.xml --cert-type PEM --cert path_to_pem_file_with_both_key_and_cert
Any ideas about what I'm missing in the ruby script?
Thanks in advance.
UPDATE:
The code above works if the WSDL document is correct. However, in case there isn't one or in case it is erroneous, just replace the client declaration with this:
# Create the client
client = Savon::Client.new do
wsdl.endpoint = "https://whateverendpoint.com"
wsdl.namespace = "http://whatevernamespace.com"
end
Finally, it is also a good idea to catch possible faults as described in Savon's documentation:
begin
# execute request
response = client.request :sub, :get_data do
soap.body = {"sub:id" => "123456"}
end
rescue Savon::SOAP::Fault => fault
puts fault.to_s
end
Have you tried to extend the http timeout? I had the same problem with some of my SOAP call that took very long on the server side. What I did was this
jira = Savon::Client.new do
wsdl.document = 'http://jira.xxx.com/rpc/soap/jirasoapservice-v2?wsdl'
end
jira.http.read_timeout = 300
done = 0
dotPrinter = Thread.new do
sec = 0
while(done==0) do
sleep 1
$STDERR.print "\b\b\b%03i" % sec
sec += 1
end
end
resp = jira.request :get_issues_from_filter do
soap.body = {:in0 => jira_token, :in1 => 18579}
end
done = 1