Related
public function customHeaders()
{
return [
'Authorization' => request()->header('Authorization'),
'Content-Type' => 'multipart/form-data',
'Access-Control-Allow-Credentials' => true,
];
}
public function post(Request $request)
{
$response = Http::withHeaders($this->customHeaders())->post($this->url, $request->all());
return response()->json($response->json(), $response->status());
}
I have no problem using application/json but I get 404 when using multipart/form-data
public function customHeaders()
{
return (
'multipart' => [
'Authorization' => request()->header('Authorization'),
'Content-Type' => 'multipart/form-data',
'Access-Control-Allow-Credentials' => true,
'Content-Disposition' => 'multipart',
'name' => 'multipart'
],
'form-data' => [
'Authorization' => request()->header('Authorization'),
'Content-Type' => 'application/json',
'Access-Control-Allow-Credentials' => true,
'Content-Disposition' => 'form-data',
'name' => 'form-data'
]
);
}
Below is my successful HTTP request on DEV environment:
$response = Http::withHeaders([
'Content-Type' => 'application/json',
'Accept' => 'application/json'
])
->withToken('xxxxxxxxxxxxxx')
->post('https://xxxxxxxxx.com/v0.1/messages/', [
'from' => [
'type' => 'xxxx',
'number' => 'xxxxxxxx',
],
'to' => [
'type' => 'xxxxx',
'number' => 'xxxxxx',
],
'message' => [
'content' => [
'type' => 'text',
'text' => 'test message from laravel'
]
]
]);
But on production its mandatory to add a proxy to the request.
Anyone have any idea how to pass a proxy with the request above ?
Thank you in advance.
You can specify Guzzle options using the withOptions method.
Hence:
$response = Http::withOptions([
'proxy' => 'http://username:password#proxyhost.com:7000'
])->withHeaders( ...
I have been trying to make a simple POST request to an endpoint with a payload using Guzzle but I always get 400 Bad Request returned.
I can make the same request in Postman and it works. Also, If I make the request using cURL it works.
Can anyone tell from my code what I am doing wrong?
Here's the original cURL request:
curl "https://endpoint.com/" \
-H "Authorization: ApiKey pp_test_*********" \
--data '{
"shipping_address": {
"recipient_name": "Deon Botha",
"address_line_1": "Eastcastle House",
"address_line_2": "27-28 Eastcastle Street",
"city": "London",
"county_state": "Greater London",
"postcode": "W1W 8DH",
"country_code": "GBR"
},
"customer_email": "email£example.com",
"customer_phone": "123455677",
"customer_payment": {
"amount": 29.99,
"currency": "USD"
},
"jobs": [{
"assets": ["http://psps.s3.amazonaws.com/sdk_static/1.jpg"],
"template_id": "i6_case"
}, {
"assets": ["http://psps.s3.amazonaws.com/sdk_static/2.jpg"],
"template_id": "a1_poster"
}]
}'
And the Postman PHPHttp Request which works too.
<?php
$request = new HttpRequest();
$request->setUrl('https://endpoint.com/');
$request->setMethod(HTTP_METH_POST);
$request->setHeaders(array(
'cache-control' => 'no-cache',
'Connection' => 'keep-alive',
'Content-Length' => '678',
'Accept-Encoding' => 'gzip, deflate',
'Host' => 'api.kite.ly',
'Cache-Control' => 'no-cache',
'Accept' => '*/*',
'User-Agent' => 'PostmanRuntime/7.19.0',
'Content-Type' => 'text/plain',
'Authorization' => 'ApiKey pk_test_*******'
));
$request->setBody(' {
"shipping_address": {
"recipient_name": "Deon Botha",
"address_line_1": "Eastcastle House",
"address_line_2": "27-28 Eastcastle Street",
"city": "London",
"county_state": "Greater London",
"postcode": "W1W 8DH",
"country_code": "GBR"
},
"customer_email": "email#example.com",
"customer_phone": "12345667",
"customer_payment": {
"amount": 29.99,
"currency": "USD"
},
"jobs": [{
"assets": ["http://psps.s3.amazonaws.com/sdk_static/1.jpg"],
"template_id": "i6_case"
}, {
"assets": ["http://psps.s3.amazonaws.com/sdk_static/2.jpg"],
"template_id": "a1_poster"
}]
}');
try {
$response = $request->send();
echo $response->getBody();
} catch (HttpException $ex) {
echo $ex;
}
But when I try to send make the same request with Guzzle it fails with 400 Bad Request and I can't understand why.
$data = [
"shipping_address" => [
"recipient_name" => "Deon Botha",
"address_line_1" => "Eastcastle House",
"address_line_2" => "27-28 Eastcastle Street",
"city" => "London",
"county_state" => "Greater London",
"postcode" => "W1W 8DH",
"country_code" => "GBR"
],
"customer_email" => "example#email.com",
"customer_phone" => "+44 (0)784297 1234",
"customer_payment" => [
"amount" => 29.99,
"currency" => "USD"
],
"jobs" =>
[
"assets" => ["http://psps.s3.amazonaws.com/sdk_static/1.jpg"],
"template_id" => "i6_case"
]
];
$options = json_encode($data);
$response = $client->request('POST', config('services.endpoint.com'),
['headers' => ["Authorization" => config('services.endpoint.com.public_key'),
'Content-Type' => "application/json"], $options]);
If anyone can help me to debug this I'd be really grateful.
If you're using Guzzle 6 (and you probably should be), you're actually constructing in a more complex way than you need to, such that the endpoint is not receiving the expected JSON. Try this instead:
$client = new Client([
'base_uri' => 'https://my.endpoint.com/api',
'headers' => [
'Accept' => 'application/json',
...other headers...
]
]);
$data = [...your big slab of data...];
$response = $client->post('/kitely/path', ['json' => $data]);
// a string containing the results, which will depend on the endpoint
// the Accept header says we will accept json if it is available
// then we can use json_decode on the result
$result = $response->getBody()->getContents();
I'm using Unirest to make any sort of HTTP requests. I tried using Guzzle, but was facing the same issue as you are.
So what I did was install Unirest in my project, all the process are given in details in their documentation. This works perfectly fine for me.
I am trying to perform an upload to BigQuery from Perl with a sample schema and some sample data. I ran into dead ends following the documentation they provide, and so now I'm trying to mimic what the bq command line client successfully does.
I am tracing what bq does by adding a debug print (method, uri, headers, body) to the request method in httplib2. I am tracing what my Perl library is doing by doing a Dumper on the response, which also includes the _request that I sent. The pattern in bq is that they POST to an upload URL, then get back a location to PUT data to. The corresponding job is monitored with a series of GET requests, and finally they respond.
In Perl my POST succeeds, and my GET fails with Invalid Upload Request (but no hint why it is invalid). I am trying to figure out what difference between the two could explain my failure. But I can't find it.
Here are (with the access_token, IP addresses and project_id elided) the traces that I get.
For the POST the information from Python is:
(
u'POST',
u'https://www.googleapis.com/upload/bigquery/v2/projects/<project ID>/jobs?uploadType=resumable&alt=json',
{
'content-length': '442',
'accept-encoding': 'gzip, deflate',
'accept': 'application/json',
'user-agent': u'bq/2.0 google-api-python-client/1.0',
'X-Upload-Content-Length': '84',
'X-Upload-Content-Type': 'application/octet-stream',
'content-type': 'application/json',
'Authorization': u'Bearer <access token>'
},
'{"configuration": {"load": {"sourceFormat": "NEWLINE_DELIMITED_JSON", "destinationTable": {"projectId": "<project id>", "tableId": "demo_api", "datasetId": "tmp_bt"}, "maxBadRecords": 0, "schema": {"fields": [{"type": "STRING", "mode": "required", "name": "demo_string"}, {"type": "INTEGER", "mode": "required", "name": "demo_integer"}]}}}, "jobReference": {"projectId": "<project id>", "jobId": "bqjob_r139e633b7e522cf7_0000014031d9fb49_1"}}'
)
The corresponding Perl gets an apparently successful response object (in which you can see the _request) of:
$VAR1 = bless( {
'_protocol' => 'HTTP/1.1',
'_content' => '',
'_rc' => '200',
'_headers' => bless( {
'connection' => 'close',
'client-response-num' => 1,
'location' => 'https://www.googleapis.com/upload/bigquery/v2/projects/<project id>/jobs?uploadType=resumable&upload_id=AEnB2Ur0mdwmZpMot6ftkgj1IkqK0f7oPbZrXWQekUDHK_E2o2HKznJO6DK2xPYCB-nhUGrMrEJJ7z1Tz9Crnka9e5EYGP1lWQ',
'date' => 'Tue, 06 Aug 2013 20:46:05 GMT',
'client-ssl-cert-issuer' => '/C=US/O=Google Inc/CN=Google Internet Authority',
'client-ssl-cipher' => 'RC4-SHA',
'client-peer' => '<some ip>:443',
'content-length' => '0',
'client-date' => 'Tue, 06 Aug 2013 20:46:05 GMT',
'content-type' => 'text/html; charset=UTF-8',
'client-ssl-cert-subject' => '/C=US/ST=California/L=Mountain View/O=Google Inc/CN=*.googleapis.com',
'server' => 'HTTP Upload Server Built on Jul 24 2013 17:20:01 (1374711601)',
'client-ssl-socket-class' => 'IO::Socket::SSL'
}, 'HTTP::Headers' ),
'_msg' => 'OK',
'_request' => bless( {
'_content' => '{"configuration":{"load":{"maxBadRecords":0,"destinationTable":{"datasetId":"tmp_bt","tableId":"perl","projectId":<project id>},"sourceFormat":"NEWLINE_DELIMITED_JSON","schema":{"fields":[{"mode":"required","name":"demo_string","type":"STRING"},{"mode":"required","name":"demo_integer","type":"INTEGER"}]}}},"jobReference":{"projectId":<project id>,"jobId":"perlapi_1375821964"}}',
'_uri' => bless( do{\(my $o = 'https://www.googleapis.com/upload/bigquery/v2/projects/<project id>/jobs?uploadType=resumable')}, 'URI::https' ),
'_headers' => bless( {
'user-agent' => 'libwww-perl/6.05',
'content-type' => 'application/json',
'accept' => 'application/json',
':X-Upload-Content-Type' => 'application/octet-stream',
'content-length' => 379,
':X-Upload-Content-Length' => '84',
'authorization' => 'Bearer <access token>'
}, 'HTTP::Headers' ),
'_method' => 'POST',
'_uri_canonical' => $VAR1->{'_request'}{'_uri'}
}, 'HTTP::Request' )
}, 'HTTP::Response' );
And then we have a PUT. On the Python side we sent:
(
'PUT',
'https://www.googleapis.com/upload/bigquery/v2/projects/<project id>/jobs?uploadType=resumable&alt=json&upload_id=AEnB2UpWMRCAOffqyR0d7zvGVtD-KWhrC9jGB-q_igecJgoyz_mIHgEFfs9cYoPxUwUxuflQScMzGxDsKKJ_CJPQq4Os-AkdZA',
{
'Content-Range': 'bytes 0-83/84',
'Content-Length': '84',
'Authorization': u'Bearer <access token>',
'user-agent': u'bq/2.0'
},
<apiclient.http._StreamSlice object at 0x10ce11150>
)
(I have verified that the stream slice object has the same 84 bytes as Perl.) And here is the Perl failure:
$VAR1 = bless( {
'_protocol' => 'HTTP/1.1',
'_content' => '{
"error": {
"errors": [
{
"domain": "global",
"reason": "badRequest",
"message": "Invalid Upload Request"
}
],
"code": 400,
"message": "Invalid Upload Request"
}
}
',
'_rc' => '400',
'_headers' => bless( {
'connection' => 'close',
'client-response-num' => 1,
'date' => 'Tue, 06 Aug 2013 20:46:07 GMT',
'client-ssl-cert-issuer' => '/C=US/O=Google Inc/CN=Google Internet Authority',
'client-ssl-cipher' => 'RC4-SHA',
'client-peer' => '<some IP address>:443',
'content-length' => '193',
'client-date' => 'Tue, 06 Aug 2013 20:46:07 GMT',
'content-type' => 'application/json',
'client-ssl-cert-subject' => '/C=US/ST=California/L=Mountain View/O=Google Inc/CN=*.googleapis.com',
'server' => 'HTTP Upload Server Built on Jul 24 2013 17:20:01 (1374711601)',
'client-ssl-socket-class' => 'IO::Socket::SSL'
}, 'HTTP::Headers' ),
'_msg' => 'Bad Request',
'_request' => bless( {
'_content' => '{"demo_string":"foo", "demo_integer":"2"}
{"demo_string":"bar", "demo_integer":"3"}
',
'_uri' => bless( do{\(my $o = 'https://www.googleapis.com/upload/bigquery/v2/projects/<project id>/jobs?uploadType=resumable&upload_id=AEnB2Ur0mdwmZpMot6ftkgj1IkqK0f7oPbZrXWQekUDHK_E2o2HKznJO6DK2xPYCB-nhUGrMrEJJ7z1Tz9Crnka9e5EYGP1lWQ')}, 'URI::https' ),
'_headers' => bless( {
'user-agent' => 'libwww-perl/6.05',
':Content-Length' => '84',
':Content-Range' => '0-83/84',
'content-length' => 84,
'authorization' => 'Bearer <access token>'
}, 'HTTP::Headers' ),
'_method' => 'PUT',
'_uri_canonical' => $VAR1->{'_request'}{'_uri'}
}, 'HTTP::Request' )
}, 'HTTP::Response' );
What should I try changing on the Perl side to make BigQuery respond to me like it does to bq?
Some of your PUT headers have colons in front of them, where the Python does not:
':Content-Length' => '84',
':Content-Range' => '0-83/84',
I suspect there is something malformed in the multipart upload request. The error "Invalid Upload Request" is in response to attempting to split the data payload out of the multipart mime message. Your logging doesn't include details on the body of the requests, so we can't side-by-side compare them for unexpected differences.
To make sure that the problem is the multipart upload, you could try a load request that loads data from Google Storage rather than including the data in the request payload itself. This would verify that the perl api request path is working for you.
FYI: There is an alpha Perl Google Apis client that might help you out. I haven't tried it and don't know whether it's actively in development or not, but you might find some useful hints there. Check out https://code.google.com/p/google-api-perl-client/
I am writing a simple static Rack app. Check out the config.ru code below:
use Rack::Static,
:urls => ["/elements", "/img", "/pages", "/users", "/css", "/js"],
:root => "archive"
map '/' do
run Proc.new { |env|
[
200,
{
'Content-Type' => 'text/html',
'Cache-Control' => 'public, max-age=6400'
},
File.open('archive/splash.html', File::RDONLY)
]
}
end
map '/pages/search.html' do
run Proc.new { |env|
[
200,
{
'Content-Type' => 'text/html',
'Cache-Control' => 'public, max-age=6400'
},
File.open('archive/pages/search.html', File::RDONLY)
]
}
end
map '/pages/user.html' do
run Proc.new { |env|
[
200,
{
'Content-Type' => 'text/html',
'Cache-Control' => 'public, max-age=6400'
},
File.open('archive/pages/user.html', File::RDONLY)
]
}
end
# Each map section is repeated for each HTML page served
I'd like to simplify this by storing the URL as variable and creating one map section that says
map url do
run Proc.new { |env|
[
200,
{
'Content-Type' => 'text/html',
'Cache-Control' => 'public, max-age=6400'
},
File.open('archive' + url, File::RDONLY)
]
}
end
How can I correctly set this url variable?
How about:
static_page_mappings = {
'/' => 'archive/splash.html',
'/pages/search.html' => 'archive/pages/search.html'
'/pages/user.html' => 'archive/pages/user.html',
}
static_page_mappings.each do |req, file|
map req do
run Proc.new { |env|
[
200,
{
'Content-Type' => 'text/html',
'Cache-Control' => 'public, max-age=6400',
},
File.open(file, File::RDONLY)
]
}
end
end
You shouldn't need the map part.
run Proc.new { |env|
[
200,
{
'Content-Type' => 'text/html',
'Cache-Control' => 'public, max-age=6400'
},
File.open( 'archive' + env['PATH_INFO'], File::RDONLY)
]
}