I want to read a csv file from the wasb in storage account using sparkr language.I'm using the jupyter notebook for implementing it.It will be great if anyone can help me with examples.
The file can be read using the following command,
iris_data <- read.df(path = "", source = "com.databricks.spark.csv")
It is working perfect.Thanks everyone
Azure Storage doesn't provide R language client library. To get a blob using R, you could call the REST API provided by Azure Storage. The HTTP request message could be like this.
GET https://{accountname}.blob.core.windows.net/{containername}/{blobname} HTTP/1.1
x-ms-version: 2016-05-31
x-ms-date: Thu, 20 Apr 2017 02:21:00 GMT
Authorization: SharedKey {accountname}:{yoursharedkey}
Host: {storagename}.blob.core.windows.net
The content of this blob will be returned in the response body. You could search on web to find how to send REST request using sparkr language.
Bing Search: Call REST API from R
Related
Google released this blog post which says:
If you authorize download requests to the Drive API using the access
token in a query parameter, you will need to migrate your requests to
authenticate using an HTTP header instead. Starting January 1, 2020,
download calls to files.get, revisions.get and files.export endpoints
which authenticate using the access token in the query parameter will
no longer be supported, which means you’ll need to update your
authentication method.
and then says:
For file downloads, redirect to the webContentLink which will instruct
the browser to download the content. If the application wants to
display the file to the user, they can simply redirect to the
alternateLink in v2 or webViewLink in v3.
however if we use webContentLink then we will hit the 100mb virus page mentioned here.
I can see that the migration has been delayed, however sooner or later this will happen, and we want to future-proof the application.
How will we be able to download content without hitting the 100mb virus limit, after this change is implemented?
If you authorize download requests to the Drive API using the access token in a query parameter, you will need to migrate your requests to authenticate using an HTTP header instead.
Example query param:
GET https://www.googleapis.com/drive/v3/files/[FILEID]?access_token=[YOUR_ACCESS_TOKEN] HTTP/1.1
Accept: application/json
Example requests header:
GET https://www.googleapis.com/drive/v3/files/[FILEID] HTTP/1.1
Authorization: Bearer [YOUR_ACCESS_TOKEN]
Accept: application/json
Assuming that you can do the http header option then you should not have any issues with the download as mentioned. The issues with download only come into play if you cant add the authorization header. In which case i think you would need to go with option number two and export the files directly.
I am testing an api. I need to upload image with api key. I can upload image separately but I can't post image and api key together.
{
"apikey" : "12345kjl",
"image" : ""
}
It depends on your server implementation, in some cases you need to post a Base64-encoded image, in this case you can use __FileToString() and __base64Encode() functions combination like:
{
"apikey" : "12345kjl",
"image" : "${__base64Encode(${__FileToString(/path/to/the/file,,)},)}"
}
Another option is building a multipart post request manually so it will look like:
--boundary
Content-Type: application/json; charset=UTF-8
JSON Metadata
--boundary
Content-Type: file MIME type
File content
--boundary--
See Testing REST API File Uploads in JMeter article for step-by-step instructions on implementing it.
In general you should capture "real" request using a sniffer tool like Fiddler or Wireshark and configure JMeter in order to send the same request (apart from dynamic parameters)
Since, I was able to upload image from postman. I used jmeter to record the script form postman and use exact recorded configuration in jmeter script.
I send "apikey" from Parameters section and image from "File Upload".
I didn't add header manager.
Actually content-type in header manager was causing the problem.
I am trying to import API in Azure API Management service using API Management REST API.
I tried following two approaches but faced issues:
1) Using the Azure API Management XML Structure.
-> I Exported existing API in XML from the portal
-> Created Class using xsd tool. The implementation of calling the REST API is here.
Error:
{StatusCode: 400, ReasonPhrase: 'Bad Request', Version: 1.1, Content: System.Net.Http.StreamContent, Headers:
{
Date: Fri, 21 Aug 2015 09:58:10 GMT
Content-Length: 309
Content-Type: application/json; charset=utf-8
}}
2)I did same thing my own XML Structure which is available here
but facing same Issue kindly help me with this.
Please refer to discussions here:https://social.msdn.microsoft.com/Forums/azure/en-US/d62a4351-7ef3-4a9d-bb3f-beea6d90d321/how-to-import-api-from-xml-file-using-azure-api-management-rest-api?forum=azureapimgmt
I am trying consume rally REST web service in Windows Phone app. I successfully fetched data using this url "https://rally1.rallydev.com/slm/webservice/1.36/task?query=((Owner.Name = {0}) and (State != Completed))&order=Rank&fetch=true&stylesheet=/slm/doc/webservice/browser.xsl" and using Ling to Xml I am able to read data. However I am not able to consume Create, Update and Delete operation. Can someone share the code to consume these below service in C#
Create PUT/POST
XML https://rally1.rallydev.com/slm/webservice/1.37/task/create
JSON https://rally1.rallydev.com/slm/webservice/1.37/task/create.js
Update POST
XML https: //rally1.rallydev.com/slm/webservice/1.37/task/ObjectID
JSON https: //rally1.rallydev.com/slm/webservice/1.37/task/ObjectID.js
Delete DELETE
XML https: //rally1.rallydev.com/slm/webservice/1.37/task/ObjectID
JSON https: //rally1.rallydev.com/slm/webservice/1.37/task/ObjectID.js
Thanks,
Sunil
Try RestSharp it supports all the HTTP operations and is available for Windows Phone 7.
Context
We wish to use "replay" web server access logs to generate load tests. JMeter had come to mind as I'd recently read blog posts about using jmeter in the cloud (e.g. firing up a number of Amazon EC2 instances to generate the load)
For years I had heard of JMeter's ability to replay access logs, but in reviewing this feature I found the following.
Access Log Sampler
DOES:
recreate sessions, i.e. handle the jsessionId token (thought it tries to approximate sessions by IP address);
DOES NOT:
handle POST data (even if you could configure apache/tomcat to write out post data to the access log, jmeter access log sampler only handles 'common' log format).
Post data would go a long way toward recreating actual load.
Additionally, the documentation describes the Access Log Sampler as "alpha code" even though it's 8 years old. It doesn't seem actively maintained. (That's longer than Gmail's beta.)
HttpPerf
Another blog post pointed me to the httpperf tool. I've started to read up on it:
blog: http://www.igvita.com/2008/09/30/load-testing-with-log-replay/
httpperf: http://code.google.com/p/httperf/
Summary
What's the best way to generate load testing 'scripts' from real user data?
What has worked best for you?
Pros and cons of various tools?
JMeter + HTTP Raw Request + Raw Data Source for me works well
I will describe how do we solve this problem using our own LT tool called Yandex Tank
It can handle simple access.log but only 'GET' requests, too. When there's a need to make other types of requests, we use other ammo formats (ammo is a file containing all the requests that we gonna send to our server). Example:
342
POST / HTTP/1.1^M
Host: xxx.xxx.xxx.xxx:8080^M
Connection: keep-alive^M
Keep-Alive: 300^M
Content-Type: multipart/form-data; boundary=AGHTUNG^M
Content-Length: 1400^M
Connection: Close^M
^M
--AGHTUNG^M
Content-Disposition: form-data; name="fp"; filename="fp_tank"^M
Content-Type: application/octet-stream^M
Content-Transfer-Encoding: binary^M
...
--AGHTUNG--^M
A number ('342') on the first line is the size of a following request. Request is in it's raw format. You could write a simple script in your favourite language that generates such ammo files from your access.log and then use it for load testing.
Such ammo format makes it really flexible. For example, this code generates ammo from FCGI logs (POST bodies are encoded in Base64). But on the other hand you will need to handle sessions manually.
You can easily replay access logs with POST data using ZebraTester. It has many plugins similar to JMeter and also ability to add inline scripts using which you can easily target POST payload, URLs, timestamps, etc. from the access logs. You can run load tests directly from the tool locally or copy the recorded script to the SaaS portal to run massive million virtual user load tests