I am trying to handle file uploads for a web app through cloud code.I am facing the following issues
We cant add third party middle ware such as busboy to parse
Express' built in function such as req.files doesn't seem to work with the body parser parse.com provides.
I don't want to expose my app key in the client code.
I wanted to know if there is any other way to handle this.
Parse Cloud is not a Node environment so it is not a surprise that it does not support nmp modules.
The middleware Parse provides for express.js does not support file uploads. Instead you need to send the file contents as base64 to your endpoint and create the Parse.File object from this data. more info here
Your app and client keys(except for master key) are PUBLIC INFORMATION and NOT secrets. This is clearly mentioned in the documentation by Parse and you cannot hide them at all. Use CLPs, ACLs and Cloud code to protect your data from unauthorised access.
Related
I'm working on an Absinthe GraphQL API for my app. I'm still learning the procedure(so please go easy on me).
I've a Absinthe/GraphQL MyAppWeb.schema.ex file in which I use for my queries and mutations. My question is how do I use this API for authenticating the user on both Mobile and Web app?
How do set a cookie(httpOnly & secure) in my web app and access/refresh tokens in a single Absinthe API to serve my website and mobile app. Basically what I'm trying to learn is how do I authenticate the user based on specific platform.
If my question sounds bit confusing, I would be happy to provide more information related to my question. I would really be grateful if someone could explain the procedure, I've been very stuck on this for a while.
I would avoid using authentication mechanisms provided by absinthe(if there are any). Depending on what front-end you are using, I would go with JSON API authentication. The flow on server goes the following way:
Create a endpoint for login that will receive a user and password and will return a refresh token.
Create a endpoint for exchanging refresh token for access token.
Use a library like guardian to generate your refresh/access tokens.
Create a phoenix plug for authentication that will check your tokens, guardian has some built-in plugs for this.
Now on device you have to implement:
Ability to save refresh and access token on device.
Have a global handler for injecting access token on authorized requests.
Have a global handler for case when access token is expired. (you usually check if your request returns Unauthorized, then you should request a new access token from the server using your refresh token)
This seems like a crude implementation, however I would advise in implementing your system instead of using a black box library that you have no idea how it works under the hood.
I'm trying to figure out how to upload a file to google cloud storage using rest API , i don't want to use the client Library .
i read the documents but it was not helpful for a beginner in this flied ,
anyone can give me a step-by-step how to do this ? and how the URL/header/body format should look like , if also can give me an examples that would be very helpful .
If you're not going to use any of the helper libraries and are also a beginner, the hardest part of implementing an upload to GCS will likely be authenticating yourself. Let's ignore that for now.
The simplest way to upload an object to Google Cloud Storage is to make an HTTPS call to storage.googleapis.com that looks like this:
PUT /your-bucket-name/your-object.txt HTTP/1.1
Authorization: (YOUR ACCESS TOKEN GOES HERE)
Content-Length: 20
Content-Type: text/plain-or-whatever; charset=utf-8
Host: storage.googleapis.com
User-Agent: YourApplication/1.0
This is a test file
That will upload a file named "your-object.txt" of type "text/plain-or-whatever" to the bucket "your-bucket-name", with the contents "This is a test file."
If your bucket allows anonymous users to upload files (you shouldn't do that), then just don't include the Authorization line and you're done.
Now, since you really don't want to use any client libraries, and that presumably includes Google's OAuth libraries, you're going to need to implement authorization yourself, so let me give you an overview.
First, though, if you want to try this out immediately, install the "gcloud" tool, login with "gcloud auth login", and the print an access token with gcloud auth print-access-token. Then use the Authorization header Authorization: Bearer whatever.gcloudprintedout. That way you can be off and running with GCS quickly. But the token will only last an hour or so, so you'll need to implement OAuth for real.
Google Cloud APIs use OAuth to handle their requests, which is a powerful but not simple auth mechanism. There's extensive documentation on how OAuth with Google works: https://developers.google.com/identity/protocols/OAuth2
And there's also more general information on authorizing Google Cloud requests: https://cloud.google.com/docs/authentication
If you are running your application on a Google Cloud technology like App Engine or GCE, auth will be somewhat easier, but I will assume you're running this on your own machine. I will further assume that you want your application to have its own identity, rather than simply having you log in as part of the upload flow. For such a case, you'll need a service account, which will have an associated private key.
The basic flow for a service account is that you will create a JWT request for access credentials, then cryptographically sign that request with your private key, then send that signed request to Google. It will return you a token that may then be passed to your actual upload request later. You can keep using that token until it expires, at which time you'll need to build another JWT request to request another token.
Again, the client libraries entirely take care of this whole process for you. I am describing the approach of implementing everything exclusively on your own.
You can find the same example here:
https://stackoverflow.com/a/53955058/4345389
in which I already explained how to upload a file to google cloud storage using rest API.
Thanks
Is there any methods to prevent web client from filling my parse cloud storage with dummy files? As far i can see, the creation of Parse.File (without object reference) requires only valid api key which is plain text in client side. Does this imply that any others can impersonate to be my valid js client with the api key or did i miss out anything in the api reference?
I am trying upload files to directly to s3 but as per my research its need server side code or dependency on facebook,google etc. is there any way to upload files directly to amazon using fineuploder only?
There are three ways to upload files directly to S3 using Fine Uploader:
Allow Fine Uploader S3 to send a small request to your server before each API call it makes to S3. In this request, your server will respond with a signature that Fine Uploader needs to make the request. This signatures ensures the integrity of the request, and requires you to use your secret key, which should not be exposed client-side. This is discussed here: http://blog.fineuploader.com/2013/08/16/fine-uploader-s3-upload-directly-to-amazon-s3-from-your-browser/.
Ask Fine Uploader to sign all requests client-side. This is a good option if you don't want Fine Uploader to make any requests to your server at all. However, it is critical that you don't simply hardcode your AWS secret key. Again, this key should be kept a secret. By utilizing an identity provider such as Facebook, Google, or Amazon, you can request very limited and temporary credentials which are fed to Fine Uploader. It then uses these credentials to submit requests to S3. You can read more about this here: http://blog.fineuploader.com/2014/01/15/uploads-without-any-server-code/.
The third way to upload files directly to S3 using Fine Uploader is to either generate temporary security credentials yourself when you create a Fine Uploader instance, or simply hard-code them in your client-side code. I would suggest you not hard-code security credentials.
Yes, with fine uploader you can do.Here is a link that explains very well what you need to do http://blog.fineuploader.com/2013/08/16/fine-uploader-s3-upload-directly-to-amazon-s3-from-your-browser/
Here is what you need. In this blogpost fineuploader team introduces serverless s3 upload via javascript. http://blog.fineuploader.com/2014/01/15/uploads-without-any-server-code/
Since the Google Search API has been deprecated, I'd like to use the Bing Search API (now a Windows Azure API) in my Ruby apps.
However, Azure has a strange authentication pattern where you build a query URI, paste it into a browser, pass the key into the password box of the standard HTTP authentication box, and make POST to see the results. I assume this generates a signature and passes it in the header somehow. I'd like to do the complete process in Ruby and skip the browser portion if possible.
I found one example in the source of an obscure Windows Azure storage gem, but I can't figure out how tthey're building the signature and make the call. Is there a simple way to do basic HTTP auth in Ruby?
I went ahead and used Faraday's built in basic authentication scheme like so:
connection = Faraday.new "http://api.something.com/1/dudez"
connection.basic_auth "username" "password"
connection.get
I want to recommend the RestClient gem for this. I've used it with great success for GET'ing and POST'ing across domains. If you really have to act like a browser to implement the API, you can always use Capybara.
I'm sorry I haven't tried the Azure API myself, or I would give an example. :)
I recall doing this previously with another Azure API but am unable to find the code.
Look here for the details of the signature process:
http://msdn.microsoft.com/en-us/library/windowsazure/ee395415.aspx
I'm unable to find immediately if the Azure API uses the SharedSignature method
The way to sign a request to Windows Azure blob storage thru the REST API is described here: http://msdn.microsoft.com/en-us/library/dd179428.aspx.
Basically, you don't authenticate by simply adding some credentials in a HTTP header, you have to sign your request with the secret key that is associated to your storage account.