I have created a collection in my Fauna database. I also made an index that returns all the data in that collection. I want to know if there is a way to get JSON REST API to work with this data. What I mean by this is I want a URL that I can 'GET' using XMLHttpRequest and then parse into JSON.
You might say that Fauna is "HTTP native". All of the language drivers (JS, C#, Python, etc.) are built on top of http requests. That is, there are no special connections or connection-pools to maintain. You can get pretty far with using the JS driver in the browser, for example, without using a server.
But to answer your question more directly, there are many ways to serve a REST API, and it is going to depend on how you want to serve the API.
Examples
AWS
Here is a blog post on how to set up a REST API with AWS App Runner, for example.
https://fauna.com/blog/deploying-a-rest-api-with-aws-app-runner-and-fauna
Vercel and Netlify
There are also some example apps on github that demonstrate using a REST API with
Vercel:
https://github.com/vercel/next.js/tree/canary/examples/with-cookie-auth-fauna
Or Netlify:
https://github.com/netlify/netlify-faunadb-example
Both of which host serverless functions for you to create a REST API.
Related
It is currently being developed using mysql-prisma-apollo server-nexus, and it is necessary to receive row data post using the REST API, not the GrqphQL statement currently developed. You want to process raw data passed to the post in Path (for example,/api/data/status). Is there a way to create a RestAPI on the apollo-server?
The apollo-server runs in a node environment, so you are able to use any http client you want.
Example:
axios
node-fetch
Is it possible to trigger a file download in a browser from the GraphQL endpoint on an apollo-server-express application?
I have the endpoint written in a standard express app.get function (see below) but I would like to make use of the GraphQL context for file download and so I'm wondering if it's possible to cause a download from a GraphQL endpoint.
Here's a bare-bones example of what I have on the express end in the app.get function:
app.get('/download-batch/:batchId', async (req, res) => {
res.send(new Buffer('test'));
});
Any help would me much appreciated. Thanks!
Yes, but you would be required to create a custom endpoint for that. You can't use the existing endpoint which you are using for making requests.
Using the custom endpoint, you have to add a middleware and process the data into a buffer or whatever format you need. But it would not be recommended. That would again become one more endpoint instead which you can write an API to serve that.(After all graphql is built mainly on the focus of single endpoint).
BoĊĦtjan Cigan mentions here some solutions and gives details about using GraphQL as a proxy with Minio. The backend would ask Mino to generate a temporary link that can be sent back to the browser for direct access.
This is a valid solution for many use cases.
I have recently been experimenting with building a cross domain web api, and wow has it been a bumpy journey. I have not had any problems with modern browsers such as Chrome, FF and Safari. The problem is with IE, which requires you to use XDR as opposed $.ajax when making cross domain calls. First Question: If I was using Backbone.js, what is the recommended way of making cross browser and cross domain ajax calls?
Another problem I had with IE was that when you make cross domain ajax requests, IE has a bunch of restrictions and limitations such as "Only text/plain is supported for the request's Content-Type header" - a link. Therefore in my case, I was unable to bind to my model using the MVC C# framework, unless I bind it manually.
Anyway my second and last question is: How do companies like Instagram, Facebook, and Twitter go about building their API's? I am not looking for a complete guide, but just want to know how difficult it is.
JSONP
The current standard is using JSONP. It is basically a trick to send a JSON payload wrapped in a single JavaScript function, the browser treats it like a script file and executes it.
CORS
Moving forward the way to go is CORS. Sadly browser support (IE) isn't there yet and there are still some implementation differences between the modern browsers that do implement it.
HTTP Method Overloading
Some APIs overload GET and POST request using X-HTTP-Method-Override: PUT or ?_method=PUT.
easyXDM
A number of API providers implement easyXDM. This tends to be used more when they provide a JavaScript API or widget API where developers load their JS and integrate it directly in to the frontend code.
So, I implemented an API provider to be accessed by both web application and mobile applications.
Most likely this will not be a large scale project, but I want to maximize my learning experience and geek out where I can.
Anyway, from what I understand, it seems like it's better to put the API provider service and the actual website on separate domains to make scaling easier.
For example, twitter has the website twitter.com and api.twitter.com.
One immediate issue would be dealing with the cross-domain issue with AJAX.
From what I gather, there are 2 ways to implement cross-domain AJAX
JSONP: I heard about it, but don't know much beyond the name
Proxy Server: so, my website is build on top of ASP.NET MVC and I was thinking about creating a APIProxy controller to handle all cross-domain API requests.
That way, I would make an AJAX call via $.ajax(settings) and then pass in the website URL that corresponds to the APIProxy controller. The APIProxy controller would then make the appropriate POST server calls and process the JSON responses and return the response back to AJAX callback functions.
I heard about flXHR about I don't want to use Flash because devices like the iPad or any a lot of mobile browsers don't support Flash.
Anyway, I just wanted to ask what are some of the best practices in managing a website with the API provider on a separate domain or subdomain.
When you request some JSON, it returns an object or array. Script tags are not subject to the same-domain rule. So instead making an AJAX call, you would essentially do this:
<script src="Http://api.example.com?param1=something&etc"></script>
That would load the JSON, and it would execute as JavaScript.
...But a simple object or array "executing" by itself isn't very useful. So when you request the JSON, you also include the name of a callback function. If the provider sees that a callback was provided, instead of just returning JSON, it actually returns JavaScript: the JSON is passed to your function as an argument.
Then, when the script loads, your function (which you already defined) is called, and given the JSON to work with.
That's JSONP.
Bibliography
Newton, Aaron. "Request.JSONP." Clientcide. 7 Dec. 2009. Web. 28 Jan. 2011.
I'm fairly new to WCF but am technically competent.
I am having trouble getting WCF to play nicely. I currently have a WSHttpBinding set up to a service and it is working when using the WCFTestClient supplied with VS2008. What I would like to do is have the service accessible within the browser.
I currently return a JSON response from my service but am unable, as of yet, to access the data via. a URL. I have seen lots of internet tutorials where they seem to be accessing data a bit like this (note the bolded section):
http://localhost/Service.svc/MethodName?param1=value1¶m2=value2
If I try and do that I get a 404 - I am guessing it is looking for a literal file but don't know how to fix it.
Any help you can give would be great, thanks!
You can't do that with WSHttpBinding... you need to expose an endpoint using the WebHttpBinding and have your contract correctly specify the right uri template in the [WebGet] attribute. Here are some pointers to get you started:
Rest in WCF
WebHttpBinding example
WebHttpBinding and JSON