How can I document existing API services with RAML? - raml

We have a project that is pretty nearly done now and has hundreds of existing WEB API Services.
We have documented everything as far as design and example URLs on a wiki as we went.
Now that we are near the end, I have been tasked with finding a product that that is more truly meant to document WEB API services like API Blueprint or Swagger.
I migrated a few API services from our wiki to API Blueprint and Apiary.
But that editor is kind of clunky and now I have stumbled onto RAML and like the human readable syntax much more as well as it's reusability features.
Our second goal however is to be able to use that same API documentation to perform integration testing.
Does anyone know of any tools that could auto document our services without me having to manually move everything over? I need a kind of an environment that would also support Integration testing.

It depends on the language.
For ASP.NET 5 / WebApi you have the RAML .NET Tools
You can check all the available tools here and filter by language and type.

Related

Why we use general interface(GI)?

I am newer to the topic of GI.I just what to know why we are using GI,what is GI.
General interface is a proprietary UI framework developed by TIBCO. It features AJAX and rich client experience. They used it for one of AMX BPM interfaces : Workspace, and made it available via licensing for implementation.
This was another time... back then multiple companies tried to suggest a (often proprietary) approach to rich client based + back-end calling web UI. Ajax was the rage... and everybody tried to impose a framework.
A lot of that changed in the following years, when some new tools emerged, like Google Web Toolkit (GWT), JSF, Wicket, etc.
Eventually, TIBCO itself would use GWT in a second BPM UI : Openspace.
Arguably, most of these framework looks complex and outdated when compared to the lighter,current web frameworks, such as : JQuery, Webix, etc. These are mostly done with Javascript.

What is the simplest way to convert a WCF Data Services OData provider to Web API?

I'm currently looking at the feasibility of converting our current WCF Data Services OData provider to Web API OData.
I'm just a little confused at the OData implementation for Web API. With WCF Data Services it sits over the top of our ADO.Net entity model which exposes a bunch of tables from the SQL Server backend, i.e. you give WCFDS the ADO model to generate and then you have access to all of the tables through the standard OData syntax.
With Web API from all reading so far do we create a controller or separate actions for every table/object that we want to expose? Am I missing something? Is there just a way where the OData Web API controller can just expose the entire model from the ADO Data model? Having to create a action for every table would be a mess and overkill.
Currently if we need to add a table we just map it in the EDMX and WCFDS will automatically expose it as it's mapped to the entire context of the model.
Generating the model(s)
You can:
Use the convention model builder from ASP.NET Web API. This generates a different model than what EF's own convention model builder produces: an EdmLib IEdmModel. See this question though if you're using model-first or database-first. This method seems really backwards, and it is, but it mostly works.
Serialize the EF model and rebuild it as an IEdmModel (see this question). Again this is really inefficient. If you're using model-first or database-first, you'll want to just deserialize the EDMX file to build the IEdmModel. It still produces a different model internally, but at least the CDSL is a more stable format than CLR code conventions, so you'll probably have less surprises than you'd get when using two different convention-based model builders.
The reason for this is that ASP.NET Web API OData extensions use EdmLib, while EF uses its own code, and there is no plan to make them work together. Maybe you'll find this rant useful if you're curious.
Working on the API
Once you've generated the model from a unique source (so that you can work on your model from a single place), you'll indeed have to create a controller per entity, basically. The point of Web API is not to build things automagically, but to offer flexibility to the developer. The EntitySetController helps reducing redundancy, but it won't offer everything out of the box.
Taking a step back
In the rant mentioned above, I also talk about the difference between a service-layer API and a data-layer API. ASP.NET Web API is better suited for services, while OData makes services awkward. On the other hand, OData makes data access a breeze (essentially being like a RESTful SQL) and by virtue of being directly attached to the data model, can automate a lot of things as you saw with WCF Data Services. ASP.NET Web API with OData extensions sits in the middle, and its value is not universally agreed upon (using certain bits of OData URI syntax on service APIs is certainly useful though).
Don't get too hyped up by the recent buzz around ASP.NET Web API, it and WCF Data Services are very different beasts and run on different layers in your design. Indeed, in a multi-tier architecture, you could very well see a service API built using ASP.NET Web API sitting on top of an OData API built using WCF Data Services.
My advice is think carefully about what you're trying to build, and depending on the answer, either choose ASP.NET Web API and embrace the fact that the API you expose will be very different from a data-centric OData API, or stick with WCF Data Services.
A possible plan
You can find a lot of material on the web about service-layer APIs on the web by searching for terms like "non-CRUD web/RESTful/hypermedia API" or by comparing products like ServiceStack, which advocate for less data-oriented APIs.
If you're still unsure about the nature of your project, prototype it.
If you end up with a bunch of essentially identical controllers with Web API, each mapped to exactly one entity, then your API is heavily data-oriented. Go with WCF Data Services.
If you end up with a lot of OData Actions and awkward entities with WCF Data Services, then you need more domain logic on the server-side of the API, and data-orientation doesn't offer you enough. Go with Web API. A good rule of thumb here is to treat OData actions just like you treat stored procedures in a SQL DBMS. Actually, treat any OData server as a DBMS, because that's what they are. If you wouldn't put it behind a SQL interface, don't put it behind an OData interface.
Important (Update)
It was announced on March 27th 2014 that WCF Data Services would be discontinued by Microsoft in favor of ASP.NET Web API. To handle the "data-layer" use-cases I've exposed here, Microsoft has said that it is planning to extend ASP.NET Web API. Some community efforts are also underway. WCF Data Services will also be open-sourced at some point, so it's not impossible that a new maintainer will takeover, though it's uncertain.

ASP.NET MVC REST frameworks

There are number of REST frameworks around for ASP.NET MVC. Which one is the most mature in your opinion? Following are few I briefly looked at, but I couldn't decide.
Snooze
BistroMVC
Restful Service with WCF
OpenRasta
Siesta
REST support build in ASP.Net MVC SDK
.... there are few more.
Personally I would go with the default ASP.NET routing engine which is built and supported by Microsoft. This will ensure that you won't find yourself one day into the position of having to migrate some code which has become obsolete because the authors simply decided to abandon the project. Of course if there is something specific that you want to implement which isn't supported out of the box you could search for alternatives. But as far as exposing a RESTful API is concerned the routing engine should work just fine.
I completely agree with Darin.
But if you're looking for something closer to what WCF offers (Web service, versus a typical Web site), I've been extremely happy with WCF REST.
There's a WCF REST Service Template available via Visual Studio's Extension Manager that will get you up and running fairly quickly.
OpenRasta
Have been implementing RESTFul service using OR, only one word to describe it => "Pure Awesomeness" ....actually it's 2 words.
For me the simplicity is a plus, the framework is easy to use and adopt to. Some of it's conventions in my opinion really help me to understand Resful. Many integration points in the framework, very easy to extend its functionalists.
watching video recordings of seb's talk is very entertaining as well :) very opinionated (in a good way IMO)
I agree with Darin. Personally, I think Apache Thrift is also an option for doing client and server communication.

Moving to a New Stack - AJAX, REST & NoSQL

All,
I'm beginning to explore what frameworks (open source) and tools for building web applications. What should I select and learn for the following layers,
Layer 1
Client side JavaScript / AJAX library or framework that will invoke REST style services provided by layer-
2
Layer 2
Provides a framework to rapidly create REST style services out of existing applications and out of a NoSQL document oriented database provided by layer-3. I need this layer in cases where I need to expose REST style services out of my traditional apps and RDBMS.
Layer 3
Which NoSQL to use - CouchDB or MongoDB that would work well with layer-2?
Will I need a MVC framework like RoR or a web/component framework like Wicket? Am I missing anything?
I also need recommendations for which tooling/IDE (and associated plugins) for the development environment. Thanks in advance for your answers/thoughts.
We've had pretty decent luck using a Java stack:
For the presentation, we use jQuery and jQueryUI, with Freemarker for XHTML/CSS templating, including to invoke REST web services through various UIs.
Restlet (www.restlet.org) is a wonderfully rich framework for crafting REST web services in Java. We decided to use it on a major product after it was strongly recommended to us by the engineering director of a top 10 e-commerce site in the US. And everything he said about it was true.
Unless you know you're going to face a really large amount of write volume, you're probably best off using one of the tried and true SQL databases supporting ACID transactional guarantees. We used Oracle, then switched to PostgreSQL, using the MyBatis (formerly iBatis) SQL Mapper to shield our code from the details of the database. With the advent of 64-bit addresses and scads of inexpensive DRAM, plus SSDs, these old workhorses do scale quite high.
If you are anticipating very large amounts of writes, by all means consider a so-called "NoSQL" database. I heard very good things about Vertica from the top network ops folks at a major technology company last week. MongoDB and CouchDB both look interesting. Or you may be able to leverage persistent distributed cache technology like Redis or EhCache to offload a traditional database.
The task you're trying to accomplish determines the technology you use.
If you're interested in the .NET platform, consider:
RavenDB for the data layer.
WCF Data Services for your REST services layer. Here's a guide to WCF Data Services with some great video tutorials.
If you need a web front-end, consider ASP.NET MVC or plain HTML as you see fit. jQuery and AJAX modules are well suited in both.
Arguably, ASP.NET MVC IS a REST endpoint of sorts, so you could skip the WCF services.
All the above can be leveraged with free tools, including:
Visual Studio Express
Raven DB's HTML interface at http://localhost:8080/

How can I integrate Oracle BI into an existing application?

I have an existing application written in Perl. Now I need to integrate this application with OBI. The plan is having a button the user can click on to open OBI in an iframe. OBI resides on a different server from the running application.
Has anyone done this before, know what is the best practice for doing this, and what is the effort of doing this?
Another question is is it possible to add customizations to the OBI displayed in the iframe.?
There are two ways to address the problem that I know of and tried out. According to your needs, one or the other might be more appropriate (or both, they're not mutually exclusive). In both cases, the documentation is good and readily available.
The "Go URL"
The Go URL is documented more thoroughly in the Oracle Business Intelligence Presentation Services Administration Guide. It provides a quick and easy interface to the reports you've already defined, in the form of a URL. All that's needed to get it running is to fill in a few query parameters to direct to the report you want. You might need to include authentication tokens too.
Pros: very easy to try out.
Cons: harder to get security right.
The web services
The presentation server comes with a series of web services that enable a more programmatic way of querying your repository. The functionality offered through this channel goes further: for example most catalog management, including report creation and editing is possible. The full list fills a guide of its own: the Oracle Business Intelligence Web Services Guide.
Pros: better integration (i.e., no need for an IFRAME); easier to get the security right.
Cons: harder to setup; lots of XML; more advanced features (e.g. in-place drilldown) need an HTTP bridge that was a bitch to get right in my case. The generated HTML might clash a bit with yours and require cleaning up, notably in the CSS.
Embedding OBIEE reports inside a non-ADF web app is tough. If you have an option to re-write your web application in ADF, your life will be a lot easier. Just drag and drop reports and visualizations into your web application. Oracle's own Fusion Applications also follow this approach. If your app is analytics heavy, it might be a good option to explore. Here's a link to the Oracle doc.

Resources