Got some info from this related question: Using SignalR with Azure Service Fabric
Quote in quote from that comment history:
"After deploying this application, you can see that some calls to signalR fails and Some succeeds...."
Just want to know if anybody had similar implementation (signalR with Web API...).
And I don't want to set instance count 1 for the stateless Web API.
You can configure LoadBalancer that is used by Service Fabric to persist session. This will result in particular client ending up always in the same node.
You can do it by accessing LoadBalancer assigned to service fabric cluster in azure portal. Then go to "Load balancing rules", select rule that is responsible for signalR port and set "Session persistence" to "Client IP". This can also be done from PowerShell or ARM template.
Related
Did anyone deploy Azue Web App Bot using Application Service Environment?
are there any key considerations to be noted? I know we have ASE ILB and ASE External;
Is it possible to host multiple azure webapp bots in one ASE; my question is primarily due to the default lockdown of internet traffice in ASE ILB model and what type of firewall exceptions we will need to ensure
functinally the communication to Azure Bot Service/ Directline happen smoothly.
It is possible to host multiple azure web app bots in one ASE. However, care should be taken on how to have the bot dynamically looks up the pipe name as there a multiple bots inside the same ASE. Also, the normal DirectLine or other channels would require a lot of whitelisting to allow traffic into the ILB, and bot services IPs can change so it would be difficult to maintain long term.
'Test in WebChat' is not expected to work within an ILB ASE. It calls out to the DirectLine channel and causes the channel to send a call to the bot's messaging endpoint. In most ASE or VNET scenarios that call will be blocked, but since we don't have static IP addresses so the customer can't whitelist the incoming calls, either. Other than that, Directline channel and Direct Line App Service Extension(DL ASE) should typically work as expected from within an ILB ASE setup. If you are implementing additional features such as OAuth or SSO, then you will need to add a rule to enable service tags for AzureAD.
For more info on DL ASE, please refer to https://learn.microsoft.com/en-us/azure/bot-service/bot-service-channel-directline-extension?view=azure-bot-service-4.0
I have written a web application that is, typically, installed internally by customers (based on IIS/MSSQL server).
When a customer wants to provide external access to the application, we offer the following supported scenarios:
Publish the application in their DMZ (pretty standard deployment).
Use our own platform where we host the application in our own cloud infrastructure for them.
However, because I have more and more customers who misunderstand the requirements for publishing an internal application, I would like to add a "one click" way of providing that service.
My idea is to have a reverse proxy installed on the customer's web server that will connect to a cloud server we control. When the application starts, it will connect to our server, authenticate and maintain the connection. When a user wants to use the application, she will use an URL that directs it to our server (say https://myapp.mycompany.org/CustomerID or https://CustomerID.myapp.mycompany.org). The server will then lookup the list of connections from reverse proxy to find the one matching the customer ID and, if found, use that connection to relay the end user connection.
In essence, that is the same thing as what Azure Application proxy or TeamViewer do, only without the need for using Azure AD or TeamViewer.
Is there an existing framework I can use for building such a service ? I know I can write it on my own but that is quite a large development.
I've created a simple Stateful Actor and a Web API (self hosted) and deployed it to Azure. It has worked and I can browse the nodes in the Service Fabric Explorer.
Azure gives me a url but when I add /api/values to the end (which works fine locally) it downloads a file called values and I can't open it as it is a binary file.
I want to call the web api from a Xamarin app (ie normal Rest api call) but if I can't call it via a browser I'm a bit stuck.
I would comment this on Stephen's answer, but I lack sufficient reputation.
To add a custom port to the Load Balancer after the service fabric cluster has been created you can (in the newer Azure portal):
Navigate to the load balancer resource for your service fabric cluster.
Under "Settings" find the "Load balancing rules" option.
This will have at least two rules, more if you did setup custom rules during the setup of the cluster.
Add a new rule.
Give it a name
'Port' is the external port you'd like to hit.
'BackendPort' is the port your service is configured to listen on.
The defaults on the other settings work in a pinch.
Note if you have multiple ports to enable, they each need their own rule.
I do know the above worked in my 'hello world' sandbox project.
I'm climbing the service fabric learning curve myself so I can't comment with authority on the other settings.
Have discovered what was missing.
https://azure.microsoft.com/en-us/documentation/articles/service-fabric-cluster-creation-via-portal/
This link here walks through creating the Service Fabric app on Azure and in particular the field "Application input endpoints" needs to have the port you want to use. For the samples, they are mostly port 80 or 8081.
There is supposed to be a way to add these ports afterwards which I tried (and so did a Microsoft support engineer) and it did not seem to work. You are supposed to be able to add these ports to the Load Balancer associated with the Service Fabric App.
I recreated my Service Fabric app, exactly as I did before but this time filled in the ports I want to use in the Node Type section and now I can hit the webapi services I've deployed. This field can be left blank which is what I did first time round and was why I had issues.
Not really related to Service Fabric, it's just how you set up your HTTP response headers in Web API. Recommend tagging this with asp.net or asp.net-web-api for a more thorough answer.
Tutorials and technical resources around Azure Service Fabric Stateless Web API tend to be slightly disjointed, given that the platform and resources are still quite immature.
This Stateless Web API tutorial, at the time of writing, is very effective.
As prerequisite to the tutorial:
Update Visual Studio to the latest version (Extensions and Updates)
Update the Service Fabric SDK to the latest version (Web Platform Installer)
Explicitly specify the EndPoint Port attribute (defined in ServiceManifest.xml) when setting up your Azure Service Fabric Cluster Node Type parameters
Following these steps will successfully allow deployment to both local and remote clusters, and will expose your Web API endpoints for consumption.
I have to implement a asp.net web api which acts as a subscriber to rabbitMQ. The windows service is going to publish message to the web api services. There will be more than one instance of web api running on production enviornment. I am not sure how to open up the subscriber channel on web api and keep it open untill the IIS restarts. There will be one publisher and several consumer.
Can anyone please guide with some sample code to start with?
Any help will hugely appreciated
Generally RabbitMQ subscriptions don't work well with IIS hosted applications because you have no control over when the application is running. IIS will recycle, stop and start the app as it sees fit.
If you must do it, open the connection to RabbitMQ and start subscribing when the application starts, in Global.asax.cs for example, and make sure to dispose of everything properly when it closes.
You are far better off building a windows service for the subscription and either writing to a shared store that the IIS hosted web service can access, or alternatively self-hosting the API inside the windows service.
I am working on a program that uses Azure for it's database. It works pretty good, except that I have to authorize every IP address that I access it from. So, if I go to a friends house I have to authorize that IP, and if I go to a coffee shop I have to authorize that IP...
I am hoping that there is a way to authorize the connection from the program, whatever IP it is coming from. Or, worse case senario, turn off that security measure.
DON'T.
The idea behind Firewalling your DB is to protect your data from anything that could have the SQL Server credentials should they somehow leak. It's for your own safety.
Instead, try to write a quick Web Service with ASP.Net WS/Jax RS/Rails/... to expose the DB data in a sane, secure and thoughtful manner. It's not hard and there are tons of tutorials and books on the matter out there.
Although NOT Recommended, but if you want to turn off this security measure you can allow connections to your SQL database from all IP Addresses by setting the IP address range to 0.0.0.0 - 255.255.255.255 in Azure Portal.
Another alternative would be to dynamically manage allowed IP addresses by using Azure Service Management API. You can manage Firewall rules using this API. You can read more about it here: http://msdn.microsoft.com/en-us/library/azure/dn505717.aspx
So what you could do is have a small service running in Azure. When your application starts, it sends the current IP address to your service and your service sets the IP address in the firewall rules. When the application terminates, it sends another request to your service and then your service removes that IP address from the firewall rule.
As #Machinarius so eloquently said DON'T. .NET already has a way of exposing data through OData services. You get SOAP or Json, LINQ queries, caching, security even down to the entity or operation level.
Exposing an EF model as an OData service is very easy. You can create an ASP.NET Web API OData endpoint using the "Web API 2 OData Controller with actions, using Entity Framework" template as described in the "Creating an OData Endpoint" tutorial.
To call the service from a client, you add a service reference to it and then use the proxy to execute LINQ queries. It could be something as simple as:
Uri uri = new Uri("http://localhost:1234/odata/");
var container = new ProductService.Container(uri);
var myProducts=container.Products.Where(....);
Check "Calling an OData Service From a .NET Client" for a detailed tutorial.
As an alternative, if you need to access your application from random places, why not have a VM configured in Azure with your application installed. And whenever you need your app, fire up that VM, RDP there and work via RDP. Would not need to update connection and much more secure rather then having to allow random IPs to access your database.
I realise this is not an answer to your question, but other stackoverflowers already provided a significant input on your problem. And I do agree with them all. Do not disable the firewall. It is for your own good!