Azure function not showing up in portal - visual-studio

I've written my first Azure Function and I've published it to azure. It said that the publish was successful and it created a new function app, but it's an empty one. I've done a bit of research into why my function wouldn't show up and I've seen that you might need to change the path to the zip deploy but I can't find where I need to change this and into what. Other solutions are always welcome! (I've published it directly from VS2017)
When I send a Postman Request to the url/api/function, it also doesn't work. It doesn't crash or anything, it just gives back nothing after a second.
I need this to work to create a custom connector for Power Automate that way I can use it in my Microsoft Flow.

Related

Generate Report in Plugin (Dynamics 365)

i gust curious maybe someone heard about solution how to get Report (SSRS Report) in Plugin (i want to export it in pdf from CRM and save in Sharepoint).
I have tried following solution:
https://community.dynamics.com/crm/f/microsoft-dynamics-crm-forum/247210/how-to-run-the-crm-report-through-sdk?pifragment-97030=1#responses
this one is not working anymore because of authentication. I tried to authentificate my user in WebClient inside plugin but no luck. Maybe someone know how to do it
There is an excellent article/blog posted by Bob Guidinger for Generating Report and sending Email for D365 Online.
Once you get first step running, you can extend it to perform you specific operation.
Blog mentioned about Azure function and plugin (combination).
Scheduling Reports in Dynamics 365 - Part 1
I tried this for one of my project and it worked fine with me. This shall be a bit learning curve if you do not know how to create azure functions and some small parts.
Happy Coding!!
Make sure to upvote, Accept if this helps!!

Issues with Swashbuckle

I have a WebAPI service, written in ASP.NET (not Core), for which I am trying to generate documentation, in order to allow other devs to use it. I found Swashbuckle, and installed it. Then, since I also use OData for some of my services, I added Swashbuckle.OData. Then, I modified the CustomProvider setting in SwaggerConfig to use the ODataSwaggerProvider. I also set ResolveConflictingActions(apiDescriptions => apiDescriptions.First()) because I had a few Actions with the same URL path, differing only by query string (I'll need to address that later). So far so good.
Then, I tested it. I started my web app, then added "/swagger/" to then end. I got a message stating that it was loading the resource info. However, after several minutes, I got a browser error debug popup, stating "Error: Not enough storage is available to complete this operation." It asks if I want to debug, and if I do, it takes me to the debugger in IE (the browser I'm using). The only code in the stack is either from jquery-1-8-0-min-js or swagger-ui-min-js (this part confuses me, as there is no "swagger-ui-min-js" file in my project; I'm assuming it's embedded in the dll). There is no part of the stack trace that floats back up to my code, and all the code there is minified, so it's very difficult to debug.
However, I do know that it is at least partially working, as three of the controllers do show up in the resulting page after you close the error popup. You can navigate through them, and all the GETs, POSTs, PUTs, and DELETEs seem to be there, and you can test them.
Is it the case that whenever you navigate to the "/swagger/" url, Swagger hits all the URLs in the service, in order to generate the documentation? I'm wondering if maybe it is hitting an action that is taking a particularly long time to run, or possibly its generated documentation is taking too much disk space (I have plenty of space on my disk, but maybe it is referring to RAM?).
Anyway, even if that were not an issue, how can I get it to generate something, some kind of document file, that I can send off to someone? I see no new files added to my folders, so it would seem that it re-does the whole process every time you navigate to the swagger URL.
When I tried the Chrome browser, I no longer had the issue (I was using IE11 before). Not sure what the problem was, but this was the workaround.

Download build drop from hosted Team Foundation Service

Using the hosted Team Foundation Service at tfs.visualstudio.com, one has the option in a Build Definition to "Copy build output to the server" which creates a zip of the drop folder that can be downloaded over https via team web access. I really need to download this drop automatically, so I can chain input to the next stage in my build pipeline.
Unfortunately, the drop URL is not obvious, but can be created using the TfsDropDownloader.
TL;DR - I can't get the TfsDropDownloader to work, I'm hoping someone else has used this tool or a similar method to succesfully download a drop from https://tfs.visualstudio.com
Using the command line TfsDropDownloader.exe I can do this:
TfsDropDownloader.exe /c:"https://MYPROJECTNAME.visualstudio.com/DefaultCollection" /t:"ProjectName" /b:"BuildDefinitionName" /u:username /p:password
...and get an empty zip file with the correct build label name of the last successful build e.g. BuildDefinitionName_20130611.1.zip
Running the source code in the debugger, this is because the URL that is generated for downloading:
https://tflonline.visualstudio.com/DefaultCollection/_apis/resources/containers/804/drop/BuildDefinitionName_20130611.1.zip
..returns a content type of application/json, which is unsupported. This exception is swallowed by the application, but not before the empty zip file is created.
Is it possible the REST API on Team Foundation Service has changed in some way so the generated URL is no longer correct?
Note that I am using the "alternate credentials" defined on my Team Foundation Service account (i.e. not my live ID) - using anything else gets me TF30063: not authorized.
I got it working by using alternate credentials, but I also had to access the REST API via a different path.
The current TfsDropDownloader builds a URL that looks like this:
https://project.visualstudio.com/DefaultCollection/_apis/resources/containers/804/drop/BuildDefinitionName_20130611.1.zip
This returns empty JSON whenever I try to use it. I'm definitely authenticated, because if I tweak the URL to:
https://project.visualstudio.com/DefaultCollection/_apis/resources/containers/804/drop
I get a nice JSON listing of every single file in the drop, but no zip.
From spying on the SSL traffic to https://tfs.visualstudio.com with Fiddler I saw that clicking the "Download drop as zip" link I can see that there is another endpoint at:
https://project.visualstudio.com/DefaultCollection/ProjectName/_api/_build/ItemContent?buildUri=vstfs%3a%2f%2f%2fBuild%2fBuild%2f639&path=%2Fdrop
...which does give you a zip. The "vstfs%3a%2f%2f%2fBuild%2fBuild%2f639" portion is the URL encoded BuildUri.
So I've changed my version of GetServerPath in the TfsDropDownloader source to do this:
private static string GetServerPath(TfsConnection collection, IBuildDetail buildDetail)
{
var downloadPath = string.Format("{0}{1}/_api/_build/ItemContent?buildUri={2}&path=%2Fdrop",
collection.Uri,
HttpUtility.UrlPathEncode(buildDetail.TeamProject),
HttpUtility.UrlEncode(buildDetail.Uri.ToString()));
return downloadPath;
}
This works for me for the time being. Hopefully this helps someone else with the same problem!

Oracle CRM On-Demand - Use Tasks Status to update Service Request Status

We're using CRM On-Demand for our Service Group and I'm running into an application limitation and am wondering if anyone has a workaround or just some general ideas on how to accomplish our goal.
In the application, our major focus is around the Service Request and driving for users to create Tasks for all Activities related to working towards closure. For example, a customer calls in and we need a technical resource to make a return call to diagnose the issue in detail, so a Task is assigned to that resource. Once that Task has been marked as completed, I'd like the Status to be updated. I tried creating a workflow using JoinFieldValue(), which wasn't working. I tried a more basic approach and tried to just have a field on the Service Request be populated with the Status of the Task, but that did not work either.
Upon further investigation in the Help File, there is a relationship from the Activity object to the Service Request object, but not one the other way.
So, has anyone else run into this limitation and found some other method to have a Status change on the Task update the Status of a Service Request?
(Also, I'd like to try and avoid writing a custom web service for this purpose, which is why I'm trying to use the tools in the app)
Thanks in advance for any ideas!
actually, if I well understood your issue is related on workflow cross object.
OCOD doesn't manage this type of workflow when you need to use workaround.
In order to cover a cross object worklflow you have many possibilities:
webservices as you said, but you could imagine a js code that will run WS and hosted directly into OCOD (in R19 you could hoste that in Client Side extension). That could be a good solution
Another one could be using Report with a custom look up functionnality with the usage of "Callback" function
I would prefer the 1st solution.

How entity edit URL from within plug-in in MS Dynamics CRM 4.0

I would like to have a workflow create a task, then email the assigned user that they have a new task and include a link to the newly created task in the body of the email. I have client side code that will correctly create the edit URL, using the entities GUID and stores it in a custom attribute. However, when the task is created from within a workflow, the client script isn't run.
So, I think a plug-in should work, but I can't figure out how to determine the URL of the CRM installation. I'm authoring this in a test environment and definitely don't want to have to change things when I move to production. I'm sure I could use a config file, but seems like the plug-in should be able to figure this out at runtime.
Anyone have any ideas how to access the URL of the crm service from within a plug-in? Any other ideas?
There is no simple way to do this. However, there is one.
The MSCRM_Config is the deployment database that handle physical deployment properties, like the URL from which users are accessing the CRM deployment. The url that you might want is the one stored in "ADWebApplicationRootDomain", in the MSCRM_CONFIG.dbo.DeploymentProperties table. You may need some permission to access this database.
Note that this doesn't work in a deployment that is an Internet Facing Deployment.
Another way could be to query the discovery service to retrieve the same information (in the case that you are on the Online edition of MSCRM4).
What do you mean by "change things"?
If you create a custom workflow assembly, you can give it a server url input. Once you register it with CRM, you can simply type in the server url when you configure the workflow. You'll have to update the url for any workflows that use the custom workflow assembly once you move to production, but you'll only have to do that once.
My apologies if this is what you meant you wanted to avoid.
Edit: Sounds like you may be able to use the CustomConfiguration attribute when you register the plugin. Here's some more info.
http://blogs.msdn.com/crm/archive/2008/10/24/storing-configuration-data-for-microsoft-dynamics-crm-plug-ins.aspx
String Url = ((string)(Registry.LocalMachine.OpenSubKey(
"Software\\Microsoft\\MSCRM").GetValue("ServerUrl"))
).Replace("MSCRMServices", "");

Resources