This is my first time to use Microsoft Flow
it looks easy but I am stuck at the moment
I want to pass data from MS Forms to my Stored Procedure using MS Flow
I managed to get the response ID but that is not enough,
I want the rest of the data in the MS Forms
as in the image below
List of response notifications Response id will send the ID of the response
but
Current item will send this data
{"webhookId":"f912d0db-1dd4-4c7e-b655-ae75d4bb3d7a","eventType":"ResponseAdded","resourceData":{"formId":"dVLH1V-fFkuoPzxppiOnlzixU_JOYMlMpqOmPhdZrLpUREV0OU5VFFNJVE5XUE0GS15DQVVWTlFaMC4u","responseId":3},"eventTime":"2018-12-10T01:13:15.4185181Z"}
my question is how to pass user entry in the MS Forms to the SP?
You should add an extra action to your flow, to get the details of each row.
Related
My requirement is to fetch the data from rest api data and store it in separate variable. I had the endpoint url(https://abcdev.service-now.com/api/now/table/sc_multi_row_question_answer?sysparm_query=parent_id%3Dxxxxxxxxxxxxxxxxxxxx&sysparm_display_value=true&sysparm_exclude_reference_link=true&sysparm_fields=variable_set%2Cvalue%2Citem_option_new%2Crow_index).This is fetching the datas for all 3 rows. But i wants that this user needs to be added in this delegate access like that.
I have attached the image for which field i want to fetch.
I wants to fetch user to be added,level of access required,inbox,calender,tasks,contacts,notes,journal,select required send rights for these field i wants to fetch row by row.kindly assist me in that.
You will receive the response in JSON with all the rows, but you can use de serialize JSON Array activity to de serialize the JSON in each record with all the columns you have mentioned.
And Once you have the rows in the array JSON you can loop through it and get the details for each field.
Situation: We get audit emails every morning. The body of the email contains a table with various columns. The column labelled 'ID' is the unique key for each row. I have to copy the data from the ID column, format it in note++, and then paste it in a pre-filled query in SQL were I run it.
Question: Is it possible to automate this process? if so, where could I start? I would be nice if I could have something that either runs automatically or manually, reads the email, extracts the data from the column, formats it, and throws it in a query and executes.
Additional Details: The emails are always from the same distro, fire at the same time every day, and the table columns are static.
My skill Level: Beginner but resourceful and eager to learn, so please don't crucify me if I am not clear.
Yes, it is possible. You can develop a VBA macro or a COM add-in if you need to distribute your solution on multiple machines. Basically you are interested in handling the NewMailEx event of the Application class which is fired when a new message arrives in the Inbox and before client rule processing occurs. You can use the Entry ID returned in the EntryIDCollection array to call the NameSpace.GetItemFromID method and process the item. This event fires once for every received item that is processed by Microsoft Outlook. The item can be one of several different item types, for example, MailItem, MeetingItem, or SharingItem.
After retrieving the incoming item you can process it by getting property values and adding them to the SQL Db.
You may find the Walkthrough: Create your first VSTO Add-in for Outlook article helpful.
I have requirement for job that needs to get report for total appointments created at all locations and appointment types for data range.
Below is sequence to achieve the same.
Read list of 50000 locations from excel
For each of above location call a Soap Service A to get list of appointments created at that location for a data range and appointment type. The request takes in appt type and location.
For each of above appointments call SOAP Service B to get order number.
I was thinking to create multiple steps for each appointment type and implement multithreading for each chunck size.
If I have 100 locations with chunk size of 10, have 10 threads execute step.
Each step will have reader that will read excel file and that I was thinking if I need to implement composite reader to call Service A to fetch appointments and pass it to processor which will call service B for each appointment returned in list by service A or is it better to implement composite processor which will:
Call Service A to get list of appointments (Use Async Processor) for each location chunk returned from reader
Pass list to second processor
Iterate over each item of list and pass appointment to service B to get order number. (Use Async Processor)
Use async item writer to generate excel
Then after all steps are done,combine excels for each step into one excel
Is there a better design to handle scenario.
I am creating a payment page which uses PayFast.co.za.
When the user completes payment on the PayFast server, Payfast, in the background fires off a call to a script on the server(itn.php) to verify. then the user is redirected to the thankyou page. I use the thank you page to execute the function that iterates through the session data stored in their cart to update the database.
I want to do the executing of the cart data in the itn.php file instead of the thank you page.
So my question is... How do I store the data of a function(the thank you page) before the user is directed to Payfast servers and then retrieve it and execute it?
Do I use serialization? queues??
I notice when I am using queues for sending emails the data in the database is stored then executed later. I need to be able to store the cart data and associate it with the invoice id then recall that on the itn.php that Payfast is executing. Payfast is also sending the new invoice id to the itn.php page.
I'm trying to have an api which stores the information into my CRM, to push the details I've following parameters/details to store into the data:
Called_number, caller_number, agent_number, date, time, call_status, total_call_duration, Call_UUID, Recording_URL, conversation_duration
I've created the migration table with the same data name mentioned above, request protocol is HTTP, request data type is Query String and response data type is JSON.
Data is being sent by simple URL from third party so I'm using simple post route to insert the data into the database like this:
Route::post('/calllogs/{called_number}/{caller_number}/{agent_number}/{date}/{time}/{call_status}/{total_call_duration}/{call_UUID}/{recording_URL}/{converstation_duration}', 'CalllogController#insert')
Is there any way to secure this with some dynamic API keys to prevent inserting fake data? I mean any person having idea of the URL, can make the url and will insert data into my database, I want to have something like this:
Route::post('/calllogs/{api_key}/{caller_number}....
where I can check the api_key dynamically and then insert into the database.
Thanks.
This library (API Guard) is probably what you're looking for, it does exactly what you want: securing API calls with authorization keys.