Zoho sync data between crm and mysql - zoho

I am planning to use zoho crm for my business. On on side I have clients who pay my business, on other hand I have online customer to whom I assign work given to me by clients. So basically my business is kind a mediator.
Now I want to use zoho crm workflow automation like when lead is created signup mail should be sent. I want to increase lead score when client does particular activity. I want to use webform to capture leads.
My issue is that zoho crm gives very less number of APIs like 500 per user per day. Then how do I do capture leads directly into crm. How do I increase lead score.
How do you guys manage such scenarios ?

The API Calls per day day will depend of your subscription plan.
Standard: starts with 2000 calls per day
Professional: starts with 3000 per day
Enterprise: starts with 4000 per day
Reference link
In several cases this will be enough, however there are a little tricks for saving API calls like using the API V4 in which you can insert/update multiples records (100 per request).
Also, you can use the custom function (zoho deluge) in the CRM and set yours workflow rules like:
Each time a new lead is created with the status "Not contacted" then:
Send a welcome email
Create a case (zoho deluge)
Create a taks (zoho deluge)
etc.
The rate limit for zoho custom functions is not the same as zoho api calls. (Integration Tasks - 25000 Zoho API calls/day using deluge.) So you can use both of them.
Reference link

Related

Google Pay API for passes, what is the restraint?

I have couple questions about google pay api of passes:
How many loyalty classes can we create?
What is the daily quota of this rest api call, like how many calls we can make per user per day?
Can we send the pass token links in email with a "self generated QRcode"? Or it have to be the "add to google pay button", is this allowed according to google design and use cases?
Thank you for answer this question.
How many loyalty classes can we create?
Unlimited
What is the daily quota of this rest api call, like how many calls we
can make per user per day?
"We might limit the rate at which you can call our API. We recommend you keep your requests to no more than 20 requests per second."
Source : https://developers.google.com/pay/passes/guides/get-started/api-guidelines/performance-tips#limit
Can we send the pass token links in email with a "self generated
QRcode"? Or it have to be the "add to google pay button", is this
allowed according to google design and use cases?
It's a shame to use a QRCode, especially if the user uses Gmail which integrates Go-To-Action (https://developers.google.com/gmail/markup/reference/go-to-action).

Query for deleted entities in Dynamics 365 XRM Tooling SDK

I am writing a generic integration that needs to use the database in Dynamics 365 for Customer Engagement as the system of record. I will be polling it from time to time to keep data up to date in other ancillary systems. I can know, of course, when records have changed by inspecting the "modifiedon" attribute. I can build a query saying "give me all the records that have changed since the last time I asked". It would, however, also be optimal to be able to know what records have been DELETED since the last time I asked. I have a similar integration with Salesforce, and that is trivial to do with the SalesForce API, but I can't see how to do it with the Dynamics 365 API.
It appears that the only option to me might be to keep a list of all the record primary keys in my integration, and download on each poll ALL records existing in CRM and then figure out the deleted ones on my own by their absence. That is pretty ugly and inefficient though.
Any ideas or advice?
MS introduced change tracking for this purpose.
The change tracking feature in Dynamics 365 for Customer Engagement Customer Engagement provides a way to keep the data synchronized in a performant way by detecting what data has changed since the data was initially extracted or last synchronized.
The sample web api request below:
GET [Organization URI]/org1/api/data/v9.0/accounts?$select=name,accountnumber,telephone1,fax HTTP/1.1
Prefer: odata.track-changes
Response will have delta link with a delta token:
"#odata.deltaLink": "[Organization URI]/api/data/v9.0/accounts?$select=name,accountnumber,telephone1,fax&$deltatoken=919042%2108%2f22%2f2017%2008%3a10%3a44"
When you use the above URI, you can get the changes including deleted entries.
{
"#odata.context":"[Organization URI]/data/v9.0/$metadata#accounts(name,telephone1,fax)/$delta",
"#odata.deltaLink":"[Organization URI]/api/data/v9.0/accounts?$select=name,telephone1,fax&$deltatoken=919058%2108%2f22%2f2017%2008%3a21%3a20",
"value":
[
{
"#odata.etag":"W/\"915244\"",
"name":"Monte Orton",
"telephone1":"555000",
"fax":"10101",
"accountid":"60c4e274-0d87-e711-80e5-00155db19e6d"
},
{
"#odata.context":"[Organization URI]/api/data/v9.0/$metadata#accounts/$deletedEntity",
"id":"2e451703-c686-e711-80e5-00155db19e6d",
"reason":"deleted"
}
]
}
Sample: Synchronize data with external systems using change tracking
I would like to contribute to this question which gave me excellent advice for the direction to take.
Although poorly documented on the official Dynamics doc and not mentioned where Change Tracking is explained, this powerful feature is also available in the Soap API:
https://learn.microsoft.com/en-us/dotnet/api/microsoft.xrm.sdk.messages.retrieveentitychangesrequest?view=dynamics-general-ce-9
https://learn.microsoft.com/en-us/dotnet/api/microsoft.xrm.sdk.messages.retrieveentitychangesresponse?view=dynamics-general-ce-9
I hope this helps someone.

How can we increase throughput on the LUIS programmatic API?

When using the LUIS programmatic API, we get frequent 429 errors ("too many requests") when doing a half-dozen GET and POST requests. We've inserted a pause in our code to deal with this.
We have a paid subscription key to LUIS, which indicates we should get 50 requests/second (see https://azure.microsoft.com/en-us/pricing/details/cognitive-services/language-understanding-intelligent-services/). However, it seems the paid subscription key can only be used for hitting the application endpoint. For Ocp-Apim-Subscription-Key in request headers, we must use our "programmatic key", which is associated with the Starter_Key, which is (apparently) rate-limited.
Am I missing something here? How do we get more throughput on the LUIS programmatic api?
One of our engineers arrived at the following answer, so I'm going to post his answer here.
The programmatic API is limited to 5 requests per second, 100K requests per month for everyone. Your paid subscription key is only for the endpoint API, not for the programmatic API.
If you need more throughput:
Put your API requests in a queue. It is unlikely that you need to update your LUIS model non-stop, 5 times per second -- you probably just have a burst of updates. Put them in a queue to stay within the limit.
Don't try using the same user account to manage multiple LUIS models. Set up additional accounts for each model, which gives you additional programmatic keys. Each programmatic key gives you 5 requests per second.

Mix Panel API web segmentation and personalisation

Hi I am interested in using Mix Panel on a web site to track customers events. I would like to know if there is any way to use the api to personalise the web site per customer, similar to segmentation for emails.
I would like to query the api for a singular customer asking whether they have achieved several events.
For example something like
If customer has clicked out and last visit greater than a month ago display a banner advert.
Mixpanel does not seem like a correct tool for the job you describe here.
While theoretically this might be possible (via Mixpanel's HTTP API), this will create unnecessary architectural complexity and add extra latency. If you need to customize your web site per user, store any user state in a database like MySQL or PostgreSQL. This will be both faster and easier.

How to handle php call for rapid api data update to database?

I am creating web application work with laravel and vueJs and my app work with third party api which one provide live data on related market. I.E.
When user add order with a price, it will match live market price before order is confirmed.
I constantly update the market price feed. A cron job is used to do this, but the response time is more than 1123 ms on laravel.
When using direct Php API call without using framework, the response time only improve slightly, e.g. 995 ms , before compare data in local database.
Please suggest a better way to retrieve this continuous update data. Currently the application is still in development, I i want know the correct type of service need and need suggestion on what kind of server should I use. The application make ~2,678,400 API request monthly.

Resources