Zapier CLI Trigger - How to use defined sample data when no results returned during setup - zapier-cli

I am trying to prototype a trigger using the Zapier CLI and I am running to an issue with the 'Pull In Samples' section when setting up the trigger in the UI.
This tries to pull in a live sample of data to use, however the documentation states that if no results are returned it will use the sample data that is configured for the trigger.
In most cases there will be no live data and so ideally would actually prefer the sample data to be used in the first instance, however my trigger does not seem to ever use the sample and I have not been able to find a concrete example of a 'no results' response.
The API I am using returns XML so I am manipulating the result into JSON which works fine if there is data.
If there are no results so far I have tried returning '[]', but that just hangs and if I check the zapier http logs it's looping http requests until I cancel the sample check.
Returning '[{}]' returns an error that I need an 'id' field.
The definition I am using is:
module.exports = {
key: 'getsmsinbound',
noun: 'GetSMSInbound',
display: {
label: 'Get Inbound SMS',
description: 'Check for inbound SMS'
},
operation: {
inputFields: [
{ key: 'number', required: true, type: 'string', helpText: 'Enter the inbound number' },
{ key: 'keyword', required: false, type: 'string', helpText: 'Optional if you have configured a keyword and you wish to check for specific keyword messages.' },
],
perform: getsmsinbound,
sample: {
id: 1,
originator: '+447980123456',
destination: '+447781484146',
keyword: '',
date: '2009-07-08',
time: '10:38:55',
body: 'hello world',
network: 'Orange'
}
}
};
I'm hoping it's something obvious as on scouring the web and Zapier documentation I've not had any luck!

Sample data must be provided from your app and the sample payload is not used for this poll specifically. From the docs:
Sample results will NOT be used for a user's Zap testing step. That
step requires data to be received by an event or returned from a
polling URL. If a user chooses to "Skip Test", then the sample result,
if provided, will be used.
Personally, I have never seen "Skip Test" show up. A while back I asked support about this:
That's a great question! It's definitely one of those "chicken and
egg" situations when using REST Hooks - if there isn't a sample
available, then everything just stalls.
When the Zap editor tries to obtain a "sample result", there are three
places where it's going to look:
The Polling endpoint (in Step #3 of your trigger's setup) is invoked for the current user. If that returns "nothing", then the Zap
editor will try the next step.
The "most recent record/data" in the Zap's history. Since this is a brand new Zap, there won't be anything present.
The Sample result (in Step #4 of your trigger's setup). The Zap editor will tell the user that there's "nothing to show", and will
give the user the option to "skip test and continue", which will use
the sample JSON that you've provided here.
In reality, it will just continue to retry the request over and over and never provide the user with a "skip test and continue" option. I just emailed again asking if anything has changed since then, but it looks like existing sample data is a requirement.
Perhaps create a record in your API by default and hide it from normal use and just send back that one?
Or send back dummy data even though Zapier says not to. Not sure, but I don't know how people can set up a zap when no data has been created yet (and Zapier says not many of their apps have this issue, but nearly every trigger I've created and ever use case for other applications would hint to me otherwise).

Related

Ruby YouTube Data API v3 insert caption always returns error

I am trying to use the Ruby SDK to upload videos to YouTube automatically. Inserting a video, deleting a video, and setting the thumbnail for a video works fine, but for some reason trying to add captions results in an invalid metadata client error regardless of the parameters I use.
I wrote code based on the documentation and code samples in other languages (I can't find any examples of doing this in Ruby with the current gem). I am using the google-apis-youtube_v3 gem, version 0.22.0.
Here is the relevant part of my code (assuming I have uploaded a video with id 'XYZ123'):
require 'googleauth'
require 'googleauth/stores/file_token_store'
require 'google-apis-youtube_v3'
def authorize [... auth code omitted ...] end
def get_service
service = Google::Apis::YoutubeV3::YouTubeService.new
service.key = API_KEY
service.client_options.application_name = APPLICATION_NAME
service.authorization = authorize
service
end
body = {
"snippet": {
"videoId": 'XYZ123',
"language": 'en',
"name": 'English'
}
}
s = get_service
s.insert_caption('snippet', body, upload_source: '/path/to/my-captions.vtt')
I have tried many different combinations, but the result is always the same:
Google::Apis::ClientError: invalidMetadata: The request contains invalid metadata values, which prevent the track from being created. Confirm that the request specifies valid values for the snippet.language, snippet.name, and snippet.videoId properties. The snippet.isDraft property can also be included, but it is not required. status_code: 400
It seems that there really is not much choice for the language and video ID values, and there is nothing remarkable about naming the captions as "English". I am really at a loss as to what could be wrong with the values I am passing in.
Incidentally, I get exactly the same response even if I just pass in nil as the body.
I looked at the OVERVIEW.md file included with the google-apis-youtube_v3 gem, and it referred to the Google simple REST client Usage Guide, which in turn mentions that most object properties do not use camel case (which is what the underlying JSON representation uses). Instead, in most cases properties must be sent using Ruby's "snake_case" convention.
Thus it turns out that the snippet should specify video_id and not videoId.
That seems to have let the request go through, so this resolves this issue.
The response I'm getting now has a status of "failed" and a failure reason of "processingFailed", but that may be the subject of another question if I can't figure it out.

How can I get Google Auth working with Laravel?

I'd like to know if there's an easy fix for this error that I'm getting while trying to add support for Google sign-in to my website, since I can only reproduce it while on a Laravel-based environment. Vanilla PHP applications do run just fine.
This is my relevant code:
if ($request->has('googleToken')) {
$client = new Google_Client(['client_id' => env('GOOGLE_PLATFORM_CLIENT_ID') ]);
$payload = $client->verifyIdToken($credentials['googleToken']);
if (!$payload) {
return response([ 'error' => 'Invalid token, please try using form-based authentication.' ], Response::HTTP_FAILED_DEPENDENCY);
}
$user['googleToken'] = $credentials['googleToken'];
}
I know I'm doing too relaxed validations, but please just focus on the fact that I'm just testing and I plan to change this code in the near future.
The code above, receives its data through an Axios PUT request from the frontend with the payload looking like this:
{
googleToken: "eyJhbGciOiJSUzI1NiIsImtpZCI6IjE5ZmUyYTdiNjc5NTIzOTYwNmNhMGE3NTA3OTRhN2JkOWZkOTU5NjEiLCJ0eXAiOiJKV1QifQ.eyJpc3MiOiJhY2NvdW50cy5nb29nbGUuY29tIiwiYXpwIjoiNTkyODkzNjE3ODYzLXRscDdvaDByaTk2dTZxZGxrOXYwbHAyanQyNDlkdDNsLmFwcHMuZ29vZ2xldXNlcmNvbnRlbnQuY29tIiwiYXVkIjoiNTkyODkzNjE3ODYzLXRscDdvaDByaTk2dTZxZGxrOXYwbHAyanQyNDlkdDNsLmFwcHMuZ29vZ2xldXNlcmNvbnRlbnQuY29tIiwic3ViIjoiMTE1NTg0MDg0NTE2OTMxOTQzODU..."
mailAddress: "user#mail.com"
}
The problem is that the payload would simply return false. I decided to try to investigate the issue, so I went to the definition of verifyIdToken contained within Google_Client and, from there, jumped over to the function that finally returns to its parent, which is verifyIdToken from the class Verify.
Inside of that class, there's a pretty loose try/catch block in which I decided to try adding a generic exception case so that I could quickly print the error message for debugging. I did, and this is the output I got:
OpenSSL unable to verify data: error:0909006C:PEM routines:get_name:no start line
This is what's failing internally, and from this point on, I don't really have an idea about how to proceed since the error feels very cryptic, or at least it's not in my field of knowledge.
The OpenSSL error you quoted indicates that your client was not able to read any/further PEM-encoded data. Refer to https://www.openssl.org/docs/man1.1.1/man3/PEM_read.html.
OpenSSL unable to verify data: error:0909006C:PEM routines:get_name:no start line
Here,
'PEM routines' represents the library within OpenSSL
'get_name' is the function
'no start line' is the reason
Is you client able to access the necessary certificates/keys?

How to send to multiple recipients (WITHOUT LOOPING) while using variable alphanumeric sender ID?

I was wondering if there is a way to set multiple recipients WITHOUT looping over the list of recipients at my end?! Also most importantly while using variable alphanumeric sender ID and NOT buying a twilio number?
I can do all this for single recipient like this:
$twilio_client->messages->create(
'+64*******', //to
[
'from'=> 'foo',
'body' => 'bar'
]);
Works perfectly fine. However, doesnt work with multiple receivers.
Also note, it was bloody-1-step easy to implement smsbroadcast.com.au and pass all this in a simple call (simple call, quick fast, super easy documentation - unlike twilio which has like a billion lines of confusing documentation, 200 similar products and YET no direct api to check balance, or do a simple thing such as multiple recipients!!)
After a lot of back and through with the support, plus reading all the over-written-essay (so-called documentations), finally I found a way to get all that done.
[Step 1: Configuration]
You have to implement Notify product which is a separate product than the Message product.
So from left menu Notify> Service> add New. Here you need to add a Messaging Service and have it selected here for Notify.
On the Messaging Services page, it will ask you to buy a twilio number. Instead just click on Configure from the left menu and put in all the details you need.
Specifically and importantly, make sure you have Alpha Sender ID checked and a default Alpha Sender text entered there. This will be the default fallback in case if your api call fails to accept the from param.
.
[Step 2: API Call]
//$notify_service_SID = the SID from the Notify Service you added in step 1
$client = new Client($this->Account_SID, $this->auth_token);
$notify_obj = $client->notify->services($notify_service_SID);
//you need receivers in a JSON object such as the following, plus make sure numbers are starting with country code to ensure alpha sender works correctly
$receivers_json = [0=>'{"binding_type":"sms","address":"+614********"}']+[1=>'{"binding_type":"sms","address":"+614*******"}']
$call_ret = $notify_obj->notifications->create([
'toBinding'=> $receivers_json,
'body' => $msg, //actual message goes here
'sms' => [
'from'=> $sender //alphanumeric variable sender
],
]);
.
[Step 3: Check for errors]
There is no direct way to check all errors in twilio when implementing Notify. Here is my mixed approach.
Exception handling to the notifications->create call
$call_ret will have err if the Notify fails, but not when the Message fails. because Notify just passes the call to Message and there is no direct way to check Message errors over Notify call. So you would check for $call_ret['err']
Call the Alerts API; fetch all recent alerts, and match against the Notification SID you received from the last call.
--
Here is how to do the Alerts check:
$alerts = #$client->monitor->v1->alerts->read();
if(is_array($alerts))
foreach($alerts as $av)
{
$t = #$av->toArray();
#parse_str($t['alertText'], $alert_details);
if(isset($alert_details['notificationSid']) && $alert_details['notificationSid'] == $call_retx['sid'])
{
$alert_err = $alert_details['description'];
break;
}
}
$alert_err will carry an error if there was an error. Apart from this there is no direct way to do it. You can fetch these Alerts via crons or you can setup a webhook for them to do a call back. Or simply implement a one call simple api that does it all in super simple way such assmsbroadcast.

Ruby neo4j-core mass processing data

Has anyone used Ruby neo4j-core to mass process data? Specifically, I am looking at taking in about 500k lines from a relational database and insert them via something like:
Neo4j::Session.current.transaction.query
.merge(m: { Person: { token: person_token} })
.merge(i: { IpAddress: { address: ip, country: country,
city: city, state: state } })
.merge(a: { UserToken: { token: token } })
.merge(r: { Referrer: { url: referrer } })
.merge(c: { Country: { name: country } })
.break # This will make sure the query is not reordered
.create_unique("m-[:ACCESSED_FROM]->i")
.create_unique("m-[:ACCESSED_FROM]->a")
.create_unique("m-[:ACCESSED_FROM]->r")
.create_unique("a-[:ACCESSED_FROM]->i")
.create_unique("a-[:ACCESSED_FROM]->r")
.create_unique("i-[:IN]->c")
.exec
However doing this locally it takes hours on hundreds of thousands of events. So far, I have attempted the folloiwng:
Wrapping Neo4j::Connection in a ConnectionPool and multi-threading it - I did not see much speed improvements here.
Doing tx = Neo4j::Transaction.new and tx.close every 1000 events processed - looking at a TCP dump, I am not sure this actually does what I expected. It does the exact same requests, with the same frequency, but just has a different response.
With Neo4j::Transaction I see a POST every time the .query(...).exec is called:
Request: {"statements":[{"statement":"MERGE (m:Person{token: {m_Person_token}}) ...{"m_Person_token":"AAA"...,"resultDataContents":["row","REST"]}]}
Response: {"commit":"http://localhost:7474/db/data/transaction/868/commit","results":[{"columns":[],"data":[]}],"transaction":{"expires":"Tue, 10 May 2016 23:19:25 +0000"},"errors":[]}
With Non-Neo4j::Transactions I see the same POST frequency, but this data:
Request: {"query":"MERGE (m:Person{token: {m_Person_token}}) ... {"m_Person_token":"AAA"..."c_Country_name":"United States"}}
Response: {"columns" : [ ], "data" : [ ]}
(Not sure if that is intended behavior, but it looks like less data is transmitted via the Non-Neo4j::Transaction technique - highly possibly I am doing something incorrectly)
Some other ideas I had:
* Post process into a CSV, SCP up and then use the neo4j-import command line utility (although, that seems kinda hacky).
* Combine both of the techniques I tried above.
Has anyone else run into this / have other suggestions?
Ok!
So you're absolutely right. With neo4j-core you can only send one query at a time. With transactions all you're really getting is the ability to rollback. Neo4j does have a nice HTTP JSON API for transactions which allows you to send multiple Cypher requests in the same HTTP request, but neo4j-core doesn't currently support that (I'm working on a refactor for the next major version which will allow this). So there are a number of options:
You can submit your requests via raw HTTP JSON to the APIs. If you still want to use the Query API you can use the to_cypher and merge_params methods to get the cypher and params for that (merge_params is a private method currently, so you'd need to send(:merge_params))
You can load via CSV as you said. You can either
use the neo4j-import command which allows you to import very fast but requires you to put your CSV in a specific format, requires that you be creating a DB from scratch, and requires that you create indexes/constraints after the fact
use the LOAD CSV command which isn't as fast, but is still pretty fast.
You can use the neo4apis gem to build a DSL to import your data. The gem will create Cypher queries under the covers and will batch them for performance. See examples of the gem in use via neo4apis-twitter and neo4apis-github
If you are a bit more adventurous, you can use the new Cypher API in neo4j-core via the new_cypher_api branch on the GitHub repo. The README in that branch has some documentation on the API, but also feel free to drop by our Gitter chat room if you have questions on this or anything else.
If you're implementing a solution which is going to make queries like above where you have multiple MERGE clauses, you'll probably want to profile your queries to make sure that you are avoiding the eager (that post is a bit old and newer versions of Neo4j have alleviated some of the need for care, but you can still look for Eager in your PROFILE)
Also worth a look: Max De Marzi's post on Scaling Cypher Writes

How do I retrieve step count data from Google Fitness REST api?

Since I installed the Google Fit app on my Nexus 5 it has been tracking my step count and time spent walking. I'd like to retrieve this info via the Google Fitness REST api (docs) but I can't work out how to get any of that data from the REST api.
I've used the OAuth 2.0 playground to successfully list dataSources but none of the examples I have tried have returned any fitness data whatsoever. I feel like I need to use something similar to a DataReadRequest from the (Android SDK) but I'm not building an Android app -- I just want to access fitness data already stored by the Google Fit app.
Is it even possible to get the data gathered by the Google Fit app? If so, how can I read and aggregate step count data using the REST api?
It turns out that the answer is in the docs after all. Here is the format of the request.
GET https://www.googleapis.com/fitness/v1/users/{userId}/dataSources/{dataSourceId}/datasets/{datasetId}
The only supported {userId} value is me (with authentication).
Possible values for {dataSourceId} are avaiable by running a different request.
The bit I missed was that {datasetId} is not really an ID, but actually where you define the timespan in which you are interested. The format for that variable is {startTime}-{endTime} where the times are in nanoseconds since the epoch.
I was able to get this working by going through the google php client and noticed that they append their start and finish times for the GET request with extra 0's - nine infact.
Use the same GET request format as mentioned in an answer above:
https://www.googleapis.com/fitness/v1/users/{userId}/dataSources/{dataSourceId}/datasets/{datasetId}
Now here is an example with the unix timestamp (php's time() function uses this)
https://www.googleapis.com/fitness/v1/users/me/dataSources/derived:com.google.step_count.delta:com.google.android.gms:estimated_steps/datasets/1470475368-1471080168
This is the response I get:
{
"minStartTimeNs": "1470475368",
"maxEndTimeNs": "1471080168",
"dataSourceId":
"derived:com.google.step_count.delta:com.google.android.gms:estimated_steps
}
However if you append your start and finish times with nine 0's that you put in your GET requests and shape your request like this:
https://www.googleapis.com/fitness/v1/users/me/dataSources/derived:com.google.step_count.delta:com.google.android.gms:estimated_steps/datasets/1470475368000000000-1471080168000000000
It worked - this is the response I got:
{
"minStartTimeNs": "1470475368000000000",
"maxEndTimeNs": "1471080168000000000",
"dataSourceId":
"derived:com.google.step_count.delta:com.google.android.gms:estimated_steps",
"point": [
{
"modifiedTimeMillis": "1470804762704",
"startTimeNanos": "1470801347560000000",
"endTimeNanos": "1470801347567000000",
"value": [
{
"intVal": -3
}
],
"dataTypeName": "com.google.step_count.delta",
"originDataSourceId": "raw:com.google.step_count.delta:com.dsi.ant.plugins.antplus:AntPlus.0.124"
},
The response is a lot longer but I truncated it for the sake of this post. So when passing your datasets parameter into the request:
1470475368-1471080168 will not work, but 1470475368000000000-1471080168000000000 will.
This did the trick for me, hopes it helps someone!
I tried post method with below URL & body. This will work, please check inline comments too.
Use URL: https://www.googleapis.com/fitness/v1/users/me/dataset:aggregate
Method: POST
Body:
{
"aggregateBy": [{
"dataTypeName": "com.google.step_count.delta",
"dataSourceId": "derived:com.google.step_count.delta:com.google.android.gms:estimated_steps"
}],
"bucketByTime": { "durationMillis": 86400000 }, // This is 24 hours
"startTimeMillis": 1504137600000, //start time
"endTimeMillis": 1504310400000 // End Time
}

Resources