I'm new to Mixpanel and I'm trying to use the JQL endpoint of the API to retrieve a daily feed of users who have been placed into experiments. My JQL is as follows:
function main() {
return Events({
from_date: "2022-04-01",
to_date: "2022-04-01",
})
.filter(function(event) {
return event.name == "$experiment_started"
}
)
.groupBy(["properties.experiment_name"], mixpanel.reducer.count());
}
This returns an empty array despite there being events for the date in question, and it returns figures when I remove the group by, so I know the problem must be on the property name I'm using. The trouble is, I have literally tried almost every different way to write "Experiment Name" that I can think of to no avail - prefixing with $, upper case, lower case, using an underscore, you name it but to no avail. The exact name in the UI is "Experiment name".
Can someone confirm which special way of writing "Experiment Name" will work, and how I can figure this out for myself in the UI so I don't have to break the hourly rate limit through trial and error?
Related
After looking at some SO questions and issues on RethinkDB github, I failed to come to a clear conclusion if atomic Upsert is possible?
Essentially I would like to perform the same operation as ZINCRBY using Redis.
If member does not exist in the sorted set, it is added with increment
as its score (as if its previous score was 0.0). If key does not
exist, a new sorted set with the specified member as its sole member
is created.
The current implementation appears to differ from almost all databases that I have used. With the data being replaced or inserted not updated. This is a simple use case, like update the last visit, update the number of clicks, update a product quantity. So I must be missing something very obvious, because I cannot see a simple way to do this.
Yes, it is possible. After get on the key, perform an atomic replace. Something like this might work:
function set_or_increment_score(player, points){
return r.table('scores').get(player).replace(
row =>
{ id: player,
score: r.branch(
row.eq(null),
points,
row('score').add(points))
});
}
It has the following behaviour:
> set_or_increment_score("alice", 1).run(conn)
{ inserted: 1 }
> set_or_increment_score("alice", 2).run(conn)
{ replaced: 1 }
It works because get returns null when the document doesn't exist, and a replace on a non-existing document tuns into an insert. See the documentation for replace
So I end up using the following code to go around the no Update issue.
r.db("test").table("t").insert(
{id:"A", type:"player", species:"warrior", score:0, xp:0, armor:0},
{conflict: function(id, oldDoc, newDoc) {
return newDoc.merge(oldDoc).merge(
{armor: oldDoc("armor").add(1)});
}
}
)
Do you think this is more readable/elegant or do you see any issues with the code compared to your sample?
I found an answer for finding all documents in a table with missing fields in this SO thread RethinkDB - Find documents with missing field, however I want to filter according to a missing field AND a certain value in a different field.
I want to return all documents that are missing field email and whose isCurrent: value is 1. So, I want to return all current clients who are missing the email field, so that I can add the field.
The documentation on rethink's site does not cover this case.
Here's my best attempt:
r.db('client').table('basic_info').filter(function (row) {
return row.hasFields({email: true }).not(),
/*no idea how to add another criteria here (such as .filter({isCurrent:1})*/
}).filter
Actually, you can do it in one filter. And, also, it will be faster than your current solution:
r.db('client').table('basic_info').filter(function (row) {
return row.hasFields({email: true }).not()
.and(row.hasFields({isCurrent: true }))
.and(row("isCurrent").eq(1));
})
or:
r.db('client').table('basic_info').filter(function (row) {
return row.hasFields({email: true }).not()
.and(row("isCurrent").default(0).eq(1));
})
I just realized I can chain multiple .filter commands.
Here's what worked for me:
r.db('client').table('basic_info').filter(function (row) {
return row.hasFields({email: true }).not()
}).filter({isCurrent: 1}).;
My next quest: put all of these into an array and then feed the email addresses in batch
Need some help with the infrastructure with storing business hours for a location on Parse.com, i already tried it as a separate Class called BusinessHours, where each row has a pointer to the Location class. Having a minimum of 7 rows for each day of the week for 1 location, the objects count comes to +10.000
than in swift i do this to determine if the location is open now
for hour in hours {
if hour.isClosedAllDay {
isOpen = "closed".localized
}else{
let now = NSDate()
if now.hasDayOffset(hour.weekday, closeWeekDay: hour.nextWeekday) {
if hour.open != nil && hour.close != nil {
let open = now.hourDateFromString(hour.open!, offset: now.dayOpenOffset(hour.weekday, closeWeekDay: hour.nextWeekday))
let close = now.hourDateFromString(hour.close!, offset: now.dayCloseOffset(hour.weekday, closeWeekDay: hour.nextWeekday))
if now.isBetween(open, close: close) {
isOpen = "open".localized
timeOfBusiness = hour.time!
break
}
}
}
}
}
Is there a better way to do this than to have thousands of rows for Business Hours only? I was thinking of adding a object field to the Location Class for the hours but don't know if that is the right way to go either.
Depending on how you want to edit and change the details, and the complexities of multiple opening times per day, I'd consider not using multiple columns and rows. Instead, you could simply store a JSON string in a single column which contains all of the required details.
Obviously you wouldn't be able to use this for querying, so if you need to do that then you need to keep something more like your current solution.
If you don't need querying, or you need simple querying like 'is it open at all on a Monday' then a combined solution, supported by cloud code so the app doesn't need lots of knowledge of the JSON, could work well. For instance you could have columns for general open hours each day and then details in JSON, so you can get a rough answer by querying and then check the exact detail before presentation / usage of the result.
I ended up doing it like this in an array field called businessHours in my Location class:
[
{"close":"20:00Z","open":"12:00Z","time":"09:00 - 17:00","isClosedAllDay":false,"nextWeekday":1,"weekday":1},
{"close":"20:00Z","open":"12:00Z","time":"09:00 - 17:00","isClosedAllDay":false,"nextWeekday":2,"weekday":2},
{"close":"20:00Z","open":"12:00Z","time":"09:00 - 17:00","isClosedAllDay":false,"nextWeekday":3,"weekday":3},
{"close":"20:00Z","open":"12:00Z","time":"09:00 - 17:00","isClosedAllDay":false,"nextWeekday":4,"weekday":4},
{"close":"20:00Z","open":"12:00Z","time":"09:00 - 17:00","isClosedAllDay":false,"nextWeekday":5,"weekday":5},
{"close":"20:00Z","open":"12:00Z","time":"09:00 - 17:00","isClosedAllDay":false,"nextWeekday":6,"weekday":6},
{"close":"20:00Z","open":"12:00Z","time":"09:00 - 17:00","isClosedAllDay":false,"nextWeekday":7,"weekday":7}
]
and then looping through the objects as a NSDictionary.
thanks Wain!
This has me stumped and I'm pulling my hair out here.
The simple query below finds speeches for the current user:
var _debug = function(cb) {
console.log('_debug')
var DebugParseObject = Parse.Object.extend("Speech");
var debugQuery = new Parse.Query(DebugParseObject);
debugQuery.equalTo("user", _getCurrentUser()); // Incorrect results only occurs when I set the user with this line
debugQuery.find({
success: function(results) {
console.log("Successfully retrieved " + results.length + " scores.");
cb(results);
},
error: function(error) {
console.log("Error: " + error.code + " " + error.message);
}
});
};
The speech object class has the following extra columns
title
body
speech_id
user (pointer)
Here is the weird part: The query will only return the speeches whose body is a string less than about 1000 characters.
As in, I can have Speech A, with a 500 character string in the body field. It will be returned as one of the speeches. BUT if I increase Speech A's body string to about 1500 characters, it will NOT be returned any longer.
I can't understand why.
Some further points
It's only when I filter by the user. If I search for all speeches or query by a different parameter (e.g. title), then the correct amount is returned
This used to work fine yesterday and before
I manually deleted a user earlier (removed the row from the table), while their linked speeches still existed
I changed those speeches' users value from the deleted user id to a new users id
The speeches appear to have the correct user
I tried re-saving the user object on the speech's user property and it didn't do anything
Any help will be great! I feel like I've corrupted the user class when I deleted the user row. But I can't prove it.
The query syntax looks solid and you should be well within the storage limitations of Parse. In case you're curious, there's no explicit limit on string length, but Parse Objects are limited to 128k (except for Parse Files of course).
My guess is that something has gone awry when copying over a different user in place of the one you deleted. Manually changing data and pointers within the browser is always risky and prone to errors.
I have a simple document named Order structure with the fields id, name,
userId and timeScheduled.
What I would like to do is create a view where I can find the
document.id for those who's userId is some value and timeScheduledis
after a given date.
My view:
"by_users_after_time": {
"map": "function(doc) { if (doc.userId && doc.timeScheduled) {
emit([doc.timeScheduled, doc.userId], doc._id); }}"
}
If I do
localhost:5984/orders/_design/Order/_view/by_users_after_time?startKey="[2012-01-01T11:40:52.280Z,f98ba9a518650a6c15c566fc6f00c157]"
I get every result back. Is there a way to access key[1] to do an if
doc.userId == key[1] or something along those lines and simply emit on the
time?
This would be the SQL equivalent of
select id from Order where userId =
"f98ba9a518650a6c15c566fc6f00c157" and timeScheduled >
2012-01-01T11:40:52.280Z;
I did quite a few Google searches but I can't seem to find a good tutorial
on working with multiple keys. It's also possible that my approach is
entirely flawed so any guidance would be appreciated.
You only need to reverse the key, because username is known:
function (doc) {
if (doc.userId && doc.timeScheduled) {
emit([doc.userId, doc.timeScheduled], 1);
}
}
Then query with:
?startkey=["f98ba9a518650a6c15c566fc6f00c157","2012-01-01T11:40:52.280Z"]
NOTES:
the query parameter is startkey, not startKey;
the value of startkey is an array, not a string. Then the double quotes go around the username and date values, not around the array.
I emit 1 as value, instead of doc._id, to save disk-space. Every row of the result has an id field with the doc._id, then there's no need to repeat it.
don't forget to set an endkey=["f98ba9a518650a6c15c566fc6f00c157",{}], otherwise you get the data of all users > "f98ba9a518650a6c15c566fc6f00c157"
The answer actually came from the couchdb mailing list:
Essentially, the Date.parse() doesn't like the +0000 on the timestamps. By
doing a substring and removing the +0000, everything worked.
For the record,
document.write(new Date("2012-02-13T16:18:19.565+0000")); //Outputs Invalid
Date
document.write(Date.parse("2012-02-13T16:18:19.565+0000")); //Outputs NaN
But if you remove the +0000, both lines of code work perfectly.