How do you do math operations on r.now() - rethinkdb

I am implementing an express session with rethinkdb and I have an 'expires' field that is calculated as so: r.now() + 24 * 60 * 60 * 1000 (1 day from now).
Can I do something like this?
r.now().add(millisecondsToAdd)
There is no api documentations for this.
It will also be useful for querying.
Note: I am using the official Javascript driver.

You can do that
r.now().add(24*60 * 60 * 1000)
However, it's second, not millisecond. So to add one one more day, it is:
r.now().add(24*60*60)
When you browser the API, add saying about time: https://www.rethinkdb.com/api/ruby/add/
time.add(number[, number, ...]) → time
sub works similar to add btw.

Related

Check cron syntax to execute now

I do heavy automation on servers.
Biggest problem is the time-based execution of automation things.
The idea is to use a cron-syntax-style to time the executions.
So I need a way to check if a command that is combined to a cron syntax string can be executed now.
Things like:
./parser.sh 0 0 * * *
will only return OK on Midnight not on all the other minutes of a day.
Also
./parser.sh */10 0,1,2,3,4-22/4 * * *
and all combinations possible in cron syntax needs to work.
There will be several executions per day, every execution has different syntax.
Is there any kind of stuff that can do this?
It is not possible to actually create cronjobs for this.
Only can use Bash, maybe static compiled binaries, no Python nor higher languages.
Already tried https://github.com/morganhk/BashCronParse but this cannot interpret things like 1,2,3,4,5... only single numbers and */n, neither combinations.
I cannot get your question clearly. But, if you are trying to run parser.sh every minute of the day.
Use this
./parser.sh * * * * *

JMeter Time Lapse

I need JMeter to simulate 28 days of activity (via altering time stamps) in under 1 day. There is is time function, but it doesn't look like there is one that supports anything other than current time. I also don't want to fake the dates because things like leap years and what month it is will make it incorrect.
So how can I get a time format as a delta of the current time (in milliseconds from epoch), and/or what is the best way to run a 1 day load test as if time was going 28 times faster?
UPDATE:
Thanks Dmitri T! I was able to modify your answer to what I needed (I have to restrict what hours events occur between 8am and 5pm)
For those that need it, I used the following in a JSR223 PreProcessor
timestamp=new Date();
timestamp.setDate(timestamp.getDate() - Math.floor(Math.random() * (28)));
timestamp.setHours(8,0,0,0);
timestamp.setTime(timestamp.getTime() + Math.floor(Math.random() * (1000*60*60*9)));
vars.put("TIMESTAMP", timestamp.toISOString());
You can simulate this using __time() and __longSum() functions combination like:
Now: ${__time(,)} - I guess you are already aware of it
Now + 28 days: ${__longSum(${__time(,)},2419200000,)} - where 2419200000 is
1000 * 60 * 60 * 24 * 28
^ ^ ^ ^ ^
ms sec min hour day
If Unix timestamps don't play for your for some reason you can use __javaScript() function to convert them to human-readable format like:
${__javaScript(new Date(${__longSum(${__time(,)},2419200000,)}),)}
Demo:
References:
Functions and Variables
The Function Helper Dialog
How to Use JMeter Functions

Virality algorithm for different type of objects

For a project I need to rank certain objects based on events on/with that specific object. But the objects to be ranked aren't alike.
Some background: the application is a social-network-like document-management system. There are a lot of users, who can upload/post 'documents' of various types (video's, external articles - eg. found on a relevant blog -, articles written within the system itself etc.). But also user-to-user messages should appear in the feed, as well as system messages, etc.
To break it down a little, let's assume these three objects should appear in the news-feed, ranked/sorted on virality, which is based on events.
Documents
System messages
User-to-user, or user-to-group) messages
A few parameters that are important for the ranking, per object:
Documents
Number of views
Number of comments
Number of shares
Affinity with the document (user has commented on it, shared it, etc.)
Correspondence of tags the user is enlisted to
System messages
Importancy level (eg. 'Notice', 'Announcement')
User/group messages
Level of engagement in the conversation
And to make it harder, the date the object was created is important, as well as the date and correlation of the occuring events. And to add up one more to the complexity: pretty much everything is relative; eg. the number of views for a document needed to define it as 'viral' and as such make it appear in the news-feed depends on the average number of views. Same goes for comments, but for comments the posted date and time between posting of new comments is important as well.... (oh and in case it wasn't clear, ranking is always relative to a user, not system-wide).
My first thought was to define a max score (Sm) for each object, define when an object reaches the Sm and calculate the actual score (Sa). Ie. the system messages have a Sm of 100, user/group messages 80 and documents have a Sm of 60. This means that if one of each object is created at exactly the same time, and no other parameters (comments etc.) are available yet, the system message will be listed first, the user message will come next, and last, but not least, the document.
So for each type of object, I'm looking for a formula like:
S(a) = S(m) * {calculations here}
For the system message it isn't that hard I guess, as it only has two parameters that affect the Sa (date and importancy level). So it's scoring formula could look like (I is numeric importancy level):
S(a) = S(m) * I * (1 / (now() - date_posted())
Let's assume a notice would have I=10 and announcement has I=20, the scores for a notice posted yesterday and an announcement posted 2 days ago, would be:
Notice: S(a) = 100 * 10 * (1 / 1) = 1000
Announcement: S(a) = 100 * 20 * (1 / 2) = 1000
Now for the documents, and I'm really breaking my head on that one...
I've got the following parameters to take into account:
V(o) = number of views
V(a) = average number of views
C(o) = total number of comments
C(a) = average number of comments on this type of object
C(u) = number of comments by the user
SH(o) = total number of shares of this object
SH(a) average number of views of this type of object
SH(u) = has the user shared the document (1 = no, 2 = yes)
T = number of enlisted tags
I found a simplified example of how Facebook calculates 'virality' here. They use the following formula:
Rank = Affinity * Weight * Decay
And if I translate that to my use-case, the affinity would be the outcome of a calculation on the parameters listed above, the weight would be the score-max altered a bit based on the total number of views and shared divided by the average number of views and shares, and the decay would be a complex calculation based on the correlation of the events fired and the date the object was created.
I'm giving it a try:
Affinity = C(u) * SH(u) * T * SH(u)
Weight = S(m) * (V(o) / V(a)) * (SH(o) / SH(a)) * (C(o) / C(a))
Decay = (1 / (now() - date_created())) * (1 / (now() - date-of-last-comment())
This will get me some kind of ranking, but it lacks a few things:
it has no relation whatsoever with the ranking of a system message, and thus sorting would be meaningless
the frequency of new comments isn't taken into account
So now I'm stuck...
To get to the point, my questions are:
Is this a good approach, or should I try something totally different?
If so, what direction should I go to?

Time interval (in ms) from BPM (Midi tempo)

Does anybody know formula ?
I tried following:
1000 / ((BPM * 24) / 60).
But seems not correct.
I don't think my answer is MIDI-specific, but to convert beats-per-minute to ms-per-beat, would this work?
ms_per_beat = 1000 * 60 / bpm
In other words, I think you have an extra "24" in there.
It is simply:
Time of 1 beat in ms = 1000 * 60 / BPM = 60000 / BPM
It looks like your formula is assuming data coming from a standard midi file, where tempo is expressed in terms of ticks, where there are 24 ticks per quarter note. It's not giving you ms per beat, it's giving you ms per tick.
I wrote an article on converting BPM to MS
and I made an online app called a Delay Time Calculator that does just that including giving you dotted and triplet notes

Timeseries Charts data every 5 Seconds?? Worried about Performance

We have Time Series data for every 5 Seconds .
Now you can assume what will be the Data for a single day ??
We have a requirement of showing a graph/ chart for a Whole Month .
(For example a user selects a Month from Jan 1st 2011 to Jan 31st 2011 )
The data loaded might be too heavy and it may degrade the applications performance or make it Out of Memory .
How can we handle this requirement ??
The Flotr API uses this method to draw the Charts , which requires the data to be already loaded .
summaryGraph: function (data, bounds) {
var p = Flotr.draw(
$('summaryGraph'),
[data],
{
}
)
}
please suggest any inputs for this .Thank you .
OK you have time series data collected every 5 seconds. That leads to about 60 * 60 * 24 * 30 / 5 = 518400 observations. Why not bin the time series data together, i.e. use a higher aggregation than 5 seconds? I assume you do not actually need 5 second precision on your x axis, or you'd be talking about hundreds of thousands of pixels wide charts.
What is your actual question?
Clients don't compromise , i think he has to show the data . I think DOJO provides a good solution using
which will e responsible to pint out to the URL of the servlet .
Not sure , experts may add more to this .

Resources