Mix Panel API web segmentation and personalisation - mixpanel

Hi I am interested in using Mix Panel on a web site to track customers events. I would like to know if there is any way to use the api to personalise the web site per customer, similar to segmentation for emails.
I would like to query the api for a singular customer asking whether they have achieved several events.
For example something like
If customer has clicked out and last visit greater than a month ago display a banner advert.

Mixpanel does not seem like a correct tool for the job you describe here.
While theoretically this might be possible (via Mixpanel's HTTP API), this will create unnecessary architectural complexity and add extra latency. If you need to customize your web site per user, store any user state in a database like MySQL or PostgreSQL. This will be both faster and easier.

Related

Add cart functionality to API or to Client?

I am creating an e-commerce API using the Django Rest Framework. The API will handle the following areas:
Databases
User Registration
Permissions
Orders/Payments
There's still one area in which I'm not quite sure how to implement in my project. It's the cart functionality. Would it be better to implement it on the client-side (ex: React/Ember) or on the server-side (i.e. API)?
One scenario that confused me is if the user is logged in in different platforms (ex: Website and mobile app). I want the user to have the same cart on mulitple platforms.
In that particular use case, if you want cart persistence then it must be backend. The reason for this is one being able to have a single source of truth. The phone app and the web app cannot talk to each other unless they have some sort of "common ground" between them.
That's where the API comes in. It will allow both ends to speak to each other by having the API as the single source of truth. See my terrible diagram for a visual.

Google Analytics - filtering out the traffic created by my bot

I built a website testing product that can be employed by Q&A and other teams. It is essentially a Chrome based robot that does all kind of queries to a target website.
I would like to give my customers an option to filter out the bot traffic from their Google Analytics using GA filters, or simply filter it out by default.
Restrictions:
No changes on the tested website required
It must be a product-wide solution (no exceptions for specific customers)
It should not alter any typical parameters (user-agent, screen size, java support, referral, ...), as these can be altered by the user in my product
It shouldn't require any Chrome plugins (such as GA opt-out)
Filtering cannot be IP based, as the product can be ran from anywhere at any time
I would like to avoid blocking GA objects from loading in the browser, as this can skew the test data
Ideally this would be implemented using something like a custom X- HTTP header or a custom cookie, but found no way to filter GA data based on that.
Any ideas how to approach that?

Building A Social Network

So, I'm starting out building a social network web app. I'm looking into how to fit the parts of my stack together and I'm looking for some guidance about what various frameworks will allow me to do. My current stack idea is to have:
Firebase JSON API: serving user, post, comment, and all the other data
EmberFire: to plug that API into EmberJS
EmberJS: my front-end MVC (because I'm new to MVC and Ember seems the most accessible)
What I'm stumbling on at the moment is how I'm going to implement users with this stack. I've looked at basic authentication stuff but I haven't found anything that would allow me to allow certain actions and views for certain users and not others - the basics of a social network really.
Is it sensible to be doing this stuff in front-end MVC? If so what should I be using to do authentication/personalisation? If not, should I just be doing a PHP/SQL setup? I'd rather avoid that because my skills are all front-end.
If you are just getting started, Firebase is a great service to learn on due to their 'back end as a service' model - you will spend more time building/modeling your data and less time running/installing. Not that you won't want to learn more about that later, but it lets you focus on one piece at a time.
From an access perspective, JS/NoSQL vs PHP/MySQL isn't going to be the issue. They each have their own security requirements - it's more that PHP/MySQL has had more time to establish those rules. Additionally, Firebase being a hosted service has it's own set of requirements.
Firebase security rules are a little weird when you first look at them, but they begin to make some sense after you sit with them for a bit. The Firebase docs are actually a pretty great resource. https://www.firebase.com/docs/security/
Basically, if you use a Firebase 'login provider' it makes Firebase act as both a database and a authentication manager, and the combo helps keep users 'fenced' to where you want them. You can use data from other paths, variables, validation rules, etc. You can even make a 'custom login provider' if you need to integrate with an existing one.
Finally, on the client end, your view can respond to whatever Firebase returns - if a user does 'hack' their way through to a page they shouldn't be on client side, no data is returned anyways and no submitted information would be allowed because of the rules.

Website information update system?

I have developed a student portal website for my college using Joomla 2.5 and now I want some mechanism to regularly update information on it.
My problem is that there are many societies in my college that organize events frequently and it is next to impossible to get their information on time to be updated on the site.
Is there some way possible by which those people can independently upload their events on the site without the administrator's interference and also without messing up with other facilities of the back-end?
The whole point of a CMS is to make things like this easier. As #emmanuel points out this is why there are extensions, you should use a calendaring extension.
In my experience one of the simplest things you can do if most people on your campus have Google accounts is to create a shared Google calendar that you give create access to for a representative of each club. Then embed that calendar on your site with one of the extensions for that. That way you don't have to deal with accounts on your site at all. There are a lot of ways to make it more complicated (like let each club have their own calendar and then you make a master calendar) but I think that could end up being more of a headache.
The biggest problem with calendars is getting people to list their events, because it is work for them. Sites with big empty calendars don't look very good. So you may want to make sure you have some events by finding out if there are some repeating events that you can set up.
You could try jevents component: http://www.jevents.net/
You could grant permissions to your sub admin users and add / edit / delete their events from the front end without giving them access to the backend of your site.

User-Generated Content View Validation

I am developing a user-generated content site. The goal is that users are rewarded if their content is viewed by a certain number of people. Whereas a user account is required to post content, an account is not required to view content.
I am currently developing the algorithm to count the number of valid views, and I am concerned about the possibility that users create bots to falsely increase their number of views. I would exclude views from the content generator’s IP, but I do not want to exclude valid views from other users with the same external IP address. The same external IP address could in fact account for a large amount of valid views in a college campus or corporate setting.
The site is implemented in python, and hosted on apache servers. The question is more theoretical in nature, as how can I establish whether or not traffic from the same IP is legitimate or not. I can’t find any content management systems that do this, and was just going to implement it myself.
You cannot reliably do this. Any method you create can be automated.
That said, you can raise the bar. For instance every page viewed can have a random number encoded into a piece of JavaScript that will submit an AJAX request. Any view where you have that corresponding AJAX request is probably a real browser, and is likely to be a real human since few bots handle JavaScript correctly. But absolutely nothing stops someone from having an automatic script to drive a real browser.
Well... you can make them login (through facebook or google id etc, if you don't want to create your own infrastructure). This way it is much easier to track ratings.

Resources