Google Analytics - filtering out the traffic created by my bot - filter

I built a website testing product that can be employed by Q&A and other teams. It is essentially a Chrome based robot that does all kind of queries to a target website.
I would like to give my customers an option to filter out the bot traffic from their Google Analytics using GA filters, or simply filter it out by default.
Restrictions:
No changes on the tested website required
It must be a product-wide solution (no exceptions for specific customers)
It should not alter any typical parameters (user-agent, screen size, java support, referral, ...), as these can be altered by the user in my product
It shouldn't require any Chrome plugins (such as GA opt-out)
Filtering cannot be IP based, as the product can be ran from anywhere at any time
I would like to avoid blocking GA objects from loading in the browser, as this can skew the test data
Ideally this would be implemented using something like a custom X- HTTP header or a custom cookie, but found no way to filter GA data based on that.
Any ideas how to approach that?

Related

Why some performace fields from google ads api can't be together?

Ad Performance Report states that some fields are not compatible with the some fields such as ConversionCategoryName. It can't be together impressions, clicks, cost... etc.
However They can be show together in the same table on Google ads platform.
It is not always possible to compare results of the API and results of the web application. Googles web applications have different permissions then they give to developers.
If you are seeing compatible with the some fields such then you cant use that in the API there is no work around for that.

Mix Panel API web segmentation and personalisation

Hi I am interested in using Mix Panel on a web site to track customers events. I would like to know if there is any way to use the api to personalise the web site per customer, similar to segmentation for emails.
I would like to query the api for a singular customer asking whether they have achieved several events.
For example something like
If customer has clicked out and last visit greater than a month ago display a banner advert.
Mixpanel does not seem like a correct tool for the job you describe here.
While theoretically this might be possible (via Mixpanel's HTTP API), this will create unnecessary architectural complexity and add extra latency. If you need to customize your web site per user, store any user state in a database like MySQL or PostgreSQL. This will be both faster and easier.

how to implement Complex Web API queries in ASP Core

I'm new to web API design, so I've tried to learn best practices of web API design using these articles:
1.Microsoft REST API Guidelines
2.Web API Design-Crafting Interfaces that Developers Love from "Apigee"
Apigee is recommending web API developers to use these recommendations to have better APIs.
I quote here two of the recommendations:
I need C# code for implementing these recommendations in my Web APIs (in ASP Core) which is a back-end for native mobile apps and AngularJs web site.
Sweep complexity behind the ‘?’
Most APIs have intricacies beyond the base level of a resource. Complexities can include many states that can be updated, changed, queried, as well as the attributes associated with
a resource.
Make it simple for developers to use the base URL by putting optional states and attributes behind the HTTP question mark. To get all red dogs running in the park:
GET /dogs?color=red&state=running&location=park
Partial response allows you to give developers just the information they need.
Take for example a request for a tweet on the Twitter API. You'll get much more than a typical twitter app often needs - including the name of person, the text of the tweet, a timestamp, how often the message was re-tweeted, and a lot of metadata.
Let's look at how several leading APIs handle giving developers just what they need in
responses, including Google who pioneered the idea of partial response.
LinkedIn
/people:(id,first-name,last-name,industry)
This request on a person returns the ID, first name, last name, and the industry.
LinkedIn does partial selection using this terse :(...) syntax which isn't self-evident.
Plus it's difficult for a developer to reverse engineer the meaning using a search engine.
Facebook
/joe.smith/friends?fields=id,name,picture
Google
?fields=title,media:group(media:thumbnail)
Google and Facebook have a similar approach, which works well.
They each have an optional parameter called fields after which you put the names of fieldsyou want to be returned.
As you see in this example, you can also put sub-objects in responses to pull in other information from additional resources.
Add optional fields in a comma-delimited list
The Google approach works extremely well.
Here's how to get just the information we need from our dogs API using this approach:
/dogs?fields=name,color,location
Now I need C# code that handles these kind of queries or even more complex like this:
api/books/?publisher=Jat&Writer=tom&location=LA?fields=title,ISBN?$orderBy=location desc,writerlimit=25&offset=50
So web API users will be able to send any kind of requests they want with different complexities, fields, ordering,... based on their needs.

Google Analytics event tracking dependent on source of visit

I am looking to test different traffic patterns within Google Analytics (Direct traffic abnormally high). I was curious if anyone knows how to create an event that fires when source =wildcard To make this event more difficult, this would be set up within Google Tag Manager using Universal Analytics.
I see the 6 event tags but none of them sounds like it would perform my need?
Thanks
Google Tag Manager is not a tracking tool and knows nothing about the traffic source, so no preconfigured macro could be used in a rule to fire tags depending on source.
If you use "classic" asynchronous analytics you can set up a macro that reads the _utmz-cookie and checks in a rule if it contains a source string ("direct","cpc" etc.).
However Universal Analytics determines the traffic source on the server and does not store it clientside, so with UA this would not work.
A few traffic sources are easily recognizable on the respective landing page:
If no referrer is present it's a direct visit/bookmark
if there are campaign (utm) parameters in the url you can use those
if there is a gclid parameter in the url you know it google/cpc
if the referrer is a google domain with a country tld and the parameter "q" is present (will be empty with encrypted search but should still be there) it's an organic google search
if the referrer is a bing domain with the parameter q present it's an organic bing search (and similar for other search engines)
However this will only work on landing pages. You need to write you own cookie to store the source for subsequent pages.
You can refine this approach to give rather similar results to Google Analytics but it will never match perfectly.
One of the most common reasons for abnormal high direct traffic is that no campaign parameters are present in paid traffic, either because you forgot to enable autotagging in your adwords campaigns or because you have redirects that strip out campaign parameters (so paid traffic is lumped together with direct). The above approach would not help you to discover this so I suggest you check this manually first before you do anything else.

How to fill out AJAX form programmatically and scrape results?

Basically, I want to use the Facebook Ads Manager Tool to estimate the number of users targeted by a particular set of targeting parameters. I know there is a published API available, but it is only usable if you are on their advertising application "whitelist." I am sure what I am asking is possible. Plus, it would be interesting to learn more about scraping.
Facebook's Ads Manager Tool is basically an AJAX UI for their ads API. In the process of creating a campaign, you can specify targeting parameters, and the page will dynamically report the number of users targeted as you modify the parameters. From what I've read on the web and here on stackOverflow, it is possible to use Firebug or a similar tool to pick apart what requests are being made by the page and to where, then mimicking these calls to get the information you want.
I'm having trouble interpreting the panels of Firebug. I think the URI I'm trying to send a request to is www.facebook.com/ajax/inventory_estimator.php, though I'm not sure how to form a call.
So, if I want to write a script or program that takes a list of words to use as keywords and returns the estimated number of users for each keyword, how could I do it?
Link to Facebook's Ads Manager Tool, Campaign Creation Page:
http://www.facebook.com/ads/create
yes using an extension like firebug to examine the HTTP requests is a good way to do this.
The Net tab is the one you want (last one).
Have you tried irobotsoft webscraper? It has a good ajax support.
Check their forum here: http://irobotsoft.org/bb/YaBB.pl

Resources