how piwik extension for magento 2 takes data to its database? - magento

the tracker collects data on the web page it's included in and sends it to Piwik by calling the HTTP tracking API. I got this information,but i am unable to find the piece of code for this. can anybody please help me with this?

If you have integrated the Henhed-Piwik extension to magento 2 and configured your piwik credentials in the following section Stores > Configuration > Sales > Piwik API with piwik-host URL, user id tracking, etc.
Since you have integrated the Henhed-Piwik to your magento 2 (Hope you have compiled your magento 2 and upgraded). This extension automatically places the java script tracking code to your magento site. Whenever you visit the site it automatically tracks your actions, clicks, impressions, referrals etc., asynchronously.
The role of piwik.js and piwik.php are as follows:
Your tracking site consists of a global variable var _paq = _paq || []; and _paq.push(['trackPageView']); this actually mentions the what are all the features in piwik you would like to track.
This code defines the script execution mode, i.e asynchronous and you are sending all the data to piwik.js file. g.type='text/javascript'; g.async=true; g.defer=true; g.src=u+'piwik.js';.
The same piwik.js file also calls the API methods in the piwik.php file of Piwik Analytics Tool and tracks all your web site data and stores into the MySQL database, where you have configured while installing and setting up of your piwik.
Hope, I have answered your question.

Related

Google Analytics not sending any events or custom dimensions data to dashboard

Scenario:
I am using this module: https://www.drupal.org/project/google_analytics version 3.1
It is using Google Analytics 4. Drupal version 8.9.x
We followed the documentation https://developers.google.com/analytics/devguides/collection/gtagjs/custom-dims-mets to create some custom dimensions and added in the Google Analytics configuration accordingly.
When view the page source, I see code is added there:
gtag("config", "G-MESUREMENT-KEY", {
"custom_map": {
"dimension1":"user_company",
"dimension2":"user_role",
"dimension3":"user_badge_access"
}
}
);
gtag("event", "custom", {
"user_company":"TEST Company",
"user_role":"authenticated, member_administrator, administrator",
"user_badge_access":"Office"
}
);
Using some Chrome GA debugger "GTM/GA", I see parameter is passing there. In the request "Pay Load" of Chrome debugger it shows its sending the values:
en=page_view&ep.anonymize_ip=true
en=page_view
en=custom&ep.user_company=Surface%20Oncology&ep.user_role=authenticated%2C%20member_administrator%2C%20administrator&ep.user_badge_access=Office
BUT I don't see the data when I open DebugView in the GA dashboard! And Interesting part is, some chrome debugger like "Google Analytics Debugger 2.8" when enabled, which seems open a debugger and connect to GA dashboard, DebugView is showing the events and parameters data. So there must be something which restricting or refusing to connect with GA Dashboard to push the data.
Reads lot of documentation, did lot of test but failed to find a reason for that. The site is fully login protected but event /user/login page which is accessible to all, not sending the data at all.
If some one can shed some light on the issue, it will really really help for me. Thanks in advance.
Actually this is my mistake of my understanding. All data is pushing to GA dashboard no confusion. The reason why I don't see data when creating a comparison report based on custom dimension parameters is due to "scope" difference.
So if you want to create a report based on event scoped parameters, please go to Engagement->Events, and if you want to create a report based on user scoped parameters, please go to Audience tab, but user scoped report will not show data if captured users record is less than 10.
A documentation for reference: https://support.google.com/analytics/answer/2709828?hl=en

azure media service for xamarin?

I have created a console application with azure media service.
in the app i am using the plugin
windowsazure.mediaservices
the way i am doing is
static IAsset CreateAssetAndUploadSingleFile(string filePath, string assetName, AssetCreationOptions options)
{
IAsset asset = _context.Assets.Create(assetName, options);
var assetFile = asset.AssetFiles.Create(Path.GetFileName(filePath));
assetFile.Upload(filePath);
return asset;
}
so i just want to know whether this plugin will work on xamarin(i am not a xamarin devoloper) as its a portable project.
if its not do we have any alternative plugin?
my basic purpose is upload and encode.
That package is for our current .NET SDK
https://www.nuget.org/packages/windowsazure.mediaservices
It does not support .NET Core. See the dependencies.
It's not compiled for Xamarin though, so I don't believe that it works in Xamarin, but i'm not a Xamarin expert at all.
What is your scenario exactly? Why would you want to call the Media Services account directly from Xamarin anyways? You would only need to do that if you are creating a management application for the account Administrator. Otherwise, dont put Media Services directly into any client code! You should hide it in your middle-tier, and only pass Streaming URLs or SAS locators to the client application to upload content to.
For the upload from phone scenario, middle tier should create an Asset, get a writable SAS Locator for the Asset, hand that to the client side. Client can then use Azure Storage APIs to upload the content to that SAS URL directly (it ends up in an Azure storage container then.)
I believe that Xamarin has client side support for the Azure Storage APIs available.
As john answered, you don't do this stuff on a client, you will need to use SaS tokens and what not. I could explain everything here, but there are some nice guides and examples online.
Build 2018 video explaining how it works (including Azure Functions): https://www.youtube.com/watch?v=dEZkQNNpSIQ&feature=youtu.be&rel=0
The github example of this video: https://github.com/Azure-Samples/xamarin-azure-businessreview
To understand it better, I recommend this guide, it is old but it does cover the entire process, just make sure to combine new documentation with this old one.
Old docu: https://learn.microsoft.com/en-us/previous-versions/msp-n-p/dn735912(v%3dpandp.10)
Official current documentation: https://learn.microsoft.com/en-us/azure/media-services/previous/media-services-dotnet-upload-files#upload-multiple-files-with-media-services-net-sdk
Probably useful for new readers.

doing online surveys with spring mvc and limesurvey

hi i'm working in a spring mvc project and i need to add a survey option, the requeriment is sending a link inside a email to a survey or i dont know if you can send the questions inside the email, i will probably lean for the firts option, i'm thinking of using limesurvey or any other open source survey tool who can be integrated with my spring mvc web application.
since i have not worked with a survey tool before, how acctually limesurvey works i asume that this is how it works:
1) you install limesurvey in your server
2) you create the survey with limesurvey UI after it was succesfully installed in your server
3) that survey that i just create give me somekind of URL
if this assumptions are correct,
QUESTIONS:
1) how can i get that url in my spring mvc application so i could send it in a email, since they are to different applications, does lime survey save that url in data base where i could connect with my spring mvc application and get it?
2) the same thing goes with the results can i get access with my spring mvc application to the results since i need that info to create reports,
3) can i create limesurvey surveys from my spring mvc application, if limesurveys create the surveys in a data base can i crete them from and UI in spring and save them in the limesurvey database
4) is there a way to configure the valid period time of a survey, since i dont want that a user can access the survey for long periods of times like 1 hour doing the survey, or being able to access the survey from the link for a entire week, since this can be a problem to my server capacity
Url : if unique link for each user is needed : https://manual.limesurvey.org/RemoteControl_2_API#get_participant_properties . Else a survey link it's allways the same example.org/surveyid.html for example.
https://manual.limesurvey.org/RemoteControl_2_API#export_statistics
https://manual.limesurvey.org/RemoteControl_2_API#add_survey and/or https://manual.limesurvey.org/RemoteControl_2_API#import_survey etc ...
Start and expiry date-time : https://manual.limesurvey.org/Survey_settings#Publication_.26_access_control
Then : I think LS are able to be integrated in your MVC. ANd remind : you can too use Extra plugin use LS core system with newDirectRequest event : https://manual.limesurvey.org/NewDirectRequest Return some json for your MVC if it's easiest for you.
Denis

How do I show progress of a long-running server operation (Web API Commanding) in the Lightswitch HTML client?

I have a VS 2013 Lightswitch HTML Client application to which I've added a button that makes a Web API REST post. This basically 'refreshes' the data in the table from the original upstream source. This is all working correctly, but the operation takes a few minutes, and I want to report status to the user as it runs.
Right now, I've tried attaching a simple Refresh when the post returns as follows:
$.post("/api/data/", "Refresh", function (response) {
screen.getData().then(function (newData) { screen.reQuery(); });
});
This doesn't actually seem to do a refresh (screen.reQuery is apparently the wrong call), but the better option would be to instead have the server show progress of this long-running application.
One thought I had would be to have the server call return data in the form of "percent done" in the response as it processes it, but I don't know if this would be delivered to the client piecemeal, nor the best way to display this to the user in Lightswitch.
I'm open to other third-party libraries that might help with this, but I'd like to stick with WebAPI for commanding instead of adding something like SignalR for now, if possible. Thanks!
In general this seems like not the best idea to run operations that takes minutes on the server.
A reasonable alternative is to create a single call, that will in turn create multiple Web Jobs (see Azure Web Jobs for more info). The Web Jobs will be broken to smaller individual tasks, and your html will query the web jobs rather than your Web API.

How entity edit URL from within plug-in in MS Dynamics CRM 4.0

I would like to have a workflow create a task, then email the assigned user that they have a new task and include a link to the newly created task in the body of the email. I have client side code that will correctly create the edit URL, using the entities GUID and stores it in a custom attribute. However, when the task is created from within a workflow, the client script isn't run.
So, I think a plug-in should work, but I can't figure out how to determine the URL of the CRM installation. I'm authoring this in a test environment and definitely don't want to have to change things when I move to production. I'm sure I could use a config file, but seems like the plug-in should be able to figure this out at runtime.
Anyone have any ideas how to access the URL of the crm service from within a plug-in? Any other ideas?
There is no simple way to do this. However, there is one.
The MSCRM_Config is the deployment database that handle physical deployment properties, like the URL from which users are accessing the CRM deployment. The url that you might want is the one stored in "ADWebApplicationRootDomain", in the MSCRM_CONFIG.dbo.DeploymentProperties table. You may need some permission to access this database.
Note that this doesn't work in a deployment that is an Internet Facing Deployment.
Another way could be to query the discovery service to retrieve the same information (in the case that you are on the Online edition of MSCRM4).
What do you mean by "change things"?
If you create a custom workflow assembly, you can give it a server url input. Once you register it with CRM, you can simply type in the server url when you configure the workflow. You'll have to update the url for any workflows that use the custom workflow assembly once you move to production, but you'll only have to do that once.
My apologies if this is what you meant you wanted to avoid.
Edit: Sounds like you may be able to use the CustomConfiguration attribute when you register the plugin. Here's some more info.
http://blogs.msdn.com/crm/archive/2008/10/24/storing-configuration-data-for-microsoft-dynamics-crm-plug-ins.aspx
String Url = ((string)(Registry.LocalMachine.OpenSubKey(
"Software\\Microsoft\\MSCRM").GetValue("ServerUrl"))
).Replace("MSCRMServices", "");

Resources