Is there a way to add business specific metadata to be stored along with audio file stored on google cloud via google cloud speech logging? - go

We are working on integrating with google cloud speech for speech to text conversion with logging enabled. When the audio files are logged we also need to store an additional Identifier stored with the audio file so that later on when we retrieve the audio file from google cloud we can perform entity to audio file association. Is that possible? Can we store user provided metadata along with the audio file? We are going to stream audio data for conversion and we need to store the audio file + some metadata supplied by us.

Data logging is only a way to send your data anonymously to Google to help improve the Speech API, there is no metadata that can be used here. The accessible metadata associated with samples only comprises the samples audio properties, passed in to the API through the RecognitionConfig object.
Provided that you store your samples in Cloud Storage, you can use the bucket URI’s to process sample requests, or the Cloud Storage client library to retrieve the samples with their unique filename.

Related

Extract Dynamics 365 read audit log data from security compliance center

I want to extract the read audit data from Microsoft security & compliance center and I have enable the audit logs for read audit logs and now I need to extract from Microsoft security & compliance center and I see the audit log created in Microsoft security & compliance center. Now I want extract or export data from Microsoft security & compliance center to Azure event hub using console app or web api.
Can anyone help me, how can I extract the data from Microsoft security & compliance center I used XRMToolBox using audit history extractor and it is extracting the audit data from CRM but I need extract or export the read audit data for Dynamics CRM from Microsoft security & compliance center.
How can I build the process? I checked my website I don't see any proper resource.
It’s little bit tricky and not so straight forward. You can get a webhook trigger for new data once ready, then you need to parse it and send to your event hub.
Office 365 Management Activity API reference
The Office 365 Management Activity API aggregates actions and events into tenant-specific content blobs, which are classified by the type and source of the content they contain.
To begin retrieving content blobs for a tenant, you first a create subscription to the desired content types. If you are retrieving content blobs for multiple tenants, you create multiple subscriptions to each of the desired content types, one for each tenant.
After you create a subscription, you can poll regularly to discover new content blobs that are available for download, or you can register a webhook endpoint with the subscription and we will send notifications to this endpoint as new content blobs are available.
Note:
When a subscription is created, it can take up to 12 hours for the first content blobs to become available for that subscription. The content blobs are created by collecting and aggregating actions and events across multiple servers and datacenters. As a result of this distributed process, the actions and events contained in the content blobs will not necessarily appear in the order in which they occurred. One content blob can contain actions and events that occurred prior to the actions and events contained in an earlier content blob. We are working to decrease the latency between the occurrence of actions and events and their availability within a content blob, but we can't guarantee that they appear sequentially.
Sample logs and schema reference.

Can Google BigQuery read files from Google Cloud Storage private folder from Google Play Developer Console?

Google Play Developer Console provides access to a Google Cloud Storage folder with User Acquisition analytics data for my app at;
gs://pubsite_prod_rev_XXXXXXXX/acquisition/retained_installers
Can I get BigQuery to read from this data source?
If not, is there some way to link Google Play app analytics data to BigQuery (without Firebase)?
Thanks,
Yes, you can query a federated table where the files are stored on GCS. From the UI, you can open the add table dialog using the drop-down menu next to your data set, then specify a path, file type, etc. for the external table.
The benefit of linking with Firebase, or manually uploading from GCS yourself into BigQuery storage, is that your queries will be much faster due to the internal, optimized storage format.
As mentioned in the above comment, google play uses a different encoding which could not be loaded directly into bigquery. Fortunately we are having a new feature just for this:)
Please check the BigQuery Data Transfer Service. It didn't mention Google Play because that's still in early stage.
If you are interested, you could
Finish the Before you begin section. That page is for
adwords, but the prerequisite is the same
Fill the enrollment
form and you will find play listed there.
After that we will send you more guidelines about ingesting data for Google Play.

Migrate smooth streaming encoded video files to Azure Media Service

My company streams videos using IIS Media Services to Silverlight players, the streams are delivered as adaptive bitrates (Microsoft Smooth Streaming). Due to support for Silverlight plugin being dropped by all major browsers, we are planning to migrate our streaming platform to Azure.
I have checked the documentations, samples & read articles and couldn't find anything on how to use existing smooth streaming encoded video without having to re-encode. We have quite a large asset to migrate, around 400GB, re-encoding is not an option, also we plan to dynamically encrypt our content using AES. Does anyone know how to go about this?
You need to perform following steps
Create azure media services asset
Upload files for specified asset.
Then you need to run media encryptor encoder "Windows Azure Media Encryptor"
Configure delivery options
See https://github.com/Azure/azure-sdk-for-media-services/blob/dev/test/net/Scenario/JobTests.cs.
Method
private IAsset CreateSmoothAsset()
covering step 1&2.
There are various tests in this file to cover encrypting asset using "Windows Azure Media Encryptor" encoder(see usage of
GetMediaProcessor(_mediaContext, WindowsAzureMediaServicesTestConfiguration.MpEncryptorName);
)
To configure delivery of protected content see - https://azure.microsoft.com/en-us/documentation/articles/media-services-protect-with-aes128/ .
There is also media processor called "Windows Azure Media Packager" which will allow you to package your smooth asset for example to HLS.
You can onboard your existing Smooth streaming assets to Azure Media Services without re-encoding them and apply dynamic encryption of AES and dynamic packaging to different streaming formats such as HLS, MPEG-DASH and Smooth Streaming. However, there can be some limitations and constrains. If your content is already encrypted such as Smooth Streaming + PlayReady it is not supported to dynamic encrypt to AES. Your content needs to be in clear form if you want to use dynamic encryption. Also your Smooth Streaming assets needs to be Smooth Streaming spec complaint. There are tools which generates Smooth Streaming files which is not spec complaint and not supported by Azure Media Services.
You can use creating assets from existing storage blobs article to start
https://azure.microsoft.com/en-us/documentation/articles/media-services-copying-existing-blob/
I hope this answers your question.
Cenk

How to pipe upload stream of large files to Azure Blob storage via Node.js app on Azure Websites?

I am building a Video service using Azure Media Services and Node.js
Everything went semi-fine till now, however when I tried to deploy the app to Azure Web Apps for hosting, any large files fail with 404.13 error.
Yes I know about maxAllowedContentLength and not, that is NOT a solution. It only goes up to 4GB, which is pathetic - even HDR environment maps can easily exceed that amount these days. I need to enable users to upload files up to 150GB in size. However when azure web apps recieves a multipart request it appears to buffer it into memory until a certain threshold of either bytes or just seconds (upon hitting which, it returns me a 404.13 or a 502 if my connection is slow) BEFORE running any of my server logic.
I tried Transfer-Encoding: chunked header in the server code, but even if that would help, since Web Apps doesn't let the code run, that doesn't actually matter.
For the record: I am using Sails.js at backend and Skipper is handling the stream piping to Azure Blob Service. Localhost obviously works just fine regardless of file size. I made a duplicate of this question on MSDN forums, but those are as slow as always. You can go there to see what I have found so far: enter link description here
Clientside I am using Ajax FormData to serialize the fields (one text field and one file) and send them, using the progress even to track upload progress.
Is there ANY way to make this work? I just want it to let my serverside logic handle the data stream, without buffering the bloody thing.
Rather than running all this data through your web application, you would be better off having your clients upload directly to a container in your Azure blob storage account.
You will need to enable CORS on your Azure Storage account to support this. Then, in your web application, when a user needs to upload data you would instead generate a SAS token for the storage account container you want the client to upload to and return that to the client. The client would then use the SAS token to upload the file into your storage account.
On the back-end, you could fire off a web job to do whatever processing you need to do on the file after it's been uploaded.
Further details and sample ajax code to do this is available in this blog post from the Azure Storage team.

How to share files created by an app?

How do I share files (music, video, image) create by my app? I am interested in sharing audio file specifically.
Imagine I have a program that generates wav file.
How do I take it from isolated storage?
Is it possible to sent an attachment with e-mail?
Save it on SkyDrive?
Share on Facebook?
Put it to media library?
At least in some convenient for a user way to take it out from WP7 device?
Any help regarding this topic would be welcome
You cannot directly send it as an attachment through the EmailComposerTask, however you can use your own implementation of an email sending mechanism.
You can save it to SkyDrive, but then again you have to use a custom API layer (developed by you or by a third-party) to achieve this.
A better choice in my opinion would be having a WCF service that will transmit the byte array of the generated content to a specific location - this will ultimately give you more control over the transmission layer.
You can save images to the MediaLibrary - from where you can access it via the Zune Software and transfer to PC, etc.
This can be done with the MediaLibrary.SavePicture method. (Yes, this is an XNA method but it can be used from within a Silverlight application also.)
The other alternative is to upload it to a webserver and send it from there.
There is currently no way to save songs or movies.
How to upload a file to a webserver very much depends on: the server; the software it is running; and any security concerns realting to the content.
There is the start of a discussion on this at Uploading XML files from WP7, possible, how to etc?

Resources