can i use some native Parquet file tools in Azure Logic App? - parquet

Hi there I need to change format of parquet file to csv using only Logic app native tools. Is that even possible?
I did research of similar issues, I found how to use Azure Functions to change format, but it's not native Logic App tool.

There's a custom connector that will transform Parquet to Json for you.
It will also allow you to perform filter and sorting operations on the data prior to it being returned.
Documentation can be found here ... https://www.statesolutions.com.au/parquet-to-json/

Related

Bigquery storage grpc write api

My use case here is to read from database and write it into bigquery table.
For this i am trying to use grpc api
And Following this example file. Considiring myself new to protobuf and golang I am unable to figure out how to write a DB row into bigquery table. Specifically confused about this part. Not able to find any particular example of creating request in protobuf byte sequence and streaming it.
Any help is much appreciated.
Go client provides a managedWriter that you can use to stream data more easily. You can see how it is used in the integration tests.
Also, if you are new to Go, do you consider using Java instead? There is a JsonStreamWriter available in Java that allows you to append JSONArray objects (as opposed to protobuf rows) and the samples are here: https://github.com/googleapis/java-bigquerystorage/tree/main/samples/snippets/src/main/java/com/example/bigquerystorage

Does go-pg support regex updates?

go-pg is a Golang library for PostgreSQL. In SQL one could update an entire column by applying a regular expression, e.g.:
update <some-column> set x = regexp_replace(x,'^.*\/[0-9]+(.*)$', '\1hello');
Problem
According the README, one could perform a bulk update. However, no information regarding regular expression were neither found in the issue tracker, nor in the documentation.
Question
Does this library support regexp_replace updates?
It does not support it as an ORM, but it supports plain SQL. I personally do not like to run it as such, but there seems to be no other choice when this library is used at the moment. One benefit is that the statement will be run in the flow of the go app. For example, once the file paths have been changed on disk, the database could be updated in a controlled way.

Tableau on Splunk.. Possible?

My BI team would like to access my Splunk data using Tableau, we don't want to use Splunk visualizations and would rather use our BI tools...
We tried to use Splunk connectors but that failed and the volume of data is too big to show on tableau... is there any solution for this problem ? how can that be achieved and what are your recommnendations on this subject?
Thank you.
Yes. this is possible. You can use splunk SDK. By using this you and bring your data and feed it into tableau.
We are using this thing with QlickSense

cloud DLP - sample / how to , convert a local file (Linux) to a secure CSV file

Do I really need a Google Cloud box to run DLP conversion for a local file before up-load to a Google Cloud bigquery box?
I just want to convert a csv file to a secure data protective file format ?
Currently the DLP product is an API in the cloud.
You can use gCloud similar to the example below - though this will only do simple redaction and based in infoType findings, not record transforms (apply transform to a whole column). For the full set of features, you would stream in as a table into the Content.deidentify

Google Cloud Logs Export Names

Is there a way to configure the names of the files exported from Logging?
Currently the file exported includes colons. This are invalid characters as a path element in hadoop, so PySpark for instance cannot read these files. Obviously the easy solution is to rename the files, but this interferes with syncing.
Is there a way to configure the names or change them to no include colons? Any other solutions are appreciated. Thanks!
https://github.com/apache/hadoop/blob/trunk/hadoop-common-project/hadoop-common/src/site/markdown/filesystem/introduction.md
At this time, there is no way to change the naming convention when exporting log files as this process is automated on the backend.
If you would like to request to have this feature available in GCP, I would suggest creating a PIT. This page allows you to report bugs and request new features to be implemented within GCP.

Resources