Phantom Cassandra batch insert - phantom-dsl

Is there batch insert API in phantom-dsl to batch insert to cassandra?
Tried searching but could not find it in the code.
Also i could not find enough documentation on the phantom library.

If you are looking for batch inserts to optimize performance, you are doing it wrong!
For further information, please refer to:
https://medium.com/#foundev/cassandra-batch-loading-without-the-batch-keyword-40f00e35e23e
Anyway, phantom supports batch-statements as #flavian commented
https://github.com/websudos/phantom/wiki/Batch-statements
Also, check out my github project that shows how to implement things in phantom-dsl.
https://github.com/iamthiago/cassandra-phantom

Related

Celigo integrator.io logging

I have several flows. I want to save result of these flows (succes, faile or something else) into MS SQL table. I cannot find how I can achive that.
Maybe we can save output result into file (not manually). I never worked with Caligo before. I just trying to find some options. Or maybe we don't have logging at all?

Azure Blob Storage lifecycle management - send report or log after run

I am considering using Azure Blob Storage's build-in lifecycle management feature for deleting blobs of a certain age.
However, due to a business requirement, it must be possible to generate a report or log statement after each daily execution of the defined ruleset. The report or log must state the number of blob blocks that were affected, e.g. deleted during the run.
I have read through the documentation and Googled to see if others have had similar inquiries, but so far without any luck.
So my question: Does any of you know if and how I can get a build-in Lifecycle management system to do one of the following after each daily run:
Add a log statement to the storage account containing the Blob storage.
Generate and send a report to an endpoint I define.
If the above can't be done I will have to code the daily deletion job and report generation myself, which surely I can do, but I would like to use the built-in feature if possible.
I summarize the solution as below.
If you want to know which blobs are deleted every day, we can configure Diagnostics settings in the storqge account. After doing that, we will get the logs for read, write, and delete requests for the blob. For more detail, please refer to here and here
Regarding how to enable it, we can use PowerShell command Set-AzStorageServiceLoggingProperty.

Importing apache-poi into Oracle

Good morning,
I would like to generate excel file from oracle, therefore I have imported poi 3.16 and all pre-requisits based on the bottom table in this link:
http://poi.apache.org/overview.html#components
Exctly the following files:
commons-logging, commons-codec, commons-collections, log4j ,poi.jar
The dbms command I have used:
dbms_java.loadjava('filename.jar -resolve');
Everything went fine but all the classes that are within "org/apache/poi/hssf/usermodel/" remained invalid. The most important part. :)
Anybody has any idea what can be the problem? Should I import any other classes? First I would like try solution that does not need to check log files on the harddisk or any action on the server itself. I have no access to the server, therefore I have to communicate with the administrators which is complicated in our company :(. Of course if there is no olution within oracle I have no other option...
Thanks in advance,
Sz.

Alfresco - Download statistics and user permissions report

Hey to every alfresco pro out there!
Is there any way to create a report (graphical or textually, i don't care) to see the following information:
download count per file
how many times did user X download a specific file
which permissions do the users have
Are my goals easy to realize? Is there any plugin out there that i can use for this? (Already searched for some but couldn't find one) Hope that you can help me :)
mtzE
There is nothing out-of-the-box that is counting downloads. Maybe the audit service can be used to count reads, but you'll have to turn it on and configure it. Once turned on, the audit service writes records to a set of audit tables in your Alfresco database. You can then use any reporting tool to query those tables.
If you want to check the permissions a user has you can use something like OpenCMIS to connect to the repository, traverse a folder path, and then, for each object, you can inspect the ACL of that object to use as data in your report.
As Lista said, one way to create such reports is to use AAAR, but that is not required.

DBMS_Scheduler get/put file alternative

I have a side project I'm working on currently that requires me to copy over a .csv file from a remote FTP and save it locally. I figured I would use DBMS_SCHEDULER.GET_FILE but I do not have permission. When I asked my manager, he said that I wont be able to get privileges to do this and should look up other ways.
After researching for a couple of days I keep coming back to DBMS_SCHEDULER, am I out of luck or are my searching skills terrible.
Thanks
I'm not certain you'd want to use DBMS_SCHEDULER for this; from what I understand from the documentation (never used this myself) the FTP site would have to be completely open to all; there is a parameter destination_permissions, but it's only "Reserved for future use", i.e. there's no way of specifying any permissions at the moment.
If I'm right with this then I agree with your manager, though not necessarily for the same reasons (it seems like you'll never get permission to use DBMS_SCHEDULER which I hope is incorrect).
There are other methods of doing this:
UTL_TCP; this is simply a method of interacting over a TCP/IP protocol. Oracle Base has an article, which includes a FTP package based on UTL_TCP and instructions how to use it. This also requires the use of the UTL_FILE package, which can write OS files.
UTL_HTTP; I'm 99% certain it's possible to connect to an FTP using this; it's certainly possible to connect to a SFTP/any server. It'll require a little more work but it would be worth it in the longer run. It would also require the use of UTL_FILE.
A Java stored procedure to FTP directly; this is probably the best approach; create one using one of the many Java FTP libraries.
A Java stored procedure to call call OS commands; this is easiest method but the least extensible. Oracle released a white paper on calling OS commands from within PL/SQL back in 2008 but there's plenty of other stuff out there (including Oracle Base again)
Lastly, you could question whether this is actually what you want to do...
What scheduler do you use? Does it have event driven scheduling? If so there's no need to FTP from within Oracle; use UTL_FILE to write a file to the OS and then OS commands from there.
Was the other file originally in a database? If that's the case you don't need to extract it. You could use DBMS_FILE_TRANSFER to collect it straight from the database or even create a JDBC connection or (more simply) a database link to SELECT the data directly.

Resources