Background job triggering afterDelete - parse-platform

I need to delete a record from inside of a background job. When I do that, it triggers the afterDelete function, which is great because I have a bit of logic in afterDelete, but it fails because it doesn't have the context of a request.user, which I use in my afterDelete logic.
I need to schedule this job to run from my dashboard – I'm not using the REST endpoint.
Is there a way to pass that context to afterDelete? How do I handle the situation?
Edit: This is all happening inside the context of Cloud Code. No Parse Android/iOS SDK is being used.

Related

Inject nifi web-api bean into custom processor

I am playing around with NiFi custom processor.
How can I inject an instance of org.apache.nifi.web.StandardNiFiServiceFacade into my custom processor instance?
Background:
I am trying to achieve the goal of stopping the processor after the processor is executed. I understand that nifi processors are meant only for stream processing and not for batch processing, in which we execute the job just once. But to leverage on the NiFi execution support, this need to be done. As experimented further, I will be able to do that with the instance of StandardNiFiServiceFacade available in the custom processor instance.
This is not made available to the processor API intentionally. If you are certain you want have the processor tell the controller to stop scheduling it then it can make an HTTP/REST API call to the API as would be the case for the user interface or programmatic API calls.
Processors should, however, never be doing this. They are either scheduled to execute or not scheduled to execute. If the conditions to perform some function are no longer as needed then the processor can check for these conditions and short-circuit its on trigger call and simply return. If the conditions to perform some function are present then it can run them.
If you are triggering this custom processor from an upstream processor such as GenerateFlowFile, you may be able to leverage ExecuteScript to emulate a "one-and-done" job trigger, check out my blog post for Groovy script(s) that might help you achieve what you're trying to do.

Handling multiple Rest Services in ACTIVITI Process

I am completely new to Spring and Activiti, and made myself a little Project which works just fine. there are 4 service tasks, a REST controller, 1 process, 1 service and 4 methods in this service.
When i call the server endpoint, i start my process and it just goes step by step through my service tasks and calls service.method as defined in the expression ${service.myMethod()}.
BUT, what i really need is a workflow that stops after a servicecall and waits till there is another request sent, similar to a user task waiting for input, the whole process should pause till i send a request to another endpoint.
like myurl:8080/startprocess, and maybe the next day myurl:8080/continueprocess. Maybe even save some data for continued use.
is there a simple predefined way to do this?
Best regards
You can use human tasks for that or use a "signal intermediate catching event" (see activiti´s user guide) after each ativity.
When you do that, the first rest call will start a new process instance that will execute your flow´s activities until it reaches a the signal element. When this happens, the engine saves its current state and returns control to the caller.
In order to make you flow progress, you have to send it a "signal", something you can do with an API call or using the rest API (see item 15.6.2 in the guide)

creating a pojo/ejb with spring 3 that always runs in the background

I have created apps in the past that would have web pages that would call the persistence layer to get some query results or to insert, delete, etc against a db. However, nothing was left running in the background except for the persistence layer. Now I need to develop an app that has an process that is always running in the background, which is waiting for messages to come thru a zeromq messaging system (cannot change this at this point). I am a little lost as to how to setup the object so that it can always be running and yet I can control or query the results from the object.
Is there any tutorial/examples that covers this configuration?
Thanks,
You could use some kind of timer, to start a method every second to look at a specific ressource and process the input taken from that.
If you use Spring than you could have a look at the #Scheduled annotation.
If your input is some kind of java method invokation, than have a look at the java.util.concurrent Package, and concurrent programming at all. -- But be aware of the fact, that there are some restictions one creating own Threads in an EJB environment.

How to get notification of workflow errors?

I am having issues were a workflow is stalled because there is an issue with sending an email (send email activity). Typically, this is simply solved by resuming the workflow. I'm wondering if there any way to react to a workflow error, so the user knows they need to go in and resume the workflow.
I'm also wondering about this relative to a workflow that is attempting to assign a task to a user who no longer exists in the CRM or one that has an invalid email address, which I'm assuming would cause errors in workflows as well.
Any other suggestions related to this sort if issue would be welcome.
Thanks!
My point of view, is that monitoring can't be done inside CRM, because all CRM processes could be problematic (what happen if a workflow fails to monitor another workflow?)
The way I already done that, was by adding a SQL query (that check the workflow instance state) to a monitoring tool (such as Nagios with the check_mssql_health) or you can just create a small service that will send emails using SMTP.
Off the top of my head, I can't think of an automated way to do it (you could try attaching a workflow to a workflow instance record, but I'm not sure if that will do it).
I'd probably try to build a utility to query workflow instance records, and then notify users if necessary based on their status.

Windows Workflow - Is there a way to guarantee only one workflow running?

The workflow is being published as a wcf service, and I need to guarantee that workflows execute sequentially. Is there a way--in code or in the config--to guarantee the runtime doesn't launch two workflows concurrently?
There is no way to configure the runtime to limit the number of workflows in progress.
Consider though that its the responsibility of the workflow itself to control flow. Hence the workflow itself should have means to determine if another instance of itself is currently in progress.
I would consider creating an Activity that would transactionally attempt to update a DB record to the effect that an instance of this workflow is in progress. If it finds that another is currently in progress it could take the appropriate action. It could fail or it could queue itself using an EventActivity to be alerted when the previous workflow has completed.
You probably will need to check at workflow start for another running instance.
If found, cancel it.
I don't agree that this needs to be handled at the WorkflowRuntime level. I like the idea of a custom Activity, sort of a MutexActivity that would be a CompositeActivity that has a DB backend. The first execution would log to the database it has a hold of the mutex. Subsequent calls would queue up their workflow IDs and then go idle. When the MutexActivity completes, it would release the Mutex, load up the next workflow in the queue and invoke the contained child activities.

Resources