Activiti REST API: complete process instance - spring-boot

I am working with Activiti 5 REST API interfaceintegrated with Spring Boot Activiti Starterand I am trying to complete a process instance. I was able to instantiate a process definition, walk through the process instance tasks and complete each of them. It correctly works until the end of the process, when there are no pending tasks left. I would expect the process instance to be completed - i.e. completed: true-, as I have an end event (terminateEventDefinition), but it is not.
I could not find the REST Api to complete the process instance. So, what's the correct way of managing process instance completion?
Thanks.

Perhaps I am missing something, but after the last task is completed the process will end normally and no longer visible in the /runtime/process-instances list.
Now, you mention that you complete the instance with a Terminate End Event, Terminate End events will complete the instance but will not set the "complete" flag. Terminate is typically used for cancelling a running process.
Instead of terminate, you should use a regular end event, this should set the complete flag.
Again, maybe I am missing somethign in your description.
Thanks
Greg

Related

Obtaining a handle to a JobObject the process is running under

I can check if my process is in a job via IsProcessInJob with null. But I need a handle to this job. How do I go about this?
We have an automation system that wraps processes in job objects. However we have to call a script provided to us which is launching a process outside of this job object. So we need to manually add that process so that if the job fails and has to be killed it is also killed.
Windows provides no supported method to open the job object associated with your process; it is the responsibility of the process that creates the job object to provide a way for other processes to find it, if it is desirable that they are able to do so. However, since the only reason you wanted to put the child into the job was so that it could be killed by the automation system, there is an alternative solution.
Instead of trying to put the child into the existing job object, create a new job object, one under your own control, and assign the child to it. Set the JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE basic limit flag so that the child (and any children it may have) will be killed when your job object is deleted, and make sure that your handle to the job object isn't inheritable.
If the automation system kills your process, or if it exits for any other reason, Windows will automatically close the handle and delete the job object, killing the child processes as desired.
Harry Johnstons suggestion worked.
If the only reason to put the child into the job is so that it can be killed, you might be able to do that another way. Put the child into a newly created job object, with JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE set, and make sure the job handle isn't inheritable. If your process is killed, the job handle will be closed, causing the child to be killed.

Handling multiple Rest Services in ACTIVITI Process

I am completely new to Spring and Activiti, and made myself a little Project which works just fine. there are 4 service tasks, a REST controller, 1 process, 1 service and 4 methods in this service.
When i call the server endpoint, i start my process and it just goes step by step through my service tasks and calls service.method as defined in the expression ${service.myMethod()}.
BUT, what i really need is a workflow that stops after a servicecall and waits till there is another request sent, similar to a user task waiting for input, the whole process should pause till i send a request to another endpoint.
like myurl:8080/startprocess, and maybe the next day myurl:8080/continueprocess. Maybe even save some data for continued use.
is there a simple predefined way to do this?
Best regards
You can use human tasks for that or use a "signal intermediate catching event" (see activiti´s user guide) after each ativity.
When you do that, the first rest call will start a new process instance that will execute your flow´s activities until it reaches a the signal element. When this happens, the engine saves its current state and returns control to the caller.
In order to make you flow progress, you have to send it a "signal", something you can do with an API call or using the rest API (see item 15.6.2 in the guide)

Detach debugger(unknown) from process?

I am trying to attach debugger(windbg,ollydbg) to running process but there's an error saying Debugger is already attached then how can i detach that unknown debugger from that process?
Process includes multi thread, one thread can be attached to debugger and other can't.
The process might be spawning a second process which attaches to the first process for debugging using DebugActiveProcess() in order to prevent people from debugging the first process. Keep in mind that a process cannot debug itself using this method, so a second process must be spawned to do this.
Things you could try:
Use any sort of process monitor or even task manager to figure out what process the first process spawns
Inject code into the second process to call DebugActiveProcessStop() to detach it from the first process
Hook DebugActiveProcess() (kernel32.DebugActiveProcess, ntdll.ZwDebugActiveProcess, or in kernelmode) and redirect it to attach to a different dummy process
Kill the second process
Prevent the second process from getting a handle to the first process with the needed permissions - DebugActiveProcess() will then fail
Use alternative debugging methods (Cheat engine with VEH debugging for example) that don't use the normal debugging API's and therefore bypass this problem

Fitnesse slim test with SpringWirableFixture not terminating

I'm creating a Fitnesse slim test (Decision Table). In order to run the test I need to start my Spring app. context. For that, I'm using a class that extends FixtureWirer. Starting the application context is not a problem, since the test completes successfully. In the page I can see that the test is complete and all the output values are compared. The problem is that the page with the final results never stops loading, but no exception is thrown. And that only happens when I use the FixtureWirer to start the application context, so I'm guessing the problem is related to that, but I haven't been able to figure it out.
Thanks in advance.
Do you mean the test doesn't stop as it is supposed to? If so, can you try to close the context
This may be because your Spring's shut down hook, is not being able to destroy beans which must
be blocked by any resource.
Can you check if there are any live threads after the execution of test that are blocked?
If there are any live threads left then Slim server would not terminate (SlimServer java process will not be killed).

Windows Workflow - Is there a way to guarantee only one workflow running?

The workflow is being published as a wcf service, and I need to guarantee that workflows execute sequentially. Is there a way--in code or in the config--to guarantee the runtime doesn't launch two workflows concurrently?
There is no way to configure the runtime to limit the number of workflows in progress.
Consider though that its the responsibility of the workflow itself to control flow. Hence the workflow itself should have means to determine if another instance of itself is currently in progress.
I would consider creating an Activity that would transactionally attempt to update a DB record to the effect that an instance of this workflow is in progress. If it finds that another is currently in progress it could take the appropriate action. It could fail or it could queue itself using an EventActivity to be alerted when the previous workflow has completed.
You probably will need to check at workflow start for another running instance.
If found, cancel it.
I don't agree that this needs to be handled at the WorkflowRuntime level. I like the idea of a custom Activity, sort of a MutexActivity that would be a CompositeActivity that has a DB backend. The first execution would log to the database it has a hold of the mutex. Subsequent calls would queue up their workflow IDs and then go idle. When the MutexActivity completes, it would release the Mutex, load up the next workflow in the queue and invoke the contained child activities.

Resources