One important information to note about Microsoft Dynamics CRM event execution pipeline, is the 2 minutes time limit on all custom components execution. Regardless of whether a plug-in or custom workflow activity executes synchronously or asynchronously, there is a 2 minute time limit imposed on the execution of a (message) request. If the execution of your plugin or custom workflow activity logic exceeds the time limit, a System.TimeoutException is thrown.
Now this value of 2 minutes can be updated in Dynamics CRM on-premise in an unsupported way. However, there is no way you can override this in Dynamcis CRM Online.
My team and I were faced with this dilema and after a lot of backward and forward POC that included tens of design, build, test cycles, we came to the conclusion that the best way forward, at least for our very complex computation execution, is to go with an azure based computation service. We tried custom components (plugins or custom workflows) calling each other but this triggered the IExecutionContext.Depth Property (Microsoft.Xrm.Sdk) with the maximum depth of 8 and time limit of one hour (more details here: https://msdn.microsoft.com/en-us/library/microsoft.xrm.sdk.iexecutioncontext.depth.aspx)
We were then left with the option of running a windows service (or console app – not recommended) on an Azure server that pulls the data from CRM Online applies its computation and executes the results back to CRM and other applications where appropriate. This might not be the optimum solution for all similar scenarios (and this post is provided as is with no warranty). However, this worked really well for our complex calculation and we have managed to achieve optimum performance (after tens of performance and code optimisation cycles) as well as avoiding the 2 minutes time limit issue that was bugging us during our technical design work (more details here: https://msdn.microsoft.com/en-us/library/gg327941.aspx)
Hope this helps – if you have any questions, please ask via a comment below.