Problem

Often there is overlap of the functionalities between different modules. In monolith approach, we split it up in reusuable modules and import it in corresponding module. Let take one example about Netflix. In order to support wide variety of encoding, Netflix encodes all the shows in all supported devices. Encoding is CPU intensive task, With monolith approach, we tend to increase the entire capacity of the server, but if you look closely only Encoding service needs to be CPU bound. To support this we can split up the pipeline into small set of tasks which perform dedicated operation on its own.

Solution

Here a complex process is splitup into small individual process. Let assume that Task A is CPU Intesive, so we can put it in a large machine to make it faster. and Task B is small task and does not require that much compute, we can use Serverless kind of framework or use horizondal scalability to increase the speed of execution.

To make it **event driven **, we can use queues in between all the tasks, thereby acheiving segregation between each modules.

Why

Pros

  • Reusability of code (or module)
  • Performance (large CPU bound tasks can be alloted Large CPU bound machines - vertical scaling)
  • Scalability (if a certain task takes more time to process, we can horizondally scale to meet the needs)

Cons

  • Difficult to Debug

Reference

Azure Architecture