We've all learned that separation of concerns is a fundamental good in software/systems design. Often, what is seen as a logical next step is to package these concerns into separately deployable artifacts - as services requiring remote invocation, for example.
Many a startup goes through these phases - start simply, with monolithic codebase and relational DB as a universal persistence mechanism. If the startup is successful, its growth inevitably requires more - more features, more specialization based on market/locale and user segments, more experimentation, integration of new and acquired capabilities, external systems, partners, etc.
At some point, the monolithic code base becomes too complex and the engineering organization too large for every engineer to understand most of the code. So the order of the day is... isolation! Draw reasonable boundaries within the code and attempt to create good interfaces between isolatable portions (hopefully hiding implementation details of each from the others).
And how can one isolate various code domains from each other in a way that makes enforcement of dependency management simple? Well - SOA (service-oriented architecture) of course!
So soon enough, instead of a monolithic code base, there's a rich panoply of services. But not without a cost - creating a lot of services and forcing remote invocation in the name of isolation is far more expensive than running execution within the memory space of a single process. Intermediation between a multitude of services becomes the next challenge - with far from trivial issues of orchestration, latency, topology & discovery, geo-distributed failover, etc. If processing of the same request now requires a cascade of a dozen or more service calls, how does one still make it performant? How to handle versioning required across common interfaces - supporting both the new and the legacy clients?
Attacking these issues by sheer brute force is usually an expensive proposition - with costs rapidly rising. The only answer I found is ensuring an adequate investment in quality technical design/architecture.
Many a startup goes through these phases - start simply, with monolithic codebase and relational DB as a universal persistence mechanism. If the startup is successful, its growth inevitably requires more - more features, more specialization based on market/locale and user segments, more experimentation, integration of new and acquired capabilities, external systems, partners, etc.
At some point, the monolithic code base becomes too complex and the engineering organization too large for every engineer to understand most of the code. So the order of the day is... isolation! Draw reasonable boundaries within the code and attempt to create good interfaces between isolatable portions (hopefully hiding implementation details of each from the others).
And how can one isolate various code domains from each other in a way that makes enforcement of dependency management simple? Well - SOA (service-oriented architecture) of course!
So soon enough, instead of a monolithic code base, there's a rich panoply of services. But not without a cost - creating a lot of services and forcing remote invocation in the name of isolation is far more expensive than running execution within the memory space of a single process. Intermediation between a multitude of services becomes the next challenge - with far from trivial issues of orchestration, latency, topology & discovery, geo-distributed failover, etc. If processing of the same request now requires a cascade of a dozen or more service calls, how does one still make it performant? How to handle versioning required across common interfaces - supporting both the new and the legacy clients?
Attacking these issues by sheer brute force is usually an expensive proposition - with costs rapidly rising. The only answer I found is ensuring an adequate investment in quality technical design/architecture.