There are only so many times that an industry can solve the same problem before someone devises a way to turn the solution into a product. With data science, that limit is being reached, and Models as a Service is the “product”.

Models as a Service (MaaS) is the Quantitative Analyst’s (aka Quant or Data Scientist) answer to Software as a Service. Instead of providing a subscription to a software application, in the MaaS paradigm a quantitative model and its data output are offered as a subscription. This output is a dedicated feed that provides signals or measurements driving either proprietary models, visualizations, or decision support systems. Hence, these models are part of a broader analytical system that operates continuously. Systems like these are commonplace in finance, and as big data and analytics permeate through the fabric of all industries, this same trend will continue to unfold.

The MaaS paradigm isn’t new, but its broad applicability is. With so many companies racing to integrate social media analytics into their product and marketing strategies, the need for models that can make sense of this vast sea of data has never been more timely. Similarly, the rise of the Internet of Things (IoT) is producing copious amounts of data that requires quantitative models to make sense of it all. The common goal in all this data is modeling user behavior and the semantic structure of user-generated content. Models as a Service delivers the solution and saves businesses from having to unnecessarily reinvent the wheel.

How else is MaaS a better approach than building it in-house? It comes down to core competencies. The first way to look at this is from the perspective of hardware. When electronic devices are made there are a number of specializations. The component manufacturers focus on providing integrated circuits (ICs) that themselves are designed for a few specific functions. ICs are an essential part of the end product but aren’t consumer-ready per se. Device manufacturers connect these components together in a proprietary design to create consumer-facing products. Each business focuses on their core strengths as opposed to trying to build a complete stack.

hardware

Software has a similar structure, where numerous components are required to build a consumer-facing product or service. Many technology companies are not in the business of making application servers nor message queues, despite relying on these components, because it is not their core competency. Similarly, pure technology companies are not in the business of building consumer-facing applications. Analytical systems go a step further in that they require models and data to drive the end product. Just like it doesn’t make sense to build your own message queue, it doesn’t always make sense to build your own models.

software

This is where a company like Zato Novo comes into play as our core competency is building models. The models in our flagship service, Panoptez, solve universal problems, such as identifying user interests, computing sentiment, analyzing the semantic structure of content, clustering content, etc. These models are like ICs that can be combined in different ways to create a consumer application.

core_competencies

By leveraging these generic components, businesses can focus on value-added development, like proprietary domain-specific models. This approach is clearly better than getting distracted from the core mission by reinventing the wheel. Furthermore, Data Scientists are a scarce resource. Leveraging a MaaS provider adds scale to effectively execute an analytics strategy. It’s only a matter of time before every CIO is asked “Who’s your MaaS provider?”

So is this the future of data science? Tell us what you think in the comments.