At this week’s series of SQL Server community events hosted under the SQLRelay umbrella, I presented a session titled “Managing multi-vendor technology environments”. It sounds like something large organisations have to do, which is true, but it’s now also something individual solution owners have to do – especially in the business analytics space.
Rarely is one vendor’s technology enough to meet a solution’s every requirement. Whether it’s the source systems, the storage and analysis layers, or the presentation layer – it’s increasingly common for business intelligence/business analytics solutions to create solutions that use components from multiple sources.
Our use of the word vendor is also changing especially as the open source community is now making significant contributions with Hadoop and R.
Below is both a link to my presentation and its overview.
Rarely has a single vendor managed to meet all of an organisation’s technology requirements, especially in an era when we increasingly depend on vendor-less open-source software. Instead, we regularly deploy portfolios of products and technologies to meet a business’s needs.
This style of enterprise computing isn’t new but what’s becoming more common is for a single project to use and integrate multiple products and technologies. To find an example, we need look no further than modern business intelligence solutions.
Regardless of where it happens, deploying multi-vendor solutions increases risk – the risk of increased complexity, increased costs and project delays.
This session’s objective is to help us manage multi-vendor environments. We’ll consider when to decide if a single vendor’s “just good enough” solution may actually be the best decision – or how to manage the inevitable – the complexities of integrating, supporting and purchasing in multi-vendor environments.