My journey in building an Analytics Centre of Excellence

Freddy Loo
4 min readDec 24, 2020

--

BIG Data was introduced in the mid-2000s when there was a realisation that traditional Business Intelligence tools could not manage the size and complexities of data generated from Web 2.0. Big Data has since generated a lot of excitement and some have even touted it as the silver bullet required to end world hunger.

But the ability to use data for decision making has in fact been around for much longer; we’re talking 7,000 years ago in the Mesopotamian era when accounting was first introduced to record the growth of herds and crops.

So, analytics is certainly not new. But if it has been around for so long, why is there still skepticism and challenges faced out there on analytics?

Let’s address that by looking examples on how Centres of Excellence (COEs) function in Malaysia. COEs are designed to address specific business problems. Some COEs are designed to generate insights to serve customers better, reduce customer churn or the ability to measure customer’s propensity in buying a product. Some shared services analytics COEs supports operational and regulatory reporting. While some are also designed as R&D centers, experimenting with cutting edge Internet of Things (IoT) solutions or used as talent incubators. In my view, analytics COEs should be designed to address current known problems as well as future yet to be known problems. This is one of the reasons we see many COEs starting off strong but failing to sustain when the problem statement changes.

The core function of analytics is to help businesses make informed decisions by finding patterns in data or alerting the business of an impending problem. However, the business landscape is ever-changing and the problems we solve today, may not be the problems we solve tomorrow.

You might be thinking “isn’t this the case for all solution implementation?”

Well, no. Let’s take an enterprise resource planning (ERP) implementation as an example. The problem an ERP tool sets to solve is one of standardisation, efficiency and automation. This is done by designing, implementing and automating business processes using tech. Once complete, the business processes stay largely consistent for the foreseeable future, and the needs for today and tomorrows’ are met.

Because of this, it is necessary that an analytics COE is built with a focus on agility in order to manage change. I would define agility as the ability to define a problem early on, test solution feasibility through prototyping and pivot the solution given the feedback. Continue to iterate until you get a “minimum viable product” (MVP). An MVP is a product with just enough features to satisfy the customers and which can be used to get more feedback for future iterations.

So that sets that stage. What then are my three key learnings from building Star Media Group’s analytics COE?

Firstly, it’s to define value upfront and get the right stakeholders on the bus. Value is hard to define and we still need a north star to rally the organisation together. Commonly, value is defined as either revenue generation or cost reduction.

My view is that the value of analytics is not easily attributable to revenue or cost. Hence we need to look at it in a slightly different way. As told by my boss, use analytics as a means to change perspective and mindset of the organisation. In the same process, the organisation will naturally align to its intended objective and will drive up the necessary KPIs. For example, in Star, analytics is used to provide competitive, product and social insights to the go-to-market team, differentiating the value that we bring to our clients. Or use readership behaviour as a feedback for content creation.

Secondly, culture of experimentation. Analytics is not a crystal ball to predict the future. It can tell you insights only through historical patterns or previously unavailable data points. This is a process that has to happen iteratively. Give it a business problem, test it with data, refine it and test again. Repeat until we get a cut of the results or conclude that it is a problem that cannot be resolved yet. This experimentation culture must be percolated throughout the entire organisation — the techie, data scientist, customer care agent and everyone in the business side all have to be part of this process. The good news is that more often than not, this cycle will uncover additional insights that wasn’t part of the original objective. My personal mantra is “Test and Iterate” or “Fail and Pivot”.

Thirdly, finding the right talent. The perfect analytics practitioner has to understand the business domain to unlock value, be a master in statistics to uncover patterns and a tech guru to manage the data pipeline. This perfect person is the unicorn that doesn’t exist. We should instead build a team around these three pillars.

In Star, I call them the Suits, the Statisticians and the Techies. The Suits are the go-to-market team that engages the business to action upon the insights and to unlock business value through process improvement or change management. The Statisticians are my data scientists who helps to crunch the numbers and build machine learning algorithms. The Techies are responsible to source, store, curate and manage the entire data repository for use by the Suits and the Statisticians. Talent and teamwork are definitely the key to success, as Steve Jobs said: “Great things in business are never done by one person. They’re done by a team of people”.

--

--

Freddy Loo
Freddy Loo

Written by Freddy Loo

Am a management consultant who like data. Hobbies including stingless bees, running, scuba diving and gardening.

No responses yet