Seven Phases of a Successful Analytics Project

Seven Phases of a Successful Analytics Project

Learn the seven crucial phases for a successful analytics project, whether integrating Generative AI or not. From data-gathering techniques to effective visualisation and predictive analysis, this guide by Sid Bhatia provides valuable insights for mastering analytics projects.

Have you decided that you want to take part in the GCC’s projected $2 billion generative AI market? Or indeed, do you want to get involved, in any capacity, in the region’s overall AI market, expected to reach $7 billion? If so, you are not alone. After all, who wants to be left behind as we approach the reality painted by these 2030 estimates? You may have already explored the potential of becoming an Everyday AI organisation where every employee is an AI-first thinker. If you have, you have likely rejoiced at what this means for your enterprise — more innovation, more efficiency, more customer loyalty, more employee satisfaction, and so on. 

But before you dive into the pond of prosperity, it might behove you to consider exactly where you are going, what steps you need to take to get there, and what success looks like. If Everyday AI is your goal, allow me to guide you through the seven steps of your first analytics project, which may or may not include AI. 

1. Know yourself

Even before the design stage comes the period of introspection. Look inward for an organisational need that a successful analytics project will fulfil. Talk to people about processes. Get to know them and their spreadsheets. And define a “get” or a “win” between you. Set timelines and KPIs so you can identify the data you need. Which leads us to…

2. Gather the data

Do not confine yourself to a single database. Look far and wide and combine, mix, and merge the datasets that make sense to your use case. Data and IT teams will, of course, make internal stores available, but if the project requirements warrant it, be bold and look beyond. APIs can give you access to open sources. 

Plug-ins can speed up development. There is so much public data on which to draw — economic, demographic, and the rest — and each can enrich the project when combined with all the information painstakingly collected as part of your organization’s day-to-day activity.

3. Clean the data

When it comes time to link data sources together so they can be interrogated as a single source, things get tricky. Up to 80% of project time is spent exploring, formatting, and cleaning data. How do you interpret null fields? How do you get around missing data? How should you deal with misspellings or inconsistencies? How do you shape the data inside the boundaries of your regulatory obligations? 

Domain experts from relevant business units must be part of the conversation so you can understand what everything means and how it should be treated as a model is built. The good news is that data-cleaning tools with generative AI interfaces are coming into play to take the labour burden away from data scientists.

4. Enrich the data

Clean data is still not ready to play in the analytics sandbox. Think of it like this: you have had a shower, but now you must dress for dinner. And dress appropriately. You must create the right data shape to ensure we get the most value from your modelling process. Sources must be joined as needed. 

You may have an entirely clean date field, for example, but your analysis may require deriving fields such as “day of the week”, “fiscal week”, or even “workday (Y/N)”, where you take account of weekends and national holidays. It is the enrichment phase, more than any other, where bias can creep into your data and make its way from there into your results. Skilled data scientists will know the potential for misinterpretation and should step in to keep the project on the right track.

5. Visualise the data

Every decision-maker loves (or at least relies on) a good graph, chart, or diagram to help them see the big picture. The larger and more complex the dataset, the more critical and effective visualisation is at each stage of the modelling project. It is how we explore data and communicate our findings. 

Assuming you have gone through the previous steps — especially the cleaning phase — rigorously, APIs and plugins can help to push valuable insights to the right end users. This is a core deliverable of Everyday AI — not just to produce insights but to produce actionable ones. Line-of-business executives who lack the skills to interpret raw analytics results are served well by the right visualisation.

6. Start predicting

This is what we picture when we think of AI, right? Machine learning algorithms rummage where we would never think to go and deliver actionable insights in the form of future trends. We get ahead of our competitors. In truth, this is not an unreasonable expectation. Unsupervised learning algorithms like clustering can go beyond graphs and raw stats. 

Clustering is the unprompted discovery of groups with things in common that a human may never have spotted. Sometimes, the commonality that binds items together in a cluster can be operationalised because it ties back to real-world factors upon which an organisation can act. 

7. Rinse, repeat

While projects like the one described are an important first step, you must still become an Everyday AI business. You must replicate your lessons — successes and failures — over subsequent projects. Measure and present these findings to demonstrate progress so that when you become more agile, everyone notices the proof of effectiveness and buys into the premise that AI can be a game-changer. 

Maintain to gain

Over and above repeating your successes, you must monitor each model. Degradation of models — so-called “model drift” — can pull an Everyday AI organisation backwards on the maturity line. Code and data must be periodically reviewed to keep a model fresh and accurate and, therefore, valuable. A data scientist’s job is never done.