Case study

Use case

How MLOps reduced model deployment time from 2 days to 5 minutes

How MLOps reduced model deployment time from 2 days to 5 minutes

Company

zally is a behaviouralAI company based in Manchester. They deliver seamless, continuous authentication through their proprietary behavioural AI, verifying users passively based on how they interact with technology. Their easy-to-integrate APIs and SDKs empower developers to embed behavioural AI effortlessly into any application.

https://www.zally.com/
Headquarters
Manchester, UK
Industry
Behavioural AI

zally, a fast-growing behavioural AI company in Manchester, improved their experiment-to-production cycle by replacing manual processes with a robust MLOps pipeline - cutting the time to deploy their models from days to minutes.

The Challenge

zally’s researchers and data scientists demonstrated that mobile interactions could be used to authenticate users, using Jupyter notebooks and classical machine learning. Like many startups, they prioritised prototyping over infrastructure at first — but as their approach matured, they needed better tools to scale.

They lacked automated experiment tracking, dataset versioning, and reproducibility. Deploying a model could take up to two days, and results were hard to replicate. The company was ready to move faster. Their tooling wasn’t.

Our Solution

We introduced a complete MLOps pipeline that made it easier for the team to track experiments, manage data, and deploy models quickly and reliably. The solution was designed to scale with their growth and support their shift toward more complex modelling.

The Results

In just six weeks, the team achieved:

  • Deployment time reduced from two days to five minutes
  • Experiment reproducibility improved from 0% to 100%
  • Manual processes largely automated

With solid infrastructure in place, zally’s tech team could focus on advancing their behavioural modelling, rather than managing deployment steps.

How We Built It

This started when we showed zally’s Chief AI Officer, Sarah Schlobohm, our open source project Matcha — a pre-packaged MLOps stack built for Microsoft Azure. She recognised its potential as a baseline but needed something aligned with zally’s technical strategy and infrastructure. Working closely together, we scoped a version designed for AWS, with a toolchain selected to support zally’s model development and deployment approach.

Our work focused on reproducibility, smooth deployment, and the ability to scale efficiently with demand.

We built and integrated a custom MLOps pipeline with ZenML at the centre that included:

  • Data versioning (using LakeFS) for auditability and traceability
  • Automated experiment tracking (using MLFlow) to enable full data and model provenance
  • Continuous training, test, and deployment pipelines: (using ZenML) for consistent and rapid rollout to production

All of this was implemented using infrastructure-as-code, ensuring that changes could be made safely and repeatably over time. Our aim was not just to build a system, but to make sure the team could manage and adapt it independently.

A Collaborative Project

This was a hands-on collaboration throughout. We worked alongside zally’s engineers and data scientists to design, implement, and iterate on the infrastructure together. That close partnership meant the final system reflected their ways of working - and that they were fully equipped to take it forward.

Why It Matters

Being able to run reliable, fast experiments is critical when developing models in a fast-changing space like behavioural biometrics. By solving for reproducibility and automation, the team can now test ideas quickly and deploy improvements with confidence.

What zally had to say

“We’re building a new type of AI here at zally and I need the best minds to work with the best tools in order to achieve our goal. Fuzzy Labs provided those tools, no question”

Patrick Smith - Founder and CEO

“I loved observing the teamwork between the Fuzzy Labs and zally engineering teams. Fuzzy Labs built the ‘racetrack’ to enable our data science and ML engineers to build and deploy models at scale. The whole experience was great, and I'm really hoping we can find opportunities to partner with Fuzzy Labs again in the future.”

Sarah Schlobohm - Chief AI Officer

“Oscar, Shubham, and Misha were a pleasure to work with. They each brought strong expertise with the tools they used, and were always friendly, approachable, and ready to help. What really stood out was how generously they shared their knowledge. They took the time to explain concepts clearly, answered all my questions with patience, and really helped me upskill along the way. Pairing with them was smooth, productive, and genuinely enjoyable. They had an amazing positive impact.”

Adil Said - Machine Learning Engineer

Try It Yourself

While the work on this project is proprietary, many of the tools we used are open source-including Matcha, our MLOps starter kit for Azure.

👉 mymatcha.ai

Matcha gets your AI systems up and running in minutes, with a full deployment pipeline built in.

Want to learn how to apply this to your team? Let’s talk.