The Data-First Strategy: Powering the Data + AI Economy in Telecom

Gurol Akman, CTO
Published

Data-rich. Insight-poor. That two-word diagnosis applies to more telecom operators than anyone in the industry would comfortably admit. For decades, building the network was the job. The data the network produced was a byproduct — stored, sometimes analyzed, rarely weaponized. Now the rules have changed. AI doesn’t run on ambition; it runs on clean, consistent, trusted data. And the operators who figured that out first are pulling ahead of the ones still treating their data as exhaust.

And the urgency is no longer theoretical. Hyperscalers have connectivity in their sights — and the resources to act on it. AI-native players are eating into adjacent revenue streams. Governments and regulators are demanding accountability on fraud, spam, and service quality — and “we’re working on it” doesn’t satisfy them anymore. The comfortable timeline operators once had to modernize their data infrastructure has quietly expired. What used to be a strategic priority is now a competitive emergency. The operators who treat it as anything less are betting the company.

Data First Strategy

The root cause of that unpreparedness is one the industry has been reluctant to confront directly. Siloed. Inconsistent. Unreconciled. Three words that describe the data reality inside too many Tier-1 operators — and three reasons why AI projects stall at the proof-of-concept stage. The OSS has one version of truth. The BSS has another. The customer experience platform is working from a third. Nobody’s lying — they’re just speaking different data languages that were never designed to talk to each other. You can throw the best machine learning models in the world at that environment. You’ll get impressive demos and disappointing production results. The problem was never the model.

Solving it requires something more fundamental than a new tool — it requires an architectural conviction. Building your own data framework is a bet that the foundation matters more than the feature, that the unglamorous work of defining a unified subscriber model, normalizing event schemas across domains, and enforcing data contracts between OSS and BSS systems will eventually pay off in ways no “off-the-shelf” platform can match. That’s the bet we made. Our Common Data Framework is built entirely on open-source software — no proprietary lock-in, no licensing leverage held over the operator. It consolidates and ingests the event records generated by every individual product and service across our VAS Consolidation and Digital Services solutions, turning what would otherwise be isolated fragments into a unified, cross-domain signal. It isn’t a product we sell — it’s the layer that makes every solution we deliver work. It’s what turns raw operator data into something an AI system can learn from, trust, and act on. The enabler, not the headline.

In practice, that foundation shows up in two very different places — one about protecting value, the other about generating it. On the protection side: every Tier-1 operator has a war story about spam. The campaign that slipped through for six hours before anyone noticed. The legitimate business blocked because its call pattern looked fraudulent. The regulatory inquiry that followed. Our AI-assisted Antispam solution was built for exactly that environment — high volume, high stakes, zero tolerance for false positives or false negatives. The AI handles detection and message categorization — distinguishing spam from legitimate traffic and classifying message types across the board: social media, sales campaigns, banking notifications. But what makes it reliable at operator scale is what’s underneath it: a normalized, cross-domain event stream from our Common Data Framework that gives the model the signal fidelity it needs to tell the difference between a spam burst and a flash sale. That distinction lives in the data, not the algorithm.

The same foundation powers a very different kind of outcome on the monetization side. Every unaccepted offer is a small revenue failure. Multiply that across millions of subscribers and a dozen campaigns a month, and the number gets uncomfortable quickly. The culprit is almost never the offer itself — it’s relevance, or the lack of it. An upsell that lands at the wrong moment, for the wrong plan, on a device that can’t support the feature being sold, isn’t personalization. It’s noise. Our AI-driven Recommendation Engine exists to close that monetization gap — matching the right offer to the right subscriber at the right moment, based on a real-time, cross-domain view of who they are and how they use the network. That view comes from the Common Data Framework, which reconciles signals from across OSS and BSS domains — including the continuous stream of event records generated by our own products — into something the model can learn from. The result is a recommendation layer that gets sharper with every interaction — because it’s learning from data that’s consistent, not data that’s convenient.

Sequence. That’s what connects both of those examples — and what I’d argue is the defining pattern of every successful AI deployment we have seen in this industry. There’s a sequencing problem at the heart of too many telecom AI strategies, and it goes something like this: the business identifies a high-value AI use case, procurement finds a capable vendor, the project kicks off with momentum — and then it hits the data layer and slows to a crawl. Months get spent on data preparation that was never budgeted. The model that looked sharp in the proof of concept turns mediocre in production. The business case quietly gets revised downward. We have seen this play out enough times to be certain of one thing: the operators who avoid it aren’t luckier or better funded. They just did the foundation work first. They treated data architecture as a strategic decision, not an infrastructure afterthought. That sequencing — data first, intelligence second — is the only order that works.

Which is ultimately what the “data-first” strategy means — not a technology choice, but a discipline. The telcos that win the next decade won’t necessarily be the ones who moved fastest on AI. They’ll be the ones who moved smartest on data. Clean it, unify it, enrich it with context, trust it — and then let the intelligence follow naturally. That’s not a complicated strategy. It’s just a disciplined one. And in an industry that has spent years chasing features before fixing foundations, discipline might turn out to be the rarest competitive advantage of all.

Related Articles

The Red Pill, The Blue Pill, and the Digital Services Matrix

If you’ve ever watched The Matrix, you’ll remember the moment when Morpheus offers Neo two pills, one red, one blue. The blue pill promises...

Read More

Join Telenity at MWC 2026: Driving Innovation, Agility, and Growth in Telecom!

Meet with us at the Mobile World Congress 2026! Please share your contact details to schedule a meeting at Mobile World Congress 2026!   Find us at...

Read More

AI won’t fix telcos. Monetization will. 

Telcos aren’t broken, but they’re stuck. EBITDA looks fine, yet growth crawls, ROIC slides, and M&A papers over cracks.  Growth...

Read More