I recently published an open letter to the address verification industry, a group to which I’ve belonged professionally for over 20 years. In it, I critiqued our lack of innovation over the years and suggested another way: making global address verification better by using local intelligence.
As hoped, the letter spurred a lot of interesting dialogue. In hopes to keep the conversation going, I’m posting a series of blog entries that dig deeper into the themes of technology, address data and how we can meet the needs of our most demanding customers.
Part IV: Post-Script – The Innovator’s Dilemma and Local Disruption
In 1997, Harvard Business School professor Clayton Christensen published his landmark book, The Innovator’s Dilemma. His guiding questions: why is it that innovation always seems to come with disruption? True innovation rarely comes from market leaders with established products. It always comes from an upstart with little to lose and a scrappy mentality.
The minicomputer didn’t come from IBM, Christensen instructs us, though IBM had the deep pockets, experienced sales force, and engineering know-how to make it happen. No, IBM kept clinging to the profits from its dominant mainframe technology before Digital Equipment Corporation (and a host of other upstarts) swooped in and launched the better, faster, cheaper alternative.
History will show the global address verification market is going through a similar disruption.
On the surface, it’s baffling that the older global address companies aren’t pushing local intelligence into the verification process. The new model makes so much more sense. But looking at it through Christensen’s innovator’s dilemma lens, we can begin to understand why. They’re captive to their own success; to the demands of their big customers. Rather than look to the future of our industry – the rise of cross-border commerce, increasing parcel shipment, and consumers demanding ever faster delivery – they remain focused on protecting what they have.
It’s a hard cycle to break.
They aren’t doing it, and I don’t expect that to change. What I do expect is that Global Data Consortium continues leaning into this new and developing market. We will build a critical mass of data partners providing best in-country solutions; we will help our partners continue improving their data through faster feedback loops; and we will make our services less expensive with time (though, perhaps surprisingly, it’s already price competitive with the major global generics).
This is the way of disruption. Quality keeps getting better, prices keep coming down, and eventually even the most price-sensitive customers want the better service.
Five years from now I predict that most of the market has shifted to this model. We will no longer accept “good enough is good enough.” We’ll demand innovations that move local intelligence (that’s so rich at the delivery level) upstream to the address verification process. Rather than depend on local intelligence to fix it, our local data will prevent the problems that lead to mis-delivery and delays.
That’s when we can say the local disruption has truly taken hold. That’s the future that I want GDC to be part of.