Beneath the fanfare of its sleek new hardware, Apple’s latest product launch contained a quiet admission: when it comes to certain AI-powered features, it is still playing catch-up to its chief rival, Google. The most prominent example was the introduction of live translation in the AirPods Pro 3.
While Apple presented this as a major new feature, it is one that Google integrated into its Pixel Buds headphones years ago. The launch brings Apple’s product to feature parity with its competitor, but it also highlights the fact that Google was first to market with this powerful, AI-driven innovation.
This follows a broader trend where Google has been more aggressive in deploying user-facing AI tools. From advanced photo editing on Pixel phones to the deep integration of its Gemini AI model, Google has consistently been a step ahead in the software intelligence race. Meanwhile, Apple’s own “Apple Intelligence” initiative has had a slower, more cautious rollout.
The launch event demonstrates that Apple is aware of these gaps and is working to close them. However, by adding features that its competitor has long offered, Apple is implicitly acknowledging Google’s leadership in the practical application of AI. The challenge for Apple will be to move from catching up to once again setting the pace of innovation.
