Forbes article by CEO Gary Drenik and Prosper podcast briefing highlight why reliable signal—not just larger models—will define the next era of AI
Prosper Insights & Analytics is calling attention to what it believes is the most important emerging issue in artificial intelligence: the growing shortage of high-quality, forward-looking data needed to make AI systems more reliable, explainable and economically useful. That perspective is outlined in a recent Forbes article by Prosper co-founder & CEO Gary Drenik, “AI’s Real Bottleneck Isn’t Algorithms, It’s The Rare Earths Of Data,” which argues that AI’s biggest constraint is no longer model architecture alone, but the scarcity of trustworthy data inputs that reflect real human intent and behavior over time.
When data captures consumer intent, confidence and changing behavior before it appears in conventional reports, AI becomes more than an automation tool—it becomes a strategic intelligence asset.”
— Phil Rist, EVP-Strategic Initiatives, Prosper Insights & Analytics
In the article, Drenik compares scarce, high-value data to rare earth elements—inputs that are difficult to source, yet essential to powering modern technology at scale. He explains that many AI systems continue to struggle because they are trained primarily on backward-looking digital exhaust such as clicks, logs, scraped text and transactions, rather than clean, auditable and longitudinal signals that help explain why people make decisions and how they are likely to behave next. Prosper believes this distinction is becoming increasingly important as enterprises push AI from experimental demos into mission-critical forecasting, planning and decision support.
Marketing Technology News: MarTech Interview with Nicholas Kontopoulous, Vice President of Marketing, Asia Pacific & Japan @ Twilio
That same theme is explored in a Prosper Spotify podcast episode, “Prosper Insights AI & Innovation Briefing 2026 E25: The Rare Earth Moment: AI’s High-Quality Data Bottleneck.” As described, the podcast argues that the AI industry’s central challenge has shifted from algorithmic complexity to the scarcity of premium data capable of producing dependable, forward-looking insight. The episode emphasizes that proprietary datasets with longitudinal depth are becoming more valuable than broad open-web scraping when the goal is to predict macroeconomic shifts, market trends and consumer behavior with greater confidence.
“AI will not create durable enterprise value simply because a model is bigger or faster,” said Phil Rist, co-founder of Prosper Insights & Analytics and EVP-Strategic Initiatives. “The real advantage comes from the quality of the signal going in. When data captures consumer intent, confidence and changing behavior before it appears in conventional reports, AI becomes more than an automation tool—it becomes a strategic intelligence asset.”
Marketing Technology News: The ‘Demand Gen’ Delusion (And What To Do About It)
Prosper notes that this issue has major implications for businesses, investors and policymakers alike. As accountability, regulation and performance expectations rise, organizations will increasingly need AI systems built on transparent, representative and well-governed data foundations. In Prosper’s view, the winners in the next phase of AI will not simply be those with the largest models, but those with access to scarce, high-integrity datasets that improve prediction, explainability and decision quality over time.
Drenik’s Forbes article also points to the growing market value of proprietary signal-rich data as AI leaders seek more dependable inputs for real-world applications.











