New release enables in-database time-series analysis and scalable validation, supporting enterprise data governance and reliability
digna has announced the release of version 2026.04 of its Data Quality & Observability Platform, introducing expanded time-series analytics capabilities and new approaches to scalable data validation designed for enterprise data environments.
The combination of integrated analytics and scalable validation reflects increasing demand for platforms that support both data reliability and a deeper understanding of data behavior”
— digna
As organizations across financial services, fintech, and enterprise sectors continue to scale their data platforms, understanding how data behaves over time has become increasingly important. The latest release reflects a shift from static monitoring approaches toward behavioral analysis that enables teams to interpret trends, patterns, and structural changes in data.
With the introduction of a new Analytics Chart, digna enables users to perform time-series analysis directly within the platform. Built-in methods include linear, quadratic, and cubic regression, piecewise regression with configurable breakpoints, smoothing techniques, quantile analysis, and residual analysis. The platform also automatically identifies trends, seasonal patterns, and changes in data behavior.
Marketing Technology News: MarTech Interview With Fredrik Skantze, CEO and Co-founder of Funnel
By integrating analytics directly into the data environment, organizations can analyze data without exporting it to external tools or relying on separate data science workflows. This approach supports faster investigation of anomalies while maintaining control over data within enterprise environments.
In addition to analytics enhancements, the release introduces new capabilities in data validation aimed at improving consistency and scalability across complex data landscapes. These include reusable validation rule templates and centralized enumerations for defining allowed values across datasets and systems.
All validation checks are executed directly within the source database, supporting performance, security, and governance requirements in regulated environments. This in-database approach allows organizations to enforce data quality standards without moving sensitive data outside controlled environments.
Marketing Technology News: The Death of Third-Party Cookies Was Just the Start. Are You Ready for Consent Orchestration?
The release also introduces statistic-level relevance conditions, enabling teams to define when specific metrics should be considered relevant for anomaly evaluation. This helps reduce noise and ensures monitoring systems focus on meaningful deviations.
According to digna, the combination of integrated analytics and scalable validation reflects increasing demand for platforms that support both data reliability and deeper understanding of data behavior across enterprise systems.
As data infrastructures continue to grow in complexity, enabling teams to analyze and validate data within the same environment is becoming critical for maintaining operational stability and supporting decision-making.











