Lightup Announces Beta Program for Breakthrough Data Quality Monitoring Solution to Make Data Decisions and Applications Dependable

0 278

Lightup, developers of a breakthrough data quality monitoring solution, today announced the launch of its beta program to address the growing problem of proactively detecting and identifying issues with data that can have a devastating impact on data-driven application behavior and decisions. Lightup, which is backed by Andreesen Horowitz, is a developer-first solution that can be up and running in minutes, providing organizations with an ideal solution for ensuring data quality for SQL data stores such as Snowflake and Databricks and streaming data sources including Kafka and Segment.

Marketing Technology News: AUDIENCEX Ranks #80 On Financial Times List Of Fastest-Growing Companies In The Americas

A data outage can hurt user experience as much as an IT failure or a software bug. It can strike an internal input data interface like a feature feed for an ML model, a third-party API endpoint such as a payment gateway, an output going straight to the user like a price quote, or a data stream like customer events going to an internal decision-maker.  These problems in data pipelines escape detection by IT infrastructure monitoring (ITIM) and application performance monitoring (APM) alarms and scans as these tools are only monitoring the health of the infrastructure or application endpoints. They are not monitoring the health of what is inside the infrastructure — the data the pipes are carrying. This results in good pipes, bad data.

Lightup is designed for data engineers and analytics engineers building and operating ETL and ELT pipelines. Unlike open-source data quality monitoring technologies that require developers to take on even more programming responsibilities and enterprise approaches that require executive sponsorship and significant upfront investment, Lightup is a developer-first, pay-as-you go SaaS solution. It is available in several tiers, including a free option so users can try it before making a commitment.

Lightup continuously tracks the data going into and coming out of a software product to detect significant changes that are indicative of degradation in data quality. Key data quality metrics such as data availability, data delay, and data volume indicators are built-in and start showing up instantly. In addition, the Lightup platform enables hands-off monitoring of data quality by automatically learning the normal behavior of those data quality metrics from past patterns and using that as a baseline to detect issues. This enables low-touch, fine-grained monitoring of data health that significantly cuts down the delay in detecting issues.

“While it is well understood that data is the oxygen that fuels every application and process in an organization, companies are flying blind when it comes to understanding the health of data driving their applications,” said Manu Bansal, co-founder, and CEO, Lightup. “With the data volumes, high cardinality, and complex data flows that we are all dealing with today, it is easy to end up with bad data in the pipeline. Lightup’s data quality monitoring solution provides data teams with a crystal clear understanding of the health and quality of the data fueling their applications. This ensures that data outages don’t silently turn into broken applications that can have a devastating impact on a company’s performance and bottom line.”

Marketing Technology News: NUMATEC Launches Data Driven Omni-Channel Buy-Side Media Company, EKN

“Before we started using Lightup, we didn’t know the extent of our data quality issues, but with the Lightup SaaS solution, we identified and fixed problems with our data pipeline that we had been suffering from for months.  We now have a lot more faith in our data, and we can focus our time on extending our core reporting and analysis capabilities and spend much less time on fixing data quality issues,” said Alex Dovenmuehle, co-founder, Big Time Data

The Lightup platform is designed specifically for the purpose of catching data outages:

  • Architecture: Lightup hooks right up to a company’s data warehouse where the data already lives. There is no need to publish data out with an agent, and the system scales to data volumes as a byproduct of warehousing.
  • Data quality checks: Lightup provides ready-made data quality metrics for common outage symptoms. Visibility begins the moment warehouse credentials are plugged into Lightup.
  • Accurate real-time alerting: Lightup surfaces data outages instantaneously and accurately, focusing the developer’s attention where it’s needed the most.

The Lightup platform integrates seamlessly with data warehouses including BigQuery, Snowflake, Databricks, and Redshift that are used as destinations for product data. It can also run data quality checks on data flowing through an ingestion system including Segment, Rudder, and Kafka. It connects directly to a company’s data sources to instantly alert data teams via Slack, PagerDuty, email, or via API when data breaks. The platform is enterprise-ready with support for both SaaS and on-premise (private cloud or customer data center) deployment models.

Marketing Technology News: Insurity Selects Larsen & Toubro Infotech as the Strategic Partner and System Integrator for…

Leave A Reply

Your email address will not be published.