Suped

Summary

Delivery error rate discrepancies between APIs and UIs arise from a combination of factors. APIs typically present real-time, raw data, whereas UIs often display aggregated, summarized, filtered, or cached data. These differences stem from variations in data processing pipelines, tracking methodologies, data aggregation methods, the handling of retries and bounces, timezone calculations, bot filtering, and the level of detail displayed. GPT breakdowns in UIs have also been noted as inaccurate. Key considerations include verifying data integrity and scaling for GPT graphs.

Key findings

  • Data Real-Time vs. Aggregated: APIs provide real-time or near real-time data, while UIs display aggregated or summarized data.
  • Data Processing Pipelines: Differences in processing pipelines contribute to variations in error rates.
  • Data Filtering/Classifications: UIs may apply filters or classifications, altering the error rates.
  • Data Caching: UI dashboards often use cached data, causing discrepancies.
  • Retry Handling: APIs may count initial failures, while UIs show final delivery status.
  • Bounce Definitions: Different definitions of bounces affect error rate calculations.
  • Timezone Calculations: Timezone differences can impact daily metric reporting.
  • Bot Filtering: UIs may filter bot traffic, affecting error rates.
  • GPT Dashboard Inaccuracies: GPT breakdowns can be widely inaccurate.
  • GPT Scaling Issues: API scaling discrepancies for GPT graphs need consideration.

Key considerations

  • API Raw Data Verification: Check API raw response for calculation errors.
  • Data Integrity Verification: Verify details of delivery error rates for alignment.
  • Data Source Appropriateness: Select the appropriate data source (API or UI) based on the use case.
  • Understanding Data Aggregation: Comprehend aggregation methods to reconcile discrepancies.
  • Code Correctness: If API data is used then ensure the code that pulls the data is correct.
  • GPT Analysis: Approximate, but not identically replicate, the GPT graphs from the API data for validity.

What email marketers say

12 marketer opinions

Discrepancies between API and UI delivery error rates stem from several factors. The UI often presents aggregated, summarized, or cached data, while the API provides more granular, real-time, and potentially unfiltered data. Variations in data processing pipelines, filtering of bot traffic, timezone calculations, the handling of retries and bounce classifications (hard vs. soft), and the use of different timeframes for calculations all contribute to these differences. It's also noted that dashboard inaccuracies can happen and the scaling of any GPT graphs needs to be considered.

Key opinions

  • Data Aggregation: The UI typically aggregates data, potentially hiding granular errors visible through the API.
  • Real-time vs. Cached Data: The API often provides real-time data, while the UI might use cached data, leading to inconsistencies.
  • Filtering and Calculations: The UI often applies filters (e.g., bot traffic) and calculations that are not present in the raw API data.
  • Retry Handling: The API may count initial delivery failures (even if successful on retry), while the UI reflects the final delivery status.
  • Bounce Classifications: Differences in defining and classifying bounces (hard vs. soft) between the API and UI can cause varying rates.
  • Timezone Calculations: Different timezone calculations (UTC vs. local) can impact daily metrics reported by the API and UI.
  • Dashboard inaccuracies: GPT dashboard readings for errors are often wildly inaccurate.
  • Scaling inaccuracies: API scaling discrepancies in the GPT graphs can be an issue.

Key considerations

  • Data Granularity: Understand the level of detail provided by both the API and the UI and how the data is aggregated in each.
  • Processing Pipelines: Be aware of the different processing pipelines used for API and UI data and how these might impact the reported metrics.
  • Data Definitions: Clarify how metrics like bounces and delivery errors are defined and calculated in both the API and the UI.
  • Data Latency: Consider the potential latency in UI data due to caching or batch processing.
  • Intended Use Case: Determine whether the API data should be used, and if it is the coding is correct. Consider GPT inaccurate breakdowns.

Marketer view

Email marketer from StackExchange notes that API endpoints may have different levels of data granularity compared to the UI. The UI might aggregate data to simplify presentation, which could mask specific errors visible in the API.

19 Jun 2022 - StackExchange

Marketer view

Email marketer from Quora shares that UI delivery metrics may be calculated using a different timeframe than API data. The UI might show a daily summary, while the API could report hourly or even more granular data.

31 May 2025 - Quora

What the experts say

2 expert opinions

Experts agree that inconsistencies in delivery error rates between APIs and UIs arise from differing data processing and reporting methods. APIs often present raw, real-time data, while UIs use processed, summarized, or filtered data, leading to variations in reported error rates. These differences stem from disparities in tracking methodologies, processing times, and data aggregation approaches.

Key opinions

  • Different Data States: APIs provide raw, real-time data, whereas UIs show processed and summarized data.
  • Tracking Methodologies: Differing tracking methods between APIs and UIs contribute to inconsistent error rates.
  • Data Filtering: UIs may apply filters or classifications to the raw data, altering the error rates reported compared to the API.
  • Processing Time: API's deliver almost real-time data, UIs can be delayed for processing.

Key considerations

  • Data Processing Awareness: Be aware of the data processing steps applied by the UI and how they differ from the raw data provided by the API.
  • Tracking Methodology Alignment: Understand the tracking methodologies used in both the API and the UI to reconcile differences in reported error rates.
  • Data Usage Context: Consider the intended use case for the data and choose the appropriate source (API or UI) based on the required level of detail and processing.

Expert view

Expert from Spam Resource explains that different definitions and tracking methodologies between the API and UI can lead to inconsistencies. The API might report raw data, while the UI applies filters or classifications that alter the error rates.

6 Apr 2025 - Spam Resource

Expert view

Expert from Word to the Wise, Laura Atkins, responds that API and UI discrepancies can arise from different processing times and data aggregation methods. The UI might reflect processed and summarized data, whereas the API may provide unprocessed, real-time data.

10 Jun 2025 - Word to the Wise

What the documentation says

5 technical articles

Email service provider documentation consistently indicates that discrepancies in delivery error rates between APIs and UIs stem from differences in data processing, aggregation, and reporting. APIs typically offer real-time or near real-time data reflecting immediate activity, whereas UIs often display aggregated, summarized, or delayed data. Factors like processing variations, caching, data sampling, and different levels of detail also contribute to these discrepancies.

Key findings

  • Real-time vs. Aggregated Data: APIs provide real-time or near real-time data, while UIs present aggregated or summarized data.
  • Processing Pipelines: Different processing pipelines for API and UI data contribute to variances in reported rates.
  • Data Sampling/Estimation: UI reporting may use data sampling or estimation for performance reasons, while API data reflects full records.
  • Level of Detail: UI dashboards and APIs can have different levels of detail and aggregation, leading to discrepancies.
  • Data Delay: UI reports might display data with slight delays, which are not present in the API data.

Key considerations

  • Data Latency: Acknowledge potential data latency in UI displays due to aggregation and processing.
  • Data Source Selection: Select the appropriate data source (API or UI) based on the required level of detail, real-time vs. aggregated views, and potential for estimation/sampling.
  • Consistency Checks: Regularly compare API and UI data to ensure consistency and identify potential issues with data processing or reporting.
  • Understand Aggregation Methods: Be aware of the aggregation methods used for data presented in the UI and their potential impact on reported error rates.

Technical article

Documentation from Postmark details that UI reporting could be subject to data sampling or estimation for performance reasons, whilst API data represents full records. Sampling can cause minor differences in delivery error rates.

8 Apr 2025 - Postmark

Technical article

Documentation from Mailgun explains that differences can arise due to the API providing real-time data while the UI might display aggregated or slightly delayed data. Processing variations and caching mechanisms can also contribute.

30 Oct 2021 - Mailgun

Start improving your email deliverability today

Sign up