The NDX Quality Assurance Model begins with our intuitive database design. Structured for accuracy and consistency in an ever-evolving environment of publication changes, targeting rule updates and ZIP-level population shifts, Source data is normalized and the database maintains integrated activity and event tracking logs.
Our proprietary system and software controls provide standardized templates to insure data compatibility and conformity with thousands of publications throughout the country. Integrity is protected through various restrictions blocking input of invalid data, and warning flags for non-conforming data are automated.
Continuous database scanning provides alerts for data exceptions and anomalies, such as large circulation variances or aged data, and provides real-time reporting of issues to NDX analysts.
Controls capture and isolate exceptions; trained NDX analysts then investigate and rapidly resolve inconsistencies, running automated tests to study data variances and trends or directly contacting original data provider for validation.