Why NBCE’s Data Integrity Problem Is Everyone’s Problem: The Numbers That License a Profession:
Originally published: 2025-08-29
“If the data that drives accreditation and licensure is wrong, or unverifiable, then the decisions built on it cannot be trusted.”
Why this matters (to students, schools, and the public)
NBCE isn’t just another vendor. Its exams and institutional reports are the metric stick for CCE Policy 56 compliance (program pass-rate thresholds), accreditation pressure and oversight, graduate licensure trajectories, and the public-safety narrative the profession presents to lawmakers. That means NBCE’s data architecture, how it defines cohorts, calculates pass rates, and validates all of its outcomes, has real-world consequences for every new DC, every college, and every patient.
After all the NBCE, Norman Ouzts and Karlos Boghosian and the rest of the folks that have a “seat at the table” publicly claim that the NBCE protects the public and ensures those that pass its tests are not a risk to public health. The question is: “Where’s the Beef”?
A pattern you can’t ignore
Opaque or missing outcomes data
When asked to substantiate marketing claims with basic outcomes metrics (predictive validity, retake trajectories, time-to-license), NBCE has not produced the goods. If outcomes exist, publish them with methods. If they don’t, stop claiming them.
CLICK HERE for a copy of that story
“If you can’t show your work, you can’t ask the profession—or the public—to believe your conclusions.”
The Chiropractic Freedom Coalition requested a full accountability package from NBCE including: long-run pass-rate histories and trendlines; evidence linking NBCE performance to academic performance before and during chiropractic college; real-world practice outcomes, patient satisfaction, malpractice frequency/severity, career longevity, and safety records benchmarked both within chiropractic (passers vs. non-passers) and against other health professions.
It sought a transparent cost breakdown for exam development, administration, grading; graduate employment data (rates, roles, field alignment); patient outcome studies tied to licensure status; and broad stakeholder feedback (schools, doctors, patients) on exam relevance and public-safety value. Finally, it asked for NBCE’s analysis of alternative assessment methods, a rigorous cost-benefit justification for the exams, and detailed EBAS disclosures, how student-loan dollars flow to EBAS, and who its board members and principals are.
The NBCE produced none of this. Yet they consistently make marketing claims related to these data. If the NBCE were a practitioner making clinical claims without evidence their licensed would be sanctioned yet state boards ignore this lack of data to substantiate their claims.
A compromised foundation: the Practice Analysis
Independent reviewers have raised red flags about NBCE’s Practice Analysis inputs and methods (sampling, weighting, instrument design). That document seeds exam blueprints and, ultimately, what students are taught and tested on. If the foundation is biased or brittle, downstream scores, and the high-stakes policies attached to them, inherit the flaw.
In reality, the NBCE’s 2025 Practice Analysis is a convenience sample dressed up as a census: 3,876 respondents, about five percent of an estimated 77,000 chiropractors, recruited largely via association e-blasts and social posts, not a probability draw. The sampling frame counted licenses, not people, inflating denominators where many DCs hold multiple licenses and warping state quotas (as low as three responses in small jurisdictions, error bars >50%). The resulting respondent pool skews older, whiter, more male, and full-time (part-time/<20hrs were sidelined), underrepresenting Gen Z, women, minority DCs, locums, academics, and semi-retirees. Post-hoc weighting on license counts can’t recover missing voices or fix self-selection.
The flaws in this survey, never mind the claims based on it, could be used in a course on flaws in the analysis and design of research, yet the profession is hanging its hat on it.
CLICK HERE for more on this story
LIFE University: a concrete, public example
Here’s an actual example of harm caused by the NBCE data and its ties to Policy 56 and the CCE. Life University has publicly faced Policy 56 compliance problems. In response, Life adopted a “no boards, no degree” policy, students must now pass all four NBCE parts prior to graduation. Life’s own analysis indicates many students ultimately pass at high rates, but some take longer than six months after graduation; under Policy 56’s six-month clock, those later passes don’t count toward the metric that drives accreditation pressure. In other words, timing, not competence, becomes the issue, yet Policy 56 treats the timing artifact as a performance failure.
“When a six-month stopwatch outweighs long-run competence, the metric becomes the mission.”
Centralization math that doesn’t add up
NBCE’s modeling around centralized Part IV has already shown material misses (e.g., travel-cost assumptions and access impacts). If the arithmetic for logistics is off, why should we assume the psychometrics and reporting are right?
NBCE’s Part IV centralization “reporting” substitutes PR for proof. They held a June 23 webinar which muted dissent (no Q&A), offered no pilot outcomes, and buried real costs by cherry-picking travel figures while shifting expenses onto students. Claims of “inclusive” stakeholder input ring hollow given many schools said “no” to centralization, yet the plan marched on. What follows now is a legal and logistical gamble: new format, vague refunds, no documented rulemaking at the state level. If NBCE can’t handle this change transparently and competently, why should boards, schools, or students trust anything else they’re doing?
EBAS secrecy under the NBCE umbrella
NBCE’s relationship to its for-profit EBAS arm raises basic transparency questions at the heart of data integrity and public trust. Who runs EBAS? How are decisions made? How much non-profit money, staff time, or infrastructure cross-subsidizes for-profit activities, and who benefits? Who are the Board of Directors? The employees? When the governance, financial flows, and oversight for a revenue-generating affiliate are opaque, it’s reasonable to ask whether incentives are aligned with validity, reliability, and public protection, or with volume and revenue. If an organization won’t be transparent about who’s in charge and where the money goes, why should stakeholders simply accept unverifiable claims about its data, methods, or exam design? Especially when there is ample evidence of problems.
“Opacity around money and control breeds opacity around methods and results.”
The Summit Group: selective access, selective influence
NBCE’s participation in the closed-door Summit Group compounds the trust problem. Not every school or organization gets a seat at that table, but those who do can shape the agenda, messaging, and expectations the rest of the field is later asked to live under. What concrete advantages, informational or political, accrue to invitees?
How do those advantages cascade into accreditation narratives, licensure expectations, and resource flows? If a handful of insiders can gatekeep the conversation, then metrics and policies promoted as “consensus” may actually be club consensus, not professional consensus. That’s not a solid foundation for high-stakes governance.
If NBCE is willing to shape policy in closed rooms where many of the very schools and boards that must rely on its data aren’t allowed in, why should anyone trust the data or the policies that emerge? When access is selective, outcomes are suspect, because the facts that “win” are often the ones already inside the room. High-stakes governance can’t rest on club conversations and private briefings; if they won’t open the doors, we shouldn’t open our trust.
“When the room is closed, the data that ‘wins’ is often the data that’s already in the room.”
What else could possibly go wrong by relying on NBCE data?
So far we’ve reviewed documented instances of how NBCE’s data integrity, data handling and marketing claims don’t jive - something is really off. But what about other possibilities that may not have been brought to light yet. If we already know that the NBCE is already hiding data and pushing out false information based on flawed methods, operating in secret meetings, and hiding the activities of its for profit venture - then what else could they be hiding? What other data integrity problems could exist that the profession just doesn’t know about - yet?
For example if the reports NBCE sends to schools, the CCE and state regulatory boards rely on bad data such as unverified inputs, for example, an examinee’s anticipated graduation date rather than a registrar-verified date, then predictable failures follow: cohort drift (graduates assigned to the wrong year), false inclusions (non-graduates inside graduate cohorts), and omissions (verified graduates left out). Even modest cohorting errors would inflate or deflate pass rates, distorting how a program appears under Policy 56 and triggering misplaced accreditation pressure.
Now extend that logic to what downstream actors actually do with these numbers:
Schools: If a college relied on a flawed cohort report to prepare its Policy 56 submission, the school’s compliance posture could be misstated.
State boards: If a board formally or informally relied on the same flawed aggregates (or the expectations they create), two catastrophic outcomes follow:
False positives: Applicants who should not have been licensed may have been waved through because program/aggregate data looked compliant.
False negatives: Qualified applicants were delayed or denied, because their program’s pass rate was artificially suppressed by miscohorting noise.
And what about CCE? Policy 56 is predicated on NBCE pass rates within six months of graduation. If CCE relies on NBCE-supplied aggregates (rather than school-verified cohorts paired with official individual score reports) to monitor compliance, then CCE’s enforcement is only as sound as NBCE’s black-box methods. What data do NBCE and CCE actually share, and is that data publicly available with methods, error rates, and version history? If not, the profession is being regulated by opaque numbers that outsiders cannot audit.
It’s Never a Problem Until its Problem
This is not just an academic quibble, it’s a state-level scandal.
State Boards have a non-delegable duty to safeguard the public. If they outsourced practical gatekeeping to NBCE (and, indirectly, to CCE metrics) without demanding transparent methodology and verification, they outsourced accountability too. For all the NBCE hype about “protecting the public,” reliance on unverifiable, error-prone aggregates may have increased public risk, and we already have evidence of NBCE data problems in other contexts (outcomes opacity, practice-analysis defects, modeling errors), which only amplifies the concern.
“If the cohort is wrong, the conclusion is wrong , and when boards bet public safety on wrong conclusions, everyone loses.”
Rules for Thee and Not for Me
In modern science, “trust me” died a long time ago. Journals now expect (and funders increasingly require) data availability, code sharing, preregistration, and independent replication, because conclusions without underlying data are opinions, not evidence.
The same standard should apply, more so, when NBCE, CCE, FCLB, and state boards claim that NBCE exams “protect the public.” That is a public-health assertion, and public-health claims demand regulatory-grade transparency.
Transparency here isn’t complicated: publish the de-identified microdata and codebooks behind institutional reports; disclose cohort definitions, weighting, error rates, and version histories; open the Practice Analysis sampling frame, response rates, and raw tables; release outcomes datasets (time-to-license, retakes, predictive validity); and subject everything to independent audit with public findings. If a drug maker must open its protocols and datasets to justify safety, a testing monopoly that gates licensure and purports to protect public health should do no less. Until the numbers, methods, and incentives are visible, and verifiable, the system remains a black box. And a black box is not a basis for safeguarding the public; it’s a recipe for manufactured consensus and misplaced trust.
What transparency looks like (and the profession should demand)
Real transparency starts with publishing the methodology including a full, version-controlled data dictionary detailing cohorting rules, verification steps, error-rate studies, and change logs. Cohorts must be built on school-verified, registrar-certified graduation data, not examinee anecdotes or proxies. NBCE’s public claims should be backed by de-identified outcomes datasets (predictive validity, retake pathways, time-to-license) because, without outcomes, it didn’t happen. A truly independent third-party audit must scrutinize institutional reporting, the Practice Analysis, and any affiliate relationships (e.g., EBAS), with findings released publicly. Every aggregate report should include a plain-English limitations section, and when errors are discovered, NBCE must issue visible, versioned corrections.
Finally, because NBCE’s authority flows from state mandates, state boards should require this transparency as a condition of reliance and make clear that licensure decisions rest on individual official scores and registrar-verified graduation, not on unvetted “courtesy” aggregates.
What schools and students can do right now
Verify cohorts in-house. Build Policy 56 submissions on registrar-verified rosters + official individual score reports.
Ask for the methods in writing. If a report affects accreditation or licensure posture, request the cohorting rules, data sources, and disclaimers.
Document discrepancies. Keep a paper trail of corrections and reissues; patterns matter.
Engage your board. State boards should be partners in getting the data right — that’s public safety, not politics.
“Data integrity isn’t a technicality. It’s the difference between a fair system and a rigged one.”
Bottom line: NBCE’s data touch every high-stakes decision in chiropractic, accreditation, licensure, professional credibility. Until methodology is transparent, cohorts are school-verified, outcomes are published, and affiliate structures are disclosed, the profession should treat NBCE’s institutional reporting as unreliable by default and act accordingly.

