Accredited Trust and Fiduciary Advisor

BEN Review Engine
Architecture & FAQ

How the ATFA CE review pipeline works — the moving parts, the tech stack, and answers to common questions from the working team.

Document Status Draft — Internal
Version v0.1
Author Ben Hopf
I.

Overview

This page documents the technical architecture behind the BEN Review Engine — the process system that powers the ATFA annual CE credit review cycle. It's intended as a working reference for the Director and CE Review Committee, not a final spec.

BEN handles the systematic parts of the review: ingesting submissions, classifying programs against a tiered database, researching unknown programs, generating recommendation reports, updating records after approval, and sending outbound communications. The judgment calls stay with the Director and Committee.

New here? Start with the CE Review Process Working Plan at memo.atfacertification.com — it covers the what and why. This page covers the how.
II.

Pipeline Flow

INTAKE BEN ENGINE REVIEW OUTPUT CE App Holder submits credits CSV Export Bulk pull from CE App Google Sheet Submissions staging area BEN REVIEW ENGINE Ingest Parse + tier classify Match + Clean Check known programs DB Research Web + social sweep (T4) Known Programs Database Rec. Report Color-coded Google Sheet Director Review + annotate Committee Async via comments Final Decisions Committee → BEN DB Update CE records + T4 → T3 Outbound Comms Holder notices + reminders T4 → T3 loop PUBLISHED VIEWS — atfacertification.com memo. database. architecture. + future

The review cycle moves through four layers. BEN is active in two of them — the preliminary analysis and the post-approval output — with human review in between.

1
Intake  Certificate Holder → CE App → Google Sheet
Certificate holders submit CE credits via CE App. The Director exports submissions as a CSV and loads them into the staging Google Sheet. This step remains manual until CE App's API capabilities are confirmed.
2
BEN Review  BEN Review Engine
BEN ingests the staging sheet, classifies each submission into Tiers 1–4 against the Known Programs database, cleans and flags data errors, researches Tier 4 unknowns via web and social media sweep, and produces a color-coded recommendation report.
3
Human Review  ATFA Director → CE Review Committee
The Director reviews BEN's recommendation report, may override or annotate, and shares the finalized report with the Committee via Google Sheets (View + Comment access). Committee members review asynchronously, using comments as the audit trail for governance.
4
Output  BEN Review Engine
BEN receives final Committee decisions, updates CE records with approval statuses, auto-promotes any newly-approved Tier 4 programs into the Tier 3 Known database, and sends outbound communications to certificate holders. The feedback loop compounds over time — each cycle reduces Unknown volume.
III.

Tech Stack — v1

The v1 stack is deliberately simple — proven tools, low overhead, easy to hand off. Each layer can be upgraded independently as the process matures.

Intake Layer
CE App + Google Sheets
CE App handles holder-facing submission. CSV export → Google Sheet is the staging area. API integration pending support team response — could eliminate the manual export step.
BEN Engine Layer
Claude (Anthropic) + Web Search
Claude with web search enabled runs the classification, matching, cleanup, and Tier 4 research. Operates via a purpose-configured Claude Project with the Known Programs database in persistent context.
Review Layer
Google Sheets (shared)
BEN's color-coded recommendation report lives in Google Sheets. Director and Committee get View + Comment access. Comments serve as the official async decision record.
Output Layer
Google Sheets + Gmail
CE records updated in Google Sheets after approval. Outbound holder communications via Gmail (Director-reviewed drafts in Year 1, fully automated in future state). Published views via atfacertification.com subdomains.
Known Programs Database: Lives as a dedicated tab (or separate file) in Google Sheets. BEN reads it on every run. The Director is the only operator who edits it directly. A read-only HTML view will be published at database.atfacertification.com for Committee reference.
IV.

Frequently Asked Questions

Questions from the working team, answered. This section grows as new questions come in.

How does BEN research unknown programs? +

When a CE submission doesn't match anything in the Known Programs database (Tier 4 — Unknown), BEN runs a structured research sweep before flagging it for human review. The goal is to exhaust the automated research so the Director and Committee are spending their time on genuine judgment calls, not Googling.

The sweep works in layers:

  • Layer 1 — Direct web search: BEN searches for the program/event name, year, and location. It's looking for an official event website, registration page, agenda PDF, press coverage, or a sponsoring organization's page. If it finds an official agenda, it cross-references session titles, speakers, hours, and dates against what the holder submitted.
  • Layer 2 — LinkedIn: Conference organizers and speakers almost always post about their events. Attendees post recaps. This surfaces legitimacy signals even when there's no formal website.
  • Layer 3 — YouTube: Many industry conferences post recorded sessions. If BEN finds a recording of a claimed session, that's strong corroboration — and it can pull title, speaker, and approximate length from video metadata.
  • Layer 4 — Facebook / other social: Less reliable, but useful for smaller regional events. Event pages, attendee check-ins, and post-event recap posts can confirm an event existed even without a formal web presence.

BEN doesn't make the approval decision — it produces a research summary card for each Unknown submission with whatever it found: source URLs, event description, agenda match confidence, and a plain-language note. The Director and Committee make the final call.

After each cycle, every newly-approved Tier 4 program is automatically promoted to Tier 3 (Known), so the Unknown pool shrinks year over year.

How does the tier classification system work? +

Every CE submission gets classified into one of four tiers. The tier determines how BEN handles the submission and how much human review it needs.

  • Tier 1 — Auto-ApprovedATFA-affiliated programs (TAF, TAI). BEN recommends approval automatically. No individual Committee review required — ratified in bulk.
  • Tier 2 — ReputableNationally recognized industry organizations (ABA, Cannon Financial Institute, FIRMA). BEN matches the submission against known session metadata and recommends approval when verified.
  • Tier 3 — KnownPrograms previously approved by ATFA in prior cycles. BEN matches against historical records and recommends approval. This database grows automatically after every cycle.
  • Tier 4 — UnknownNot in any existing database. BEN runs the web research sweep and produces a summary card. Director and Committee review individually.

Classification is based on program name matching — exact match first, then fuzzy match for typos and variations. BEN flags low-confidence matches for Director review rather than auto-classifying incorrectly.

What is the tech stack, and why was it chosen? +

The v1 stack was chosen for simplicity, low cost, and ease of handoff — not for technical elegance. The goal is a working system in Year 1, not an over-engineered one.

  • CE App (myceapp.com): Retained for holder-facing submission. Has bulk CSV export. API capabilities are being assessed — if available, the manual export step could be eliminated.
  • Google Sheets: Serves three roles: staging area for submissions, the Known Programs database, and the Committee review surface. Chosen because it's already familiar, supports View+Comment sharing, and creates an audit trail via comments.
  • Claude (Anthropic) with web search: Powers the BEN Review Engine. Classification logic, data cleanup, Tier 4 research, and recommendation report generation all run through Claude. Configured as a persistent Claude Project so the Known Programs database doesn't need to be re-uploaded each run.
  • Gmail: Outbound holder communications in Year 1 are Director-reviewed drafts. Future state: fully automated sends triggered by BEN after Committee approval.
  • Netlify + atfacertification.com subdomains: Static HTML pages for internal documentation and published read-only views of key data. No CMS, no backend — just files.

Each layer can be upgraded independently. The Google Sheets database could migrate to Airtable or Supabase. The CE App dependency could be replaced with a lightweight custom intake form. None of those upgrades require rebuilding the others.

How does the Committee review process work, exactly? +

The Committee review is designed to be asynchronous — no meeting required unless a submission warrants real discussion.

  • What Committee members receive: A View + Comment link to a Google Sheet containing BEN's recommendation report. The sheet is color-coded: Tier 1 auto-approvals are pre-marked, verified Tier 2/3 matches have a recommended approval status, and Tier 4 unknowns have BEN's research summary card attached.
  • How to respond: Committee members add a comment to any row where they want to flag a concern, ask a question, or formally approve/deny. No edits to the sheet itself — comments only.
  • What constitutes approval: A formal approval decision is recorded when the Director confirms consensus from the Committee comments. The Director has final authority to close any open items.
  • When a call is needed: If a submission is genuinely contested — unusual program, disputed hours, credibility question — the Director escalates to a brief synchronous call. These should be rare.
  • The audit trail: All Committee comments are timestamped and attributed in Google Sheets. This creates a defensible governance record without any additional documentation overhead.

The Committee's workload should decrease each year as the Known Programs database grows and the Tier 4 Unknown pool shrinks.

Working Documents

This is one of four internal working documents covering the ATFA CE review process. Each page serves a distinct purpose — use the links below to navigate the full system.

Working Plan CE Review Process — Working Plan memo.atfacertification.com Technical Architecture BEN Engine Architecture & FAQ architecture.atfacertification.com System Prompt BEN Review Engine — System Prompt ben.atfacertification.com Policy & Decisions CE Review Process & Policy process.atfacertification.com