Platform

A product architecture for healthcare mapping teams that need speed without losing review control.

This page is the technical narrative: how MedMapper moves from source metadata to validated outputs with evidence, workflow gates, and delivery paths that enterprise buyers can understand.

MEDMAPPER workspace / mapping review

Domains

Patient
Encounter
Diagnosis
Lab
Target ColumnSource ExpressionConfidenceStatus
person_idPATIENT.patient_id0.98approved
gender_concept_idPATIENT.gender_code0.91approved
visit_occurrence_idENCOUNTER.visit_id0.86review
condition_source_valueDX.diagnosis_code0.95approved
Schema discoveryJoin inferenceReview queueValidationGitHub publish

Evidence panel

DX.diagnosis_code -> condition_source_value

Matched target semantics, vocabulary hints, and approved-pattern support inside the review policy.

Name match 0.93Pattern boost 0.88Validation ready
Deployment mode

SaaS or customer-deployed data plane with the same workspace flow and approval model.

Pipeline story

Each platform stage should feel like a product capability, not a backend checklist.

The layout is intentionally sequential so visitors understand the system quickly.

01

Connect and profile

Connect Snowflake, SQL Server, and enterprise data sources, then ground the platform in the actual schema.

02

Infer joins and domains

Use deterministic evidence and guarded AI reasoning to classify domains and propose join paths.

03

Review mappings

Approve mappings in a grid-first workbench with evidence panels, confidence states, and reviewer controls.

04

Validate outputs

Run validation, inspect lineage, and verify table readiness before shipping any generated artifacts.

05

Publish downstream

Deliver SQL, dbt-oriented outputs, and GitHub publish workflows once the project is ready for export.

Key surfaces

The website should show the platform through the actual review and delivery surfaces.

Use polished interface mockups here later, but keep the copy anchored in the real product structure now.

Join evidence

Evidence before approval

Use overlap score, type compatibility, and critic checks to make join confidence understandable instead of magical.

Lineage and validation

Trace every target path

Follow how source tables, transforms, and target models connect before the team trusts generated outputs.

Delivery

Approved-table outputs only

Move ready tables into SQL, dbt-oriented artifacts, and GitHub workflows while blocked tables stay visible and gated.

Workbench

Mapping Workbench

Spreadsheet-grade review surface for source expressions, transform types, confidence bands, and bulk approvals.

Inference

Join Evidence Viewer

Inspect overlap scores, type compatibility, domain plausibility, and critic warnings before trusting join paths.

Traceability

Lineage and validation

Trace source-to-target paths and move from proposed mappings to export-ready artifacts with explicit validation gates.

Delivery

SQL, dbt, and GitHub publish

Generate approved-table outputs and push deterministic bundles into a governed delivery workflow.

Explainable AI posture

Lead with evidence, confidence scoring, and human review. Avoid language that implies unsupervised automation.

Enterprise deployment flexibility

Show that the same workspace can support SaaS tenants and customer-deployed data planes without changing the product story.

Governed delivery

Connect readiness gates, validation, SQL/dbt outputs, and GitHub publish into one coherent downstream narrative.

Platform walkthrough

See the architecture, workflow, and review controls in a product demo.

Book a demo