Skip to main content
If your team spends time manually reviewing documents — checking fields, validating data, screening names — Omni can automate much of that work. This guide helps you think through how to convert your existing Standard Operating Procedures (SOPs) into Omni workflows.

Identifying Automation Opportunities

Not every process is a good fit for automation. Look for these signs that your manual review process is ready for Omni:
SignalWhy It Matters
Repetitive checksThe same verification steps are applied to every document
Consistent criteriaPass/fail decisions follow defined rules, not subjective judgment
High volumeYour team processes dozens or hundreds of documents per day/week
Human error riskFatigue or inconsistency leads to missed issues
Structured inputsDocuments follow a predictable format (invoices, registration certificates, forms)

Common Processes That Omni Can Automate

  • Invoice verification — Checking amounts, dates, vendor details, line item totals
  • Vendor onboarding — Validating business registration documents and screening against AML watchlists
  • Compliance form review — Verifying required fields are present and valid
  • Business registration validation — Extracting and cross-checking company details
  • Contract review — Checking key terms, dates, and party information
  • Employee document checks — Validating submitted certifications, licenses, or identification documents
Omni works best with processes that have clear, documentable rules. If your review process relies heavily on subjective judgment or institutional knowledge that cannot be written down, it may need to be partially automated rather than fully automated.

Mapping Your SOP to an Omni Workflow

Follow these five steps to convert a manual process into an Omni workflow.
1

Document Your Current Manual Process

Before building anything in Omni, write down exactly how your team currently handles the review:
  • Who performs the review?
  • What documents do they receive?
  • What do they check on each document?
  • What decisions do they make (approve, reject, escalate)?
  • Where do results go after the review?
This becomes your baseline. You will translate each of these into Omni configuration.
2

Identify the Input Documents

List every document type that gets submitted for review. Be specific:
  • Is it a PDF invoice? A scanned business registration certificate? A spreadsheet?
  • Are there multiple documents per case, or just one?
  • Do documents need to be cross-referenced against each other?
Omni supports JPG, PNG, PDF, TXT, DOC, DOCX, XLS, XLSX, BMP, TIFF, WEBP, CSV, HTML, and MD. Up to 5 items per folder. If a single case involves more than 5 documents, group related documents into multiple folders within the same profile.
3

Define Pass/Fail Criteria

For each check your team performs, write down the exact criteria:
  • What makes a document acceptable? (e.g., “All required fields are present and the total matches line items”)
  • What makes a document rejected? (e.g., “Invoice date is in the future” or “Business registration is expired”)
  • What triggers escalation to a senior reviewer? (e.g., “AML screening returns a match” or “Key fields are unreadable”)
These criteria become the decision logic in your Omni policy.
4

Write It as a Natural Language Policy

Combine the checks and criteria from the previous steps into a structured policy statement. Use numbered steps and explicit language.Template:
Review the submitted [document type] by performing the following checks:
1. Extract [list of fields] from the document
2. Verify that [specific condition]
3. Check that [another condition]
4. [Additional verification steps]

Decision criteria:
- APPROVE if all checks pass and all required fields are present
- REJECT if [specific failure conditions]
- FLAG for manual review if [ambiguous conditions]
See the Policy Writing Guide for detailed examples and best practices.
5

Define What Data You Need Extracted

Decide what structured data your downstream systems need from each review. This becomes your output schema.Ask yourself:
  • What fields does your backend system expect?
  • Do you need the raw extracted values, or just the pass/fail result?
  • Should the output include reasons for the decision?
Design the JSON output schema to match these requirements exactly.

Real-World Examples

These examples show how manual review processes translate into Omni workflows.

Example 1: Invoice Review

A finance team member opens each invoice PDF, manually checks:
  • Vendor name and invoice number are present
  • Invoice date is reasonable (not in the future, not older than 90 days)
  • Line item amounts add up to the stated total
  • Tax calculations are correct
  • No duplicate invoice numbers in the system
Time per invoice: 5-10 minutes. Error rate: ~3% (missed calculation errors, overlooked duplicates).
Policy:
Verify the submitted invoice document by:
1. Extracting vendor name, invoice number, invoice date, line items, subtotal, tax, and total amount
2. Verifying all required fields are present and non-empty
3. Validating that the invoice date is not in the future and not older than 90 days
4. Checking that the sum of line item amounts equals the stated subtotal
5. Verifying that tax is calculated correctly based on the subtotal
6. Approve if all checks pass; reject if amounts do not match or required fields are missing; flag for manual review if date is borderline
Engines: Text Verifier - Glove (extract and validate all fields)Output schema:
{
  "invoice": {
    "vendorName": "string",
    "invoiceNumber": "string",
    "invoiceDate": "string",
    "lineItems": ["string"],
    "subtotal": "number",
    "tax": "number",
    "totalAmount": "number"
  },
  "validation": {
    "allFieldsPresent": "boolean",
    "dateValid": "boolean",
    "amountsMatch": "boolean",
    "taxCorrect": "boolean"
  },
  "decision": {
    "result": "APPROVE | REJECT | MANUAL_REVIEW",
    "verificationStatus": "pending_review | approved | rejected",
    "reasons": ["string"]
  }
}
Result: Review time drops from 5-10 minutes to seconds for straightforward invoices. The team only handles flagged cases manually.

Example 2: Vendor Onboarding

When a new vendor is onboarded, a compliance officer:
  • Reviews the business registration certificate for company name, registration number, and incorporation date
  • Verifies the representative’s identity document
  • Manually searches the representative’s name in AML/sanctions databases
  • Cross-checks company details across documents
  • Records the results in a spreadsheet
Time per vendor: 20-30 minutes. Bottleneck: AML search is slow and manual lookup is error-prone.
Policy:
Verify vendor onboarding documents by:
1. Extracting company name, registration number, incorporation date, and representative name from the business registration certificate
2. Screening the representative's name against AML/sanctions watchlists
3. Cross-validating the representative name between the business registration and any submitted identity documents
4. Checking that the business registration is not expired
5. Approve if no AML matches found, all fields are consistent, and registration is valid; reject if AML screening returns a high-risk match; flag for manual review if AML returns a partial match or fields are inconsistent
Engines: Text Verifier - Glove + AML Search - PersonOutput schema:
{
  "company": {
    "name": "string",
    "registrationNumber": "string",
    "incorporationDate": "string",
    "representativeName": "string"
  },
  "amlScreening": {
    "screened": "boolean",
    "matchFound": "boolean",
    "riskLevel": "string",
    "matchDetails": "string"
  },
  "validation": {
    "fieldsConsistent": "boolean",
    "registrationValid": "boolean"
  },
  "decision": {
    "result": "APPROVE | REJECT | MANUAL_REVIEW",
    "verificationStatus": "pending_review | approved | rejected",
    "reasons": ["string"]
  }
}
Result: AML screening is automated and instant. Clean cases are auto-approved. The compliance officer focuses only on flagged vendors.

Example 3: Compliance Document Check

A compliance team reviews submitted regulatory documents:
  • Checks that all required sections are present (header, signature, dates, license numbers)
  • Verifies expiration dates are in the future
  • Confirms the document is addressed to the correct entity
  • Validates that reference numbers match internal records
Time per document: 10-15 minutes. Risk: Expired documents occasionally slip through during high-volume periods.
Policy:
Review the submitted compliance document by:
1. Extracting the document type, issuing authority, issue date, expiration date, license number, and entity name
2. Verifying that all required sections are present: header, body, signature block, and dates
3. Checking that the expiration date is in the future
4. Validating that the entity name matches the expected entity
5. Approve if all sections are present, the document is not expired, and entity names match; reject if the document is expired or missing required sections; flag for manual review if the entity name is a partial match
Engines: Text Verifier - Glove (validate completeness and field correctness)Output schema:
{
  "document": {
    "type": "string",
    "issuingAuthority": "string",
    "issueDate": "string",
    "expirationDate": "string",
    "licenseNumber": "string",
    "entityName": "string"
  },
  "validation": {
    "allSectionsPresent": "boolean",
    "notExpired": "boolean",
    "entityNameMatch": "boolean"
  },
  "decision": {
    "result": "APPROVE | REJECT | MANUAL_REVIEW",
    "verificationStatus": "pending_review | approved | rejected",
    "reasons": ["string"]
  }
}
Result: Expired documents are caught automatically. Reviewers only handle edge cases where names partially match or sections are ambiguous.

Designing Effective Output Schemas

Your output schema determines what structured data you get back from each analysis. Design it with your downstream systems in mind. Every output schema should include these three blocks:
BlockPurposeExample Fields
Extracted dataRaw values pulled from documentsvendorName, invoiceNumber, registrationDate
Validation resultsPer-check pass/fail with reasonsamountsMatch, dateValid, fieldsConsistent
DecisionFinal verdict and routing stateresult, verificationStatus, reasons
If you omit the decision block, you will need to write your own decision logic based on the raw validation results. Including a decision block lets the AI agent make the final call based on your policy criteria.

Routing with verificationStatus

Each analysis exposes verificationStatus: approved, pending_review, or rejected. Branch your integrations on that enum.
verificationStatusRecommended action
approvedAuto-accept and record in your systems
pending_reviewSend to a human queue with AI findings and extracted data
rejectedDeny or close the case per your policy
This approach means Omni does not replace your review team entirely. It clears straight-through cases and sends ambiguous or failed ones to the right queue with context.
Track how often each status appears and compare with manual outcomes. Adjust policy language or your output schema when pending_review or rejected rates drift from expectations.

Implementation pattern

if verificationStatus == "approved":
    → auto-accept, record in system
elif verificationStatus == "pending_review":
    → send to reviewer queue with AI findings highlighted
else:  # rejected
    → denial flow or manual escalation per policy
Integrate this logic in your backend after retrieving analysis results from the Omni API.

Measuring Success

After deploying an Omni workflow, track these metrics to measure impact:
MetricHow to MeasureTarget
Review time reductionAverage time per case before vs. after50-80% reduction for auto-approved cases
Error ratePercentage of incorrect decisions (compare Omni results against manual spot-checks)Equal to or lower than manual error rate
ThroughputCases processed per day/weekSignificant increase due to auto-processing
Escalation ratePercentage of cases routed to human reviewShould decrease over time as you refine policies
1

Pilot with a Single Workflow

Pick your highest-volume, most rule-based review process. Create an Omni workflow for it and run it in parallel with your manual process for 1-2 weeks.
2

Compare Results

Check Omni’s decisions against your team’s manual decisions. Look for disagreements and investigate whether Omni or the human reviewer was correct.
3

Refine the Policy

Based on the comparison, adjust your policy language, output schema, or score thresholds. Small changes often produce significant improvements.
4

Expand Gradually

Once accuracy is validated, let Omni handle the process end-to-end. Then move on to automating additional review processes.

What’s Next?

Creating a Workflow

Step-by-step guide to building your first workflow in the dashboard.

Policy Writing Guide

Best practices for writing effective verification policies.