Skip to main content

How to Find Gaps in an EASA SORA Application Before Submission

· 8 min read

Preparing a SORA-based operational authorisation package is not just filling in one form. It is a document-control problem: the official EASA material, the application draft, CONOPS, operations manual, UAS evidence, risk tables, compliance matrices, mitigation records, pilot records, and national-authority notes all need to stay consistent.

Aviation.Bot is useful here because it treats the application as a workspace, not as a single uploaded PDF.

Aviation.Bot SORA application gap review screenshot

This article walks through a fictional but realistic example: NorthSea Aero Logistics, a commercial drone operator preparing a specific-category SORA application package for BVLOS infrastructure work in the Netherlands.

The demo uses public EASA and Dutch ILT source material, synthetic company files, and a real agent run. The workspace did not contain the final report upfront. Aviation.Bot created the pre-submission gap review during the recorded run.

This is preparation support, not regulatory advice. A qualified compliance lead and the relevant competent-authority process remain required before any real submission.

Why This Becomes A Workspace Problem

A SORA package has many moving parts:

  • the current SORA methodology and related AMC/GM material
  • the operational authorisation application form and current draft
  • the operator's concept of operations
  • UAS system, C2 link, containment, maintenance, and emergency-response evidence
  • risk registers and mitigation evidence matrices
  • remote pilot competence records
  • operations manual sections that must match the application
  • national or local authority constraints for the state of operation

The hard part is consistency. A single mismatch such as altitude, operating area, mitigation owner, evidence ID, or document revision can create review work later.

In the recorded demo, the generated gap review found exactly that kind of issue: the application draft used one altitude assumption, while the CONOPS and operational-volume evidence used another.

Aviation.Bot SORA risk register screenshot

What Goes Into The Workspace

For a real review, start from official sources and keep provenance outside the visible working folder: canonical URL, access date, direct download URL when applicable, file size, and checksum. In the demo pipeline, that source manifest is kept as a sidecar artifact rather than cluttering the user's workspace.

Useful EASA starting points include:

Because the fictional operator is based in the Netherlands, the demo also includes Dutch ILT pages captured as PDFs. This matters because local conditions can change what evidence is needed for BVLOS operations.

Aviation.Bot EASA application draft screenshot

Company Files To Add

After the official sources, add the operator files. A useful SORA review workspace normally includes:

  • current application form draft
  • CONOPS or equivalent operation description
  • UAS system description
  • draft operations manual
  • operational volume and ground risk buffer evidence
  • risk register
  • compliance matrix
  • mitigation evidence matrix
  • remote pilot competence matrix
  • maintenance, C2 link, containment, and emergency-response procedures
  • national or local-condition evidence where applicable

The demo uses synthetic DOCX, XLSX, and PDF files instead of Markdown because real SORA work normally happens across office documents and formal PDFs.

The Prompt Pattern

The visible demo prompt is deliberately bounded. It asks Aviation.Bot to compare a named document set and create a review artifact, not to "approve" an application.

Review this SORA application workspace. Start with the application draft, compare it with the SORA 2.5 annex, the form guidance, the CONOPS, risk register, compliance matrix, and supporting evidence. Identify missing evidence, inconsistent mitigations, national or local-condition gaps, and the highest-priority actions. Save the gap review in outputs as sora-application-gap-review.md.

The exact wording can change. The important pattern is:

  1. name the official sources and company files in scope
  2. ask for missing evidence and inconsistencies
  3. ask for a concrete review artifact
  4. keep the output bounded as preparation support

What The Agent Did In The Demo

The recorded run used a real model, not a scripted final answer. During the run, Aviation.Bot indexed the workspace, opened the application draft, inspected the official source PDFs, read the SORA annex and Form 208 material, checked the Dutch ILT captures, inspected Word documents and spreadsheets, searched for missing evidence files, and then wrote the gap review.

Aviation.Bot SORA source document screenshot

The final output is a pre-submission gap review. It is not an approval opinion. It gives the operator a practical list of items to review with a compliance owner.

The report flagged issues such as:

  • inconsistent altitude and operational-volume assumptions
  • missing KML/KMZ or equivalent location package evidence
  • missing adjacent-area population-density support
  • containment evidence ID mismatch and missing flight-test evidence
  • incomplete emergency-response revision and role alignment
  • operations manual maturity gaps
  • weak compliance-matrix traceability
  • missing insurance evidence for the ILT application context

Aviation.Bot SORA compliance matrix screenshot

Why This Is Better Than A One-Off Chat Upload

Uploading a single PDF to a chat tool can help with reading. It does not solve the workspace problem.

For a SORA package, the useful workflow is:

  1. keep official sources and operator evidence in one folder
  2. index PDFs, Word files, spreadsheets, and notes together
  3. let the agent search and inspect the original source files
  4. create a reviewable output in the same workspace
  5. let a human owner verify each finding against the cited documents

Aviation.Bot is designed around that loop. The source documents remain visible, the generated report is just another workspace file, and the user can continue reviewing instead of copy/pasting between a PDF viewer, spreadsheet, file browser, and chat session.

How To Build This Workspace Yourself

  1. Install Aviation.Bot and create a new workspace folder for the application package.
  2. Download the relevant official EASA sources listed above and store them under a sources/easa/ folder.
  3. Add national or local authority material under a folder such as sources/naa/ when your operation depends on state-specific conditions.
  4. Add your own operator files under folders such as company/, risk/, and evidence/.
  5. Include the application draft, CONOPS, operations manual, risk register, compliance matrix, mitigation evidence, pilot competence records, UAS system description, and emergency-response material.
  6. Ask Aviation.Bot to compare the official sources with your company files and produce a gap review.
  7. Inspect every cited source and assign a human owner for each open action.

Do not stop at the AI output. Treat it as a review accelerant: a way to find inconsistencies, missing evidence, and draft action lists faster.

Aviation.Bot generated SORA gap review caveat screenshot

What To Avoid

Do not ask any AI tool to approve a SORA application. Do not rely on a generated answer without checking source documents. Do not use a workspace like this as a substitute for national requirements, privacy and data-protection review, insurance, liability, security, environmental constraints, or competent-authority guidance.

The useful promise is narrower and more practical: reduce the search tax, keep official sources and operator evidence together, and make the pre-submission review easier to inspect.

How Aviation.Bot Can Help

For SORA work, Aviation.Bot helps turn a scattered application package into a reviewable folder: EASA source PDFs, national-authority material, the application draft, CONOPS, operations manual sections, risk registers, evidence matrices, and generated gap reviews can all stay together.

You can choose the AI model or provider that fits your data boundary. Use a cloud model for maximum capability when policy allows it, or use a local/offline model when operational details, customer constraints, company policy, or EU GDPR concerns require tighter control. Aviation.Bot then adds the document workflow: indexing, source inspection, file-aware chat, generated reports, reviewable output files, and human approval before the result is used.

Learn more at aviation.bot.