Overview

A pharma software team maintained validation documentation for regulated modules by hand, pulling evidence from tickets, commits, and ad hoc test exports. Packages varied by release, traceability gaps triggered findings during internal reviews, and audit prep consumed cycles. Intelligex implemented a governed document automation flow that assembled validation packages from Jira, Git history, and test results, then routed them for electronic signatures in the existing Quality Management System (QMS). Audit trails became consistent and reproducible, approval cycles tightened, and change controls were simpler to justify—without replacing source tools or altering validated pipelines.

Client Profile

  • Industry: Pharmaceutical software supporting GxP workflows
  • Company size (range): Multi-product platform with regulated and nonregulated modules
  • Stage: Mature Jira/Git/CI and a QMS in place; validation packages assembled manually
  • Department owner: Product Management & R&D
  • Other stakeholders: QA/Validation, Regulatory/Compliance, Engineering, DevOps, Clinical/Manufacturing QA, Security/IT, Internal Audit

The Challenge

Validation deliverables—user and functional requirements, risk assessments, protocols, traceability matrices, execution reports, and approvals—were compiled from disparate sources. Product managers exported Jira epics and stories into spreadsheets, engineers captured screenshots of tests from CI tools, and QA pasted links to commits and build artifacts. Every release binder looked slightly different. Trace links from requirements to tests and defects broke when items were renamed, and change control boards spent time reconciling what changed and why before evaluating risk.

Evidence lived everywhere. Requirements sat in Jira, implementation and release notes were in Git commits and tags, automated suites generated results in CI, and deviations were logged in the QMS. Signoffs often happened late because reviewers lacked a single, governed package with the right context. Internal reviewers and auditors asked for the same clarifications each cycle: which requirements applied, how they were tested, what failed, and where approvals were captured. Teams wanted a predictable path from intent to evidence to approval that respected existing tools and validation constraints under guidance such as ISPE GAMP 5 and electronic records and signatures expectations in Part 11.

Why It Was Happening

Root causes were fragmentation and handoffs with no canonical model. Jira held requirements and defects with varying fields by team; Git tagged release candidates inconsistently; CI tools produced result files without durable links to requirement IDs; and the QMS recorded deviations and approvals in a way that was not easy to aggregate. With no shared traceability model across these systems, validation packages were rebuilt each release, and link rot crept in as items were edited or renamed.

Ownership was distributed. Product authored intent, Engineering shipped changes, QA/Validation compiled evidence, and Compliance reviewed. Each group optimized locally—tickets closed on time, code merged, tests green—but assembling proof that the right things were built and tested took extra effort. Hot fixes further complicated the picture when changes bypassed the full set of documents and had to be backfilled later.

The Solution

Intelligex delivered a document automation service that harvested requirements, code changes, and test evidence into a governed validation package, then routed it for electronic signatures in the QMS. The service mapped Jira items to a canonical requirement model, linked commits and tags from Git to those items, pulled automated and manual test results from CI and test management tools, and generated a traceability matrix, protocols, and execution reports. Human-in-the-loop review ensured the package reflected intent and risk classification before approvals were requested. All artifacts carried links back to sources and a full lineage, and nothing required a change to the validated operation of source systems.

  • Integrations: Jira for requirements, defects, and releases; Git providers such as GitHub, GitLab, or Bitbucket for commits, branches, and tags; CI systems like Jenkins or GitHub Actions for automated test results; test management tools (for example, Zephyr or Xray); QMS platforms such as ETQ Reliance, MasterControl, or TrackWise for approvals and deviations.
  • Traceability model: Canonical entities for requirement, risk, control, test case, execution, and deviation with stable IDs and effective dating. Mappings aligned Jira fields to validation constructs and preserved history through edits.
  • Evidence harvesting: Collected test logs, screenshots, and result summaries; associated them to requirement and risk IDs; captured environment and build metadata; and linked change requests and commits that implemented each requirement.
  • Package generation: Produced user and functional requirement extracts, risk assessments, protocols and reports, a bidirectional traceability matrix, deviation logs, and a release note with change summaries and impact. Embedded deep links to sources and file hashes for integrity.
  • Risk and impact checks: Highlighted requirements with elevated risk or unresolved deviations; flagged requirements without corresponding tests; and ensured effective dating aligned with release scope.
  • Approvals and signatures: Routed packages to designated roles in the QMS with electronic signatures aligned to Part 11 expectations. Exceptions required reason codes and expiring variances.
  • Audit and security: Immutable logs of extractions, mappings, edits, and approvals. Read-only access to source systems; all writes occurred in the QMS or document vault. Role-based permissions constrained who could assemble and approve packages.

Implementation

  • Discovery: Mapped current validation flows by module. Collected representative Jira projects, Git workflows, build artifacts, and QMS templates. Identified document gaps, common traceability breaks, and audit findings from recent reviews.
  • Design: Defined the canonical traceability model and field mappings, package templates, and routing roles. Established evidence criteria, naming conventions, and effective-dating rules. Agreed on human-in-the-loop review steps and exception handling.
  • Build: Implemented connectors to Jira, Git, CI, and test management tools; built the harvesting and mapping services; configured package generators and QMS workflows for e-signatures; created dashboards for package status and trace coverage.
  • Testing/QA: Ran in shadow mode alongside manual compilation. Assembled packages for recent releases and compared results to existing binders and auditor feedback. Tuned mappings, templates, and routing thresholds. Included a review board with QA/Validation, Product, and Compliance.
  • Rollout: Activated package generation for one regulated module, then expanded by product area. Kept manual assembly as a controlled fallback during early cycles. Enabled QMS e-signature gating after teams confirmed package completeness.
  • Training/hand-off: Delivered role-based sessions for Product, Engineering, QA/Validation, and Compliance. Updated SOPs for validation documentation and change control. Transferred ownership of templates, mappings, and routing rules to QA/Validation under change control.

Results

Validation moved from document chasing to governed assembly. Requirements, changes, and test outcomes flowed into a standard package with a clear traceability matrix, so reviewers focused on risk and deviations instead of asking where evidence lived. Approval cycles tightened because approvers saw a consistent structure with live links to sources, and questions during internal and external reviews were answered from the same package rather than from memory.

Change control became easier to justify. Each package included a concise change summary tied to commits and requirements, with risk and impact noted from the same model. Deviations and exceptions were documented with lineage and approval, and effective dates aligned to the release scope. Jira, Git, CI, and the QMS remained the systems of record; the difference was an orchestration layer that turned everyday activity into a defensible validation trail.

What Changed for the Team

  • Before: Validation binders were assembled by hand from exports and screenshots. After: Packages were generated from Jira, Git, and test results with governed templates.
  • Before: Traceability broke when items were edited. After: A canonical model preserved history and maintained links through changes.
  • Before: Approvals arrived late with missing context. After: QMS e-signature workflows routed complete packages with role-based gates.
  • Before: Change control boards reconciled what changed. After: Packages carried commit-linked change summaries and impact checks.
  • Before: Audit questions required manual reconstruction. After: Packages included live links, file hashes, and logs for straightforward evidence.
  • Before: Hot fixes were backfilled ad hoc. After: Exceptions were documented with reason codes and expirations in the same flow.

Key Takeaways

  • Build a canonical trace model; consistent IDs and effective dating keep requirements, tests, and changes aligned.
  • Assemble from source systems; harvest Jira, Git, and CI rather than copying into spreadsheets.
  • Generate, then review; automate package creation and keep humans in the loop for risk and exceptions.
  • Route approvals in the QMS; e-signatures and role-based gates anchor compliance without new tools.
  • Preserve lineage and integrity; links, hashes, and logs make audits and change justifications straightforward.
  • Integrate, don’t replace; leave Jira, Git, CI, and the QMS as systems of record and add governance around them.

FAQ

What tools did this integrate with? The service read requirements and defects from Jira, commits and release tags from Git providers such as GitHub, GitLab, or Bitbucket, and automated test outputs from CI systems like Jenkins or GitHub Actions and from test management tools such as Zephyr or Xray. Packages and approvals flowed through the existing QMS (for example, ETQ Reliance, MasterControl, or TrackWise).

How did you handle quality control and governance? Templates and mappings lived under change control. Evidence criteria, naming conventions, and effective-dating rules were documented and reviewed. Packages entered a human-in-the-loop review before routing for e-signature, and exceptions required reason codes. All extractions, mappings, edits, and approvals were immutably logged. Work aligned to validation guidance in ISPE GAMP 5 and electronic records and signatures expectations under Part 11.

How did you roll this out without disruption? The automation ran in shadow mode initially, assembling packages for active releases while the manual approach remained in place. Differences were reconciled with QA/Validation and Compliance, templates were tuned, and routing thresholds were set. After stable cycles, e-signature gating in the QMS was enabled for selected modules, then expanded.

What documents were generated and how was traceability handled? The package included requirement extracts, risk assessments, test protocols and execution reports, a bidirectional traceability matrix, deviation logs, and a release change summary. Traceability was based on a canonical model that linked each requirement to implementing commits and to test cases and executions, with effective dating and history preserved through edits.

How were electronic signatures managed? Approvals occurred in the QMS using established roles and electronic signatures. The service created approval tasks with references to the assembled package and embedded links to source evidence. Signature records, comments, and any variances were captured in the QMS to maintain compliance with Part 11 expectations.

Did this require changes to validated pipelines or source tools? No. Connectors operated in read-only mode for Jira, Git, and CI. All approvals and document control remained in the QMS. The orchestration added governed assembly and review steps around existing tools and workflows, preserving validated states while improving consistency.

You need a similar solution?

Get a FREE
Proof of Concept
& Consultation

No Cost, No Commitment!