Overview
Academic program strategy at a university dragged because committees debated stale data and accreditation constraints hidden in files. Enrollment projections, peer comparisons, and policy language lived in separate repositories, so meetings stalled on interpretation rather than options. Intelligex enabled a retrieval?augmented generation (RAG) layer across accreditation documents, peer catalogs, and internal enrollment projections with role?based access, and added an approval workflow for scenario summaries. Committees aligned faster with fewer misinterpretations, clear citations to sources, and a documented trail from scenario to decision.
Client Profile
- Industry: Higher education (public research university)
- Company size (range): Multi?college, multi?campus institution
- Stage: Active portfolio refresh and accreditation cycle planning
- Department owner: Strategy, Analytics & Executive Leadership (Office of the Provost / Institutional Research)
- Other stakeholders: Deans and Department Chairs, Accreditation & Compliance, Enrollment Management, Finance, Faculty Senate, IT/Security, Library & Knowledge Management
The Challenge
Program review committees needed a coherent picture of demand, outcomes, costs, and accreditation constraints. In practice, accreditation standards and prior findings were buried in PDFs, peer program details were scattered across public catalogs, and enrollment projections lived in the institutional research warehouse. Materials were emailed before meetings, and members brought their own extracts. Debates centered on which clause applied or whether projections were current, slowing decisions on launches, revisions, or teach?outs.
The institution relied on SharePoint and Box for documents, Tableau for dashboards, and a warehouse for projections, but there was no cross?repository search with permissions intact or a way to summarize options with citations. Accreditation staff were pulled into late reviews to correct interpretations. Faculty and Deans asked for an approach that met them in their current tools, respected access rules, and produced scenario summaries that were easy to verify.
Why It Was Happening
Information and identity were fragmented. Accreditation language, prior self?studies, and conditions lived in document stores with inconsistent naming; peer catalogs used different formats; and projections followed separate refresh calendars. Searches worked within each system but not across them, and nothing bound a decision slide to the exact clause or dataset version behind it.
Governance arrived at the end. Scenario memos were drafted without required citations or review gates, and sensitive accreditation findings circulated beyond intended audiences. Without role?based access and an approval path, committees spent meetings reconstructing context and correcting misunderstandings rather than weighing trade?offs.
The Solution
We implemented a permissions?aware RAG service that indexed accreditation PDFs, peer program catalogs, and enrollment projections, then generated scenario summaries with inline citations and role?appropriate detail. Azure Cognitive Search provided cross?repository indexing; the RAG layer followed a use your data pattern to assemble answer cards with links to source passages and current projections. A lightweight approval workflow routed scenario summaries to Accreditation and Institutional Research for sign?off before committee use. Nothing was replatformed: SharePoint and Box remained the document sources, Tableau remained the analytics layer, and single sign?on governed access throughout.
- Cross?repository indexing of SharePoint and Box libraries with metadata and permission trimming (Azure Cognitive Search)
- Retrieval?augmented generation for summaries with citations to accreditation standards, prior reports, and peer catalog entries (Azure OpenAI: Use your data)
- Integration of enrollment projections from the warehouse into certified Tableau views embedded in summaries (Tableau)
- Role?based access aligned to campus identity groups so search and summaries respect entitlements (Okta Groups)
- Policy filters that block redistribution of restricted passages and watermark internal excerpts
- Scenario summary template with required citations, assumptions, and links to certified dashboards
- Approval workflow for Accreditation and Institutional Research with comments and tracked edits (Power Automate Approvals)
- Definitions catalog for program viability metrics maintained by Institutional Research; peer reference alignment to IPEDS taxonomies
- Audit trail capturing data versions, sources cited, approvers, and final language used
- Reference library links to accreditation bodies for context, such as CHEA
Implementation
- Discovery: Cataloged accreditation standards, self?study archives, action letters, and peer program sources. Mapped Tableau views and warehouse refresh cycles for enrollment projections. Reviewed committee packets to identify recurring misinterpretations and missing citations. Aligned on access boundaries with Accreditation and Legal.
- Design: Defined indexing scope, metadata schema, and tagging for topic, college, degree level, and accreditation domain. Authored prompt and citation patterns for RAG. Designed the scenario summary template with required fields, and the approval workflow and role mappings. Documented policy filters for restricted content.
- Build: Connected SharePoint and Box libraries to Azure Cognitive Search; configured indexes and enrichments. Implemented the RAG service to assemble summaries with citations and embedded Tableau views. Built the scenario approval flow with comments and tracked changes. Stood up the definitions catalog and reference mappings to IPEDS.
- Testing and QA: Ran historical cases through the service to compare summaries with prior committee decisions. Validated permission trimming and citation accuracy. Stress?tested policy filters for restricted content. Verified that embedded Tableau views referenced certified projections and correct refresh windows.
- Rollout: Launched read?only search and draft summaries to a pilot group of Deans and Accreditation staff. After tuning, enabled the approval workflow and required cited summaries for committee packets. Expanded indexing to additional colleges and specialized accreditors.
- Training and hand?off: Delivered quick guides for committee secretaries on assembling packets, for Accreditation on reviewing citations, and for Institutional Research on maintaining the definitions catalog. Established a human?in?the?loop review for edge cases and a cadence to refresh tags and sources each term.
Results
Committees opened packets with scenario summaries that cited the exact clauses and peer references, and displayed current projections from certified dashboards. Clarifications moved from the middle of the meeting to the drafting phase, and debates centered on options and trade?offs rather than source hunts. Accreditation staff saw interpretations early and corrected them before circulation.
Alignment improved because every recommendation linked back to the same governed sources and metric definitions. Faculty, Deans, and Institutional Research referenced common language and views, which reduced rework on packet revisions. The approval workflow created a durable record of assumptions and edits, simplifying post?meeting follow?ups and audit requests.
What Changed for the Team
- Before: Packet prep relied on manual searches and unpublished interpretations. After: A permissions?aware search and RAG layer produced cited summaries from governed sources.
- Before: Projections and standards had mismatched refresh cycles. After: Summaries embedded certified Tableau views and stamped data versions.
- Before: Accreditation weighed in late. After: An approval step captured Accreditation and Institutional Research review before committee use.
- Before: Misinterpretations surfaced during meetings. After: Policy filters and citations framed context upfront with links to primary sources.
- Before: Decisions lacked a traceable rationale. After: An audit log tied sources, definitions, and approvals to each scenario.
Key Takeaways
- Unify accreditation language, peer references, and projections under a permissions?aware index so committees start from shared facts.
- Use retrieval?augmented generation with citations; summaries are only useful if they point back to primary sources.
- Embed certified dashboards and stamp data versions to avoid stale or conflicting views.
- Add a light approval gate with Accreditation and Institutional Research so interpretations are vetted before meetings.
- Maintain a definitions catalog aligned to common taxonomies to keep comparisons consistent across colleges.
FAQ
What tools did this integrate with?
We indexed SharePoint and Box libraries with Azure Cognitive Search, generated cited summaries using a use your data pattern in Azure OpenAI, embedded certified Tableau views for projections, and enforced role?based access via Okta groups. Scenario approvals ran through Power Automate Approvals. Accreditation context referenced bodies such as CHEA.
How did you handle quality control and governance?
Citations linked directly to clauses in accreditation PDFs and to certified Tableau views with version stamps. Policy filters blocked redistribution of restricted content and watermarked internal excerpts. The definitions catalog aligned metric names and cohort rules, and the approval workflow required Accreditation and Institutional Research sign?off on scenarios with tracked edits and comments. An audit log recorded sources, prompts, approvers, and final language.
How did you roll this out without disruption?
We began with read?only search and draft summaries while committees continued their existing process. After validating relevance, permissions, and citation integrity, we enabled the approval workflow and required cited summaries in packets. Existing repositories, dashboards, and identity systems remained; the solution orchestrated search, summarization, and governance around them.
How were peer catalogs and accreditation sources selected?
Peer references were aligned to IPEDS taxonomies and institutional peer lists, and accreditation documents included regional standards, recent action letters, and relevant specialized accreditor materials. Indexing scope and tags were reviewed with Accreditation and Deans each term to stay current.
How did you ensure sensitive accreditation findings didnt leak?
Role?based access limited who could search restricted collections, and the RAG layer generated summaries only from sources the user could access. Restricted content displayed as masked excerpts with links to request access. Policy filters prevented copying of non?shareable passages, and approvals were required before scenarios left the restricted audience.
Get a FREE
Proof of Concept
& Consultation
No Cost, No Commitment!


