Overview

A hospitality technology vendor’s knowledge base had grown unevenly across tools and formats, making search unreliable for customers and Customer Success Managers (CSMs). Articles varied in structure and naming, and the most relevant content was often buried or outdated. We partnered with Marketing & Customer Engagement, Support, and Product to replatform the help center to Zendesk Guide, introduce structured article templates and taxonomy, layer an AI search experience using retrieval-augmented generation (RAG) from a verified corpus, and institute a publish and review workflow. Customers and CSMs began finding accurate answers quickly, escalations tapered, and teams spent less time chasing links and more time resolving real issues.

Client Profile

  • Industry: Hospitality technology (property operations and guest experience)
  • Company size (range): Mid-market
  • Stage: Established SaaS with regular feature releases
  • Department owner: Marketing & Customer Engagement
  • Other stakeholders: Support/Helpdesk, Customer Success, Product Management, Technical Writing/Docs, IT/Security, Legal/Compliance

The Challenge

The existing knowledge base had accreted over time across a legacy portal and scattered document repositories. Some articles were written as long-form guides, others as quick fixes, and many referenced products by deprecated names. Search returned a mix of outdated release notes, duplicate answers, and mismatched content because metadata and titles lacked consistency. Customers resorted to submitting tickets for basic questions. CSMs kept private notes and links because they could not trust what surfaced.

Tooling and process made matters worse. The legacy site had limited support for structured fields, versioning, or workflow. Anyone with access could publish, and no one had a clear view of what had gone stale. The product changed frequently, but articles were not updated in step with releases. Budget and time constraints ruled out a complete rewrite. The vendor needed a path that preserved useful content, introduced structure and governance, and improved findability without forcing teams to learn new systems all at once.

Why It Was Happening

The root issues were fragmentation and a lack of shared standards. Content lived in multiple places with overlapping ownership. Authors used their own templates—or none at all—which led to inconsistent titles, steps, prerequisites, and screenshots. Tags were free-form, so search relevance signals were weak. Without a publishing workflow, articles drifted out of date and stayed there.

Search behaved like a generic keyword match across an ungoverned corpus. It had no way to prioritize current, verified answers over archived or speculative content. When releases shipped, there was no coordinated signal to refresh related articles, and no guardrails to prevent out-of-scope drafts from appearing in the public help center. In short, the system rewarded volume over clarity.

The Solution

We aligned content, search, and governance around a single help center without introducing a separate tool sprawl. The team migrated the knowledge base to Zendesk Guide to take advantage of structured templates, article states, and roles. We implemented standardized article templates with required fields and clear taxonomy, and created a “verified” workflow so only reviewed content entered the AI search index. The AI search used retrieval-augmented generation (RAG) to summarize and cite from approved articles, giving customers natural-language answers rooted in the official knowledge base rather than open-ended models.

  • Replatformed to Zendesk Guide with structured templates and section hierarchy (Zendesk Guide)
  • Article templates with required fields (audience, product/module, version/release, steps, expected behavior, screenshots, related articles)
  • Controlled taxonomy (labels, product names, synonyms) with an editorial style guide
  • Role-based publishing workflow with draft, review, approve, and publish states
  • AI search layer using retrieval-augmented generation that draws only from verified articles, with citations for transparency (What is RAG?)
  • Indexing pipeline that excludes archived or unverified content and respects article permissions
  • Search tuning: synonyms, boosted fields, and query intent handling for common hospitality terms
  • Integration with Zendesk Support for ticket deflection and with the CSM workspace to surface suggested answers during calls
  • Analytics dashboard for search queries, zero-result reports, article health, and approval cycle time
  • Human-in-the-loop review for high-impact topics and release-adjacent updates

Implementation

  • Discovery: Audited the existing help center, internal docs, and search logs to map top intents and common failure patterns. Interviewed Support, CSMs, and Product to define must-keep content and identify deprecations. Cataloged sensitive topics requiring restricted access.
  • Design: Defined the content model and templates, including required fields and acceptable values for product names and modules. Designed the Zendesk Guide IA (categories, sections) and labels. Mapped the publish workflow and review roles. Outlined the RAG index scope and guardrails.
  • Build: Stood up a Guide instance with the new hierarchy and templates. Migrated articles in batches, normalizing titles, metadata, and screenshots. Implemented the indexing job for AI search to include only “verified” articles and attach source citations. Tuned keyword search with synonyms and field boosts.
  • Testing/QA: Ran editorial QA on migrated content; verified permissions and visibility. Red-teamed the AI search with tricky queries, ensuring responses stayed within the knowledge base and fell back to extractive snippets when confidence was low. Included human-in-the-loop review on sensitive topics.
  • Rollout: Launched the new help center in phases. Kept the legacy site read-only with redirects for popular URLs. Enabled the AI search widget for a limited audience first, monitored analytics and feedback, then expanded once accuracy and coverage proved stable.
  • Training/hand-off: Trained authors and reviewers on templates, labels, and the approval workflow. Provided CSMs and Support with guidance on using suggested answers within Zendesk Support and their CRM workspace. Documented governance, style guidelines, and escalation paths.

Results

Customers could phrase questions in natural language and receive grounded answers with clear steps and links, instead of wading through mismatched search hits. Because the AI drew only from reviewed, current articles, responses stayed on-message and traceable. When content was missing, analytics surfaced the gap and the editorial team filled it using the standard template, keeping quality consistent as the library grew.

CSMs and Support spent less time hunting for canonical explanations and more time solving real issues. Escalations that were previously caused by ambiguous or outdated docs were redirected by accurate self-service and in-context suggestions. Product teams benefited as well; releases were paired with targeted article updates through the publish workflow, and the help center reflected changes without last-minute scrambles.

What Changed for the Team

  • Before: Articles varied wildly in format and tone. After: Templates enforced consistent structure, terminology, and included fields like prerequisites and expected behavior.
  • Before: Anyone could publish at any time. After: A clear draft-review-approve workflow ensured content quality and accountability.
  • Before: Search surfaced a mix of outdated and duplicate content. After: AI search with RAG returned grounded answers sourced from verified articles, with citations.
  • Before: CSMs kept private notes to compensate for gaps. After: A single, trusted help center served both customers and internal teams.
  • Before: Releases outpaced documentation. After: Content updates were tied to release cues, with human-in-the-loop review on high-impact changes.

Key Takeaways

  • Structure matters: consistent templates and taxonomy are prerequisites for effective search.
  • Ground AI in verified content using retrieval-augmented generation; do not let models improvise beyond the help center.
  • Governance is as important as tooling: institute clear roles, states, and approvals so content stays current.
  • Integrate knowledge where people work—Support, CSM tools, and the help center—rather than creating a new destination to check.
  • Use search analytics to drive the editorial backlog; zero-result queries point directly to content gaps.
  • Phase migrations with redirects and fallbacks to avoid disrupting customers while improving quality.

FAQ

What tools did this integrate with?
The new help center runs on Zendesk Guide, integrated with Zendesk Support for suggested answers and ticket deflection. The AI search layer indexes verified Guide articles only. For customer-facing teams, the same search experience was embedded in the CSM workspace and the customer portal. Identity and permissions continued to flow from the existing single sign-on provider.

How did you handle quality control and governance?
We implemented structured templates, a controlled taxonomy, and a publish workflow with defined reviewer roles. Only articles in a “verified” state entered the AI search index. Sensitive topics used restricted sections. We monitored search analytics, ran periodic content reviews, and required human-in-the-loop approval for high-impact topics and release-adjacent changes.

How did you roll this out without disruption?
The rollout was phased. We migrated content in batches, kept the legacy site in read-only mode with redirects for popular URLs, and enabled the AI search widget to a limited audience first. Support and CSMs were briefed ahead of each phase, and a feedback loop captured issues quickly.

How does the AI search avoid hallucinations?
Responses are generated with retrieval-augmented generation from the verified corpus only, and each answer includes citations back to the source article. When a query falls outside the knowledge base or confidence is low, the system returns extractive snippets or routes to traditional search rather than synthesizing unsupported guidance.

What was migrated and what was left behind?
We migrated canonical product how-tos, troubleshooting guides, and policy pages after normalizing them to the new templates. Obsolete release notes, duplicated content, and author-only drafts were archived. Internal-only materials moved to restricted sections with appropriate permissions instead of being published publicly.

You need a similar solution?

Get a FREE
Proof of Concept
& Consultation

No Cost, No Commitment!