Overview

A global chemicals manufacturer was seeing uneven sourcing outcomes because buyers ran events differently based on personal preference. Intelligex standardized sourcing playbooks inside the existing eSourcing platform, introduced category-specific templates and scoring matrices, and added analytics for structured bid comparisons. Evaluations became defensible, collaboration with stakeholders was smoother, and the handoff to Legal moved without rework, all without replacing core systems.

Client Profile

  • Industry: Chemicals (specialty and commodity blends)
  • Company size: Multi-region operations with centralized procurement
  • Stage: Established procurement function with distributed category teams
  • Department owner: Procurement, Supply Chain & Logistics
  • Other stakeholders: Legal/Contracting, Finance, Quality, Health Safety & Environment (HSE), Manufacturing plants, IT/Enterprise Applications

The Challenge

Sourcing events were set up and evaluated differently by each buyer. Some used rich questionnaires for supplier capability and compliance; others focused only on price. Weightings varied by preference, and critical terms like delivery conditions, payment terms, and packaging specifications were handled inconsistently. Bid comparisons lived in personal spreadsheets, making it difficult to trace decisions or bring stakeholders into a consistent evaluation process.

In the chemicals space, supplier compliance is not optional. Regulatory regimes such as REACH and TSCA place specific obligations on manufacturers and their supply base. Without standardized questions and scoring, compliance sign-offs were delayed and sometimes missed during early evaluations. Legal teams received handoffs that were incomplete or ambiguous, so they had to renegotiate details that should have been settled during the RFP stage.

The company could not pause active sourcing or swap out its eSourcing tool. Budgets were tight, buyers were busy, and plant stakeholders needed reliable timelines. The ask was to bring order to the process using the tools already in place, with minimal disruption and clear governance.

Why It Was Happening

The root causes were fragmentation and reliance on offline workarounds. The eSourcing tool had no maintained playbooks or templates; buyers copied old events or started from scratch. Scoring criteria and weightings were not standardized by category, so value analysis differed from event to event. Bid comparisons were performed in spreadsheets with custom formulas, so version control and auditability suffered. There was no shared definition of total landed cost, and key commercial terms like delivery conditions were captured as free text.

Ownership of evaluation design was also unclear. Category managers, plant stakeholders, Quality, and HSE each added requirements, but there was no consistent way to translate those into structured templates or locked scoring sections. Without a shared foundation, buyers compensated with personal judgment and manual triage, which created uneven outcomes and made cross-event reporting impractical.

The Solution

Intelligex implemented standardized sourcing playbooks inside the existing platform, building templates and scoring matrices for priority categories and enabling analytics-driven bid comparisons. We integrated the eSourcing tool with the ERP item and vendor master, embedded mandatory compliance sections for Quality and HSE, and configured evaluation workflows with clear roles. The solution emphasized governance without slowing down buyers, and it connected accepted awards directly to contracting to avoid rework.

  • Category-specific templates in Coupa Sourcing (adaptable to other platforms), covering raw materials, intermediates, packaging, and tolling services.
  • Structured scoring matrices with locked sections for safety, quality, sustainability, and commercial terms; configurable weightings owned by category leads.
  • Analytics for side-by-side bid comparison, including total landed cost with delivery terms aligned to Incoterms rules; scenario modeling for multi-plant awards.
  • Supplier questionnaires that capture compliance attestations and documentation relevant to REACH and TSCA, with file requirements and expiration tracking.
  • Integration to ERP for material codes, vendor IDs, and payment terms; validation rules to prevent mismatches and free-text drift.
  • Workflow gates for evaluation panel reviews, conflict-of-interest attestations, and Legal pre-checks before award recommendations.
  • Standardized award justification reports auto-generated from the event, including score breakdowns, commentary, and attachments.
  • Connector to Contract Lifecycle Management (CLM), such as Icertis, to push award data, clause selections, and negotiated terms into draft contracts.
  • Role-based permissions and an immutable audit trail for template changes, weight adjustments, and evaluation actions.

Implementation

  • Discovery: Reviewed a representative set of past sourcing events across categories; audited spreadsheets and formulas used for bid comparisons; mapped stakeholder requirements from Quality, HSE, and plants; documented Legal’s contracting pain points.
  • Design: Defined playbooks per category with standard questions, mandatory documents, and scoring sections; established weight ranges and decision rights; set up evaluation workflows and reviewer roles; aligned commercial fields to ERP and standard delivery terms.
  • Build: Configured templates and questionnaires in the eSourcing tool; implemented analytics views and scenario filters; built validations and data mappings to ERP and CLM; created award justification report templates.
  • Testing/QA: Piloted with active events in a limited scope; ran evaluations in parallel with the legacy approach; calibrated scoring using dry runs; validated integration payloads and clause mapping with Legal.
  • Rollout: Phased by category and region; started with strategic raw materials and packaging, then expanded to services and the long tail; kept the legacy comparison spreadsheet available as a contingency during each wave.
  • Training/hand-off: Conducted short, role-based sessions for buyers and stakeholders; provided quick-reference guides and template change request workflows; established a governance cadence for template updates and periodic calibration sessions; included human-in-the-loop review at key approval gates.

Results

Evaluations became consistent and explainable. Buyers launched events from templates that already captured safety, quality, and commercial requirements, and panel members scored against predefined criteria rather than ad hoc checklists. Bid comparisons moved out of spreadsheets and into the platform, where stakeholders could see clear apples-to-apples views of pricing, delivery conditions, and non-price factors. Award recommendations were supported by standardized reports that stood up to executive and audit review.

Legal received handoffs with structured terms and clause selections based on event outcomes, so initial drafts reflected what suppliers had already agreed to. Disputes over delivery responsibilities and packaging specifications dropped because those terms were defined and scored during the event. Procurement cycles were more predictable, rework decreased, and cross-event analytics revealed which templates and weightings produced the most resilient awards.

What Changed for the Team

  • Before: Each buyer built events differently, with personal spreadsheets for comparisons. After: Buyers launched from category templates and used in-tool analytics for structured comparisons.
  • Before: Criteria and weightings shifted by preference. After: Scoring matrices and weight ranges were standardized and governed by category leads.
  • Before: Compliance checks were bolted on late. After: Quality and HSE requirements were embedded and scored within the event.
  • Before: Legal reworked award details during contracting. After: Award data flowed to CLM with agreed terms and clause guidance.
  • Before: Stakeholder reviews were email-driven and hard to audit. After: Panel workflows, conflict attestations, and audit trails lived in the platform.

Key Takeaways

  • Standardized playbooks in the eSourcing tool create consistent, defensible evaluations without slowing buyers down.
  • Structured scoring and analytics enable apples-to-apples comparisons and reduce reliance on fragile spreadsheets.
  • Embedding compliance and quality sections into templates avoids late-stage surprises and improves audit readiness.
  • Integrations to ERP and CLM keep data aligned from event setup through contract execution.
  • Governed templates work best with a cadence for calibration and a clear path to request updates when categories evolve.

FAQ

What tools did this integrate with?
We configured templates and analytics in the client’s existing eSourcing platform, with connectors to ERP for material and vendor master data and to CLM for contract creation. The approach shown here used Coupa Sourcing, SAP ERP, and Icertis, but the pattern applies to other suites with comparable capabilities.

How did you handle quality control and governance?
Category leads owned template content and weight ranges, and changes required a documented request and approval. Evaluation panels used locked scoring sections for safety, quality, and compliance, and the platform captured conflict-of-interest attestations and reviewer comments. We sampled closed events for calibration and kept an immutable audit log of all template edits and scoring actions.

How did you roll this out without disruption?
We piloted with active events in parallel, allowing buyers to check results against their legacy spreadsheets before full adoption. Rollout was phased by category, with clear fallback to prior methods during cutovers. Training focused on real events, not generic demos, and change champions in each team handled quick questions during the first waves.

How were category differences handled?
Templates were tailored by category and included only the mandatory sections relevant to that space. For example, raw materials emphasized assay specs and delivery handling, packaging included palletization and labeling, and tolling services covered capacity and quality system certifications. Weight ranges were governed centrally but adjustable within defined bounds.

How did you ensure confidential pricing and fair evaluation?
Buyers could enable blind scoring for non-price sections, and access to pricing fields was restricted by role. The platform enforced separation between price and technical evaluations where appropriate, and decision rationales were captured in the award justification report for transparency.

You need a similar solution?

Get a FREE
Proof of Concept
& Consultation

No Cost, No Commitment!