A structured RFP eliminates 80% of selection mistakes
Most universities choose their chatbot after a 30-minute demo and a pricing negotiation. Six months later, the tool gives irrelevant answers, nobody checks the analytics, and the admissions team goes back to the contact form.
The problem is not the chatbot. It is the absence of a proper specification document. Without formalised criteria, every stakeholder evaluates the solution against their own priorities — IT looks at integration, the admissions director wants leads, the finance director compares prices. The result is a decision by default, not by method.
This guide provides the 12 criteria to include in your chatbot RFP, organised into four blocks: functional, technical, compliance and support. Each criterion includes a concrete acceptance threshold and a recommended weighting for the evaluation grid.
The benchmarks cited come from the analysis of 200,000 chatbot sessions across 50 partner institutions between October 2025 and February 2026 (source: Skolbot internal data).
The 12-point checklist: overview
Before diving into the detail, here is the full grid. Each criterion is grouped by block and weighted according to its impact on student recruitment.
| # | Block | Criterion | Weight | |---|---|---|---| | 1 | Functional | Training on institution-specific data | 15% | | 2 | Functional | Native multilingual support | 12% | | 3 | Functional | Automatic open day registration | 10% | | 4 | Functional | Analytics and reporting | 8% | | 5 | Technical | CMS / CRM integration | 10% | | 6 | Technical | Deployment timeline | 8% | | 7 | Technical | Uptime SLA | 5% | | 8 | Technical | Performance and response time | 5% | | 9 | Compliance | GDPR and data hosting | 10% | | 10 | Compliance | EU AI Act (transparency obligations) | 5% | | 11 | Support | Onboarding and training | 7% | | 12 | Support | Support SLA and dedicated CSM | 5% |
The total reaches 100%. Adjust the weightings to match your institution's priorities — but do not remove any criterion. A chatbot that excels functionally but fails on compliance exposes the university to real legal risk.
Functional requirements: what the chatbot must do
1. Training on institution-specific data (15%)
The chatbot must answer questions specific to your institution, not sector generalities. Analysis of 12,000 Skolbot conversations (Sept 2025 — Feb 2026) reveals that 89% of prospects ask about tuition fees and 78% about apprenticeship programmes. A chatbot that does not know your fees or your apprenticeship offering fails on the most frequent questions.
Acceptance threshold. The chatbot must correctly answer 90% of the top 10 prospect questions (fees, career outcomes, apprenticeships, accommodation, international exchanges, admissions requirements, placements, degree recognition, campus life, financial aid) within 48 hours of deployment.
Question to ask the vendor: "How is the chatbot fed content? Automatic scraping, manual import, or both? What is the update lag when a programme changes?"
2. Native multilingual support (12%)
58% of international prospects are not native speakers of the institution's primary language (source: language detection, 8,500 Skolbot conversations, 2025-2026). A monolingual chatbot cuts access to more than half of the international pipeline.
Acceptance threshold. Automatic language detection, response in the same language, coverage of at least 10 European languages without quality degradation.
Common trap. "Automatic translation" is not "native multilingual". A chatbot that translates its English response into German produces approximate content and misses the nuances of local education pathways (Studienkolleg in Germany, selectie in the Netherlands, classes préparatoires in France).
3. Automatic open day registration (10%)
The chatbot must detect visit intent and offer registration within the conversation, not simply link to a form. Tracking data across 35 institutions (2025-2026) shows an open day registration rate of 18.4% via chatbot versus 6.2% via form — a 3x factor.
Acceptance threshold. In-conversation registration (no external redirect), instant confirmation, personalised reminders at D-7 and D-1 with a no-show rate below 20%. For reference, the no-show rate without any reminder reaches 52% (source: tracking of 4,200 open day registrations across 12 institutions, 2025-2026).
4. Analytics and reporting (8%)
Without data, the chatbot is a black box. The dashboard must provide at minimum: conversation volume, top questions, resolution rate, human handoff rate, and conversions (open days, forms, applications).
Acceptance threshold. Dashboard accessible without technical skills, CSV/API export, segmentation by programme/campus/language, and alerts on anomalies (spike in questions on a topic = problem on the site or programme change).
Technical requirements: how the chatbot integrates
5. CMS / CRM integration (10%)
The chatbot must integrate with your existing ecosystem, not replace it. Critical integrations: CMS (WordPress, Drupal, headless), CRM (HubSpot, Salesforce, SITS, Ellucian), and marketing automation tools.
Acceptance threshold. JavaScript snippet for the CMS (deployment without a developer), webhook or REST API for the CRM (real-time lead synchronisation), and complete technical documentation.
Question to ask: "Does your chatbot push leads into our CRM in real time or in batch? Which fields are synchronised?"
6. Deployment timeline (8%)
The seasonality of student recruitment makes timing critical. A chatbot deployed after the UCAS deadline (January) or after clearing (August) has missed its value window.
Acceptance threshold. Less than 2 weeks from contract signature to production, including training on institution content. Education-specialist solutions achieve 48 hours; generic solutions require 4 to 8 weeks of configuration.
7. Uptime SLA (5%)
67% of prospect activity occurs outside office hours, peaking on Sunday evenings (source: 200,000 Skolbot sessions, 2025-2026). A chatbot that goes down at weekends cancels the main competitive advantage.
Acceptance threshold. SLA of 99.9% minimum (less than 8h45 of downtime per year), with real-time monitoring and alerts.
8. Performance and response time (5%)
Acceptance threshold. Response time below 5 seconds for 95% of queries. Field data shows a median of 3 seconds for education-specialist AI chatbots, versus 47 hours for email and 72 hours for contact forms (source: mystery shopping audit across 80 institutions, 2025).
Compliance: what the law requires
9. GDPR and data hosting (10%)
Any chatbot that collects prospect data — including data from minors — must comply with GDPR (Regulation 2016/679). This is not optional; it is the European legal framework.
Acceptance threshold. Data hosted in the EU, signed DPA (Data Processing Agreement), accessible processing records, operational right to erasure within 72 hours, and explicit consent before any data collection. The ICO publishes specific guidance on AI and data protection relevant to education.
Critical question: "Where is conversation data hosted? Who has access? What is the deletion process upon request?"
10. EU AI Act — transparency and obligations (5%)
The European AI Act imposes transparency obligations (Article 52): prospects must know they are interacting with an AI. AI systems used in education are classified as high-risk (Annex III), which triggers additional documentation and human oversight requirements.
Acceptance threshold. Explicit notice "You are chatting with an AI assistant" at the start of every conversation, accessible technical documentation of the AI system, and a mechanism to transfer to a human at any time.
Support: what makes the difference after signing
11. Onboarding and training (7%)
A high-performing chatbot poorly configured produces the same results as a mediocre chatbot. Onboarding must include: assisted initial setup, admissions team training, and content validation before go-live.
Acceptance threshold. Dedicated training session (not a generic webinar), joint validation of the chatbot on the 20 most frequent questions, and customised internal documentation.
12. Support SLA and dedicated CSM (5%)
Acceptance threshold. Support response time below 4 hours on business days, a dedicated CSM (Customer Success Manager) with education sector knowledge, and quarterly performance reviews with optimisation recommendations.
Evaluation grid: the ready-to-use template
Use this matrix to score each candidate solution. Each criterion is rated from 1 (insufficient) to 5 (excellent), then multiplied by its weight.
| Criterion | Wt. | Solution A | Solution B | Solution C | |---|---|---|---|---| | 1. Institution-specific training | 15% | _/5 × 0.15 = _ | _/5 × 0.15 = _ | _/5 × 0.15 = _ | | 2. Native multilingual | 12% | _/5 × 0.12 = _ | _/5 × 0.12 = _ | _/5 × 0.12 = _ | | 3. Open day auto-registration | 10% | _/5 × 0.10 = _ | _/5 × 0.10 = _ | _/5 × 0.10 = _ | | 4. Analytics | 8% | _/5 × 0.08 = _ | _/5 × 0.08 = _ | _/5 × 0.08 = _ | | 5. CMS/CRM integration | 10% | _/5 × 0.10 = _ | _/5 × 0.10 = _ | _/5 × 0.10 = _ | | 6. Deployment timeline | 8% | _/5 × 0.08 = _ | _/5 × 0.08 = _ | _/5 × 0.08 = _ | | 7. Uptime SLA | 5% | _/5 × 0.05 = _ | _/5 × 0.05 = _ | _/5 × 0.05 = _ | | 8. Response time | 5% | _/5 × 0.05 = _ | _/5 × 0.05 = _ | _/5 × 0.05 = _ | | 9. GDPR | 10% | _/5 × 0.10 = _ | _/5 × 0.10 = _ | _/5 × 0.10 = _ | | 10. AI Act | 5% | _/5 × 0.05 = _ | _/5 × 0.05 = _ | _/5 × 0.05 = _ | | 11. Onboarding | 7% | _/5 × 0.07 = _ | _/5 × 0.07 = _ | _/5 × 0.07 = _ | | 12. Support / CSM | 5% | /5 × 0.05 = _ | /5 × 0.05 = _ | /5 × 0.05 = _ | | TOTAL | 100% | **/5** | **/5** | **/5** |
How to interpret the score. Below 3/5, the solution has structural gaps. Between 3 and 4, it works with trade-offs. Above 4, it covers the needs of a European higher education institution.
For a detailed comparison of market solutions, see our AI chatbot comparison for higher education. To understand why chatbots outperform contact forms, read our chatbot vs form analysis.
FAQ
Who should write the chatbot RFP within the institution?
The specification should be co-authored by three parties: the admissions directorate (defining functional needs), IT (validating technical and integration constraints), and the DPO or legal team (ensuring GDPR and AI Act compliance). A steering committee of 3 to 5 people is sufficient. Involving too many stakeholders lengthens the process without improving the document quality.
How long does it take to write a chatbot RFP?
With this grid as a starting point, allow 2 to 3 weeks from kickoff to finalised document. The longest phase is not the writing — it is internal alignment on priorities (criterion weighting). Start with the summary grid from this article, adjust the weightings in committee, then detail the acceptance thresholds.
Should the RFP include a budget range?
Yes, include a budget range. This filters out solutions outside your scope and avoids wasting time on demonstrations with vendors 5 times above budget. For an education-specialist AI chatbot, the range is between EUR 200 and 800 per month on a per-institution flat fee. Generic B2B solutions start at USD 2,500 per month. Including this information enables vendors to propose the most relevant offering.
Must the RFP reference the EU AI Act?
Yes, explicitly. Since the progressive enforcement of the EU AI Regulation, AI systems in education are classified as high-risk. The RFP must require Article 52 compliance (transparency) and verify the classification of the proposed system. Any vendor unable to document AI Act compliance in 2026 represents a risk. The UK Department for Education also provides guidance on safe AI use in education settings.
How should you evaluate chatbot response quality during the trial?
Prepare a list of 30 real questions drawn from your exchanges with prospects (email, phone, social media). Submit them to the chatbot in test mode and evaluate each response on three axes: accuracy (is the information correct?), completeness (does the response cover the question?), and tone (is the response appropriate for a student prospect?). A score of 80% or above on the 30 questions indicates a viable solution. For further guidance on return on investment, see our student chatbot ROI guide.
Test Skolbot on your institution in 30 seconds


