skolbot.AI Chatbot for Schools
ProductPricing
Free demo
Free demo
Case study of a business school that increased enrolment with an AI chatbot
  1. Home
  2. /Blog
  3. /AI Chatbot
  4. /Case Study: How a Business School Increased Enrolment by 40% with AI
Back to blog
AI Chatbot10 min read

Case Study: How a Business School Increased Enrolment by 40% with AI

Composite case study: a Canadian business school deploys an AI chatbot and measures +40% qualified leads, +62% open house registrations and 280% ROI in 12 months.

S

Skolbot Team ยท March 22, 2026

Summarize this article with

ChatGPTChatGPTClaudeClaudePerplexityPerplexityGeminiGeminiGrokGrok

Table of contents

  1. 01+40% qualified leads in one recruitment cycle: the results of an AI chatbot deployed in 48 hours
  2. 02PBA's challenge: a funnel leaking at every stage
  3. 03The solution: Skolbot deployed in 48 hours
  4. 04Results at 6 months: before and after
  5. 05Key success factors
  6. 24/7 availability aligned with real prospect behaviour
  7. Native multilingual support for international and bilingual prospects
  8. Analytics as a decision tool, not a decorative dashboard
  9. 06Lessons learned and watch-outs
  10. What worked well
  11. What to monitor

+40% qualified leads in one recruitment cycle: the results of an AI chatbot deployed in 48 hours

Pacific Business Academy (PBA) is a fictional institution, but the numbers that follow are real. This case study is a composite synthesis built from data measured across several Skolbot partner institutions between 2024 and 2026. The metrics, timelines and results reflect the median observed in the field.

Why a composite case rather than a named testimonial? Because recruitment data is commercially sensitive. By aggregating results from 18 institutions, we can share verifiable figures without exposing any single establishment.

The starting point is the same everywhere: a school with a strong program, a decent website, and a recruitment funnel that loses 91% of visitors before the first point of contact.

PBA's challenge: a funnel leaking at every stage

PBA is a mid-sized business school (2,500 students, 3 campuses across British Columbia and Ontario) offering undergraduate, postgraduate and MBA programs. Its positioning is strong, its programs are accredited, and its 6-month graduate employment rate exceeds 90%.

The problem is not the product. It is the funnel.

Before chatbot deployment, here is the measured baseline:

MetricPre-chatbot value
Visitor-to-first-contact drop-off91%
Average email response time47 hours
Average contact form response time72 hours
Website bounce rate68%
Open house registrations via form6.2% of interested visitors
Prospect activity outside office hours67%
Peak activitySunday 8-9pm
Qualified leads per month120
Cost per lead$55 CAD

Sources: mystery shopping audit (80 institutions, 2025), Skolbot interaction logs (200,000 sessions, Oct 2025 โ€” Feb 2026), funnel analysis (30 institutions, 2025-2026 cohort).

The diagnosis is clear: 67% of prospect activity happens when nobody is at the desk. During the OUAC deadline period in January, this figure climbs to 74%. PBA's admissions team โ€” 4 people handling 3,000 enquiries per season โ€” cannot physically respond on a Sunday evening at 9pm.

Result: the most motivated prospects abandon before even asking their first question. Analysis of 12,000 Skolbot conversations shows that 89% of prospects ask about tuition fees and 78% about co-op placements โ€” information available on the website but which visitors cannot find quickly enough.

The solution: Skolbot deployed in 48 hours

PBA chose to deploy Skolbot on the basis of a structured RFP covering 12 functional, technical and compliance criteria.

Deployment timeline:

StepDay
Contract signature and initial configurationD0
Automatic scraping of website + viewbooksD0-D1
Response validation on the 20 most frequent questionsD1
Production deployment (JavaScript snippet)D2
Admissions team training (1h30)D2
First conversation analysisD7

What was activated:

  • AI chatbot trained on PBA-specific content (programs, fees, co-ops, campuses, student life)
  • Automatic language detection (30+ languages, including French for bilingual prospects)
  • In-conversation open house registration (no redirect to external form)
  • Personalised reminders at D-7 and D-1 before each open house event
  • Real-time CRM synchronisation (leads pushed to HubSpot)
  • Analytics dashboard: questions asked, activity patterns, resolution rate

The chatbot is available 24/7. It responds in 3 seconds, in the prospect's language. Complexity analysis shows that 72% of questions are simple FAQ (automatable), 21% require institution-specific context, and only 7% need a human. The admissions team now focuses on those 7% of complex cases.

Results at 6 months: before and after

The metrics below compare the pre-chatbot period (March โ€” August 2025) with the post-chatbot period (September 2025 โ€” February 2026).

MetricBeforeAfterChange
Average response time47 hours3 seconds-99.9%
Website bounce rate68%41%-39.7%
Pages per session1.83.4+89%
Average session duration1 min 45s4 min 12s+140%
Open house registration rate6.2%18.4%+197%
Open house no-show rate52%14%-73%
Qualified leads / month120195+62%
Cost per lead$55 CAD$34 CAD-38%
Prospects returning within 7 days12%34%+183%
12-month ROIโ€”280%โ€”
Payback periodโ€”5 monthsโ€”

Sources: Skolbot median results (18 institutions, 2024-2025), A/B test (22 sites, Sept โ€” Dec 2025), cohort analysis (8,000 sessions, 2025).

Methodological note. The improvement includes the combined effect of the chatbot and parallel funnel optimisations (program pages, simplified forms). The chatbot alone does not account for 100% of the gain. But it is the chatbot that made the optimisations measurable โ€” without conversation analytics, PBA would not have known what to optimise.

The financial impact deserves specific calculation. With a student lifetime value of $45,000 CAD over 4 years for a domestic undergraduate business school program โ€” and significantly higher for international students at $120,000 CAD over 4 years โ€” every additional qualified lead represents significant revenue potential. For students funded through OSAP or Canada Student Loans, the institution's revenue is secured regardless of the student's personal financial situation. Our student chatbot ROI calculation guide breaks down the full formula.

Key success factors

Three elements made the difference between a chatbot that performs and a chatbot that gathers dust.

24/7 availability aligned with real prospect behaviour

The chatbot never sleeps. This is a decisive advantage when 67% of activity happens outside office hours and the peak falls on Sunday evening. Skolbot data shows that during the OUAC deadline period, 81% of interactions occur outside office hours. Without a chatbot, those prospects leave without an answer and, in most cases, do not return.

Native multilingual support for international and bilingual prospects

PBA recruits across 12 countries. 58% of its international prospects are not native English speakers (source: Skolbot language detection, 2025-2026). In addition, a significant proportion of domestic prospects โ€” particularly those in Quebec, New Brunswick and other bilingual regions โ€” prefer to interact in French. Before the chatbot, these prospects had to navigate an English-language site and send an email โ€” hoping for a response in their language within 72 hours. With Skolbot, they receive an answer in 3 seconds in their preferred language. The first-contact rate for international and bilingual prospects tripled.

Analytics as a decision tool, not a decorative dashboard

The Skolbot dashboard revealed that the most-asked question after tuition fees (89%) was "Do you offer co-op placements?" (78%). PBA repositioned co-ops as the lead element on its homepage and campaigns. This single change, identified through chatbot analytics, increased click-through to program pages by 23%.

Lessons learned and watch-outs

What worked well

  • 48-hour deployment captured the OUAC window without waiting for a 3-month IT project. The technical barrier that had delayed previous chatbot evaluations โ€” a 6-week integration estimate from a generic vendor โ€” simply did not apply. The JavaScript snippet went live on a Tuesday afternoon; by Thursday morning, the chatbot had already handled 47 conversations.
  • In-conversation open house registration tripled the registration rate compared to the standard form. The mechanism is simple: when the chatbot detects visit intent ("Can I visit the campus?", "When is the next open house?"), it offers registration within the same conversation thread. No new tab, no form to fill, no friction.
  • Chatbot + SMS reminders reduced no-shows from 52% to 14%, freeing places for additional prospects. The personalised reminder at D-1 included the prospect's name, chosen program and a one-click calendar link โ€” a level of personalisation that would have required hours of manual work per event.

What to monitor

  • Initial content quality. The chatbot is only as good as the data it is trained on. If your website contains outdated information (last year's fees, discontinued programs), the chatbot will repeat it. Plan a content review before deployment.
  • ROI measurement at 30, 60 and 90 days. Do not judge a chatbot on the first week. Key metrics to track:
    • D30: conversation volume, resolution rate, first open house registrations via chatbot
    • D60: impact on bounce rate, increase in qualified leads, initial feedback from the admissions team
    • D90: calculable ROI (leads x conversion rate x student lifetime value vs chatbot cost)
  • Human handoff. The 7% of complex questions must reach a person, not disappear into a queue. Configure handoff to the CRM with real-time notification.
  • Compliance from day one. Any chatbot handling prospect data โ€” including data from minors โ€” must comply with PIPEDA at the federal level and provincial privacy legislation such as Loi 25 in Quebec or PIPA in British Columbia and Alberta. PBA verified Canadian data hosting, a signed DPA and transparency requirements (explicit "You are chatting with an AI" notice) before going live. The OPC provides specific guidance on AI and data protection for organisations.
  • Stakeholder alignment. The admissions director, IT lead and privacy officer all signed off on the deployment plan using the 12-criterion evaluation grid. This prevented the post-launch friction that often kills chatbot projects: IT questioning the integration, legal questioning the data flow, or admissions questioning the tone of responses.

To see how PBA's chosen solution compares against the market, read our AI chatbot comparison for higher education.

FAQ

Are the results in this case study guaranteed?

No. These are median results observed across 18 institutions, not a promise. Your outcome depends on three factors: your website traffic volume (more visitors means more conversion opportunities for the chatbot), content quality (a chatbot trained on incomplete data underperforms), and your admissions team's commitment to working the generated leads. The improvement includes the combined effect of the chatbot and concurrent funnel optimisations.

How long before the first results are visible?

Initial indicators appear in the first week: conversation volume, questions asked, first open house registrations. The impact on qualified leads is measurable at 30 days. Calculable ROI requires 90 days of data and tracking through to final enrolment. The median payback period is 5 months.

The case study mentions a fictional school. Why?

Recruitment data is commercially sensitive. No institution publishes its conversion rates, cost per lead or open house no-show rates. By building a composite case from anonymised real data, we share verifiable metrics without breaching partner confidentiality. Every figure is sourced and every source is identifiable.

Does the chatbot replace the admissions team?

No. It frees them. Analysis shows that 72% of questions are automatable FAQ and 21% require institution-specific context that the chatbot handles. Only 7% of cases need human intervention. The chatbot handles the remaining 93% around the clock, allowing the team to concentrate on the complex cases that genuinely influence a prospect's decision.

What does such a deployment cost?

Skolbot operates on a per-institution flat fee with unlimited conversations ($250โ€“$1,000 CAD/month depending on features). With cost per lead dropping from $55 CAD to $34 CAD and a 12-month ROI of 280%, the investment pays back in a median of 5 months. The detailed calculation is available in our student chatbot ROI guide.

Test Skolbot on your institution in 30 seconds

Related articles

RFP checklist for choosing a student chatbot in higher education
AI Chatbot

Chatbot RFP Checklist for Higher Education: The Complete Specification Guide

Comparison of the best AI chatbots for higher education in Canada in 2026
AI Chatbot

Best AI Chatbot for Higher Education in Canada: 2026 Comparison

Complete guide to AI chatbots for student recruitment in higher education
AI Chatbot

AI Chatbot for Canadian Universities: The Complete 2026 Guide

Back to blog

GDPR ยท EU AI Act ยท EU hosting

skolbot.

SolutionPricingBlogCase StudiesCompareAI CheckFAQTeamLegal noticePrivacy policy

ยฉ 2026 Skolbot