skolbot.AI Chatbot for Schools
ProductPricing
Free demo
Free demo
Isometric illustration of a Gen Z student using ChatGPT to search for a UK university — AI-powered higher education search
  1. Home
  2. /Blog
  3. /AI visibility
  4. /Gen Z in 2026: Over Half Start Their University Search on AI Tools
Back to blog
AI visibility12 min read

Gen Z in 2026: Over Half Start Their University Search on AI Tools

More than half of Gen Z applicants now open ChatGPT or Perplexity before UCAS. Here's what UK higher education institutions can do to appear in those AI-generated answers.

S

Skolbot Team · May 15, 2026

Summarize this article with

ChatGPTChatGPTClaudeClaudePerplexityPerplexityGeminiGeminiGrokGrok

Table of contents

  1. 01How Gen Z actually searches for universities in 2026
  2. 02What applicants see in an AI answer — and what they miss
  3. 03Why AI engines recommend the same 20 universities repeatedly
  4. Training corpus weight
  5. Real-time web access and crawlability
  6. Cross-source corroboration
  7. 04The three signals Gen Z checks after an AI answer
  8. Signal 1: Fees and funding transparency
  9. Signal 2: Student voice and authentic evidence
  10. Signal 3: Rankings and accreditation validation
  11. 05How your institution can appear in ChatGPT and Perplexity answers
  12. 1. Implement Schema.org structured markup
  13. 2. Make your programme pages factually dense
  14. 3. Ensure your UCAS and third-party profiles are current
  15. 4. Structure content for AI extraction, not scroll reading
  16. 06The Gen Z AI search landscape by the numbers

How Gen Z actually searches for universities in 2026

Gen Z no longer begins a university search on Google. The default starting point for prospective students born after 1997 is now a generative AI tool — ChatGPT, Perplexity, or Gemini — where they expect a direct, conversational answer rather than a ranked list of links to click through.

The data confirm the shift. The share of prospective students using generative AI in their university search rose from 26% in spring 2025 to 46% by late 2025 (Source: Skolbot internal tracking, 2025–2026 admissions cycle). By the time this cohort was preparing UCAS applications in early 2026, over half reported opening an AI tool before visiting any institutional website. The pattern is consistent across state-school and independent-school applicants, across subject areas, and across all UK regions.

This is not a marginal behaviour. When more than half of prospective students arrive at your website after an AI conversation has already shaped their shortlist, the question of whether your institution appeared in that conversation becomes an enrolment-critical issue.

The AI tools Gen Z uses fall into three types:

  • Conversational AI with web search — Perplexity and Gemini, which pull live web data and cite sources
  • Conversational AI from training data — ChatGPT in its base mode, which synthesises from its training corpus
  • Google AI Overviews — which appear above organic results on roughly 64% of informational education queries (Search Engine Land, March 2026)

Each tool selects institutions differently. All three weight verifiable, structured information over generic marketing copy.

What applicants see in an AI answer — and what they miss

An AI-generated answer to "best universities for psychology in the UK" looks nothing like a Google results page. It is a paragraph of flowing prose, sometimes with a bulleted shortlist, naming three to six institutions with a brief rationale for each. It reads like a knowledgeable friend's recommendation.

What the prospective student sees feels authoritative. What they rarely consider is which institutions the AI did not mention — and why.

The "why" matters for admissions teams. UK universities reach a 29% mention rate on ChatGPT and 38% on Perplexity — the highest rates in Europe, where the continental average is just 19% (Source: Skolbot GEO monitoring, 500 queries × 6 countries × 3 AI engines, Feb. 2026). Even at 38%, more than six in ten Perplexity answers about UK higher education name no specific institution. The AI answer routinely omits mid-ranking, specialist, and private higher education providers — not because they offer poorer programmes, but because their online presence does not yet speak in a language AI engines can process.

The data gap is structural. AI models extract information from three source types:

  1. Institutional web content that is factually dense, freshly updated, and semantically structured
  2. Third-party authority sources such as UCAS, QAA, OfS, the Guardian University Guide, and the Complete University Guide
  3. Structured markup — specifically Schema.org, which acts as a machine-readable identity card for your institution

A university whose programme pages read as marketing prose, whose UCAS entry is incomplete, and whose website carries no Schema.org markup is effectively invisible to the generation that searches with AI first.

Why AI engines recommend the same 20 universities repeatedly

AI engines are not impartial. They repeat what their training data says, and their training data overrepresents institutions with high historical web presence, abundant third-party mentions, and decades of SEO investment. Russell Group universities and the large post-92 institutions with strong domain authority appear by default. Everyone else must earn their place.

The mechanism has three reinforcing layers.

Training corpus weight

AI models learn from a snapshot of the web. Institutions that have published factual, citable content over many years — detailed programme descriptions, graduate salary data from HESA Graduate Outcomes, TEF ratings, QAA inspection reports — accumulate mention density that newer or more recently optimised websites cannot match overnight.

Real-time web access and crawlability

For AI tools with web search enabled (Perplexity, Gemini), the answer depends on what the crawler can reach and parse at the moment of query. Pages behind login walls, PDFs containing key data such as fees or entry requirements, and pages that block AI crawlers in robots.txt are simply absent from the answer pool.

Cross-source corroboration

When an AI engine names an institution, it validates that name against multiple sources. A university that appears on UCAS, is listed in the Complete University Guide, has a QAA registered provider entry, and publishes its TEF rating on its homepage is corroborated across four authoritative sources. An institution present only on its own website cannot be corroborated — and therefore carries lower citation confidence.

The implication is that AI citation is not random. It is predictable, and it is addressable. For a detailed breakdown of the 10 signals AI uses to decide which schools to recommend, see our analysis of AI recommendation criteria for schools.

The three signals Gen Z checks after an AI answer

Gen Z uses AI as a first filter, not a final decision tool. After the AI conversation narrows the field to three or four institutions, the prospective student begins a secondary verification phase. Understanding this phase tells you what your website must deliver.

Signal 1: Fees and funding transparency

The first thing a prospective student checks on any shortlisted institution's website is the cost. This holds consistently across income backgrounds, subject areas, and institution types. 89% of prospects ask about tuition fees before any other topic (Source: analysis of 12,000 Skolbot chatbot conversations, Sept. 2025 – Feb. 2026). If this information is buried behind a "Request a prospectus" form or absent from the programme page entirely, the institution fails the first post-AI check.

Signal 2: Student voice and authentic evidence

Gen Z was algorithmically literate before they were teenagers. They distinguish between polished marketing copy and genuine student testimony. After an AI shortlist, they visit the institution's social media, search for student Reddit threads, and check Whatuni or Studentcrowd reviews. A prospectus photograph and a quote attributed to "Sarah, Business Studies, Year 2" does not pass scrutiny.

Signal 3: Rankings and accreditation validation

Ranking mentions in AI answers are frequently cited verbatim. After seeing "ranked 14th in the Guardian University Guide for Business," the prospective student opens the Guardian University Guide directly to verify. If your ranking data on your own website is inconsistent with third-party sources, or if your TEF or OfS registration status is not clearly stated, the verification fails — and trust erodes rapidly.

This three-signal verification loop means that AI visibility and website conversion are not independent problems. An institution that earns an AI mention but fails the post-AI check loses the applicant at the second stage. Both layers require attention. For more on what Gen Z needs from your site in that verification phase, see our article on what Gen Z expects from your school website.

How your institution can appear in ChatGPT and Perplexity answers

Appearing in AI-generated answers is achievable for any UK institution willing to treat it as a structured technical and content project. The discipline is Generative Engine Optimisation (GEO) — the counterpart to SEO for AI channels. Our full GEO guide for schools covers the complete framework; the four highest-impact actions for institutions starting now are below.

1. Implement Schema.org structured markup

Institutions with structured Schema.org markup gain an average of 12 additional percentage points of AI visibility compared to those without it (Source: Skolbot GEO monitoring, 500 queries × 6 countries × 3 AI engines, Feb. 2026). This is the single highest-ROI GEO intervention available to admissions and digital teams.

The minimum implementation for a UK higher education provider covers:

Schema typeKey fields for AI citation
EducationalOrganizationLegal name, address, URL, logo, OfS registration, TEF rating, QAA status
EducationalOccupationalProgramProgramme title, award (BSc/BA/MSc), duration, UCAS tariff points, delivery mode
FAQPageQuestion-answer pairs on each admissions and programme page
AggregateRatingVerified student ratings from Whatuni or Studentcrowd

Schema.org's EducationalOrganization specification provides the full vocabulary. Google Search Central's structured data documentation covers implementation syntax. For a UK-specific implementation walkthrough, see our guide to structured data for schools and AI visibility.

2. Make your programme pages factually dense

Marketing language — "world-class teaching," "vibrant campus community," "outstanding career support" — gives AI engines nothing to cite. Factual content does.

For each programme page, add:

  • UCAS tariff entry requirements (numerical, not ranges)
  • Graduate employment rate at six months, sourced from HESA
  • Median starting salary, sourced from Graduate Outcomes or HESA
  • Accrediting body (AACSB, AMBA, RICS, GDC, NMC — whichever applies)
  • TEF rating and year of award
  • Any specific Guardian University Guide or Complete University Guide ranking position

Each named entity — UCAS, HESA, QAA, TEF — is a cross-reference point that AI models use to verify institutional credibility. The denser the verifiable entity network, the higher the citation probability.

3. Ensure your UCAS and third-party profiles are current

UCAS programme listings are indexed by AI models and treated as an authoritative source for entry requirements, places available, and programme structure. If your UCAS data is outdated or inconsistent with your own website, AI engines encounter contradictory signals and reduce citation confidence.

The same applies to your QAA registered provider entry, your OfS registration, and any entries in the Complete University Guide or Guardian University Guide. Audit these four sources against your website content before any GEO investment in your own pages — third-party corroboration must match.

4. Structure content for AI extraction, not scroll reading

AI engines extract passages, not full pages. Every H2 section on your admissions and programme pages should open with a direct, complete answer to the implied question in the heading. A 60-word paragraph containing a verifiable statistic and its source is cited ahead of a 400-word narrative about your institution's heritage.

This applies equally to Clearing content. Gen Z applicants in Clearing are among the highest-urgency searchers in the annual UCAS cycle. An institution with structured, current Schema.org data and a Clearing FAQ page with FAQPage markup will appear in AI responses during the Clearing window — an opportunity that the majority of UK institutions have not yet addressed. For a broader treatment of how GEO and SEO interact, see our article on SEO vs GEO for higher education search strategy.

The Gen Z AI search landscape by the numbers

MetricFigureSource
Prospective students using AI in university search (late 2025)46%Skolbot tracking, 2025–2026 cycle
UK ChatGPT mention rate for higher education queries29%Skolbot GEO monitoring, Feb. 2026
UK Perplexity mention rate for higher education queries38%Skolbot GEO monitoring, Feb. 2026
European average AI mention rate19%Skolbot GEO monitoring, Feb. 2026
AI visibility gain from Schema.org markup+12 ppSkolbot GEO monitoring, Feb. 2026
Google AI Overviews on informational education queries64%Search Engine Land, March 2026
Prospects asking about fees before any other topic89%Skolbot chatbot analysis, 12,000 conversations

FAQ

Does being in the Russell Group guarantee AI visibility?

Not automatically. Russell Group institutions benefit from high training-corpus weight due to decades of published research, press coverage, and third-party mentions. However, for specific subject-combination queries — "part-time MSc data science London," "nursing degree with January intake" — a specialist or post-92 institution with strong Schema.org markup and factually dense programme pages can outperform a Russell Group university in AI citation. Niche authority on specific queries is achievable for any institution, regardless of league-table position.

Does AI visibility affect UCAS application volumes directly?

Not with a direct trackable link, but the connection is real. AI tools are now used for shortlisting before UCAS applications are submitted. An institution absent from AI answers is absent from the consideration set of the prospective students who use those tools. Given that 46% of prospective students now use AI in their search, AI invisibility structurally reduces an institution's shortlist appearances — and by extension its application volume from that segment.

What is the fastest way to improve AI visibility for a UK university?

Schema.org structured markup is the fastest single intervention with the highest measurable effect — a +12 percentage-point average gain in AI visibility (Source: Skolbot GEO monitoring, Feb. 2026). Implementation takes two to three days of development work and produces measurable results within four weeks. Programme page content rewrites take longer to influence AI responses (one to three months), and third-party source updates (UCAS, QAA, Guardian) take three to six months to propagate fully.

Does Clearing strategy need to account for AI search?

Yes. Clearing is the highest-urgency search moment in the UCAS calendar. Gen Z applicants in Clearing open ChatGPT or Perplexity before calling any institution's hotline. An institution with a Clearing FAQ page using FAQPage markup, current UCAS availability data, and Schema.org markup for any Clearing-specific programmes stands the best chance of appearing in AI answers during that window. Institutions without this infrastructure are effectively invisible to AI-first Clearing searchers.

How does AI search affect international student recruitment?

International prospective students — who cannot attend UK open days in person and rely heavily on digital research — are among the most frequent users of AI in their university search. For international recruitment, AI visibility is particularly high-stakes. Perplexity's 38% UK mention rate is the relevant benchmark; for queries in languages other than English, AI citation rates drop significantly, making multilingual content and Schema.org markup in multiple languages an emerging priority for UK institutions with international recruitment targets.


Test your school's AI visibility for free Discover how schools are improving their recruitment

Related articles

15 LLM signals for university AI recommendation — isometric GEO diagram
AI visibility

15 Signals LLMs Evaluate to Recommend Your Institution

90-day action plan for UK schools to get cited by ChatGPT and Perplexity AI engines
AI visibility

90-Day Plan to Get Cited by ChatGPT and Perplexity

Audit of a university's visibility on Perplexity with scoring grid
AI visibility

Perplexity school visibility: audit and optimisation guide

Back to blog

GDPR · EU AI Act · EU hosting

skolbot.

SolutionPricingBlogCase StudiesCompareAI CheckFAQTeamLegal noticePrivacy policy

© 2026 Skolbot