A privacy audit is not guesswork β it requires a structured checklist
A privacy audit in a college or university is a methodical inventory of what you collect, why you collect it, how you store it, and what you do with it. Without a structured checklist, gaps are guaranteed: the college fair spreadsheet nobody anonymized, the DPA never signed with the chatbot provider, prospect conversations stored without a retention period.
62% of institutions have no documented procedure for processing prospect data (Source: Skolbot survey of 62 marketing managers at higher education institutions, December 2025). This 20-point checklist covers the full scope of privacy compliance for a private college or university, including AI governance obligations. Every point is actionable and prioritized.
For the broader framework, consult our complete guide to student data protection.
Part 1 β Governance and legal basis (points 1 to 5)
1. Verify your institution's FERPA compliance officer
A FERPA compliance officer is essential for any institution receiving federal funding under Title IV. What the audit checks: has a compliance officer been formally designated? Do they have direct access to senior leadership? Do they hold a decision-making role (CIO, General Counsel) that creates a conflict of interest?
Action: verify the compliance officer's designation, confirm their independence, and ensure any Department of Education notifications are current.
2. Compile and update the record of processing activities
A comprehensive data inventory is the cornerstone of privacy compliance. It must list every personal data processing operation: purpose, data categories, legal basis, retention periods, and recipients. The FTC and state attorneys general expect institutions to maintain current records.
Action: review each department (admissions, registrar, marketing, IT, finance) and verify that their processing activities appear in the record with an explicit legal basis.
3. Validate the legal basis for each processing activity
Four legal bases cover 95% of a college's processing activities: performance of a contract (enrollment, billing), legal obligation (submissions to accreditors, degree records), legitimate interest (marketing, analytics), and consent (newsletters, cookies). The classic mistake: basing everything on consent, which can be withdrawn at any time.
Action: for each entry in the record, verify the legal basis is correct. Migrate processing activities incorrectly based on consent to the appropriate basis.
4. Verify the existence and quality of Privacy Impact Assessments (PIAs)
Under OMB Circular A-130 and state privacy laws like CCPA, high-risk processing requires a Privacy Impact Assessment. For a college, this includes at minimum: deploying an AI chatbot, using AI tools for admissions decisions, campus video surveillance, and marketing profiling.
Action: list all high-risk processing activities, verify a PIA exists for each, and confirm it is current (less than 2 years old or updated after any modification to the processing).
5. Document data subject rights response procedures
FERPA grants students specific rights (access to education records, amendment, consent for disclosure). State laws like CCPA add further rights (access, deletion, opt-out of sale). Your institution must be able to respond to each within mandated timeframes. The cost of acquisition per student ranges from USD 1,800 to USD 3,000 in the US (Source: estimates from NACAC, EAB, Ruffalo Noel Levitz) β every deletion request represents a measurable loss of marketing investment.
Action: test the procedure by simulating a deletion request. Measure the actual response time and the number of systems involved (CRM, chatbot, email platform, analytics, backups).
Part 2 β Collection and consent (points 6 to 10)
6. Audit every data collection point
Personal data enters your system through dozens of channels: website forms, chatbot, campus visit registration, college fairs, Common App/Coalition Application platforms, spontaneous applications, phone calls. The audit must inventory every one.
89% of prospects ask a question about tuition and fees and 78% ask about internships and co-ops (Source: analysis of 12,000 Skolbot chatbot conversations, Sept 2025 β Feb 2026). These conversations generate personal data the moment an identifier is associated.
Action: map every form, chatbot, and physical collection point. For each, verify: what data is collected? Is the prospect informed? Is the legal basis displayed?
7. Check the compliance of consent forms
Under CCPA and state consumer privacy laws, consent must be freely given, specific, informed, and unambiguous: no pre-checked boxes, no bundled consent, no conditioning access to information on data provision.
Action: audit every form. Marketing boxes unchecked by default, text distinguishing each purpose, visible link to the privacy policy.
8. Verify cookie consent management
While the US does not have a federal cookie law equivalent to the ePrivacy Directive, state laws like CCPA and CPRA require opt-out mechanisms for tracking. Several states (Virginia, Colorado, Connecticut) have enacted comprehensive privacy laws. The audit checks: does your cookie banner offer clear opt-out? Are tracking technologies properly disclosed? Is proof of consent retained?
Action: test the site with a clean browser. Verify that Google Analytics, Meta pixels, and other trackers respect user opt-out preferences.
9. Verify the processing of minors' data
Under COPPA (Children's Online Privacy Protection Act), children under 13 require parental consent. Many community colleges and dual-enrollment programs admit 16-17 year olds whose data requires additional protections under state laws.
Action: verify that forms and the chatbot identify minors and trigger appropriate verification mechanisms (parental email, double opt-in).
10. Check data minimization
The data minimization principle requires collecting only what is strictly necessary. A chatbot should not require a name and email to answer a question about programs.
Action: for each form, list the mandatory fields and verify they are justified by the stated purpose.
Part 3 β Storage and security (points 11 to 15)
11. Verify retention periods and automated purging
Best practices recommend 3 years after last contact for prospects, 7 years for financial records, and permanent retention for academic transcripts. The audit verifies that purging is actually happening, not merely theoretical.
Action: query the database. Are prospects older than 3 years still present? If so, automated purging is not working.
12. Check encryption in transit and at rest
Encryption in transit (TLS 1.3) and at rest (AES-256) across the entire chain: website, APIs, databases, backups.
Action: check the SSL certificate of each endpoint via SSL Labs. Confirm at-rest encryption on the database.
13. Verify data hosting jurisdiction
Under FERPA and state privacy laws, institutions should understand where their data is hosted. Many institutions require US-based hosting. International transfers require appropriate safeguards.
Action: list every service that processes personal data (hosting provider, CRM, email platform, analytics, chatbot). For each, verify the server location and the existence of appropriate data processing agreements.
14. Audit access controls and logging
Who has access to which data, and since when? The audit verifies that access follows the principle of least privilege and that access is logged.
Action: extract the list of users with access to the CRM, student information system, and email platform. Verify that accounts of former staff are deactivated. Confirm that access logs are retained and usable.
15. Test backups and restoration
Backups must be encrypted, regular, and β crucially β tested. A backup that has never been successfully restored is not a backup.
Action: ask for the date of the last restoration test. If it is more than 6 months ago (or has never occurred), schedule one immediately.
Part 4 β Vendors and transfers (points 16 to 18)
16. Verify DPAs (Data Processing Agreements) with each vendor
Every vendor processing data on your behalf requires a data processing agreement: hosting, CRM, email platform, chatbot, analytics, video conferencing. The DPA specifies: subject, duration, data categories, obligations, and onward sub-processing.
Action: list all vendors. Verify that a signed, up-to-date DPA exists. Prioritize high-volume processors (CRM, chatbot) and those handling sensitive data.
17. Check international data transfers
If any vendor stores data outside the US, verify the transfer mechanism and applicable protections. For institutions subject to state privacy laws, data localization requirements may apply.
Action: for each vendor from point 16, verify the server location and the transfer mechanism. A vendor without a proper DPA is a compliance risk.
18. Audit your vendors' subcontractors
Due diligence requires knowledge of onward sub-processors. Does your CRM use AWS? Does your chatbot rely on an AI model hosted by a third party? These chains must be documented.
Action: ask each vendor for their list of onward sub-processors. Verify equivalent safeguards.
Part 5 β AI and specific obligations (points 19 to 20)
19. Classify your AI systems under emerging AI governance frameworks
The NIST AI Risk Management Framework and the White House AI Executive Order are shaping AI governance in the US. For a college, the main categories are:
- High risk β application scoring, automated grading, admissions decision support. Obligations: risk management, human oversight, transparency.
- Limited risk β informational chatbot, FAQ assistant. Primary obligation: inform the prospect that they are interacting with an AI.
Several states are introducing AI-specific legislation. Institutions using AI tools for application screening must prepare now.
For detailed obligations by category, see our article on AI governance and higher education.
Action: compile an inventory of all AI systems used in the institution (chatbot, scoring, plagiarism detection, recommendation engine). Classify each by risk level. For high-risk systems, verify the existence of a compliance dossier.
20. Verify algorithmic transparency and human oversight
Emerging AI governance frameworks and FERPA (for automated decisions affecting education records) converge: any automated decision with a significant effect (admission, exclusion, scholarship) requires effective human oversight. The AI recommends; a human decides.
Action: for each high-risk AI system, verify: (a) documented human oversight, (b) prospect/student notification, (c) a functioning objection procedure.
Summary: checklist overview table
| # | Audit point | Domain | Priority | Frequency |
|---|---|---|---|---|
| 1 | Compliance officer designation | Governance | Critical | Annual |
| 2 | Up-to-date record of processing activities | Governance | Critical | Bi-annual |
| 3 | Legal basis per processing activity | Governance | Critical | On each new processing activity |
| 4 | Privacy Impact Assessments (PIAs) | Governance | High | Annual or on modification |
| 5 | Data subject rights procedures | Governance | High | Annual + simulated test |
| 6 | Mapping of data collection points | Collection | High | Bi-annual |
| 7 | Consent form compliance | Collection | Critical | Quarterly |
| 8 | Cookie/tracking consent management | Collection | Critical | Quarterly |
| 9 | Processing of minors' data | Collection | High | Annual |
| 10 | Data minimization | Collection | Medium | Bi-annual |
| 11 | Retention periods and purging | Storage | Critical | Bi-annual |
| 12 | Encryption in transit and at rest | Security | Critical | Annual |
| 13 | Data hosting jurisdiction | Security | High | On each new provider |
| 14 | Access controls and logging | Security | High | Quarterly |
| 15 | Backups and restoration | Security | High | Bi-annual |
| 16 | DPAs with vendors | Vendor management | Critical | Annual |
| 17 | International transfers | Vendor management | High | On each new provider |
| 18 | Vendor sub-processors | Vendor management | Medium | Annual |
| 19 | AI governance classification | AI | High | Annual |
| 20 | Algorithmic transparency | AI | High | Annual |
How to organize the audit in practice
The audit involves at least four stakeholders: the compliance officer, the VP of enrollment management, IT, and the CMO. Schedule: full annual audit (20 points) + quarterly checks on critical points (consent, cookies, access). Each audited point produces a record: result (compliant / non-compliant / partial), evidence, and corrective action. This is the first thing the FTC or your state attorney general will ask for in an investigation.
For the technical measures to protect prospect data, see our dedicated guide.
FAQ
How long does a complete privacy audit take for a college?
Between 3 and 6 weeks depending on the size of the institution and the maturity of its data protection framework. Institutions that already have an up-to-date processing record and an active compliance officer save time. The longest phase is the vendor audit (points 16 to 18), as it depends on provider response times.
Is a specific audit required if the institution uses an AI chatbot?
Yes. An AI chatbot constitutes a distinct processing activity that must appear in the record. If the language model is hosted by a third party, points 13 and 17 are directly affected. Emerging AI governance frameworks add the obligation to inform the prospect they are interacting with an AI. 91% of visitors to an institution's website leave without first contact (Source: Skolbot funnel analysis, 30 institutions, 2025-2026 cohort) β the chatbot is often the only collection point before the application, making its compliance critical.
What are the penalties for privacy non-compliance for a college?
Under FERPA, institutions risk losing federal funding β a catastrophic outcome for any Title IV institution. CCPA violations can result in fines of USD 2,500 per unintentional violation and USD 7,500 per intentional violation. State attorneys general can pursue additional enforcement. Beyond fines, a public enforcement action damages reputation with prospects and their parents.
Does the privacy audit also cover AI governance obligations?
Not natively. Points 19 and 20 of this checklist extend the scope to AI classification and algorithmic transparency. Privacy law and AI governance are complementary: one protects data, the other regulates the systems that process it. An integrated audit avoids duplication. For details, see our AI governance guide.
This 20-point checklist is the foundation of every audit cycle. Institutions that integrate it into their annual governance reduce their exposure to penalties and strengthen prospect trust.
Also read: AI Chatbot Comparison for Higher Education



