What is a PIA & Why Does it Matter?
What is Personal Information at TRU?
Before you use any AI tool, you need to know whether the data you’re working with is considered personal information under BC’s FIPPA — because that determines whether privacy obligations apply.
Under FIPPA, personal information is any recorded information about an identifiable individual — meaning there is a serious possibility someone could be identified from the information, alone or combined with other available data.

Personal information — protected under FIPPA

Not personal information (generally)
Legal Basis — BC FIPPA
Under British Columbia’s Freedom of Information and Protection of Privacy Act (FIPPA), a Privacy Impact Assessment (PIA) is required for university tools and initiatives that handle personal information.
At TRU, PIAs are led by the Information Security Office, with the participation of the Privacy and Access Office on a case-by-case basis, with a focus on proactively mitigating privacy risks.
4-Step PIA Decision Workflow
Before using any AI tool with TRU data, work through these four steps. The classification of your data — not the tool — determines your obligations under FIPPA.
1
What type of data will you use?
- What to consider?
- Review the TRU Data Classification framework. Determine whether your data is: Public, Internal, or Confidential.
- Decision
- If Internal or Confidential → PIA required
2
Is the data Confidential or Internal?
- What to consider?
- If data includes personal, student, HR, financial, or operational information, it is likely Internal or Confidential.
- Decision
- Yes → PIA required No (Public only) → continue to Step 3
3
Is the tool external or cloud-based?
- What to consider?
- Most AI tools are cloud services under FIPPA. Ask: Is data stored outside Canada? Does the vendor retain or access data?
- Decision
- Yes to any → PIA required No → continue to Step 4
4
Have you verified the vendor’s privacy practices?
- What to consider?
- Have you reviewed the privacy policy, terms of use, data storage location, and model training opt-out controls?
- Decision
- No or unsure → PIA required Yes, fully verified → may proceed without PIA
The PIA intake process: step by step
Once you’ve determined a PIA is needed, there is a specific process at TRU. The PIA assessment is led by the Information Security Office with participation from the Privacy and Access Office — but the process starts with you, as the requestor.
TRU uses a Preliminary Privacy Impact Assessment (PPIA) Tool as the first step. If your project involves personal information in any way — including the use of an AI tool that processes student, staff, or operational data — you must complete the PPIA before proceeding. If your project does not involve any personal information, the Privacy Office takes no position and no assessment is required.
AI Model Training & Data Controls
When using any AI tool, you must understand how your data is handled after submission — including whether it may be retained or used to train the underlying AI model.
Opt-in model (safer by default)
Data is NOT used for training unless you explicitly enable it. Preferred for any TRU use involving non-public information.
Opt-out model (verify before use)
Data IS used for training by default unless you actively disable it. You must confirm the setting is off before using with Internal data.
Before using any AI tool for TRU work, confirm:
- Whether submitted data is used for model training, and whether opt-out is available.
- Whether data controls apply at the account or organization level — not just per chat session.
- Where data is stored and processed — must be in Canada or covered by an adequate data agreement.
- Whether the vendor retains or can access submitted data beyond the session.
- Settings are documented in: vendor privacy policy, terms of service, and account data controls.
Do not use these tools with Internal data if:
- The tool does not allow opt-out from model training.
- You cannot determine how data is stored.
- Opt-out controls are unavailable, unclear, or unverifiable.
- Data is processed or stored outside Canada without an adequate agreement.
Responding to a Privacy Breach involving an AI tool
A privacy breach occurs when personal information is collected, used, disclosed, or retained in a way that violates FIPPA. With AI tools, breaches can happen in ways that aren’t always obvious — including accidentally submitting the wrong data or a vendor changing how they handle information.
Common AI-related breach scenarios
- Submitting student records, HR data, or financial information into a tool with no PIA.
- Using a tool after its terms of service changed to enable model training.
- Sharing prompts containing identifiable personal information with an external vendor.
- Vendor suffering a data breach that affects TRU data.
When a breach becomes notifiable
TRU must notify the affected individual(s) and the BC Privacy Commissioner without unreasonable delay if the breach could reasonably be expected to cause significant harm — including identity theft, financial loss, or reputational damage to an individual.
If you suspect or discover a privacy breach related to an AI tool, take these steps immediately:
Do not wait to report. Delaying a breach report to the Privacy Office or Information Security Office increases the risk of harm to individuals and may complicate TRU’s compliance obligations under FIPPA. When in doubt, report immediately and let the Privacy Office determine the severity — do not make that judgment on your own.
For TRU’s full Privacy Breach procedure, see: Privacy Breaches — TRU Privacy and Access Office