AI Tools PIA Guidance

What is a PIA & Why Does it Matter?

What is Personal Information at TRU?

Before you use any AI tool, you need to know whether the data you’re working with is considered personal information under BC’s FIPPA — because that determines whether privacy obligations apply. 

Under FIPPA, personal information is any recorded information about an identifiable individual — meaning there is a serious possibility someone could be identified from the information, alone or combined with other available data. 

Personal information — protected under FIPPA

  • Student names, email addresses, phone numbers. 
  • Student ID numbers and enrrolment records. 
  • Employee performance, salary, or HR records. 
  • Health or medical information.
  •  Financial records or banking details. 
  • Home addresses or personal contact details. 
  • Indigenous status or other demographic data. 
  • Research participant data. 

Not personal information (generally)

  • Employee’s name and work email address 
  • Employee’s work phone number or office location 
  • Job title and business contact information
  • Publicly available institutional information 
  • Anonymized or fully de-identified datasets  
  • Aggregate data with no individual link 

Legal Basis — BC FIPPA 


Under British Columbia’s Freedom of Information and Protection of Privacy Act (FIPPA), a Privacy Impact Assessment (PIA) is required for university tools and initiatives that handle personal information.

At TRU, PIAs are led by the Information Security Office, with the participation of the Privacy and Access Office on a case-by-case basis, with a focus on proactively mitigating privacy risks.

4-Step PIA Decision Workflow

Before using any AI tool with TRU data, work through these four steps. The classification of your data — not the tool — determines your obligations under FIPPA. 

1

What type of data will you use?

  • What to consider?
    • Review the TRU Data Classification framework. Determine whether your data is: Public, Internal, or Confidential. 
  • Decision
    • If Internal or Confidential → PIA required 

2

Is the data Confidential or Internal?

  • What to consider?
    • If data includes personal, student, HR, financial, or operational information, it is likely Internal or Confidential. 
  • Decision
    • Yes → PIA required No (Public only) → continue to Step 3 

3

Is the tool external or cloud-based?

  • What to consider?
    • Most AI tools are cloud services under FIPPA. Ask: Is data stored outside Canada? Does the vendor retain or access data? 
  • Decision
    • Yes to any → PIA required No → continue to Step 4 

4

Have you verified the vendor’s privacy practices? 

  • What to consider?
    • Have you reviewed the privacy policy, terms of use, data storage location, and model training opt-out controls? 
  • Decision
    • No or unsure → PIA required Yes, fully verified → may proceed without PIA 

The PIA intake process: step by step

Once you’ve determined a PIA is needed, there is a specific process at TRU. The PIA assessment is led by the Information Security Office with participation from the Privacy and Access Office — but the process starts with you, as the requestor. 

TRU uses a Preliminary Privacy Impact Assessment (PPIA) Tool as the first step. If your project involves personal information in any way — including the use of an AI tool that processes student, staff, or operational data — you must complete the PPIA before proceeding. If your project does not involve any personal information, the Privacy Office takes no position and no assessment is required. 

If you are looking at any online or cloud-based AI tool, start by reviewing TRU’s Cloud Security Standard. This document outlines the processes and responsibilities for procuring a cloud service — which includes virtually all AI tools accessed via the internet. 

→ Contact IT Services or Info Security for the current Cloud Security Standard document 

If your project involves personal information in any manner, you must complete the TRU Preliminary Privacy Impact Assessment Tool. This self-assessment identifies the nature of the data involved, the vendor’s data practices, and whether a full PIA is warranted. 

→ The PPIA Tool is available through the Privacy and Access Office · privacy@tru.ca 

If you would like to implement a new initiative — including a new software tool or AI platform — contact the Director of Information Security. They lead the PIA process and will coordinate with the Privacy and Access Office and your team. 

→ Director of Information Security: jcuzzola@tru.ca 

As the person requesting the tool, you are responsible for understanding how the project uses personal information. You will be expected to provide details about the tool’s purpose, data types involved, vendor terms, and your department’s intended use cases throughout the assessment. 

→ Your responsibility as requestor: understand and document your data use 

Do not use the tool with Internal or Confidential data until the PIA is complete and the tool receives an approved status. If an interim report is issued, review its conditions carefully — some tools may be permitted for limited use under specific constraints while the full assessment is in progress. 

→ Check the tool’s current status on the PIA review list above 

Ready to start a PIA for an AI tool? Contact the Director of Information Security to begin the intake process 

Start a PIA → jcuzzola@tru.ca 

Remember: It is the client/requestor’s responsibility to understand how their project uses personal information. The Privacy and Access Office and Information Security Office are here to guide you through the process — but they rely on you to bring the tool forward before deployment, not after. 

AI Model Training & Data Controls

When using any AI tool, you must understand how your data is handled after submission — including whether it may be retained or used to train the underlying AI model.

Data is NOT used for training unless you explicitly enable it. Preferred for any TRU use involving non-public information.

Data IS used for training by default unless you actively disable it. You must confirm the setting is off before using with Internal data.

Before using any AI tool for TRU work, confirm:

  • Whether submitted data is used for model training, and whether opt-out is available. 
  • Whether data controls apply at the account or organization level — not just per chat session.
  •  Where data is stored and processed — must be in Canada or covered by an adequate data agreement.
  • Whether the vendor retains or can access submitted data beyond the session.
  • Settings are documented in: vendor privacy policy, terms of service, and account data controls.

Do not use these tools with Internal data if:

  • The tool does not allow opt-out from model training.
  • You cannot determine how data is stored.
  • Opt-out controls are unavailable, unclear, or unverifiable.
  • Data is processed or stored outside Canada without an adequate agreement. 

Responding to a Privacy Breach involving an AI tool 

A privacy breach occurs when personal information is collected, used, disclosed, or retained in a way that violates FIPPA. With AI tools, breaches can happen in ways that aren’t always obvious — including accidentally submitting the wrong data or a vendor changing how they handle information. 

Common AI-related breach scenarios 

  • Submitting student records, HR data, or financial information into a tool with no PIA.
  • Using a tool after its terms of service changed to enable model training.
  • Sharing prompts containing identifiable personal information with an external vendor.
  • Vendor suffering a data breach that affects TRU data.

When a breach becomes notifiable

TRU must notify the affected individual(s) and the BC Privacy Commissioner without unreasonable delay if the breach could reasonably be expected to cause significant harm — including identity theft, financial loss, or reputational damage to an individual. 

If you suspect or discover a privacy breach related to an AI tool, take these steps immediately: 

Cease any further processing of personal information through the affected tool. Do not attempt to delete data from the vendor’s systems without guidance — this can interfere with the formal response process. 

Notify your supervisor and contact the Director of Information Security as soon as possible. Time matters — delays can increase the scope of harm and complicate TRU’s legal obligations. 

→ Director of Information Security: jcuzzola@tru.ca 

They will determine whether the breach could reasonably be expected to result in significant harm to an individual. This assessment drives next steps, including whether formal notification is required. 

Where significant harm is determined possible, the General Counsel (as head of the public body under FIPPA) is responsible for notifying both the affected individual(s) and the BC Privacy Commissioner without unreasonable delay. The Privacy and Access Officer carries out these notifications. 

All breaches should be documented. After the incident is managed, work with InfoSecurity and the Privacy Office to understand how the breach occurred and what changes — including updating or revoking tool access — are needed to prevent recurrence. 

Do not wait to report. Delaying a breach report to the Privacy Office or Information Security Office increases the risk of harm to individuals and may complicate TRU’s compliance obligations under FIPPA. When in doubt, report immediately and let the Privacy Office determine the severity — do not make that judgment on your own. 

For TRU’s full Privacy Breach procedure, see: Privacy Breaches — TRU Privacy and Access Office

TRU Privacy Resources