EduLinked Pty Ltd - Data Security Confidentiality

AI Policy (Use of Text Generators)

Company: EduLinked Pty Ltd

Contact: Sarah Ailish McLoughlin, Director — founder@edulinked.com.au

Intent and Scope

This policy applies to the use of all forms of AI text generators within EduLinked Pty Ltd, including, but not limited to, ChatGPT and Bard. This includes any software that incorporates an AI text generator.

This policy applies to all of EduLinked Pty Ltd employees, contractors, volunteers, vendors and anyone else who may have any type of access to EduLinked Pty Ltd systems, software and hardware.

External Party Requirements

All contractors, volunteers and vendors must sign a written acknowledgment of this policy and complete mandatory AI policy training before being granted system access. Such acknowledgment must include certification that they will not submit NDIS participant data to external AI systems. EduLinked Pty Ltd retains audit rights to verify compliance with data handling requirements, and external parties must report any suspected breaches within 24 hours. Non-compliance may result in contract termination and regulatory notification, with records of acknowledgments and training completion maintained for a minimum of 7 years.

External parties must maintain their own records demonstrating compliance with this policy, including logs of system access, data handling procedures, and training completion certificates. These records must be made available to EduLinked Pty Ltd upon request within 24 hours and must be retained for the same minimum period as required by EduLinked Pty Ltd. Failure to maintain or produce such records constitutes a material breach of contract and may result in immediate suspension of system access pending investigation.

Quarterly Audits

EduLinked Pty Ltd will conduct quarterly audits of external party compliance through documented record review and spot-check verification of system access logs, data handling procedures, and training certifications. All external party acknowledgments, training completion records, and audit findings must be documented in a centralized repository maintained by EduLinked Pty Ltd for regulatory review purposes, with all records retained for a minimum of 7 years in accordance with NDIS regulatory requirements.

EduLinked Pty Ltd recognises the uses of AI text generators in gathering information and efficiently producing copy, as well as other aspects of knowledge work. EduLinked Pty Ltd seeks to balance these positive uses with the risks associated with the technology.

EduLinked Pty Ltd's highest priority remains the protection of company information and customer data, as well as ensuring legislative and regulatory compliance regarding security and privacy. We also seek to ensure that uses of AI text generators do not compromise the quality of work output, and that we appropriately disclose the use of such generators both to our customers and across the company.

Acceptable Use

EduLinked Pty Ltd recognises the potential benefits to the business of incorporating AI text generators into internal workflows. EduLinked Pty Ltd gives users discretion to decide whether the use of a generator for an internal workflow is appropriate, as long as all such use is in line with this policy, any other company policies, and EduLinked Pty Ltd's values.

As the capabilities and use cases of AI text generators are rapidly evolving, it is important for users to decide carefully whether it is appropriate to integrate AI text generators into their work. If in doubt, contact your manager, or Sarah Ailish McLoughlin, Director (founder@edulinked.com.au).

Role-Specific Guidance

Role Permitted Uses Prohibited Uses
Support Coordinators Administrative tasks including scheduling communications, formatting documentation, and drafting routine internal correspondence Developing support plans, service goals, accommodation recommendations, crisis response, or participant-facing advice without independent professional validation
Clinical & Assessment Staff Administrative documentation formatting and internal workflow management Clinical assessments, diagnoses, health recommendations, or professional advice
Administrative Staff Internal communications and document formatting (provided all data security requirements are met) Inputting personal or sensitive participant information without prior authorisation

Prohibited Uses

Users must not use AI text generators for:

  • Clinical assessments, diagnoses, or health recommendations
  • Developing or modifying participant support plans, service goals, or accommodation recommendations
  • Crisis response, incident assessment, or safeguarding decisions
  • Generating communications that constitute professional advice or clinical guidance to participants or external parties
  • Any decision-making that directly affects participant welfare, safety, or service delivery without independent validation by qualified staff

Risk Assessment Requirements

Before implementing AI text generators in any workflow, users must complete a basic risk assessment considering:

  1. Data sensitivity and privacy implications
  2. Accuracy requirements of the output
  3. Potential impact on service quality
  4. Compliance with existing workflows

Document this assessment and store it according to EduLinked Pty Ltd's record-keeping practices.

All risk assessments must be completed using EduLinked Pty Ltd's standardised risk assessment template and submitted to the user's manager or compliance officer for review and approval before AI implementation commences. Risk assessments involving participant data, service delivery decisions, or clinical workflows must be escalated to Sarah Ailish McLoughlin, Director for approval. All completed risk assessments must be stored in EduLinked Pty Ltd's centralised risk assessment repository, with records maintained for a minimum of 7 years.

Quarterly Risk Assessment Review

EduLinked Pty Ltd must establish a quarterly risk assessment review process to identify:

  • Patterns of high-risk AI implementation across teams or workflows
  • Systemic vulnerabilities requiring policy amendments or additional controls
  • Teams or individuals requiring supplementary training or support
  • Emerging risks from new AI technologies or use cases

The quarterly analysis must be documented in a written report submitted to the Director within 14 days of quarter end.

Content Retention

Users must ensure that AI-generated content is retained only for the minimum period necessary for its intended purpose:

  • Content used in participant service delivery must be retained in accordance with NDIS record-keeping requirements
  • Content used for internal administrative purposes should be deleted within 90 days unless required for compliance or legal purposes
  • All AI-generated content containing personal or sensitive information must be securely destroyed using encryption or certified destruction methods

EduLinked Pty Ltd must maintain a centralised deletion register documenting all AI-generated content destruction. Deletion certifications must be retained for 7 years.

Where the use of AI text generators may change the nature of how a communication is made or a service is provided to a client or an external party, contact Sarah Ailish McLoughlin, Director to discuss whether this is an appropriate decision to make.

Security

Text input over AI text generators carries the same risks as the sharing of information with any other third party.

Additionally, text submitted to an AI text generator is often added to the training data of the generator. Therefore, there is a risk of the data, or aspects of it, being exposed to other users of the generator.

Data input into an AI text generator should be treated as an external disclosure, and as such falls under EduLinked Pty Ltd's policy regarding data confidentiality and security. This policy can be found at https://www.edulinked.com.au/policies/data-security-confidentiality.

Before Submitting Content to AI Text Generators

Prior to submitting any content to AI text generators, employees must:

  1. Remove all personally identifiable information
  2. Redact confidential business data
  3. Sanitize proprietary details
  4. Maintain a log of all AI system interactions

Any potential data breach or unauthorized disclosure must be reported immediately to the IT security team.

NDIS Participant Data — Strictly Prohibited

Employees must not submit any NDIS participant information to AI text generators under any circumstances, including but not limited to:

  • Participant names
  • Service plans
  • Health records
  • Support needs assessments
  • Incident reports
  • Any other information that could identify or relate to NDIS participants

All work involving NDIS participant data must be conducted using EduLinked's secure, compliant systems in accordance with the NDIS Act 2013, Privacy Act 1988, and EduLinked Pty Ltd's NDIS Privacy Policy.

Validation

Automated text generators should not be used in a manner that lowers the quality of output, whether this output is external, towards EduLinked Pty Ltd's customers, or internal, as part of EduLinked Pty Ltd's operations.

Where this level of quality control is not able to be provided in a manner that produces output more efficiently, AI text generators should not be used. For example, if an employee seeks to gather information on a topic for a report, and a generator brings together information on that topic, the employee will still have to validate the accuracy and reliability of that information and produce sources for it. If this requires more work than seeking out that information without the aid of an AI text generator, a generator should not be used.

Three-Point Validation Process

All AI-generated content must undergo the following validation process:

  1. Fact-checking against at least 2 authoritative sources
  2. Quality assessment against EduLinked Pty Ltd's quality benchmarks including accuracy, relevance, and completeness
  3. Human review by a qualified team member

Each validation step must be documented with a standardised checklist, and content failing any validation criteria must be either revised or discarded. The time required for this validation process should be factored into the assessment of whether using AI generators improves overall efficiency.

Validator Qualifications

Validators must possess relevant professional qualifications and expertise appropriate to the content type being reviewed:

  • Support coordination content must be validated by qualified support coordinators
  • Clinical or therapeutic content by appropriately credentialed clinical staff
  • Participant-facing materials by staff with demonstrated expertise in disability-affirming communication and NDIS Practice Standards

All validators must complete training in accessibility requirements, plain language principles, and participant-centred practice before conducting validation activities.

Transparency

The use of AI text generators when collaborating or providing services raises ethical issues regarding the legitimate expectation that people may have that they are interacting with a real person, or receiving text that was produced by a real person without the aid of a generator.

Wherever an AI text generator is used when collaborating with a person not from one's team, or providing a service to a customer, the use of an AI text generator should be disclosed to that person.

Working Remotely

This policy also applies when working remotely.

Training and Support

All users are required to undergo training on the responsible and effective use of AI text generators, as provided by EduLinked Pty Ltd.

Training Requirements

  • Training shall be conducted annually, with mandatory completion of assessment modules and certification
  • Records of training completion and assessment results must be maintained for a period of 7 years
  • Role-specific training modules must be completed based on staff responsibilities
  • All staff must achieve a minimum assessment score of 80% and obtain certification before being granted access to AI systems
  • New staff required to complete training prior to commencing duties involving AI tools

Annual Refresher Training

Annual refresher training must be completed by all staff, incorporating updates to:

  • NDIS Code of Conduct requirements
  • Privacy Act obligations
  • AI system capabilities

Training completion rates must be tracked and reported quarterly to management, with staff failing to complete required training within 30 days of the due date subject to suspension of AI system access until training is completed.

Disciplinary Action

If this policy is breached, disciplinary action will be taken according to the following procedure:

  • Incidents will be assessed on a case-by-case basis
  • In case of breaches that are intentional or repeated or cases that cause direct harm to EduLinked Pty Ltd, employees may face serious disciplinary action
  • Subject to the gravity of the breach, formal warnings may be issued to the offending employee
  • Employees must report suspected policy violations to their immediate supervisor or HR department within 2 business days of becoming aware of the incident

Data Breach Reporting

Any suspected data breach or unauthorized disclosure involving AI systems or NDIS participant information must be reported:

  • Immediately to the Director and IT security team within 24 hours
  • Written incident report submitted within 2 business days documenting the nature of the breach, affected data, and remedial actions taken

Breach Involving NDIS Participant Data

Following any breach involving NDIS participant data:

  • Conduct root cause analysis within 5 business days
  • Implement corrective actions to prevent recurrence
  • Document all findings in a breach review report submitted to the Director
  • All breach incident records retained for 7 years
  • Notify the NDIA within 5 business days using the NDIA breach notification template
  • Notify affected participants within 7 days using accessible formats appropriate to their communication needs

Compliance Audits

The organisation must conduct quarterly compliance audits of AI system usage, data handling practices, and disclosure documentation. A designated compliance officer is responsible for:

  • Reviewing access logs, validation records, and participant disclosure compliance
  • Documenting findings and corrective actions
  • Submitting an annual summary report to the Director for governance review

All audit records retained for a minimum of 7 years.

Review

EduLinked Pty Ltd will periodically review this policy and update as required. It is important for those to whom this policy applies to stay up-to-date with changes to this policy, as this is a rapidly-changing area of technology.

Policy Review Schedule

This policy will be formally reviewed at least every 6 months by a governance committee comprising representatives from operations, compliance, and disability services. Reviews will assess alignment with:

  • Current NDIS legislation
  • Disability services standards
  • AI regulatory developments

All review outcomes will be documented, and policy updates will be communicated to staff within 14 days of approval, with mandatory training provided on material changes.

NDIS-Specific Requirements

Training Requirements

Training must include:

  • NDIS Code of Conduct obligations
  • Participant rights under the NDIS Practice Standards
  • Privacy requirements under the Privacy Act 1988 (Cth)
  • Prohibited data categories
  • Validation standards for disability services
  • Documentation requirements

Training must be completed before AI tool access is granted and refreshed annually, with completion records maintained for compliance purposes.

Participant Disclosure Requirements

When AI text generators are used in preparing participant communications, support plans, progress reports, or service documentation for NDIS participants, this must be disclosed to the participant or their representative in writing and documented in participant records. The disclosure must include:

  • The purpose of AI use
  • The type of content generated
  • Confirmation that human review and validation by qualified personnel occurred

For all external communications with NDIS participants or their representatives, AI use must be explicitly stated within the communication itself to ensure compliance with NDIS Practice Standards and participant rights frameworks.

Accessible Disclosure Standards

Disclosure must be provided in plain language appropriate to the participant's communication level and needs, with alternative formats offered including:

  • Visual supports
  • Audio recordings
  • Easy-read documents
  • Support person involvement based on documented communication preferences

Staff must use a standardised disclosure template explaining what AI was used, what content was generated, how it was validated by qualified staff, and how the participant can provide feedback or request changes.

Staff must verify participant understanding through conversation, confirmation processes, or support person involvement, documenting the disclosure method, format used, date, verification method, and outcome in participant records.

Intellectual Property and Liability

EduLinked retains full ownership of all AI-generated content created using company systems, data, and staff input, subject to third-party AI provider terms of service.

EduLinked assumes full liability for AI-generated content used in service delivery to participants, and all such content must be reviewed and approved by qualified staff who assume accountability for accuracy and appropriateness before use in participant communications or service planning.

Third-Party AI Provider Agreements

EduLinked must execute a Data Processing Agreement with all third-party AI providers specifying:

  • Security standards
  • Data residency requirements
  • Encryption protocols
  • Breach notification obligations within 24 hours of any suspected data breach or unauthorised access

All AI providers must:

  • Comply with Privacy Act standards and NDIS security requirements
  • Maintain cyber liability insurance with proof of coverage provided to EduLinked
  • Indemnify EduLinked for breaches caused by provider negligence or security failures

EduLinked retains audit rights to verify provider compliance with security standards, and providers are prohibited from using EduLinked data for AI model training or secondary purposes without explicit written consent. Provider agreements must be reviewed and updated annually and when regulatory requirements change.

Further Information

If you require further information, contact your manager, or:

Sarah Ailish McLoughlin
Director
founder@edulinked.com.au

Related Research

EduLinked is committed to contributing to the broader discourse on AI ethics, authorship, and data sovereignty. We maintain ongoing open source research on artificial intelligence and authorship protocols.

Metadata Sovereignty & AI Research

Our research repository explores frameworks for maintaining authorship integrity, metadata sovereignty, and ethical AI implementation in organisational contexts.

Repository: github.com/EduLinked-Systems/metadata-sovereignty-ai-research

Policy Version Control

EduLinked will maintain comprehensive policy version control including version number, effective date, amendment summary, and approval date for each policy iteration. Superseded policy versions will be archived for a minimum of 7 years to maintain audit trail compliance.

Implementation of policy amendments will be tracked through training completion records, and policy governance activities will be reported to the Director quarterly to ensure accountability and continuous improvement.

Current policy versions will be published on the organisation's intranet within 14 days of approval, with superseded versions archived for 7 years to maintain audit trail compliance with NDIS Quality and Safeguards Commission requirements.

Easy Read

AI Policy

This document tells you about our rules for using AI.

AI means Artificial Intelligence. AI is a computer program that can write text. ChatGPT is an example of AI.

Computer showing artificial intelligence

What is this about?

This policy is about how we use AI tools at work.

AI tools can help us do some tasks faster.

We must be careful when we use AI.

This policy is for:

  • all staff
  • contractors
  • volunteers
  • anyone who uses our systems.
Checklist with tick marks

What you can do with AI

You can use AI to help with:

  • admin tasks
  • formatting documents
  • writing first drafts of internal emails
  • getting ideas for your work.

You must always check the AI's work.

Make sure it is correct before you use it.

Warning sign

What you must not do

Do not use AI for:

  • clinical assessments
  • support plans for NDIS participants
  • decisions about participant safety
  • advice to participants without checking.

NDIS participant means a person who gets support from the NDIS.

Never put NDIS participant information into AI.

This includes names and health records.

Padlock representing security and privacy

Keep information safe

Before you use AI, you must:

  • remove people's names
  • remove private business information
  • write down what you used AI for.

A data breach is when private information gets shared by mistake.

If something goes wrong, tell your manager straight away.

Team working together reviewing documents

Check AI work

AI can make mistakes.

You must check AI work by:

  • checking facts from trusted sources
  • making sure the quality is good
  • asking another person to check it.

Do not use AI work until you have checked it.

People communicating and discussing

Tell people about AI use

You must tell people when you use AI.

Tell them if you write to them using AI.

This is being honest and fair.

Write a note on your email or document.

Training and learning in classroom

Training

All staff must do AI training.

You must:

  • do training before you use AI
  • do more training each year
  • pass the test with 80 per cent or more.
Alert warning symbol

If something goes wrong

A data breach is when private information gets shared by mistake.

If you think there is a data breach:

  • tell your manager or the Director now
  • do this within 24 hours
  • write down what happened.
Question mark for help and questions

Questions?

Ask your manager if you are not sure.

You can also contact:

Sarah Ailish McLoughlin

Director

Email: founder@edulinked.com.au

What the symbols mean

Yes

This means you can do this.

No

This means you must not do this.