This policy applies to the use of all forms of AI text generators within EduLinked Pty Ltd, including, but not limited to, ChatGPT and Bard. This includes any software that incorporates an AI text generator.
This policy applies to all of EduLinked Pty Ltd employees, contractors, volunteers, vendors and anyone else who may have any type of access to EduLinked Pty Ltd systems, software and hardware.
All contractors, volunteers and vendors must sign a written acknowledgment of this policy and complete mandatory AI policy training before being granted system access. Such acknowledgment must include certification that they will not submit NDIS participant data to external AI systems. EduLinked Pty Ltd retains audit rights to verify compliance with data handling requirements, and external parties must report any suspected breaches within 24 hours. Non-compliance may result in contract termination and regulatory notification, with records of acknowledgments and training completion maintained for a minimum of 7 years.
External parties must maintain their own records demonstrating compliance with this policy, including logs of system access, data handling procedures, and training completion certificates. These records must be made available to EduLinked Pty Ltd upon request within 24 hours and must be retained for the same minimum period as required by EduLinked Pty Ltd. Failure to maintain or produce such records constitutes a material breach of contract and may result in immediate suspension of system access pending investigation.
EduLinked Pty Ltd will conduct quarterly audits of external party compliance through documented record review and spot-check verification of system access logs, data handling procedures, and training certifications. All external party acknowledgments, training completion records, and audit findings must be documented in a centralized repository maintained by EduLinked Pty Ltd for regulatory review purposes, with all records retained for a minimum of 7 years in accordance with NDIS regulatory requirements.
EduLinked Pty Ltd recognises the uses of AI text generators in gathering information and efficiently producing copy, as well as other aspects of knowledge work. EduLinked Pty Ltd seeks to balance these positive uses with the risks associated with the technology.
EduLinked Pty Ltd's highest priority remains the protection of company information and customer data, as well as ensuring legislative and regulatory compliance regarding security and privacy. We also seek to ensure that uses of AI text generators do not compromise the quality of work output, and that we appropriately disclose the use of such generators both to our customers and across the company.
EduLinked Pty Ltd recognises the potential benefits to the business of incorporating AI text generators into internal workflows. EduLinked Pty Ltd gives users discretion to decide whether the use of a generator for an internal workflow is appropriate, as long as all such use is in line with this policy, any other company policies, and EduLinked Pty Ltd's values.
As the capabilities and use cases of AI text generators are rapidly evolving, it is important for users to decide carefully whether it is appropriate to integrate AI text generators into their work. If in doubt, contact your manager, or Sarah Ailish McLoughlin, Director (founder@edulinked.com.au).
| Role | Permitted Uses | Prohibited Uses |
|---|---|---|
| Support Coordinators | Administrative tasks including scheduling communications, formatting documentation, and drafting routine internal correspondence | Developing support plans, service goals, accommodation recommendations, crisis response, or participant-facing advice without independent professional validation |
| Clinical & Assessment Staff | Administrative documentation formatting and internal workflow management | Clinical assessments, diagnoses, health recommendations, or professional advice |
| Administrative Staff | Internal communications and document formatting (provided all data security requirements are met) | Inputting personal or sensitive participant information without prior authorisation |
Users must not use AI text generators for:
Before implementing AI text generators in any workflow, users must complete a basic risk assessment considering:
Document this assessment and store it according to EduLinked Pty Ltd's record-keeping practices.
All risk assessments must be completed using EduLinked Pty Ltd's standardised risk assessment template and submitted to the user's manager or compliance officer for review and approval before AI implementation commences. Risk assessments involving participant data, service delivery decisions, or clinical workflows must be escalated to Sarah Ailish McLoughlin, Director for approval. All completed risk assessments must be stored in EduLinked Pty Ltd's centralised risk assessment repository, with records maintained for a minimum of 7 years.
EduLinked Pty Ltd must establish a quarterly risk assessment review process to identify:
The quarterly analysis must be documented in a written report submitted to the Director within 14 days of quarter end.
Users must ensure that AI-generated content is retained only for the minimum period necessary for its intended purpose:
EduLinked Pty Ltd must maintain a centralised deletion register documenting all AI-generated content destruction. Deletion certifications must be retained for 7 years.
Where the use of AI text generators may change the nature of how a communication is made or a service is provided to a client or an external party, contact Sarah Ailish McLoughlin, Director to discuss whether this is an appropriate decision to make.
Text input over AI text generators carries the same risks as the sharing of information with any other third party.
Additionally, text submitted to an AI text generator is often added to the training data of the generator. Therefore, there is a risk of the data, or aspects of it, being exposed to other users of the generator.
Data input into an AI text generator should be treated as an external disclosure, and as such falls under EduLinked Pty Ltd's policy regarding data confidentiality and security. This policy can be found at https://www.edulinked.com.au/policies/data-security-confidentiality.
Prior to submitting any content to AI text generators, employees must:
Any potential data breach or unauthorized disclosure must be reported immediately to the IT security team.
Employees must not submit any NDIS participant information to AI text generators under any circumstances, including but not limited to:
All work involving NDIS participant data must be conducted using EduLinked's secure, compliant systems in accordance with the NDIS Act 2013, Privacy Act 1988, and EduLinked Pty Ltd's NDIS Privacy Policy.
Automated text generators should not be used in a manner that lowers the quality of output, whether this output is external, towards EduLinked Pty Ltd's customers, or internal, as part of EduLinked Pty Ltd's operations.
Where this level of quality control is not able to be provided in a manner that produces output more efficiently, AI text generators should not be used. For example, if an employee seeks to gather information on a topic for a report, and a generator brings together information on that topic, the employee will still have to validate the accuracy and reliability of that information and produce sources for it. If this requires more work than seeking out that information without the aid of an AI text generator, a generator should not be used.
All AI-generated content must undergo the following validation process:
Each validation step must be documented with a standardised checklist, and content failing any validation criteria must be either revised or discarded. The time required for this validation process should be factored into the assessment of whether using AI generators improves overall efficiency.
Validators must possess relevant professional qualifications and expertise appropriate to the content type being reviewed:
All validators must complete training in accessibility requirements, plain language principles, and participant-centred practice before conducting validation activities.
The use of AI text generators when collaborating or providing services raises ethical issues regarding the legitimate expectation that people may have that they are interacting with a real person, or receiving text that was produced by a real person without the aid of a generator.
Wherever an AI text generator is used when collaborating with a person not from one's team, or providing a service to a customer, the use of an AI text generator should be disclosed to that person.
This policy also applies when working remotely.
All users are required to undergo training on the responsible and effective use of AI text generators, as provided by EduLinked Pty Ltd.
Annual refresher training must be completed by all staff, incorporating updates to:
Training completion rates must be tracked and reported quarterly to management, with staff failing to complete required training within 30 days of the due date subject to suspension of AI system access until training is completed.
If this policy is breached, disciplinary action will be taken according to the following procedure:
Any suspected data breach or unauthorized disclosure involving AI systems or NDIS participant information must be reported:
Following any breach involving NDIS participant data:
The organisation must conduct quarterly compliance audits of AI system usage, data handling practices, and disclosure documentation. A designated compliance officer is responsible for:
All audit records retained for a minimum of 7 years.
EduLinked Pty Ltd will periodically review this policy and update as required. It is important for those to whom this policy applies to stay up-to-date with changes to this policy, as this is a rapidly-changing area of technology.
This policy will be formally reviewed at least every 6 months by a governance committee comprising representatives from operations, compliance, and disability services. Reviews will assess alignment with:
All review outcomes will be documented, and policy updates will be communicated to staff within 14 days of approval, with mandatory training provided on material changes.
Training must include:
Training must be completed before AI tool access is granted and refreshed annually, with completion records maintained for compliance purposes.
When AI text generators are used in preparing participant communications, support plans, progress reports, or service documentation for NDIS participants, this must be disclosed to the participant or their representative in writing and documented in participant records. The disclosure must include:
For all external communications with NDIS participants or their representatives, AI use must be explicitly stated within the communication itself to ensure compliance with NDIS Practice Standards and participant rights frameworks.
Disclosure must be provided in plain language appropriate to the participant's communication level and needs, with alternative formats offered including:
Staff must use a standardised disclosure template explaining what AI was used, what content was generated, how it was validated by qualified staff, and how the participant can provide feedback or request changes.
Staff must verify participant understanding through conversation, confirmation processes, or support person involvement, documenting the disclosure method, format used, date, verification method, and outcome in participant records.
EduLinked retains full ownership of all AI-generated content created using company systems, data, and staff input, subject to third-party AI provider terms of service.
EduLinked assumes full liability for AI-generated content used in service delivery to participants, and all such content must be reviewed and approved by qualified staff who assume accountability for accuracy and appropriateness before use in participant communications or service planning.
EduLinked must execute a Data Processing Agreement with all third-party AI providers specifying:
All AI providers must:
EduLinked retains audit rights to verify provider compliance with security standards, and providers are prohibited from using EduLinked data for AI model training or secondary purposes without explicit written consent. Provider agreements must be reviewed and updated annually and when regulatory requirements change.
If you require further information, contact your manager, or:
Sarah Ailish McLoughlin
Director
founder@edulinked.com.au
EduLinked is committed to contributing to the broader discourse on AI ethics, authorship, and data sovereignty. We maintain ongoing open source research on artificial intelligence and authorship protocols.
Our research repository explores frameworks for maintaining authorship integrity, metadata sovereignty, and ethical AI implementation in organisational contexts.
Repository: github.com/EduLinked-Systems/metadata-sovereignty-ai-research
EduLinked will maintain comprehensive policy version control including version number, effective date, amendment summary, and approval date for each policy iteration. Superseded policy versions will be archived for a minimum of 7 years to maintain audit trail compliance.
Implementation of policy amendments will be tracked through training completion records, and policy governance activities will be reported to the Director quarterly to ensure accountability and continuous improvement.
Current policy versions will be published on the organisation's intranet within 14 days of approval, with superseded versions archived for 7 years to maintain audit trail compliance with NDIS Quality and Safeguards Commission requirements.
This document tells you about our rules for using AI.
AI means Artificial Intelligence. AI is a computer program that can write text. ChatGPT is an example of AI.
This policy is about how we use AI tools at work.
AI tools can help us do some tasks faster.
We must be careful when we use AI.
This policy is for:
You can use AI to help with:
You must always check the AI's work.
Make sure it is correct before you use it.
Do not use AI for:
NDIS participant means a person who gets support from the NDIS.
Never put NDIS participant information into AI.
This includes names and health records.
Before you use AI, you must:
A data breach is when private information gets shared by mistake.
If something goes wrong, tell your manager straight away.
AI can make mistakes.
You must check AI work by:
Do not use AI work until you have checked it.
You must tell people when you use AI.
Tell them if you write to them using AI.
This is being honest and fair.
Write a note on your email or document.
All staff must do AI training.
You must:
A data breach is when private information gets shared by mistake.
If you think there is a data breach:
Ask your manager if you are not sure.
You can also contact:
Sarah Ailish McLoughlin
Director
Email: founder@edulinked.com.au
This means you can do this.
This means you must not do this.