AI Resources Guide for IT Teams
Best Practices
Quick Links
AI Best PracticesTechnical Checklist for AISample Policies & Resources
Introduction & Purpose
Artificial Intelligence (AI) offers big opportunities and challenges for schools. Many agencies have released guidelines, checklists and policies to address multiple areas in which schools should address. This document has been created by gathering several resources from CITE members, partners, and others and is focused on addressing the needs of the IT Professionals working in California schools. The AI space continues to rapidly evolve so consider that these resources were gathered in early 2024. CITE will continue to provide resources and information as the landscape changes. CITE staff is working with our legal team to add a new exhibit to the California Data Privacy Agreement to ensure any special circumstances around AI are addressed. This will be available by June 2024.
Guiding Principles
It is recommended that the organization develop some guiding principles to guide staff on how to approach the AI policy and decision making process. AI can change the way humans work but should not replace humans when staff and students are involved.
It is also important to avoid fully banning the use of AI. AI tools can help improve workflow, simplify tasks, and assist teachers. Leveraging the use of AI and understanding that the technology is evolving makes the guiding principles important. There are some AI tools that are already widely used such as spelling and grammar checking. This is traditional AI, which is rule-based and relies on programming code to make decisions. Generative AI uses the data entered to learn how to make better decisions. Chat GPT, for example. This means that the data entered into Chat GPT, for example, could show up in the answers to the questions entered. This presents a privacy risk.
Although this is not strictly an IT issue, IT leadership and staff should be part of this discussion. This will ensure that the principles align with IT standards and practices.
Policies Review
There are current policies in place in every LEA. Reviewing and updating policies relevant to the use of AI can help create guiding principles for staff and shows the public that the LEA is addressing AI.
A general use guideline can also be developed regarding the use of AI. Areas to review include but may not be limited to:
- Administrative Regulations
- Standard Contract Language
- Acceptable Use Policies (AUP)
- See sample policies here
- Student code of conduct
Contract Reviews
When new capabilities are released in applications already in use, the current contracts should be reviewed, including the Terms of Service. Some items to consider include, but are not limited to, identifying if the intended users are students and if the use is limited to students above a certain age.
The contract should clearly state how LEA provided data will be used, stored, and retained. it should also state if the data is used for AI training or tuning purposes. This verbiage should be in place regardless of whether or not the data is used for AI. If student data is entered, the contract should clearly state whether or not that data is used to train the AI models. Students may be entering sensitive personal information.
The newest version of the California Student Data Privacy Agreement will include an exhibit specifically addressing AI. The new version will be available this summer and will be available from CITE. Contact privacyservices@cite.org.
Vetting Current Applications for Use in the Organization
In California, since the enactment of education code 49073.6, schools have been tasked with ensuring that any applications used for instruction follow the student data privacy laws at both the federal and state levels. While many aspects of AI are already covered under existing law, it is important to remember to apply the guidelines for compliance with existing applications as they are updated to include AI capability.
Does the application align with the organization’s instructional and overall goals? If guiding principles around AI are in place, how does the application compare? LEA leadership has a responsibility to evaluate any new application or updates to existing applications to ensure compliance is met.
Bias
There are documented concerns of bias and fairness in current AI Technologies. AI models have inherent biases based on the data used to train it. Is the company aware/taking any actions to combat bias? Also, hallucinations are a concern where generative AI models generate fictitious, fabricated results.
Privacy
Safeguarding the privacy of stakeholders is not new. It is an essential task to complete anytime a new application is introduced or when new features are added. The user should maintain control of how AI features are implemented. When implemented, user protection should be a priority.
To ensure the vendor is aware of the state and federal laws regarding student data privacy, check to see if they have signed a data privacy agreement. California has adopted a statewide data privacy agreement (CA-DPA) that is piggy-backable.
Has the vendor confirmed that any user-provided data (user prompts, supplied data, generated output, etc.) will remain the property of the LEA and that no LEA data will be retained past the timeline specified in the privacy agreement and/or terms of service (TOS)? Has the vendor confirmed that any user-provided data (user prompts, supplied data, generated output, etc.) will remain the property of the LEA and that no LEA data will be retained past the timeline specified in the privacy agreement and/or terms of service (TOS)?
Does the vendor's privacy policy address FERPA, SOPIPA, COPPA, CCPA, and any AI specific laws?
User Training
AI doesn’t work without user interaction. Generative AI works in the background and the user won’t know it. As it develops, it is important to continually train users to help them understand the importance of data safety and privacy. Teachers and students should be trained as end users to not input any sensitive or identifying data to minimize risk in the event of a leak. Remember to continue training as new staff and students enter the environment.