Generative AI is a type of artificial intelligence that can learn from and mimic large amounts of data to create content such as text, images, music, videos, code, and more, based on inputs or prompts.
The university supports and encourages the responsible and secure exploration of AI tools. When using any of these tools, especially public, open-source, non-protected AI tools, it is vitally important to keep information security and data privacy, compliance, copyright, and academic integrity in mind.
Washington University ChatGPT Beta
Version: OpenAI’s ChatGPT 3.5 Turbo
Use this tool when you want a secure sandbox where you can use sensitive data and WashU intellectual property.
This isolated instance of ChatGPT is compliant for use with sensitive data and WashU intellectual property, including information protected under HIPAA and FERPA. Due to the isolated nature, this tool does not pull in new information from the web and will be approximately six months behind the latest version of the large language model.
This tool is available in Beta form in order to provide a secure GPT environment to the WashU community for research, operations and education as quickly as possible. As such, it is not yet mobile friendly and users may experience limited capacity or constraints. We welcome feedback on how we can improve the user experience as we continue to optimize the tool.
LLM Version: OpenAI’s ChatGPT 4
Use this tool with non-sensitive data and when you want to connect to the most current web information.
As of February 2024, all students, in addition to faculty and staff, now have access to Microsoft Copilot (formerly called Bing Chat Enterprise) at copilot.microsoft.com when logged in to an institutional Microsoft account. As with any external tool, it is important to understand that personally identifiable, confidential or sensitive information should not be entered into Microsoft Copilot as it does not meet HIPAA, FERPA or similar compliance requirements.
Microsoft Copilot for
Microsoft Copilot for Microsoft 365 combines the power of large language models (LLMs) with your organization’s data. It works alongside popular Microsoft 365 apps such as Word, Excel, PowerPoint, Outlook, Teams, and more, providing real-time intelligent assistance, enabling users to enhance their creativity, productivity, and skills.
While Microsoft Copilot for Microsoft 365 is not yet available to WashU faculty, staff and students, a small testing group is evaluating the product’s potential utility for those groups.
No expansion of the testing group is yet planned but making cutting-edge tools available to you quickly – and ensuring that we can support their secure, responsible and effective use – remains a key mission of both WashU IT and Digital Transformation.
WashU ChatGPT FAQs
Artificial Intelligence (AI)
The theory and development of computer systems able to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.
– Oxford Languages
Large Language Models (LLMs)
A specialized type of artificial intelligence (AI) that has been trained on vast amounts of text to understand existing content and generate original content.
Machine Learning (ML)
The use and development of computer systems that are able to learn and adapt without following explicit instructions, by using algorithms and statistical models to analyze and draw inferences from patterns in data.
– Oxford Languages
Natural-Language Processing (NLP)
Involves the ability to turn text or audio speech into encoded, structured information, based on an appropriate ontology. The structured data may be used simply to classify a document, as in “this report describes a laparoscopic cholecystectomy,” or it may be used to identify findings, procedures, medications, allergies and participants.
It is the user’s responsibility to protect sensitive data and verify content when using generative AI tools.
Be Mindful Not to Share Sensitive Information
Do not enter confidential or protected data or information, including non-public research data, into publicly available or vendor-enabled AI tools.
Information shared with public AI tools:
- Is not considered private.
- May be added to the tool’s knowledge base and provided to other users.
- Is usually claimed to be the property of the vendor.
These Tools Can be Inaccurate
Each individual is responsible for any content that is produced or published containing AI-generated material.
- AI tools sometimes “hallucinate,” generating content that can be highly convincing, but inaccurate, misleading, or entirely fabricated.
- It may contain copyrighted material.
- All AI-generated content should be reviewed carefully for correctness and cited properly before submission or publication.
Adhere to Current Academic Integrity Policies
Review university, school, and department handbooks and policies.
- Schools will be developing and updating their policies as we learn more about AI tools.
- Faculty members should teach and advise students about policies on the permitted uses of AI in classes and on academic work.
- Students are encouraged to ask their instructors for clarification about these policies.
- AI may contribute intentional and unintended forms of plagiarism and falsification of data.
Be Alert for AI-Enabled Phishing
AI has made it easier for malicious actors to create sophisticated scams at a far greater scale. Continue to follow security best practices and report suspicious messages via the Phish Report button in Outlook or to email@example.com.
Contact IT When Procuring Generative AI Tools or Adding AI Functionality in Existing Applications
The university is working to ensure that tools procured on behalf of WashU have the appropriate privacy and security protections.
- If you have procured or are considering procuring AI tools, contact WashU IT at firstname.lastname@example.org and provide the following:
- Data being used
- The product or service to be used
- Compliance with the published guidelines
- Contact information
- In line with university procurement policy for IT hardware and software, vendor generative AI tools must be assessed for risk by WashU’s Office of Information Security prior to use. This includes the following:
- Existing tools that add or expand AI capabilities
- New purchases of vendor AI tools
Citation method is dependent on academic field.
- APA – “How to cite ChatGPT”
- “How do I cite generative AI in MLA style?”
- Chicago Manuel of Style FAQ
- MLA-CCCC Quick Start Guide to AI and Writing
Researchers should check each journal’s specific guidelines and provide proper attribution and transparency in their manuscripts when they use AI in their research (PMID: 36697395).
Office of the Provost
Digital Intelligence & Innovation Accelerator
The DI2 Accelerator, born from Washington University’s Here & Next strategic plan for Digital Transformation, represents a bold vision for success in the new digital economy.
Course Assessment and Design
The Center for Teaching and Learning.
If you have AI questions, please contact email@example.com.