Designing Your Own AI Agent
How to Build a No‑Code Agent from Scratch.
Written By Asad Jobanputra
Last updated 3 months ago
Overview
This page guides you through creating a custom AI agent in CampusMind without writing any code. An AI agent uses connected knowledge sources and large‑language‑model (LLM) capabilities to answer questions, automate tasks and streamline campus workflows. Using the built‑in AI Agent Builder makes it easy for anyone to build an agent tailored to their department’s needs.
Who It’s For
Department leads and administrators: who want custom self‑service tools to answer repetitive questions and reduce manual workload.
Faculty and support staff: who need agents that align with specific processes (e.g., HR helpdesks, grant‑writing assistants).
IT teams: who oversee integrations, data governance and security while empowering departments to self‑serve.
Students: who want to customize their learning experience for their personal strengths and weaknesses.
Why Use It
CampusMind’s no‑code builder empowers departments to design AI agents that mirror their own workflows. Building a custom agent saves time and costs by handling routine inquiries and tasks. Departments retain control over their processes while central IT maintains oversight for security, compliance and FERPA policies.
Requirements & Permissions
Access permissions: You must have the “Create Agents” role within your institution’s CampusMind environment. Contact your CampusMind admin if you cannot see the “Create New Agent” button.
Integrations: Ensure any required integrations (e.g., LMS, Workday, ServiceNow) are connected before building your agent. Some actions (like updating a student record) may require API keys.
Knowledge sources: You’ll need documents, policies or databases to feed the agent’s knowledge base. Confirm that you have permission to use each source, especially when handling sensitive student or HR data.
Step‑by‑Step Instructions
1. Choose Agent Type
From the Agents page, select Create New Agent.
On the Agent Type screen, choose between:
AI Agent: retrieves and summarizes information from connected knowledge sources; ideal for FAQ bots, knowledge base assistants and documentation.
Orchestration Agent: collects and manages structured data through multi‑step forms; useful for onboarding and registration flows.
Select Continue to move to the agent setup.
2. Fill Out the Profile
The profile defines the agent’s identity and behaviour.
Agent Name: Enter a clear, descriptive name (e.g., “Admissions FAQ Assistant”).
Description: Write a concise summary of what the agent does and why someone would use it (this appears in the marketplace).
Category: Choose the relevant department (IT Support, Career Services, Student Affairs, Human Resources, Administration, Billing, Referral Management or Banking)
Behavior Prompts: Provide explicit instructions for how the agent should behave and communicate. For example:
“Greet users warmly and professionally.”
“Provide clear, step‑by‑step answers.”
“Escalate complex issues to a human if needed.”
Starter Prompts (optional): Add up to four example questions to guide users when starting a conversation (e.g., “How do I submit a leave request?”).
Why: A well‑defined profile helps set user expectations and ensures the agent responds consistently
3. Configure the Knowledge Base
Navigate to Knowledge Base.
Add sources that the agent will search. These might include:
Documents and PDFs: policy documents, student handbooks, HR manuals.
Web sources or wikis: FAQs or university web pages.
Databases: structured data such as course catalogs, transcripts or ticketing systems.
For each source, define how the agent should retrieve information (e.g., full‑text search, semantic retrieval).
Test retrieval with sample questions to ensure the agent can find relevant answers.
Why: A rich, up‑to‑date knowledge base ensures the agent’s answers are accurate and context‑aware.
4. Set LLM Options
In LLM Options, choose the language model (e.g., OpenAI GPT‑4, Anthropic Claude).
Adjust settings like temperature (controls creativity) and maximum response length.
Enable summarization or citations if required to support answers.
Define any system prompts or persona instructions (e.g., tone, vocabulary).
Why: Tuning model parameters balances creativity and factual accuracy, ensuring that responses match your institution’s voice.
5. Define Actions (Optional)
Select Actions to add integrations that allow the agent to perform tasks.
Choose from available connectors (Workday, ServiceNow, Canvas, email).
For each action, configure input fields and outputs (e.g., “create IT ticket,” “update student record”).
Test each action with sample data to confirm correct behaviour.
Why: Actions transform the agent from a passive Q&A bot into a workflow assistant, capable of triggering downstream processes.
6. Prepare for Publishing
In Publish, set the agent’s visibility (private to your department or public within the institution).
Upload an icon and adjust the marketplace listing details (tags, categories).
Review compliance and data‑access restrictions. Highlight any sensitive data sources that require additional safeguards.
7. Manage User Access
Use User Access to define who can use the agent (specific user groups or roles).
Add or remove users; set permissions (viewer vs. editor).
If SSO or LMS integration is required, ensure the agent’s access aligns with those roles.
Security Note: Limit access to only those who need it, especially if the agent taps into HR or academic records.
8. Provide a User Guide
In User Guide, write simple instructions for end users: how to ask questions, what data sources are available, and any known limitations.
Use plain language and include examples (“Ask me about enrollment deadlines or HR policies”).
Include notes on data privacy or escalation (“Sensitive requests will be forwarded to a human representative”).
Best Practices / Notes
Keep it short and clear: Use concise sentences and Grade‑8 reading level language.
Explain the why: Instead of only listing steps, clarify the benefit—e.g., adding behavior prompts ensures consistent tone.
Use scenarios: Provide examples such as setting up an agent for Math instructors to manage calculus FAQs.
Use friendly language: Replace internal terms (“tenant”) with user‑friendly phrases (“your institution’s environment”).
Highlight warnings: Note when permissions may expose sensitive data and require IT approval.
Test thoroughly: Use sample queries and actions to verify the agent performs as expected before publishing.
Iterate and refine: Regularly update the knowledge base and prompts based on user feedback.
Troubleshooting
Agent returns irrelevant answers: Check that the correct knowledge sources are connected and that the behavior prompts are specific.
Users can’t access the agent: Verify roles and permissions in the User Access section; ensure SSO or LMS integration is configured.
Action fails: Ensure API keys and integrations are valid and the action parameters are correct. Test using dummy data.
Response quality issues: Adjust LLM temperature and prompt instructions; consider adding more context to the knowledge base.
Related Pages / Next Steps
Managing User Groups: Learn how to create and manage user groups to control agent access.
Connecting Knowledge Sources: Step‑by‑step guide for adding documents, databases and web sources.
AI Marketplace Overview: Explore pre‑built agents and templates.
Admin Policies and Compliance: Understand FERPA and data‑governance considerations when using AI agents.
With these guidelines, any authorized staff member can design a powerful, department‑specific AI agent using CampusMind’s no‑code builder. Taking the time to define the agent’s purpose, connect the right knowledge sources and set appropriate permissions will ensure a smooth rollout and a reliable user experience