What are AI Agents?
AI Agents is a modern paradigm designed to work alongside humans by giving them advanced reasoning capabilities and multi-orchestration capabilities so they can accomplish work far more complex than just responding to prompts. Unlike traditional AI systems that mainly mimic human conversation, AI Agents can understand goals, break them into steps, plan solutions, and coordinate with multiple tools or systems to achieve results. Most popular examples we see around are:
Code assistants, tools, and platforms. Examples are Cursor, Windsurf, Amazon Q, etc.
Google AI mode
What are MCP Servers?
MCP is an open protocol that standardizes how applications provide context to large language models (LLMs)/AI Agents. MCP offers a universal way to connect AI models with various prebuilt prompts, data sources, and tools. With MCP, we can build agents and complex workflows on top of LLMs/AI Agents and seamlessly connect models with the outside world.
MCP provides
A growing collection of pre-built integrations that your LLMs/AI Agents can directly plug into.
A standardized method to build custom integrations for AI applications.
An open protocol that anyone is free to adopt and implement.
The flexibility to switch between different apps while carrying your context with you.
MCP provides 3 capabilities
Prompts - Pre-built set of prompts that can be consumed from LLMs/Agents
Tools - Capabilities to interact with systems.
Resources - Data sources that act as a knowledge base for LLMs
Code walk-through
Now we will implement a simple MCP server by providing it with 2 tools and a pre-built prompt that will provide us the ability to interact with any of those 2 tools. The tools we will build are a pattern printer and a directory reader. Their purpose is a simple pattern printer that will print a simple pattern by taking number input, and a directory reader will read the contents of the directory and output them. Overall, our MCP server will contain the following ingredients:
Tech Stack used
High-Level Architecture
Our MCP server is consumed by VS Code Copilot Agent mode, which uses AI models made available by the Microsoft Ecosystem.
Server name: VarunMCP
MCP Protocol: stdio
![mcp]()
Coding
Download packages
pip install uv
pip install mcp[cli]
Create python file hello_mcp_server.py
import os
from mcp.server.fastmcp import FastMCP
# Create an MCP server
mcp = FastMCP("VarunMCP", "You are a helpful assistant that can use tools for Varun.")
@mcp.tool()
def print_pattern(a: int) -> int:
"""Print a pattern of numbers"""
for i in range(a):
print(i)
return a
@mcp.tool("directory_reader")
def get_files() -> str:
"""Read system files"""
return "\n".join(os.listdir("/"))
# Add a prompt
@mcp.prompt()
def invoke_resources_prompt() -> str:
"""Generate a prompt to use current resources and print pattern"""
return """You are a helpful assistant that can use any of the following tools:
- directory_reader: Read system files
- print_pattern: Print a pattern of numbers
"""
In Visual Studio code, setup .vscode/mcp.json
{
"servers": {
"my-hello_mcp_server.py": {
"command": "uv",
"args": [
"run",
"--with",
"mcp",
"mcp",
"run",
"hello_mcp_server.py"
],
"env": {
"USERNAME": "Varun",
"USERPROFILE": "C:\\Users\\Varun",
"PROGRAMFILES": "C:\\Program Files"
}
}
},
"inputs": []
}
Now, we will consume in VS Code Copilot.
Step 1. Invoke Prompt
Note. This is the same prompt we developed in the MCP server using @mcp.prompt() decorator. This way, we can build premade prompts to improve the prompting experience.
![Agentmode]()
Step 2. Execute by pressing 'Enter' key
Now the output generated by the tool can be interpreted by LLM, giving it context and allowing it to use logical judgment to draw conclusions.
![chat]()
That's it! We have learned how to create an MCP server and increase the capabilities of our AI agents. This improves decision-making and customization of the tool, allowing it to work as a support system for our work, making it an ideal choice for our work as a true companion.
Thanks for reading till the end! If you want to explore more of it, do mention it in the comments.