TigerGraph Unveils Next Generation Hybrid Search to Power AI at Scale; Also Introduces a Game-Changing Community Edition
Read Press Release
Contact Us
Go Back
June 6, 2025
5 min read

Understanding Model Context Protocol (MCP)

Chengjie Qin
A close-up of a server rack surrounded by digital network graphics, with the TigerGraph logo and the text Understanding Model Context Protocol (MCP) displayed on the left.

Understanding Model Context Protocol (MCP)

The Model Context Protocol (MCP) is an open standard that aims to streamline how AI models, particularly Large Language Models (LLMs), connect with external data sources and tools. Think of it as a universal language that allows AI to access and utilize information from various systems in a standardized way.

Key Concepts of MCP

  • Standardized Connections: MCP provides a consistent way for AI to interact with different servers, replacing the need for custom integrations for each data source.
  • Contextual Awareness: MCP enables AI to access the specific data it needs to understand a situation or answer a query, rather than relying solely on its internal knowledge.
  • Modular Architecture: MCP separates the AI application (host) from the data providers (servers), allowing for flexibility and extensibility.

MCP Server

In the context of MCP, a server is a component that exposes a specific data source or tool to AI applications. An MCP server:

  • Provides access to data (e.g., a database, a file system).
  • Offers tools or functionalities (e.g., search, data manipulation).
  • Communicates with AI applications using the MCP standard.

How TigerGraph Plays a Role in the MCP Server Space

TigerGraph can power an MCP server, providing AI applications with access to rich, interconnected data and analytical capabilities.

Here’s how:

  • Exposing Graph Data via MCP: TigerGraph can expose its graph data and query functionalities through an MCP server interface. This allows AI models to:
  • Retrieve information about entities and their relationships: For example, an AI agent could use TigerGraph to find all customers connected to a specific transaction or identify relationships between different accounts in a financial network.
  • Execute graph queries: AI can leverage TigerGraph’s GSQL query language to perform complex graph traversals and analytics, enabling it to answer questions that require understanding relationships within the data.
  • Providing Context for AI Reasoning: By acting as an MCP server, TigerGraph can equip AI models with the contextual information they need to make more informed and accurate decisions. For instance, in a customer service application, an AI agent can use TigerGraph to access a customer’s interaction history, social connections, and purchase patterns to provide more personalized and helpful support.
  • Enhancing AI Explainability: The graph-based structure of TigerGraph makes it easier to understand how AI arrived at a particular conclusion. By tracing the paths and relationships used by an AI agent, TigerGraph can improve the transparency and explainability of AI decision-making.

Use Cases

Here are some examples of how TigerGraph as an MCP server can be used:

  • AI-Powered Customer Service: An AI assistant uses TigerGraph to access customer data and relationship information to provide personalized and context-aware support.
  • Dynamic Fraud Detection: An AI agent leverages TigerGraph to analyze transaction networks and identify complex fraud patterns in real-time.
  • Knowledge-Driven Applications: An AI system uses TigerGraph to query a knowledge graph and provide users with accurate and comprehensive answers.

By acting as an MCP server, TigerGraph can empower AI applications with the ability to understand and reason over complex relationships, leading to more intelligent and effective solutions.

Get Started 

Prerequisites

To use TigerGraph-MCP, ensure you have the following prerequisites:

       1.  Python: version 3.10, 3.11, or 3.12.

       2.   TigerGraph: You need TigerGraph version 4.1 or later. You can set it up using one of these methods:

  • Local Installation: Install and configure TigerGraph on your machine.
  • TigerGraph Savanna: Use a managed instance of TigerGraph.
  • Docker: Run TigerGraph in a containerized environment.

Installation Steps

Option 1: Install from PyPI

The simplest way to install TigerGraph-MCP is via PyPI. It is recommended to create a virtual environment first:

Shell

pip install tigergraph-mcp

Option 2: Build from Source

If you wish to explore or modify the code:

1.   Install Poetry for dependency management.

2.  Clone the repository:

Shell

git clone https://github.com/TigerGraph-DevLabs/tigergraphx

cd tigergraph-mcp

3.  Setting up the Python environment with Poetry

Shell

poetry env use python3.12
poetry install --with dev
eval $(poetry env activate)

Using TigerGraph-MCP Tools

To utilize TigerGraph-MCP tools effectively, especially with GitHub Copilot Chat in VS Code, follow these steps:

1.  Set Up GitHub Copilot Chat: Follow the official documentation to configure it.

2.  Create a .env File: Include your OpenAI API key and TigerGraph connection details.

3.  Configure VS Code: Create a .vscode/mcp.json file to set up the TigerGraph-MCP server.

4.  Interact with the MCP Tool: Use GitHub Copilot to send commands and create schemas in TigerGraph.

Advanced Usage with CrewAI

For more complex interactions or custom workflows, consider using CrewAI or LangGraph. Examples are provided in the repository to help you get started with creating AI agents and managing workflows.

TigerGraph MCP server is open-source at:  https://github.com/TigerGraph-DevLabs/tigergraph-mcp/tree/main

Current Status

The TigerGraph MCP server is actively being developed, and we encourage you to contribute! Here are some current features and enhancements:

  • Basic MCP Functionality: The server currently supports basic data retrieval and query execution through the MCP interface. You can view the list of currently supported features here.
  • Ongoing Improvements: We are continuously working on enhancing the server’s capabilities. For details on our development roadmap, please visit here.
  • Community Contributions: We welcome community feedback and contributions. If you have ideas for new features or improvements, please open an issue or submit a pull request on GitHub.

Follow the demo video below to give it a try here

 

About the Author

Chengjie Qin

Learn More About PartnerGraph

TigerGraph Partners with organizations that offer
complementary technology solutions and services.
Dr. Jay Yu

Dr. Jay Yu | VP of Product and Innovation

Dr. Jay Yu is the VP of Product and Innovation at TigerGraph, responsible for driving product strategy and roadmap, as well as fostering innovation in graph database engine and graph solutions. He is a proven hands-on full-stack innovator, strategic thinker, leader, and evangelist for new technology and product, with 25+ years of industry experience ranging from highly scalable distributed database engine company (Teradata), B2B e-commerce services startup, to consumer-facing financial applications company (Intuit). He received his PhD from the University of Wisconsin - Madison, where he specialized in large scale parallel database systems

Smiling man with short dark hair wearing a black collared shirt against a light gray background.

Todd Blaschka | COO

Todd Blaschka is a veteran in the enterprise software industry. He is passionate about creating entirely new segments in data, analytics and AI, with the distinction of establishing graph analytics as a Gartner Top 10 Data & Analytics trend two years in a row. By fervently focusing on critical industry and customer challenges, the companies under Todd's leadership have delivered significant quantifiable results to the largest brands in the world through channel and solution sales approach. Prior to TigerGraph, Todd led go to market and customer experience functions at Clustrix (acquired by MariaDB), Dataguise and IBM.