About
The recent interest and spark in machine learning procedures, coming from The history of machine learning, Time Series Prediction, and others, is currently accompanied by people at the Späti talking about 2025 being the year of LLM agents.
Introduction
In this area, while LLMs have been accompanied by external tools and resources before, Anthropic recently published a standardized protocol convention called MCP.
MCP, the Model Context Protocol, is an open protocol that enables seamless integration between LLM applications and external data sources and tools. It is sometimes described as “OpenAPI for LLMs” or as “USB-C port for AI”, providing a uniform way to connect LLMs to resources they can use.
A basic “tool” would be to accompany the LLM by means to fetch an external web page, in order to enhance its context understanding by specific topic matters.
Example
We gained a few insights into the topic when supporting to conceive the CrateDB MCP Server, which provides natural-language Text-to-SQL and documentation retrieval specialized for CrateDB database clusters.
The CrateDB MCP server effectively includes two subsystems: Text-to-SQL and documentation inquiry.
Text-to-SQL works pretty well, based on two MCP tools get_table_metadata and query_sql. The agent will provide table schema information, so generated SQL statements will actually match up with reality.
Inquiry of CrateDB documentation provided in Markdown format also works pretty well, based on two other MCP tools get_cratedb_documentation_index and fetch_cratedb_docs, in turn using the cratedb-about package, which indexes a few essentials from the whole body of the CrateDB Documentation. The agent will be able to select corresponding sections of the documentation to enhance its capabilities for solving tasks that are very specific to CrateDB.
Outlook
We did not gain insights into composing systems of multiple MCP servers yet, but will report about it when it happened. Any kind of contribution is very much welcome.