Use the Finch MCP Server to enable natural language access to the Finch API via large language models (LLMs).
The Finch MCP Server makes it simple to integrate Large Language Models (LLMs) with the Finch API using natural language. Rather than writing custom API calls, you can describe your intent, and the LLM will handle the rest by issuing calls to the Finch API to get the information you need or take action on your behalf.
The Finch MCP Server exposes structured tools to LLMs, making it possible to use Finchβs API without understanding its full surface area. This unlocks powerful, intuitive workflows for developers, finance teams, support agents, and more.
MCP stands for Model Context Protocol, a mechanism that defines how LLMs can safely interact with external APIs and tools.
Install the MCP Server via npm:
To connect the Finch MCP Server to an MCP Client, you will need to add the appropriate configuration to your client. Below is an example of what that may look like.
π‘ Replace
<a specific connection's access token>
with a valid Finch API access token for the connection you want the LLM to access.
π Security Note: Use a token scoped to only the data required for your use case. Read-only tokens are recommended for most applications. Depending on your use case, and due to the sensitive nature of employment data, take extra caution when deciding what LLMs you choose to share your data with and the usage policies of the LLM provider. Finchβs general recommendation is to use a self-hosted LLM for usage with the Finch MCP Server.
Sample Prompt:
Find all employees in the engineering department who started after January 1st, 2023.
Sample Prompt:
What are the current titles and base salaries of everyone on the sales team?
Sample Prompt:
How many employees are full-time vs part-time across all locations?
If you run into issues or have questions, reach out to us at support@tryfinch.com.