Using LLMs with dLocal 🆕
Boost your dLocal integration workflow with LLMs.
Agentic integrated development environments (agentic IDEs) transform software development by directly embedding AI-powered assistance into coding workflows. You can use them to streamline your dLocal integration in various ways.
We offer tools and best practices to help you make the most of this new development approach.
LLM features
These tools improve how AI assistants interact with our documentation, helping you quickly access relevant content and streamline routine tasks.

Use cases
These tools support different parts of your workflow, whether building tools, writing scripts, or looking for quick answers.
Feature | Description | Benefit / Use |
---|---|---|
Plain text (llms.txt ) | A plain text file located at the site's root instructs LLMs on reading and processing the documentation. | Provide AI models with clear instructions on how to parse and prioritize your documentation content effectively. |
Plain docs (.md ) | Documentation pages are available in raw Markdown format by adding .md to the URL. | Copy and reuse documentation content, process pages in scripts or tools without HTML clutter. |
Ask AI | Each page has a dropdown menu that links to AI assistants (like ChatGPT and Claude) and offers utilities for easy content use. | Send content to your preferred AI tool quickly and speed up your workflow with fewer clicks. |
Setup and resources
Find everything you need to make LLMs work better with your dLocal integration.
LLMs text file location
The /llms.txt
file follows a developing standard that helps websites guide LLMs on handling their content. You can find the /llms.txt
file at the root of the site:
https://docs.dlocal.com/llms.txt
Plain text docs
Plain text documents keep the content simple, easy to process, and free of complex web formatting. You can access our guide documentation as plain text Markdown files by adding .md
to the end of any URL.
https://docs.dlocal.com/docs/using-llms.md
Ask AI
The Ask AI dropdown appears as a menu icon in the guides.
It links to AI assistants (ChatGPT and Claude) that can explain concepts, answer questions, or generate code examples based on the page content.
It also includes handy tools for working with AI-generated content, such as Copy to Clipboard and View as Markdown.
dLocal Model Context Protocol (MCP) Server
Model Context Protocol helps organize and manage the correct information to language models (LLMs) so that they work correctly in specific situations. It allows us to connect AI-powered tools (like custom assistants or IDEs) to our APIs and services.
The dLocal MCP server acts as a bridge between your AI agent and dLocal’s systems.. It provides a standardized way for agents to search internal knowledge bases (like docs, support articles, and FAQs).
Remote server
We provide a remote MCP server, available at:
https://mcp.dlocal.com/integration
Step by step
To connect to dLocal MCP server, follow the steps below using your preferred MCP client:
- Edit your MCP Client configuration.
Open the configuration file for your MCP client in a text editor. Replace its contents with the following:
{
"servers": {
"dlocal-integration-mcp": {
"url": "https://mcp.dlocal.com/integration"
}
}
}
- Save and restart the client.
After updating the configuration, save the file and restart your MCP client. - Test the connection.
To verify the integration is working correctly, ask the MCP client to perform a supported action.
Updated about 9 hours ago