Skip to content

Overview

Use LLMs to guide you through developing with SumUp SDKs.

You can use large language models (LLMs) to assist in the building of SumUp integrations. We provide a set of tools and best practices if you use LLMs during development.

You can access all of our documentation as plain text Markdown files by adding index.md to the end of any URL. For example, you can find the plaintext version of this page itself at https://developer.sumup.com/tools/llms/index.md.

This helps AI tools and agents consume our content and allows you to copy and paste the entire contents of a doc into an LLM. This format is preferable to scraping or copying from our HTML and JavaScript-rendered pages because:

  • Plain text contains fewer formatting tokens.
  • Content that isn’t rendered in the default view (for example, it’s hidden in a tab) of a given page is rendered in the plaintext version.
  • LLMs can parse and understand Markdown hierarchy.

We also host an /llms.txt file which instructs AI tools and agents how to retrieve the plaintext versions of our pages. The /llms.txt file is an emerging standard for making websites and content more accessible to LLMs.

If you use AI coding assistants, SumUp also provides Agent Skills to give your assistant targeted guidance for building payment integrations. You can install the sumup/sumup-skills repository in tools like Claude Code, Cursor, and OpenAI Codex, then invoke the sumup skill for checkout implementation and troubleshooting tasks.

Use the SumUp MCP server to connect MCP-compatible clients and AI assistants to SumUp APIs and documentation tools.

Hosted MCP URL: https://mcp.sumup.com/mcp

You can also run MCP locally:

SUMUP_API_KEY='sup_sk_...' npx -y @sumup/mcp

For full setup instructions and client examples, see MCP Server.

If you’re building agentic software, we provide an SDK for adding SumUp functionality to your agent’s capabilities. Learn more in our agents documentation.