Skip to main content
Vincony
ME
Meituan
Text

Longcat 128K

meituan/longcat-128k

2 credits / request
Compare with…Added 2026

Longcat 128K is Meituan's ultra-long context model, purpose-built for tasks that require processing massive amounts of text in a single request. With a 128K token context window, it can handle entire books, large codebases, and extensive document collections without losing coherence.

Longcat excels at tasks where understanding the full scope of a large document is essential — contract analysis, research synthesis, and codebase understanding.

Key Features

128K token context for massive document processing

Strong long-range coherence and recall

Excellent at cross-document synthesis

Good bilingual support (Chinese + English)

Optimized for document-heavy workflows

Ideal Use Cases

1.

Full-book analysis and summarization

2.

Large codebase understanding and documentation

3.

Cross-document research synthesis

4.

Contract and legal document review

Technical Specifications

Context Window128K tokens
ModalityText → Text
ProviderMeituan
CategoryText Generation
Max Output16K tokens
Best ForUltra-long document processing

API Usage

1curl -X POST https://api.vincony.com/v1/chat/completions \
2 -H "Authorization: Bearer YOUR_API_KEY" \
3 -H "Content-Type: application/json" \
4 -d '{
5 "model": "meituan/longcat-128k",
6 "messages": [
7 { "role": "user", "content": "Hello, Longcat 128K!" }
8 ]
9 }'

Replace YOUR_API_KEY with your Vincony API key. OpenAI-compatible endpoint — works with any OpenAI SDK.

Compare with Another Model

Or compare up to 3 models

Frequently Asked Questions

Try Longcat 128K now

Start using Longcat 128K instantly — 100 free credits, no credit card required. Access 343+ AI models through one platform.

More from Meituan

Use to navigate between models · Esc to go back

Vincony — Access the World's Best AI Models