Use Formatter’s newest transform to avoid token limits when using large data sets with AI (Beta)
- Tables
-
Product updates
Product updates: January 2023 Product updates: March 2023 Product updates: February 2023 Product updates: April 2023 Product updates: May 2023 Product updates: June 2023 Product updates: July 2023 Product updates: August 2023 Product updates: September 2023 Product updates: October 2023 Product updates: November 2023 Product updates: December 2023 Product updates: January 2024 Product updates: February 2024 Product updates: March 2024 Product updates: April 2024 Product updates: May 2024 Product updates: June 2024 Product updates: July 2024 Product updates: August 2024 Product updates: September 2024 Product updates: October 2024 Product updates: November 2024 Product updates: December 2024 Product updates: January 2025 Product updates: February 2025 Product updates: March 2025 Product updates: April 2025 Product updates: May 2025 Product updates: June 2025 Product updates: July 2025 Product updates: August 2025
- Zaps
- Your Zapier account
- Interfaces
- Canvas
- Chatbots
- Getting started
- Agents
- MCP
- Built-in tools
- Lead Router
- Apps

This information was accurate at the time of publication. Please check out the latest product update notes for any updates or changes.
Hey Zapier users!
We're excited to introduce a new feature to help you work with AI large language models (LLM) and large data sets more efficiently: Formatter's Split Text into Chunks for AI Prompts transform.
Use it in your Zaps when dealing with large data sets like documents or website pages. It intelligently breaks your input into chunks based on input data, model type, prompt, and response size.
Then, you can run each chunk separately in your Zap to avoid hitting token limits.
This feature makes it easier than ever to handle large data sets in your Zaps, giving you more control and flexibility. Give it a try and see how it streamlines your workflows!
Learn more about using Split Text into Chunks for AI Prompts.