Amazon offers two main managed AI services in AWS: SageMaker for building and training custom models, and Bedrock for easy access to foundation models via an API and a serverless model. This doc is only focussed on AWS BedRock and in particular on the Prompt Routing feature. We will cover other related service features in other docs.
Prompt Routing (sometimes marketed as Intelligent Prompt Routing) is a new feature added to the AWS Bedrock service and is still in early stages .. becoming generally available since April 2025. The capabilities of both Bedrock and its Prompt Routing feature are likely to evolve and this note only describes the capabilities at the time of writing this note (June 2025). You can use the Prompt Routing feature with a built-in set of foundation models from AWS, Anthropic and Meta. The basic capability here is that the service will automatically route a user provided LLM prompt to one of multiple LLM models being served in a serverless manner with an aim of either improving inference