LLM Inputs#

vllm.inputs.PromptInputs#

The central part of internal API.

This represents a generic version of type ‘origin’ with type arguments ‘params’. There are two kind of these aliases: user defined and special. The special ones are wrappers around builtin collections and ABCs in collections.abc. These must have ‘name’ always set. If ‘inst’ is False, then the alias can’t be instantiated, this is used by e.g. typing.List and typing.Dict.

alias of Union[str, TextPrompt, TokensPrompt, ExplicitEncoderDecoderPrompt]

class vllm.inputs.TextPrompt[source]#

Bases: TypedDict

Schema for a text prompt.

prompt: str#

The input text to be tokenized before passing to the model.

multi_modal_data: typing_extensions.NotRequired[MultiModalDataDict]#

Optional multi-modal data to pass to the model, if the model supports it.

class vllm.inputs.TokensPrompt[source]#

Bases: TypedDict

Schema for a tokenized prompt.

prompt_token_ids: List[int]#

A list of token IDs to pass to the model.

multi_modal_data: typing_extensions.NotRequired[MultiModalDataDict]#

Optional multi-modal data to pass to the model, if the model supports it.