- Spec files
pypi-openai
Describes: pkg:pypi/openai@1.106.x
- Description
- Official Python library for the OpenAI API providing chat completions, embeddings, audio, images, and more
- Author
- tessl
- Last updated
assistants.md docs/
1# Assistants API23Build AI assistants with persistent conversations, file access, function calling, and code interpretation capabilities using the beta assistants framework.45## Capabilities67### Assistant Management89Create, configure, and manage AI assistants with specific instructions and capabilities.1011```python { .api }12def create(13self,14*,15model: str,16description: str | NotGiven = NOT_GIVEN,17instructions: str | NotGiven = NOT_GIVEN,18name: str | NotGiven = NOT_GIVEN,19tools: List[AssistantToolUnionParam] | NotGiven = NOT_GIVEN,20tool_resources: ToolResourcesParam | NotGiven = NOT_GIVEN,21metadata: Optional[object] | NotGiven = NOT_GIVEN,22temperature: float | NotGiven = NOT_GIVEN,23top_p: float | NotGiven = NOT_GIVEN,24response_format: AssistantResponseFormatParam | NotGiven = NOT_GIVEN25) -> Assistant: ...2627def list(28self,29*,30after: str | NotGiven = NOT_GIVEN,31before: str | NotGiven = NOT_GIVEN,32limit: int | NotGiven = NOT_GIVEN,33order: Literal["asc", "desc"] | NotGiven = NOT_GIVEN34) -> SyncCursorPage[Assistant]: ...3536def retrieve(37self,38assistant_id: str39) -> Assistant: ...4041def update(42self,43assistant_id: str,44*,45model: str | NotGiven = NOT_GIVEN,46name: str | NotGiven = NOT_GIVEN,47description: str | NotGiven = NOT_GIVEN,48instructions: str | NotGiven = NOT_GIVEN,49tools: List[AssistantToolUnionParam] | NotGiven = NOT_GIVEN,50tool_resources: ToolResourcesParam | NotGiven = NOT_GIVEN,51metadata: Optional[object] | NotGiven = NOT_GIVEN,52temperature: float | NotGiven = NOT_GIVEN,53top_p: float | NotGiven = NOT_GIVEN,54response_format: AssistantResponseFormatParam | NotGiven = NOT_GIVEN55) -> Assistant: ...5657def delete(58self,59assistant_id: str60) -> AssistantDeleted: ...61```6263Usage examples:6465```python66from openai import OpenAI6768client = OpenAI()6970# Create a basic assistant71assistant = client.beta.assistants.create(72name="Math Tutor",73instructions="You are a personal math tutor. Write and run code to answer math questions.",74tools=[{"type": "code_interpreter"}],75model="gpt-4-turbo"76)7778print(f"Created assistant: {assistant.id}")79print(f"Name: {assistant.name}")8081# Create assistant with file search82assistant = client.beta.assistants.create(83name="Research Assistant",84instructions="You are a helpful research assistant. Use the provided documents to answer questions.",85tools=[{"type": "file_search"}],86model="gpt-4-turbo"87)8889# Create assistant with function calling90assistant = client.beta.assistants.create(91name="Weather Assistant",92instructions="You help users get weather information. Use the get_weather function when needed.",93tools=[94{95"type": "function",96"function": {97"name": "get_weather",98"description": "Get current weather for a location",99"parameters": {100"type": "object",101"properties": {102"location": {103"type": "string",104"description": "City name"105}106},107"required": ["location"]108}109}110}111],112model="gpt-4-turbo"113)114115# List assistants116assistants = client.beta.assistants.list(limit=10)117118print("Your assistants:")119for assistant in assistants:120print(f" {assistant.id}: {assistant.name}")121122# Update assistant123updated_assistant = client.beta.assistants.update(124assistant.id,125instructions="You are an advanced math tutor specializing in calculus and linear algebra.",126tools=[{"type": "code_interpreter"}]127)128129# Delete assistant130deletion_result = client.beta.assistants.delete(assistant.id)131print(f"Assistant deleted: {deletion_result.deleted}")132```133134### Thread Management135136Manage conversation threads for persistent multi-turn conversations with assistants.137138```python { .api }139def create(140self,141*,142messages: List[ThreadMessageParam] | NotGiven = NOT_GIVEN,143tool_resources: ToolResourcesParam | NotGiven = NOT_GIVEN,144metadata: Optional[object] | NotGiven = NOT_GIVEN145) -> Thread: ...146147def retrieve(148self,149thread_id: str150) -> Thread: ...151152def update(153self,154thread_id: str,155*,156tool_resources: ToolResourcesParam | NotGiven = NOT_GIVEN,157metadata: Optional[object] | NotGiven = NOT_GIVEN158) -> Thread: ...159160def delete(161self,162thread_id: str163) -> ThreadDeleted: ...164```165166Usage examples:167168```python169# Create empty thread170thread = client.beta.threads.create()171print(f"Created thread: {thread.id}")172173# Create thread with initial messages174thread = client.beta.threads.create(175messages=[176{177"role": "user",178"content": "I need help with calculus. Can you explain derivatives?"179}180]181)182183print(f"Thread with initial message: {thread.id}")184185# Create thread with file attachments186thread = client.beta.threads.create(187messages=[188{189"role": "user",190"content": "Please analyze this data file",191"attachments": [192{193"file_id": "file-abc123",194"tools": [{"type": "code_interpreter"}]195}196]197}198]199)200201# Update thread metadata202updated_thread = client.beta.threads.update(203thread.id,204metadata={"user_id": "user123", "session": "2024-01"}205)206207# Retrieve thread208thread_info = client.beta.threads.retrieve(thread.id)209print(f"Thread metadata: {thread_info.metadata}")210211# Delete thread212deletion_result = client.beta.threads.delete(thread.id)213print(f"Thread deleted: {deletion_result.deleted}")214```215216### Message Management217218Add, retrieve, and manage messages within conversation threads.219220```python { .api }221def create(222self,223thread_id: str,224*,225role: Literal["user", "assistant"],226content: Union[str, List[MessageContentPartParam]],227attachments: Optional[List[AttachmentParam]] | NotGiven = NOT_GIVEN,228metadata: Optional[object] | NotGiven = NOT_GIVEN229) -> ThreadMessage: ...230231def list(232self,233thread_id: str,234*,235after: str | NotGiven = NOT_GIVEN,236before: str | NotGiven = NOT_GIVEN,237limit: int | NotGiven = NOT_GIVEN,238order: Literal["asc", "desc"] | NotGiven = NOT_GIVEN,239run_id: str | NotGiven = NOT_GIVEN240) -> SyncCursorPage[ThreadMessage]: ...241242def retrieve(243self,244thread_id: str,245message_id: str246) -> ThreadMessage: ...247248def update(249self,250thread_id: str,251message_id: str,252*,253metadata: Optional[object] | NotGiven = NOT_GIVEN254) -> ThreadMessage: ...255```256257Usage examples:258259```python260# Add user message to thread261message = client.beta.threads.messages.create(262thread_id=thread.id,263role="user",264content="What is the derivative of x^2?"265)266267print(f"Added message: {message.id}")268269# Add message with file attachment270message = client.beta.threads.messages.create(271thread_id=thread.id,272role="user",273content="Please analyze this dataset",274attachments=[275{276"file_id": "file-abc123",277"tools": [{"type": "code_interpreter"}]278}279]280)281282# Add message with image283message = client.beta.threads.messages.create(284thread_id=thread.id,285role="user",286content=[287{288"type": "text",289"text": "What do you see in this image?"290},291{292"type": "image_file",293"image_file": {"file_id": "file-image123"}294}295]296)297298# List messages in thread299messages = client.beta.threads.messages.list(300thread_id=thread.id,301order="desc",302limit=20303)304305print("Thread conversation:")306for message in reversed(list(messages)):307role = message.role308content = message.content[0].text.value if message.content else ""309print(f"{role}: {content}")310311# Get specific message312message = client.beta.threads.messages.retrieve(313thread_id=thread.id,314message_id=message.id315)316317print(f"Message details: {message.content}")318```319320### Run Management321322Execute assistants on threads and manage conversation runs with streaming support.323324```python { .api }325def create(326self,327thread_id: str,328*,329assistant_id: str,330model: str | NotGiven = NOT_GIVEN,331instructions: str | NotGiven = NOT_GIVEN,332additional_instructions: str | NotGiven = NOT_GIVEN,333additional_messages: List[ThreadMessageParam] | NotGiven = NOT_GIVEN,334tools: List[AssistantToolUnionParam] | NotGiven = NOT_GIVEN,335metadata: Optional[object] | NotGiven = NOT_GIVEN,336temperature: float | NotGiven = NOT_GIVEN,337top_p: float | NotGiven = NOT_GIVEN,338stream: Optional[bool] | NotGiven = NOT_GIVEN,339max_prompt_tokens: int | NotGiven = NOT_GIVEN,340max_completion_tokens: int | NotGiven = NOT_GIVEN,341truncation_strategy: TruncationStrategyParam | NotGiven = NOT_GIVEN,342tool_choice: AssistantToolChoiceParam | NotGiven = NOT_GIVEN,343parallel_tool_calls: bool | NotGiven = NOT_GIVEN,344response_format: AssistantResponseFormatParam | NotGiven = NOT_GIVEN345) -> Run | Stream[AssistantStreamEvent]: ...346347def retrieve(348self,349thread_id: str,350run_id: str351) -> Run: ...352353def update(354self,355thread_id: str,356run_id: str,357*,358metadata: Optional[object] | NotGiven = NOT_GIVEN359) -> Run: ...360361def list(362self,363thread_id: str,364*,365after: str | NotGiven = NOT_GIVEN,366before: str | NotGiven = NOT_GIVEN,367limit: int | NotGiven = NOT_GIVEN,368order: Literal["asc", "desc"] | NotGiven = NOT_GIVEN369) -> SyncCursorPage[Run]: ...370371def cancel(372self,373thread_id: str,374run_id: str375) -> Run: ...376377def submit_tool_outputs(378self,379thread_id: str,380run_id: str,381*,382tool_outputs: List[ToolOutputParam],383stream: Optional[bool] | NotGiven = NOT_GIVEN384) -> Run | Stream[AssistantStreamEvent]: ...385```386387Usage examples:388389```python390import time391392# Create and run assistant393run = client.beta.threads.runs.create(394thread_id=thread.id,395assistant_id=assistant.id396)397398print(f"Started run: {run.id}")399print(f"Status: {run.status}")400401# Poll run until completion402def wait_for_run_completion(thread_id: str, run_id: str):403"""Wait for run to complete"""404405while True:406run = client.beta.threads.runs.retrieve(407thread_id=thread_id,408run_id=run_id409)410411print(f"Run status: {run.status}")412413if run.status in ["completed", "failed", "cancelled", "expired"]:414return run415elif run.status == "requires_action":416print("Run requires action (function calls)")417return run418419time.sleep(1)420421# Wait for completion422completed_run = wait_for_run_completion(thread.id, run.id)423424if completed_run.status == "completed":425# Get assistant's response426messages = client.beta.threads.messages.list(thread_id=thread.id)427428latest_message = messages.data[0]429if latest_message.role == "assistant":430print(f"Assistant: {latest_message.content[0].text.value}")431432# Handle function calling433def handle_function_calls(thread_id: str, run_id: str):434"""Handle required function calls"""435436run = client.beta.threads.runs.retrieve(437thread_id=thread_id,438run_id=run_id439)440441if run.status == "requires_action":442tool_calls = run.required_action.submit_tool_outputs.tool_calls443tool_outputs = []444445for tool_call in tool_calls:446function_name = tool_call.function.name447function_args = tool_call.function.arguments448449# Call your function450if function_name == "get_weather":451import json452args = json.loads(function_args)453# Your weather function implementation454result = f"Weather in {args['location']}: Sunny, 75°F"455else:456result = "Function not implemented"457458tool_outputs.append({459"tool_call_id": tool_call.id,460"output": result461})462463# Submit function outputs464run = client.beta.threads.runs.submit_tool_outputs(465thread_id=thread_id,466run_id=run_id,467tool_outputs=tool_outputs468)469470return run471472# Example with function calling473weather_assistant = client.beta.assistants.create(474name="Weather Helper",475instructions="Get weather information when requested",476tools=[477{478"type": "function",479"function": {480"name": "get_weather",481"description": "Get weather for a location",482"parameters": {483"type": "object",484"properties": {485"location": {"type": "string"}486},487"required": ["location"]488}489}490}491],492model="gpt-4-turbo"493)494495# Add weather question496client.beta.threads.messages.create(497thread_id=thread.id,498role="user",499content="What's the weather like in San Francisco?"500)501502# Run with function calling503run = client.beta.threads.runs.create(504thread_id=thread.id,505assistant_id=weather_assistant.id506)507508# Handle function calls509completed_run = wait_for_run_completion(thread.id, run.id)510511if completed_run.status == "requires_action":512final_run = handle_function_calls(thread.id, run.id)513final_run = wait_for_run_completion(thread.id, run.id)514```515516### Streaming Runs517518Handle real-time streaming responses from assistant runs for better user experience.519520```python { .api }521def create(522self,523thread_id: str,524*,525assistant_id: str,526stream: Literal[True],527# ... other parameters528) -> Stream[AssistantStreamEvent]: ...529```530531Usage examples:532533```python534# Stream assistant response535stream = client.beta.threads.runs.create(536thread_id=thread.id,537assistant_id=assistant.id,538stream=True539)540541print("Streaming assistant response:")542543for event in stream:544if event.event == "thread.message.delta":545if hasattr(event.data.delta.content[0], 'text'):546print(event.data.delta.content[0].text.value, end="", flush=True)547elif event.event == "thread.run.completed":548print("\nRun completed")549break550elif event.event == "thread.run.failed":551print(f"\nRun failed: {event.data.last_error}")552break553554# Advanced streaming handler555class AssistantStreamHandler:556def __init__(self):557self.current_message = ""558self.tool_calls = []559560def handle_event(self, event):561if event.event == "thread.message.delta":562# Handle message content updates563delta = event.data.delta564if delta.content:565for content in delta.content:566if hasattr(content, 'text') and content.text:567if content.text.value:568self.current_message += content.text.value569print(content.text.value, end="", flush=True)570571elif event.event == "thread.run.requires_action":572# Handle function calls573self.tool_calls = event.data.required_action.submit_tool_outputs.tool_calls574print(f"\nFunction call required: {len(self.tool_calls)} calls")575576elif event.event == "thread.run.completed":577print("\n✓ Run completed")578579elif event.event == "thread.run.failed":580print(f"\n✗ Run failed: {event.data.last_error}")581582# Use streaming handler583handler = AssistantStreamHandler()584585stream = client.beta.threads.runs.create(586thread_id=thread.id,587assistant_id=assistant.id,588stream=True589)590591for event in stream:592handler.handle_event(event)593594# Handle any required tool calls595if handler.tool_calls:596# Process tool calls and submit outputs597# (implementation depends on your specific functions)598pass599```600601### Vector Stores and File Search602603Use vector stores for efficient file search and retrieval augmented generation (RAG).604605```python { .api }606# Vector store management607def create_vector_store(608self,609*,610file_ids: List[str] | NotGiven = NOT_GIVEN,611name: str | NotGiven = NOT_GIVEN,612expires_after: ExpiresAfterParam | NotGiven = NOT_GIVEN,613chunking_strategy: ChunkingStrategyParam | NotGiven = NOT_GIVEN,614metadata: Optional[object] | NotGiven = NOT_GIVEN615) -> VectorStore: ...616```617618Usage examples:619620```python621# Upload documents for file search622with open("knowledge_base.pdf", "rb") as f:623file1 = client.files.create(file=f, purpose="assistants")624625with open("documentation.txt", "rb") as f:626file2 = client.files.create(file=f, purpose="assistants")627628# Create vector store629vector_store = client.beta.vector_stores.create(630name="Knowledge Base",631file_ids=[file1.id, file2.id]632)633634print(f"Created vector store: {vector_store.id}")635636# Create assistant with file search637search_assistant = client.beta.assistants.create(638name="Document Assistant",639instructions="Use the uploaded documents to answer questions accurately. Always cite your sources.",640model="gpt-4-turbo",641tools=[{"type": "file_search"}],642tool_resources={643"file_search": {644"vector_store_ids": [vector_store.id]645}646}647)648649# Use file search in conversation650thread = client.beta.threads.create(651messages=[652{653"role": "user",654"content": "What information do you have about machine learning algorithms?"655}656]657)658659run = client.beta.threads.runs.create(660thread_id=thread.id,661assistant_id=search_assistant.id662)663664# Wait for response with file citations665completed_run = wait_for_run_completion(thread.id, run.id)666667if completed_run.status == "completed":668messages = client.beta.threads.messages.list(thread_id=thread.id)669670assistant_message = messages.data[0]671672# Display content with citations673for content in assistant_message.content:674if hasattr(content, 'text'):675print(content.text.value)676677# Show citations if available678if content.text.annotations:679print("\nSources:")680for annotation in content.text.annotations:681if hasattr(annotation, 'file_citation'):682citation = annotation.file_citation683print(f"- {citation.file_id}: {citation.quote}")684```685686## Types687688### Core Response Types689690```python { .api }691class Assistant(BaseModel):692id: str693created_at: int694description: Optional[str]695instructions: Optional[str]696metadata: Optional[Dict[str, str]]697model: str698name: Optional[str]699object: Literal["assistant"]700tools: List[AssistantTool]701tool_resources: Optional[ToolResources]702temperature: Optional[float]703top_p: Optional[float]704response_format: Optional[AssistantResponseFormat]705706class Thread(BaseModel):707id: str708created_at: int709metadata: Optional[Dict[str, str]]710object: Literal["thread"]711tool_resources: Optional[ToolResources]712713class ThreadMessage(BaseModel):714id: str715assistant_id: Optional[str]716attachments: Optional[List[Attachment]]717completed_at: Optional[int]718content: List[MessageContent]719created_at: int720incomplete_at: Optional[int]721incomplete_details: Optional[MessageIncompleteDetails]722metadata: Optional[Dict[str, str]]723object: Literal["thread.message"]724role: Literal["user", "assistant"]725run_id: Optional[str]726status: Literal["in_progress", "incomplete", "completed"]727thread_id: str728729class Run(BaseModel):730id: str731assistant_id: str732cancelled_at: Optional[int]733completed_at: Optional[int]734created_at: int735expires_at: Optional[int]736failed_at: Optional[int]737incomplete_details: Optional[RunIncompleteDetails]738instructions: str739last_error: Optional[LastError]740max_completion_tokens: Optional[int]741max_prompt_tokens: Optional[int]742metadata: Optional[Dict[str, str]]743model: str744object: Literal["thread.run"]745parallel_tool_calls: bool746required_action: Optional[RequiredAction]747response_format: Optional[AssistantResponseFormat]748started_at: Optional[int]749status: RunStatus750temperature: Optional[float]751thread_id: str752tool_choice: Optional[AssistantToolChoice]753tools: List[AssistantTool]754tool_resources: Optional[ToolResources]755top_p: Optional[float]756truncation_strategy: Optional[TruncationStrategy]757usage: Optional[RunUsage]758```759760### Parameter Types761762```python { .api }763# Assistant creation parameters764AssistantCreateParams = TypedDict('AssistantCreateParams', {765'model': Required[str],766'description': NotRequired[str],767'instructions': NotRequired[str],768'name': NotRequired[str],769'tools': NotRequired[List[AssistantToolUnionParam]],770'tool_resources': NotRequired[ToolResourcesParam],771'metadata': NotRequired[Optional[object]],772'temperature': NotRequired[float],773'top_p': NotRequired[float],774'response_format': NotRequired[AssistantResponseFormatParam],775}, total=False)776777# Thread message parameters778ThreadMessageParam = TypedDict('ThreadMessageParam', {779'role': Required[Literal["user", "assistant"]],780'content': Required[Union[str, List[MessageContentPartParam]]],781'attachments': NotRequired[Optional[List[AttachmentParam]]],782'metadata': NotRequired[Optional[object]],783}, total=False)784785# Run creation parameters786RunCreateParams = TypedDict('RunCreateParams', {787'assistant_id': Required[str],788'model': NotRequired[str],789'instructions': NotRequired[str],790'additional_instructions': NotRequired[str],791'additional_messages': NotRequired[List[ThreadMessageParam]],792'tools': NotRequired[List[AssistantToolUnionParam]],793'metadata': NotRequired[Optional[object]],794'temperature': NotRequired[float],795'top_p': NotRequired[float],796'stream': NotRequired[Optional[bool]],797'max_prompt_tokens': NotRequired[int],798'max_completion_tokens': NotRequired[int],799'truncation_strategy': NotRequired[TruncationStrategyParam],800'tool_choice': NotRequired[AssistantToolChoiceParam],801'parallel_tool_calls': NotRequired[bool],802'response_format': NotRequired[AssistantResponseFormatParam],803}, total=False)804```805806### Tool Types807808```python { .api }809# Assistant tools810AssistantToolUnionParam = Union[811CodeInterpreterToolParam,812FileSearchToolParam,813FunctionToolParam814]815816class CodeInterpreterToolParam(TypedDict, total=False):817type: Required[Literal["code_interpreter"]]818819class FileSearchToolParam(TypedDict, total=False):820type: Required[Literal["file_search"]]821file_search: FileSearchParam822823class FunctionToolParam(TypedDict, total=False):824type: Required[Literal["function"]]825function: Required[FunctionDefinition]826827# Tool resources828class ToolResourcesParam(TypedDict, total=False):829code_interpreter: CodeInterpreterResourceParam830file_search: FileSearchResourceParam831832class CodeInterpreterResourceParam(TypedDict, total=False):833file_ids: List[str]834835class FileSearchResourceParam(TypedDict, total=False):836vector_store_ids: List[str]837vector_stores: List[VectorStoreParam]838```839840### Status and Event Types841842```python { .api }843# Run status enumeration844RunStatus = Literal[845"queued",846"in_progress",847"requires_action",848"cancelling",849"cancelled",850"failed",851"completed",852"incomplete",853"expired"854]855856# Stream event types857AssistantStreamEvent = Union[858ThreadRunEvent,859ThreadMessageEvent,860ThreadMessageDeltaEvent,861RunStepEvent,862RunStepDeltaEvent,863ErrorEvent864]865866# Required action for function calls867class RequiredAction(BaseModel):868submit_tool_outputs: RequiredActionSubmitToolOutputs869type: Literal["submit_tool_outputs"]870871class RequiredActionSubmitToolOutputs(BaseModel):872tool_calls: List[RequiredActionFunctionToolCall]873874class ToolOutputParam(TypedDict, total=False):875tool_call_id: Required[str]876output: Required[str]877```878879### Content Types880881```python { .api }882# Message content types883MessageContent = Union[884MessageContentImageFile,885MessageContentImageUrl,886MessageContentText,887MessageContentRefusal888]889890class MessageContentText(BaseModel):891text: MessageContentTextObject892type: Literal["text"]893894class MessageContentTextObject(BaseModel):895annotations: List[MessageContentTextAnnotation]896value: str897898# Annotations for citations899MessageContentTextAnnotation = Union[900MessageContentTextAnnotationFileCitation,901MessageContentTextAnnotationFilePath902]903904class MessageContentTextAnnotationFileCitation(BaseModel):905end_index: int906file_citation: MessageContentTextAnnotationFileCitationObject907start_index: int908text: str909type: Literal["file_citation"]910911# Attachment types912class AttachmentParam(TypedDict, total=False):913file_id: Required[str]914tools: Required[List[AssistantToolUnionParam]]915```916917## Best Practices918919### Assistant Design920921- Write clear, specific instructions for consistent behavior922- Use appropriate tools for the assistant's intended purpose923- Set reasonable temperature values (0.1-0.7 for most tasks)924- Include examples in instructions for complex tasks925- Test assistants thoroughly before deployment926927### Thread Management928929- Create separate threads for different conversation topics930- Use metadata to track thread context and user information931- Clean up old threads periodically to manage costs932- Consider thread limits and conversation length933934### File and Vector Store Usage935936- Organize files logically in vector stores937- Use descriptive names for files and vector stores938- Monitor file storage usage and costs939- Update vector stores when source documents change940- Implement proper file access controls941942### Function Calling943944- Design functions with clear, specific purposes945- Provide detailed function descriptions and parameter schemas946- Handle function errors gracefully947- Validate function inputs and outputs948- Test function calling workflows thoroughly949950### Production Considerations951952- Implement proper error handling for all assistant operations953- Monitor usage and costs for assistants and threads954- Use streaming for better user experience with long responses955- Implement rate limiting and abuse prevention956- Keep assistant instructions and tools up to date957- Plan for assistant versioning and updates