- Add ChatStream method to all providers (Anthropic, OpenAI, Gemini, Ollama)
for real-time streaming of AI responses with tool call support
- Add StreamingProvider interface with StreamEvent types for content,
thinking, tool_start, tool_end, done, and error events
- Add notable models feature that fetches model metadata from models.dev
to identify recent/recommended models (within last 3 months)
- Add Notable field to ModelInfo struct to flag "latest and greatest" models
- Add SupportsThinking method to check for extended reasoning capability
The streaming support enables real-time AI chat responses instead of
waiting for complete responses. The notable models feature helps users
identify which models are current and recommended.