Skip to content

Changelog

The following log records the version update history and change details of FlowDown.

1.20 (200/201/202)

Chat Template System 📋

  • Brand new reusable chat templates for workflow efficiency
  • AI-generated templates from existing conversations
  • Intuitive template editor with creation/deletion
  • Quick preset selection when starting new chats

General Improvements

  • Fixed missing entitlements for running efficiently on iOS/iPadOS
  • Fixed on device visual language model support
  • Added Gemma3 model support
  • Added HTML preview for generated codes

1.19 (190)

  • Added new menu items
  • Added welcome log for first-time app launch
  • Improved OpenRouter integration
  • Fixed issue where input box couldn't auto-focus when creating new chat on Mac
  • Improved app's background processing capabilities
  • Fixed issue where reasoning content wasn't updating in some cases
  • Fixed logical errors related to model tools

Thanks to the following contributors:

  • @at-wr
  • @Zach677

1.13 (130)

  • Fixed issue where keyboard shortcuts for copying text in code view were ineffective
  • Fixed incorrect cursor display when selecting text
  • Fixed unresponsive drag gestures on first and last lines of text
  • Fixed code syntax highlighting not being applied
  • Fixed reasoning requests being added to auxiliary tasks
  • Added option to use developer role when needed
  • App can now be opened via flowdown:// protocol
  • App can now open model files in place

1.11

  • Fixed incorrect model identifier when copying models
  • Fixed mathematical content not rendering correctly
  • Fixed menu disappearing when right-clicking conversation list
  • Fixed documents not being indexed after web search

1.10 (101/102)

  • Improved visual task scheduler with added option to skip auxiliary tasks for vision-native models
  • Fixed editor lag when processing large images
  • Fixed editor inability to restore editable content
  • Improved auxiliary task structure and enhanced execution speed
  • Improved data flow for web search when tool calling is enabled
  • Fixed missing reasoning content in image recognition tasks
  • Added support for OpenRouter model inference tokens

1.9 (90)

  • Added preliminary support for mathematical content
  • Support for LaTeX format mathematical formula display

1.7 (70)

  • Fixed Qwen3 local model support issues
  • Updated MLX model manifest structure to fix compatibility issues
  • Local vision functionality temporarily unavailable, awaiting upstream fixes
  • FlowDown is now fully open source, users can verify privacy protection commitments themselves

1.6 (60)

  • Improved local model support
  • Fixed multiple known issues

1.5 (50)

  • Fixed misbehave on pasting content
  • Fixed misbehave on web search phase indicator
  • Fixed wrong return key title on input box
  • Fixed a bug where multiple tool call was received at the same time
  • Made support for connecting to local hosted LLMs via OLLAMA and LM Studio

1.4 (49)

  • Fixed a crash at using location tool when location service is disabled

1.4 (47/48)

  • Fixed issue where table border colors weren't updating correctly
  • Fixed a crash when quickly deleting messages
  • Allow sending empty messages when attachments are present
  • Added ability to directly select text

1.3 (44)

  • Fixed crash issues with certain reasoning models
  • Fixed the issue where content were unexpectedly shown as reasoning
  • Fixed calculation errors in chat list element positioning
  • Fixed the issue where some model endpoints don't support usage statistics, causing unavailability
  • Fixed the issue where crashes caused content not to be saved
  • Fixed the issue where code blocks couldn't open details page in Hugging Face model download screen
  • Fixed the issue where code blocks and tables couldn't use correct information in special cases
  • Fixed logical errors that may occur when interrupting inference
  • Fixed the issue of deleting error messages during retry process
  • Disabled local model functionality for devices that don't support MLX
  • Adjusted the order and logic of several menus
  • Adjusted the logic for deleting conversation content
  • Adjusted the logic for editing conversation content
  • Added the option to send messages with Command + Enter
  • Added the option for long press to create a new line
  • Added the option to save reply content as images
  • Removed the temporary chat option

1.2 (36/37/42)

  • Added image export feature that allows exporting chat content as images
  • Added ability to terminate ongoing conversations at any time
  • Improved web search with deduplication algorithm for search results
  • Fixed inconsistent text size rendering in supplement views
  • Refactored code and resolved several known issues and bugs

1.1 (35)

  • Fixed a bug that extracted image information may not being stored correctly.
  • Fixed a bug that system prompt is not updated correctly.
  • Fixed a bug where tool call may result server error.

1.0 (32/33)

First release to the App Store.

  • Changed the software icon
  • Fixed the issue where the DeepSeek model did not support out-of-order system prompts
  • Fixed the issue where retries did not use the correct model
  • Fixed the tool call failure issue, removed the minLength parameter that some models do not support
  • Fixed the issue where the model did not use the latest time
  • Adjusted the prompts used by the tool
  • Added an option for 64k context length

1.0 (28)

  • Supported cloud models can now use tools
  • Added a right-click/long-press shortcut menu to the model selector
  • Improved the status display of tool calls, some tools support viewing call details
  • Added an option for 64K context
  • Resolved the issue of application crashes caused by undo operations
  • Optimized exception handling when the model is unavailable
  • Fixed permission-related issues and improved error messages
  • Optimized data export logic and logging system
  • Improved the pre-dialogue check mechanism to improve stability

1.0 (20)

  • Fixed menu event response issues
  • Fixed code view height mismatch issues
  • Fixed the issue where the model name was not updated as expected
  • Optimized the long-press interaction experience of the conversation list
  • Optimized the efficiency of the inference engine
  • Improved the URL opening mechanism and added security warning prompts
  • Significantly reduced network crawler resource consumption
  • Replaced widget icons