Ollama now supports "think" and "no think" settings for reasoning models like DeepSeek R1: https://ollama.com/blog/thinking
Be great if this could be supported by Bolt! R1:7b is one of the best local models for mainstream laptops but it overthinks terribly on the thinking mode.