The forthcoming version of PyScripter has a number of improvements to LLL-assisted coding:
- Support for DeepSeek
When it comes to coding, DeepSeek is simply amazing. It is now supported both in the Chat and Assistant.
- The rendering of the LLM responses is much improved and it is now on a par with the Web interface of the LLM providers.
Code is presented in code boxes, and using the icons in the header, you can copy the code into the clipboard or directly into a new editor.
Markdown is now rendered perfectly:
- Support for DeepSeek and OpenAI LLM reasoning models
You can now follow the reasoning behind the answers, using DeepSeek's deepseek-reasoner model or OpenAI's "o" models, such o1-mini. The reasoning is presented without cluttering the output.
- Other improvements
- The temperature LLM model parameter is exposed by the Chat UI. It is a decimal number between 0 and 2 that controls the randomness of the results (higher values leed to more random answers).
- You can now print the chat topic content.
- Syntax highlighting for about 300 programming languages is available. So you can now ask questions not just about python but also for other languages.
And don't forget the Assistant. While coding Assistant completion is available by pressing Ctrl+Alt+Space. as well as by using the editor context menu.
No comments:
Post a Comment