# v0.9.50, build 225 (2025-07-xx xx:xx UTC) - Added an option for chat templates to predefine a user input. - Added the ability to create chat templates from existing chats. - Added an enterprise IT configuration option to prevent manual addition of LLM providers in managed environments. - Added support for self-hosted LLMs using [vLLM](https://blog.vllm.ai/2023/06/20/vllm.html). - Improved the display of enterprise configurations on the about page; configuration details are only shown when needed. - Improved hot reloading on Unix-like systems when entire plugins were added or removed.