feat: add LMStudio, Mise support, fix chat input multiline cursor navigation issue#5810
feat: add LMStudio, Mise support, fix chat input multiline cursor navigation issue#5810sohaieb wants to merge 4 commits intoFlowiseAI:mainfrom
Conversation
Summary of ChangesHello @sohaieb, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request significantly enhances the platform's capabilities by integrating LMStudio for local large language model operations, providing users with more flexibility in managing their LLM workflows. Additionally, it incorporates Mise for streamlined development environment setup, ensuring consistent project dependencies across different environments. Highlights
Changelog
Activity
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Code Review
This pull request adds support for Mise and LMStudio, which is a great addition. The implementation is solid, introducing new credentials and nodes for chat, LLM, and embeddings. I've made a few suggestions to improve the robustness and consistency of the new LMStudio nodes. Specifically, I've pointed out a couple of places where unsafe parsing of numeric inputs could lead to runtime issues and suggested a more robust way to handle optional parameters. I also recommended making the credential for the LMStudio LLM node optional to align with the other LMStudio nodes and common usage patterns. Overall, great work on expanding Flowise's capabilities.
56780fb to
29a91da
Compare
3872592 to
5f3c1a1
Compare
391d0c9 to
8770d19
Compare
8770d19 to
14993e3
Compare
923b7c6 to
983f38d
Compare
|
what is Mise? why is it needed here? |
|
Hi dear @HenryHengZJ ,
Mise (Official Docs) helps manage tools like Node and Python per project. It installs and configures them locally and prevents version conflicts. Please check the docs and let me know if we should include it in Flowise, so I can keep or remove it from the PR. |
983f38d to
c04111a
Compare
Changes
Important note
All LMStudio nodes work well, however the LMStudio embedding are unstable: the vector column values are always stored with "0" values.
After some researches and tries, I found out that this issue is due to the LMStudio returned response :
Not sure but It seems like Followise bases on these values?
I tried to simulate the same embedding process with Ollama but it returns correct values.
Can someone confirm that please ?
BTW I opened an issue to LMStudio basing on this topic, please check it out HERE