Local use of LLM via LM studio?

U mostly now use LM studio because it was much more optimised for local computer use then ollama. Speed difference is 50% +There are now some very good visual LLM coming out like gwen3-vl

I was wondering if maybe now is the time to integrate this option? I used UI vision for a long time but now I want to automate things via web browsers that are just to variable and I need an LLM to think between the steps. But I want it local because of inefficient token cost and security (data sensitive information). Thx in advance, keep up the great work! Cheers ^^

1 Like

Yes, now is a good time to integrate local LLMs :slightly_smiling_face:. → We are have some interesting projects on our todo list for 2026.

Also, just as a reminder for everyone, Ui.Vision is fully open-source RPA. So if someone wants to give it a try (to integrate local LLMs) that would be great. We have some other tasks to complete first, so will probably not be able to work on new LLMs until early 2026.