TARILIO Pro/eSearch Pro 1.1.9439 AI Mode LLM Local Server capabilities #78
electronart
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
eSearch/TARILO PRO 1.1.9439 (4 Nov 2025) can be installed on a small home or office local network as an LLM Server (usually on a machine with a GPU) and also installed on other machines on the network as clients, so that machines with only a CPU can access high quality LLMs offline. Client users can add notes to any section of the conversation and export the file to a shared folder; other users on the network can import the file and continue a multipart conversation if required.

Note: The Hugging Face page shows the Q1 as the recommended model for the single RTX 4070 Super - however in practice as shown below it will run the Q4_K_M at a moderate pace.
Inference Client: eSearch Pro/TARILIO running on Lenovo ThinkStation with no graphics card.

LLM Server: eSearch Pro/TARILIO Local Server running on Lenovo Legend with GeForce RTX4070 Super (12GB)
Viewer Settings: Segoe UI 14pt.
Beta Was this translation helpful? Give feedback.
All reactions