This Arch package provides a preconfigured setup of Logseq with its GPT-3 OpenAI plugin, tailored for use with a local Ollama instance.
Logseq is more than just a note-taking app; it serves as a comprehensive platform for organizing thoughts, managing tasks, and building knowledge databases. In Ditana, Logseq is specially configured to leverage the local AI powered by Ollama. This integration enables AI-assisted note-taking, content generation, and knowledge management, all while maintaining privacy and security.
This package is ideal for anyone looking to enhance their intellectual workflow with the power of AI, without compromising on data privacy.
- Preconfigured Logseq setup optimized for Ditana GNU/Linux
- Integration with local Ollama instance (using the phi4-mini model) for AI-powered features
- Privacy-focused configuration, keeping your data local
- Custom shortcuts for quick access to AI features
- Tailored chat prompt for concise and focused AI responses
The upstream plugin was designed primarily for the OpenAI API and requires two patches to work correctly with Ollama's OpenAI-compatible endpoint:
- TypeScript strict null fix: The DALL-E response handling uses a non-null assertion (
response.data!) to resolve a TypeScript TS18048 error onresponse.data. - Force chat completions API: The plugin's model dispatch logic only uses the
/chat/completionsendpoint for model names starting withgpt-3.5orgpt-4, falling back to the legacy/completionsendpoint for all other models. Since Ollama does not support the legacy completions endpoint, the build forces the chat completions path for all models.
- Ditana GNU/Linux with a desktop environment
- A running Ollama server (provided by the
ollamapackage, which is installed and configured automatically when selecting the "Ollama Local AI" option in the Ditana installer)
This package is available in the Ditana GNU/Linux Arch repository. It is installed by default with desktop installations.
The package sets up Logseq with the following key configurations:
- GPT-3 OpenAI plugin enabled and configured to use the local Ollama instance via its OpenAI-compatible API
- Custom chat prompt for focused AI responses
You can find the configuration files in the following locations:
- Plugin configuration:
~/.logseq/config/plugins.edn - GPT-3 OpenAI plugin settings:
~/.logseq/settings/logseq-plugin-gpt3-openai.json
- Ensure the Ollama service is running (typically started automatically via systemd:
systemctl status ollama) - Launch Logseq
- Use the configured shortcuts to access AI features:
mod+j: AI block completionmod+g: AI popup for queries
You can customize the Logseq configuration and GPT-3 OpenAI plugin settings by editing the respective configuration files mentioned in the Configuration section.
To use a different language model, change the openAICompletionEngine value in ~/.logseq/settings/logseq-plugin-gpt3-openai.json to any model available in your Ollama installation. You can list available models with ollama list and pull new ones with ollama pull <model>.
- logseq-desktop-bin
- A running Ollama instance (provided by the
ollamapackage in Ditana)
For issues, feature requests, or contributions related to the Ditana configuration of Logseq, please use the GitHub issue tracker or submit a pull request.
For Logseq-specific questions, refer to the official Logseq documentation.
This project uses the logseq-plugin-gpt3-openai plugin by Brian Sunter. We thank Brian for developing this plugin, which has been valuable in integrating AI capabilities into Logseq for the Ditana project.
Ditana Config Logseq is part of the Ditana GNU/Linux project, aiming to provide sophisticated and privacy-focused tools for knowledge management and productivity.