Skip to content

Local LLM (Ollama) integration issue (Chrome only) #333

@nbriz

Description

@nbriz

Students can now optionally connect netnet to an LLM provider, this allows them to ask netnet questions that it passes along to an LLM with a system prompt aimed at steering the LLM output into behaving more like a tutor (teaching students to solve the problem) rather than their default behavior of solving the problem for them. Additionally, when the LLM returns code snippets, rather than copy+pasting students have to "trace over" the code (re-type the code themselves) before they can copy it into their sketch/project.

The integration is setup using the LLM-API Conduit widget, which teaches students not only how to integrate the LLM into netnet, but also how these sorts of integrations work in the first place as well as grant them control over decisions like: which model to use? what sort of format the model should reply with? what temperature to set? and notably (for this issue) which LLM provider to go with.

If they choose a commercial provider (like OpenAI or Anthropic) it works across browsers, but if they choose instead to connect netnet to an LLM they're running locally on their laptop using Ollama, there's an issue with Chrome (as mentioned here: https://dev.netnet.studio/docs/misc/ai-integration.html)

the issue

the Ollama integration does not work in Chrome. This is due to Chrome's Private Network Access (PNA) policy, which blocks requests from public HTTPS pages to loopback/local addresses regardless of CORS settings.

Chrome's PNA spec requires servers to respond to preflight requests with:

Access-Control-Allow-Private-Network: true

Ollama does not currently send this header. Firefox does not enforce PNA, so the integration works there without any additional setup beyond what's mentioned in the docs

This may be resolved in a future Ollama release — if they add support for the Access-Control-Allow-Private-Network header, Chrome compatibility would be restored without any changes on our end. Worth watching ollama/ollama for related issues.

Current handling: the widget already detects this failure and tells Chrome users to switch to Firefox. Other browsers get a more generic connectivity error.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requestlowpriority level

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions