Skip to content

Add philips-hue-control community ability#152

Closed
jamalnajmi wants to merge 12 commits intoopenhome-dev:devfrom
jamalnajmi:add-philips-hue-control
Closed

Add philips-hue-control community ability#152
jamalnajmi wants to merge 12 commits intoopenhome-dev:devfrom
jamalnajmi:add-philips-hue-control

Conversation

@jamalnajmi
Copy link
Copy Markdown

What does this Ability do?

Philips Hue Control is a community OpenHome Ability that controls Philips Hue lights via the local Hue Bridge API. It supports voice commands for on/off, brightness, color, temperature, scene activation, status checks, and all-lights control in a multi-turn flow.

Suggested Trigger Words

  • hue lights
  • turn on the lights
  • turn off the lights
    -control the lights
  • light control

Type

  • [x ] New community Ability
  • Improvement to existing Ability
  • Bug fix
  • Documentation update

External APIs

  • No external APIs
  • Uses external API(s): Philips Hue Bridge Local API (https://<bridge_ip>/clip/v2/) with runtime bridge-generated app key (no hardcoded key)

Testing

  • Tested in OpenHome Live Editor
  • All exit paths tested (said "stop", "exit", etc.)
  • [x ] Error scenarios tested (bridge discovery failure, invalid spoken IP input, unreachable bridge IP)

Checklist

  • Files are in community/philips-hue-control/
  • main.py follows SDK pattern (extends MatchingCapability, has register_capability + call)
  • README.md included with description, suggested triggers, and setup
  • resume_normal_flow() called on every exit path
  • No print() — using editor_logging_handler
  • No hardcoded API keys — using placeholders
  • No blocked imports (redis, connection_manager, user_config)
  • No asyncio.sleep() or asyncio.create_task() — using session_tasks
  • Error handling on all external calls
  • Tested in OpenHome Live Editor

Anything else?

Commands verified in demo:

  • Turn on the living room
  • Set bedroom to 50 percent
  • Make kitchen blue
  • Activate movie night
  • Are the kitchen lights on?
  • Turn off all the lights
  • Stop / That’s all

@jamalnajmi jamalnajmi requested a review from a team as a code owner February 26, 2026 08:58
@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Feb 26, 2026

✅ Community PR Path Check — Passed

All changed files are inside the community/ folder. Looks good!

@github-actions github-actions bot added first-contribution First-time contributor community-ability Community-contributed ability and removed first-contribution First-time contributor labels Feb 26, 2026
@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Feb 26, 2026

🔀 Branch Merge Check

PR direction: add-philips-hue-controldev

Passedadd-philips-hue-controldev is a valid merge direction

@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Feb 26, 2026

🔍 Lint Results

__init__.py — Empty as expected

Files linted: community/philips-hue-control/main.py

✅ Flake8 — Passed

✅ All checks passed!

@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Feb 26, 2026

✅ Ability Validation Passed

📋 Validating: community/philips-hue-control
  ✅ All checks passed!

@github-actions github-actions bot added the first-contribution First-time contributor label Feb 26, 2026
@abubakar4360
Copy link
Copy Markdown
Contributor

Can you please share the demo with us?

@abubakar4360
Copy link
Copy Markdown
Contributor

abubakar4360 commented Mar 24, 2026

Hey @jamalnajmi, great work overall. The LLM-routed intent classification is the right call, and the name resolution fallback with _resolve_name_with_llm is a solid touch. Found a few things that need to be fixed before this can be approved, though:

  1. The pre-classifier fast-paths in _classify_intent use hardcoded in lower substring matching before the LLM ever sees the input — a user saying "lights out", "kill all the lights", "everything off", or "light it all up" will silently bypass the classifier or hit the wrong branch. Remove both all...off and all...on fast-paths and let the LLM handle ALL_OFF and ALL_ON — the classifier is already configured for both intents and will do this correctly.

  2. The if "help" in lower fast-path has the same problem. A user saying "what can you do", "what do I say", "how does this work", or "what should I ask" will never reach the help response. Remove it and add help as a routable intent in CLASSIFY_PROMPT instead.

  3. EXIT_WORDS is missing too many natural spoken exits to be reliable. Add: "forget it", "never mind", "nevermind" (one word — common STT output), "leave it", "I'm good", "that's it", "all done", "no thanks", "actually". Long-term fix: replace the any(word in lower for word in EXIT_WORDS) check with a single LLM classifier call — "Is the user trying to stop or exit? Return YES or NO."

  4. The bridge setup confirmation check if "ready" not in readiness only listens for the literal word "ready" after the user is asked to press the button. A user saying "ok", "go ahead", "yep", "done", "sure", "good to go", or "alright" will fall through to "I will try pairing now" as if they said nothing useful. Expand the check to a small set of affirmatives or swap it for a short LLM classifier.

  5. Three speak() strings use uncontracted forms that sound robotic on TTS. "I will try pairing now.""I'll try pairing now.""I did not catch a valid IP...""Didn't catch that IP. Try something like 192 dot 168 dot 1 dot 45.""I did not detect the button press.""Didn't catch the button press. Try once more and say ready."

  6. "That light doesn't support color. I can set brightness and white temperature." — two sentences where one will do. Change to "That light doesn't do color — try brightness or white temperature instead."

  7. CLASSIFY_PROMPT has no instruction for how its spoken output should sound. The response from this prompt goes directly into intent routing, but if the LLM appends any explanation or formatted text it will surface in unexpected ways. Add to the prompt: "Return only valid JSON, no markdown, no preamble, no explanation."

  8. "Press the round button on your Hue Bridge, then say ready." — slightly stiff. Change to "Go ahead and press the button on your Hue Bridge, then say ready."

@jamalnajmi
Copy link
Copy Markdown
Author

Hi @abubakar4360 abubakar4360, Thank you for the review and the feedback. Let me take a look and make suggested changes.

@jamalnajmi
Copy link
Copy Markdown
Author

@abubakar4360

Implemented all requested updates in main.py based on your feedback.
What I changed

  1. Removed hardcoded _classify_intent fast-paths for help, all_off, and all_on; these now route through the LLM classifier.
  2. Added stricter classifier prompt output rule: return JSON only (no markdown, preamble, or explanation).
  3. Expanded exit handling:
  • Added missing natural phrases (forget it, never mind, nevermind, leave it, I'm good, that's it, all done, no thanks, actually, etc.).
  • Added an LLM YES/NO exit-intent check with keyword fallback.
  1. Improved bridge pairing confirmation by accepting common affirmatives (ok, yep, sure, done, good to go, alright, etc.), not only “ready”.
  2. Updated TTS phrasing for more natural speech:
    “I’ll try pairing now.”
    “Didn’t catch that IP. Try something like 192 dot 168 dot 1 dot 45.”
    “Didn’t catch the button press. Try once more and say ready.”
    “Go ahead and press the button on your Hue Bridge, then say ready.”
    “That light doesn’t do color - try brightness or white temperature instead.”

@abubakar4360
Copy link
Copy Markdown
Contributor

Thanks for submitting the changes. Can you please send the working demo as well (if you have)?

@jamalnajmi
Copy link
Copy Markdown
Author

Thanks for submitting the changes. Can you please send the working demo as well (if you have)?

Hi @abubakar4360. This is something I have brought up with Chris Gonzalez. He had mentioend he would assist in testing as I do not have the setup required for testing.

abubakar4360 and others added 2 commits March 27, 2026 10:12
@uzair401
Copy link
Copy Markdown
Contributor

Hello @jamalnajmi,

Thank you for your submission and the effort you’ve put into this ability. Since this involves hardware interactions, we will need to test it with real hardware to fully verify its functionality. To keep things organized, we’ll temporarily close the PR and revisit it after hardware testing. Once the hardware testing and verification are complete, we will reopen the PR and proceed with merging it.

We truly appreciate your contributions and encourage you to continue submitting new abilities, as your work helps strengthen the ecosystem.

@uzair401 uzair401 closed this Mar 27, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

community-ability Community-contributed ability first-contribution First-time contributor

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants