Skip to content

feat: support whisper with openvino #417

Open
ziyuanguo1998 wants to merge 2 commits into
mainfrom
ziyuan/whisper-ov
Open

feat: support whisper with openvino #417
ziyuanguo1998 wants to merge 2 commits into
mainfrom
ziyuan/whisper-ov

Conversation

@ziyuanguo1998
Copy link
Copy Markdown
Contributor

No description provided.

Copilot AI review requested due to automatic review settings May 13, 2026 02:43
@ziyuanguo1998 ziyuanguo1998 requested review from a team as code owners May 13, 2026 02:43
# Get arguments
output_dir: str = oliveJson["output_dir"]
cache_dir: str = oliveJson["cache_dir"]
config_pass = oliveJson["passes"]["aitkpython"]
config_pass = oliveJson["passes"]["aitkpython"]

# The cwd is model project folder
history_folder = os.path.dirname(args.config)
# Licensed under the MIT License.
# --------------------------------------------------------------------------
import os
import sys
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds an Intel OpenVINO workflow for the openai/whisper-large-v3-turbo recipe under the AITK/Foundry Toolkit structure, alongside updates to the central model list/docs to advertise Intel CPU/GPU/NPU support.

Changes:

  • Add an OpenVINO AitkPython workflow (ov_workflow.json + ov_workflow.py) plus an inference notebook sample.
  • Add OpenVINO conversion/encapsulation Olive configs and supporting conversion scripts/configs in the recipe folder.
  • Register Intel runtimes for this model in .aitk/configs/model_list.json and .aitk/docs/guide/ModelList.md, and bump the recipe version.

Reviewed changes

Copilot reviewed 14 out of 14 changed files in this pull request and generated 8 comments.

Show a summary per file
File Description
openai-whisper-large-v3-turbo/aitk/whisper_large_v3_turbo_encapsulate.json Adds OpenVINO encapsulation Olive config (currently has inconsistent system/EP vs target_device).
openai-whisper-large-v3-turbo/aitk/whisper_large_v3_turbo_default_ov_npu.json Adds OpenVINO optimum conversion Olive config (currently has inconsistent system/EP vs intended OpenVINO NPU conversion).
openai-whisper-large-v3-turbo/aitk/qnn_evaluate.py Removes an unused import.
openai-whisper-large-v3-turbo/aitk/ov_workflow.py Adds the AitkPython entrypoint for OpenVINO conversion/eval (currently ignores runtime/output_dir).
openai-whisper-large-v3-turbo/aitk/ov_workflow.json.config Adds the ModelParameter config for the OpenVINO workflow (currently includes a non-Olive JSON in oliveFile, and a mismatched runtime path).
openai-whisper-large-v3-turbo/aitk/ov_workflow.json Adds the AitkPython Olive workflow wrapper (currently has output_dir inconsistent with produced artifacts).
openai-whisper-large-v3-turbo/aitk/ov_workflow_inference_sample.ipynb Adds an OpenVINO inference example notebook.
openai-whisper-large-v3-turbo/aitk/model_project.config Registers the new ov_workflow.json template and bumps model version.
openai-whisper-large-v3-turbo/aitk/info.yml Registers the OpenVINO recipe entry and bumps version (currently has invalid YAML indentation for devices).
openai-whisper-large-v3-turbo/aitk/convert_whisper_to_ovir.py Adds a conversion+encapsulation helper script for Whisper → OpenVINO IR + encapsulated ONNX.
openai-whisper-large-v3-turbo/aitk/audio_processor_config_default.json Adds the default audio feature extraction config used by the OpenVINO workflow.
.aitk/requirements/requirements-IntelNPU-WP.txt Adds a new runtime requirements set for IntelNPU with WP feature.
.aitk/docs/guide/ModelList.md Updates docs to list Intel CPU/GPU/NPU support for Whisper.
.aitk/configs/model_list.json Updates the model registry to include IntelCPU/IntelGPU/IntelNPU runtimes and bumps version.
Comments suppressed due to low confidence (1)

openai-whisper-large-v3-turbo/aitk/ov_workflow.json.config:30

  • The runtime parameter path points to systems.local_system..., but ov_workflow.json uses systems.target_system. This will be rewritten by sanitize and can break runtime selection until then; update it to systems.target_system.accelerators.0.device.
        ],
        "path": "systems.local_system.accelerators.0.device",
        "values": [
            "cpu",
            "npu",

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

"$schema": "https://github.com/microsoft/olive-recipes/raw/refs/heads/main/.aitk/configs/config_schema.json",
"name": "Convert to Intel CPU/NPU/GPU",
"oliveFile": {
"audio_processor_config_default.json": "OpenVINO/audio_processor_config_default.json",
"evaluate_input_model": false,
"target": "target_system",
"clean_cache": false,
"output_dir": "model/whisper_ov",
Comment on lines +10 to +12
- npu
- cpu
- gpu
Comment on lines +11 to +13
"device":"cpu",
"execution_providers": [
"CPUExecutionProvider"
Comment on lines +11 to +13
"device":"cpu",
"execution_providers": [
"CPUExecutionProvider"
Comment on lines +25 to +29
# Get arguments
output_dir: str = oliveJson["output_dir"]
cache_dir: str = oliveJson["cache_dir"]
config_pass = oliveJson["passes"]["aitkpython"]

Comment on lines +12 to +16
parser = argparse.ArgumentParser()
parser.add_argument("--config", required=True, help="path to input config file")
parser.add_argument("--model_config", help="path to input model config file")
parser.add_argument("--runtime", required=True, help="runtime")
return parser.parse_args()
Comment on lines +50 to +51
subprocess.run([sys.executable, "convert_whisper_to_ovir.py",
"--enable_npu_ws", "True"],
urllib3==2.6.3
zipp==3.23.0
wasdk-Microsoft.Windows.AI.MachineLearning==1.8.251106002
wasdk-Microsoft.Windows.ApplicationModel.DynamicDependency.Bootstrap==1.8.251106002
Copy link
Copy Markdown
Contributor

@xieofxie xieofxie May 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please clean up

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants