Port visualizations from HoloViews/Panel to Plotly (~100x faster save)#830
Open
Port visualizations from HoloViews/Panel to Plotly (~100x faster save)#830
Conversation
…er report.save() The primary bottleneck was Panel's embed_state() pre-computing JSON diffs for every HoloMap slider position during report.save(). With 7 result vars and over_time=True, this took 16+ seconds. By replacing HoloViews plots with native Plotly figures, interactivity is handled client-side via JavaScript dropdowns, eliminating server-side pre-rendering entirely. Key changes: - All result classes (scatter, line, curve, bar, heatmap, band, histogram, violin, box, surface, volume) now produce go.Figure instead of hv elements - Over_time slider replaced with Plotly dropdown menu (visibility toggling) - BenchReport.save() uses plotly.io.to_html() instead of Panel embed=True - bench_result_base.py: DatasetWrapper replaces hv.Dataset in the pipeline - _wrap_plotly_figure() converts go.Figure to pn.pane.Plotly for Panel layout Performance: report.save() drops from ~16s to ~0.13s for complex benchmarks. https://claude.ai/code/session_0141XH5bouxpUHWaLVschsWP
Contributor
There was a problem hiding this comment.
Sorry @blooop, your pull request is larger than the review limit of 150000 diff characters
- Replace undefined `hv.Dataset` references with `DatasetWrapper` in bench_result_base.py and video_result.py (F821 lint errors) - Update over_time tests to check for Plotly dropdown menus instead of Panel DiscreteSlider widgets that no longer exist after the port - Remove unused imports flagged by ruff (holoviews, panel, plotly.io, etc.) - Apply ruff formatting to files reformatted by the port
Performance Report for
|
| Metric | Value |
|---|---|
| Total tests | 919 |
| Total time | 127.47s |
| Mean | 0.1387s |
| Median | 0.0020s |
Top 10 slowest tests
| Test | Time (s) |
|---|---|
test.test_bench_examples.TestBenchExamples::test_example_meta |
47.595 |
test.test_over_time_repeats.TestMaxSliderPoints::test_default_subsampling_caps_at_max |
8.013 |
test.test_hash_persistent.TestCrossProcessDeterminism::test_hash_stable_across_two_processes[ResultBool] |
7.445 |
test.test_generated_examples::test_generated_example[3_float/over_time/sweep_3_float_2_cat_over_time.py] |
2.061 |
test.test_generated_examples::test_generated_example[1_float/over_time_repeats/sweep_1_float_3_cat_over_time_repeats.py] |
2.017 |
test.test_over_time_save_perf::test_save_faster_without_aggregated_tab |
1.933 |
test.test_generated_examples::test_generated_example[1_float/over_time/sweep_1_float_3_cat_over_time.py] |
1.789 |
test.test_optuna_result.TestOptunaResult::test_collect_optuna_plots_with_repeats |
1.367 |
test.test_generated_examples::test_generated_example[0_float/over_time/sweep_0_float_3_cat_over_time.py] |
1.299 |
test.test_generated_examples::test_generated_example[1_float/over_time/sweep_1_float_2_cat_over_time.py] |
1.154 |
Updated by Performance Tracking workflow
- Prefix unused backward-compat args with _ or add pylint disable comments - Narrow bare Exception catches to specific types (ValueError, TypeError, OSError) - Suppress protected-access warning in test calling _build_curve_overlay
Performance Report for
|
| Metric | Value |
|---|---|
| Total tests | 919 |
| Total time | 124.54s |
| Mean | 0.1355s |
| Median | 0.0020s |
Top 10 slowest tests
| Test | Time (s) |
|---|---|
test.test_bench_examples.TestBenchExamples::test_example_meta |
46.892 |
test.test_over_time_repeats.TestMaxSliderPoints::test_default_subsampling_caps_at_max |
7.936 |
test.test_hash_persistent.TestCrossProcessDeterminism::test_hash_stable_across_two_processes[ResultBool] |
6.625 |
test.test_generated_examples::test_generated_example[3_float/over_time/sweep_3_float_2_cat_over_time.py] |
2.157 |
test.test_generated_examples::test_generated_example[1_float/over_time_repeats/sweep_1_float_3_cat_over_time_repeats.py] |
1.995 |
test.test_over_time_save_perf::test_save_faster_without_aggregated_tab |
1.931 |
test.test_generated_examples::test_generated_example[1_float/over_time/sweep_1_float_3_cat_over_time.py] |
1.756 |
test.test_optuna_result.TestOptunaResult::test_collect_optuna_plots_with_repeats |
1.356 |
test.test_generated_examples::test_generated_example[0_float/over_time/sweep_0_float_3_cat_over_time.py] |
1.275 |
test.test_generated_examples::test_generated_example[3_float/over_time/sweep_3_float_1_cat_over_time.py] |
1.156 |
Updated by Performance Tracking workflow
- Fix publish_gh_pages/publish to include _tabs/ directory (multi-tab reports were broken on GitHub Pages because only index.html was committed) - Fix double Plotly JS loading in _save_tab_plotly (header included CDN script AND first figure used include_plotlyjs="cdn") - Make _extract_plotly_figures/_extract_markdown defensive against str/dict matching __iter__, and check .objects before __iter__ for Panel containers - Delete dead plotly_result.py (183 lines, never imported anywhere) - Remove 15 dead backward-compat stubs from HoloviewResult (to_hv_type, to_hv_container, to_error_bar, to_points, to_nd_layout, to_holomap, etc.) - Remove dead _AGG_TITLE, use_tap, set_default_opts(), _apply_opts, _over_time_kdims from holoview_result.py - Remove dead to_heatmap_single/to_heatmap_tap stubs from heatmap_result.py - Remove redundant PlotFilter pre-check in scatter_result.py - Fix ReduceType re-export in __init__.py (was importing from holoview_result) - Update stale HoloViews references in comments/docstrings - Fix flaky timing test (save is now ~0.04s for both cases)
Performance Report for
|
| Metric | Value |
|---|---|
| Total tests | 908 |
| Total time | 127.72s |
| Mean | 0.1407s |
| Median | 0.0020s |
Top 10 slowest tests
| Test | Time (s) |
|---|---|
test.test_bench_examples.TestBenchExamples::test_example_meta |
49.256 |
test.test_over_time_repeats.TestMaxSliderPoints::test_default_subsampling_caps_at_max |
7.863 |
test.test_hash_persistent.TestCrossProcessDeterminism::test_hash_stable_across_two_processes[ResultBool] |
7.037 |
test.test_generated_examples::test_generated_example[3_float/over_time/sweep_3_float_2_cat_over_time.py] |
2.108 |
test.test_generated_examples::test_generated_example[1_float/over_time_repeats/sweep_1_float_3_cat_over_time_repeats.py] |
2.012 |
test.test_over_time_save_perf::test_save_faster_without_aggregated_tab |
1.936 |
test.test_generated_examples::test_generated_example[1_float/over_time/sweep_1_float_3_cat_over_time.py] |
1.724 |
test.test_optuna_result.TestOptunaResult::test_collect_optuna_plots_with_repeats |
1.371 |
test.test_generated_examples::test_generated_example[0_float/over_time/sweep_0_float_3_cat_over_time.py] |
1.281 |
test.test_generated_examples::test_generated_example[3_float/over_time/sweep_3_float_1_cat_over_time.py] |
1.137 |
Updated by Performance Tracking workflow
…bar groupby - Use `markdown` library for proper Markdown→HTML conversion in saved reports (headings, lists, bold, tables, fenced code blocks now render correctly) - Detect Bokeh panes (image/video over_time sliders) in save pipeline and delegate to Panel's save() which properly embeds Bokeh widgets with JS interactivity, instead of silently dropping them - Restore multi-level groupby for bar charts: `by` parameter now accepts a list of dimension names (was truncated to first candidate only)
Performance Report for
|
| Metric | Value |
|---|---|
| Total tests | 908 |
| Total time | 127.17s |
| Mean | 0.1401s |
| Median | 0.0020s |
Top 10 slowest tests
| Test | Time (s) |
|---|---|
test.test_bench_examples.TestBenchExamples::test_example_meta |
48.690 |
test.test_over_time_repeats.TestMaxSliderPoints::test_default_subsampling_caps_at_max |
8.235 |
test.test_hash_persistent.TestCrossProcessDeterminism::test_hash_stable_across_two_processes[ResultBool] |
6.810 |
test.test_generated_examples::test_generated_example[3_float/over_time/sweep_3_float_2_cat_over_time.py] |
2.170 |
test.test_generated_examples::test_generated_example[1_float/over_time_repeats/sweep_1_float_3_cat_over_time_repeats.py] |
2.036 |
test.test_over_time_save_perf::test_save_faster_without_aggregated_tab |
1.944 |
test.test_generated_examples::test_generated_example[1_float/over_time/sweep_1_float_3_cat_over_time.py] |
1.758 |
test.test_optuna_result.TestOptunaResult::test_collect_optuna_plots_with_repeats |
1.310 |
test.test_generated_examples::test_generated_example[0_float/over_time/sweep_0_float_3_cat_over_time.py] |
1.250 |
test.test_generated_examples::test_generated_example[3_float/over_time/sweep_3_float_1_cat_over_time.py] |
1.135 |
Updated by Performance Tracking workflow
Performance Report for
|
| Metric | Value |
|---|---|
| Total tests | 908 |
| Total time | 123.41s |
| Mean | 0.1359s |
| Median | 0.0020s |
Top 10 slowest tests
| Test | Time (s) |
|---|---|
test.test_bench_examples.TestBenchExamples::test_example_meta |
46.974 |
test.test_over_time_repeats.TestMaxSliderPoints::test_default_subsampling_caps_at_max |
8.000 |
test.test_hash_persistent.TestCrossProcessDeterminism::test_hash_stable_across_two_processes[ResultBool] |
6.734 |
test.test_generated_examples::test_generated_example[3_float/over_time/sweep_3_float_2_cat_over_time.py] |
2.042 |
test.test_generated_examples::test_generated_example[1_float/over_time_repeats/sweep_1_float_3_cat_over_time_repeats.py] |
1.931 |
test.test_over_time_save_perf::test_save_faster_without_aggregated_tab |
1.920 |
test.test_generated_examples::test_generated_example[1_float/over_time/sweep_1_float_3_cat_over_time.py] |
1.686 |
test.test_optuna_result.TestOptunaResult::test_collect_optuna_plots_with_repeats |
1.281 |
test.test_generated_examples::test_generated_example[0_float/over_time/sweep_0_float_3_cat_over_time.py] |
1.208 |
test.test_generated_examples::test_generated_example[3_float/over_time/sweep_3_float_1_cat_over_time.py] |
1.194 |
Updated by Performance Tracking workflow
Resolve conflicts keeping Plotly-based rendering from the PR while incorporating main's structural improvements (pane_layout/tabs support, numpy-based band reshape, RerunResult import, early-return guard in line_result, pytest fixtures).
Owner
Author
Merged main into this branchResolved conflicts in 12 files, keeping the Plotly-based rendering from this PR while incorporating main's recent improvements:
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
embed_state()bottleneck that madereport.save()take 16+ seconds for complex benchmarksreport.save()now usesplotly.io.to_html()instead of Panel'sembed=True, dropping save time from ~16s to ~0.13s (~120x speedup)What changed
All result classes (scatter, line, curve, bar, heatmap, band, histogram, violin, box, surface, volume, table) now produce
go.Figureinstead of HoloViews elements. Key infrastructure changes:bench_result_base.py:DatasetWrapperreplaceshv.Datasetin the layout pipeline;_wrap_plotly_figure()convertsgo.Figuretopn.pane.Plotlyfor Panel containersbench_report.py:_save_tab_plotly()extracts Plotly figures from the pane tree and writes them directly viaplotly.io.to_html()holoview_result.py:_build_time_dropdown_fig()builds a single Plotly figure with dropdown buttons for time point selection (replaceshv.HoloMap+ PanelDiscreteSlider)Performance
Test plan
pixi run ci) on full environmenthttps://claude.ai/code/session_0141XH5bouxpUHWaLVschsWP