Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
55 commits
Select commit Hold shift + click to select a range
70ee529
Command placeholder
ilyakuz-db Jan 14, 2026
b9d9afc
First iteration of YAML generation
ilyakuz-db Jan 14, 2026
ef55090
File writer
ilyakuz-db Jan 14, 2026
1bf39c9
Target overrides
ilyakuz-db Jan 14, 2026
afc2e87
Cleanup
ilyakuz-db Jan 14, 2026
beba54d
Fix invalid Dyn panic
ilyakuz-db Jan 15, 2026
d8fac50
Fix test to use structs
ilyakuz-db Jan 15, 2026
e0bb6b1
Cleanup
ilyakuz-db Jan 15, 2026
02be4c1
Fix missing tags issue
ilyakuz-db Jan 15, 2026
ee2564d
Fix sequences
ilyakuz-db Jan 15, 2026
0436a74
Cleanup
ilyakuz-db Jan 15, 2026
8ac2ba8
Rename
ilyakuz-db Jan 15, 2026
53066eb
Use less dyn.Value conversions
ilyakuz-db Jan 16, 2026
d4c5744
More asserts
ilyakuz-db Jan 16, 2026
bde1e10
Cleanup
ilyakuz-db Jan 16, 2026
b93536f
Fix add fields
ilyakuz-db Jan 18, 2026
072b408
Encapsulate dyn values in resolve selectors function
ilyakuz-db Jan 19, 2026
f7be4e3
Cleanup dyn logic
ilyakuz-db Jan 19, 2026
fc8baf8
Simplify patching behavior
ilyakuz-db Jan 19, 2026
57842b6
Hardcoded server side defaults
ilyakuz-db Jan 19, 2026
9c4f04b
Normalise values using SDK marshaler
ilyakuz-db Jan 19, 2026
a51189f
Cleanup
ilyakuz-db Jan 19, 2026
2e4076f
Move command to the bundle namespace
ilyakuz-db Jan 19, 2026
cf867e1
More strict errors
ilyakuz-db Jan 19, 2026
dd4d5eb
Better errors
ilyakuz-db Jan 20, 2026
2c70e2c
Fix tests
ilyakuz-db Jan 20, 2026
a8e96e6
Go mod update
ilyakuz-db Jan 20, 2026
52c71ae
Merge branch 'main' into config-remote-sync-command
ilyakuz-db Jan 20, 2026
2df4366
Remove unused argument
ilyakuz-db Jan 21, 2026
25603ef
Add terraform defaults
ilyakuz-db Jan 21, 2026
3830ba1
Update acceptance test
ilyakuz-db Jan 21, 2026
e1fe23e
Merge branch 'main' of github.com:databricks/cli into config-remote-s…
ilyakuz-db Jan 21, 2026
2335079
Remove webhook_notifications and email_notifications defaults
ilyakuz-db Jan 21, 2026
687ebee
Remove unused functions
ilyakuz-db Jan 21, 2026
72ce457
Add acceptance tests
ilyakuz-db Jan 22, 2026
ec82ac7
Stable sort for changes
ilyakuz-db Jan 22, 2026
09afeb5
Update tests
ilyakuz-db Jan 22, 2026
2f1f185
Update storage field
ilyakuz-db Jan 22, 2026
e31d1c9
Update tests
ilyakuz-db Jan 22, 2026
3695217
Update tests
ilyakuz-db Jan 22, 2026
2fc8dc3
Move tests to bundle/
ilyakuz-db Jan 22, 2026
0fff83a
Test update
ilyakuz-db Jan 22, 2026
022bb5d
Skip windows
ilyakuz-db Jan 22, 2026
6ef718d
Merge branch 'main' of github.com:databricks/cli into config-remote-s…
ilyakuz-db Jan 22, 2026
acefab4
Update test tomls
ilyakuz-db Jan 22, 2026
cf60183
Merge branch 'main' of github.com:databricks/cli into config-remote-s…
ilyakuz-db Jan 22, 2026
e471f65
Use diff for test output
ilyakuz-db Jan 22, 2026
838885b
Fix globs drift
ilyakuz-db Jan 22, 2026
65bbac9
Remove backup
ilyakuz-db Jan 22, 2026
7c6cc1e
Merge tests, delete task in one test
ilyakuz-db Jan 22, 2026
c476464
Cleanup
ilyakuz-db Jan 22, 2026
7092091
New test suite with confgi edits
ilyakuz-db Jan 22, 2026
bf2c771
Some cleanup
ilyakuz-db Jan 22, 2026
08805b6
Add tests for permissions
ilyakuz-db Jan 22, 2026
ab7b1f7
Enable cloud for resource-speicifc tests, disable permissions
ilyakuz-db Jan 22, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions acceptance/bin/edit_resource.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,14 @@ def set(self, job_id, value):
return run([CLI, "jobs", "reset", job_id, "--json", json.dumps(payload)])


class pipelines:
def get(self, pipeline_id):
return run_json([CLI, "pipelines", "get", pipeline_id])["spec"]

def set(self, pipeline_id, value):
return run([CLI, "pipelines", "update", pipeline_id, "--json", json.dumps(value)])


def main():
parser = argparse.ArgumentParser()
parser.add_argument("type")
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
bundle:
name: test-bundle

resources:
jobs:
my_job:
tasks:
- task_key: main
notebook_task:
notebook_path: /Users/{{workspace_user_name}}/notebook
new_cluster:
spark_version: 13.3.x-scala2.12
node_type_id: i3.xlarge
num_workers: 1

targets:
default:
resources:
jobs:
my_job:
email_notifications:
on_success:
- success@example.com
parameters:
- name: catalog
default: main
- name: env
default: dev
trigger:
periodic:
interval: 1
unit: DAYS
tags:
env: dev
version: v1
max_concurrent_runs: 1
environments:
- environment_key: default
spec:
environment_version: "3"
dependencies:
- ./*.whl

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

57 changes: 57 additions & 0 deletions acceptance/bundle/config-remote-sync/config_edits/output.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
Uploading dummy.whl...
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!

=== Edit config locally

=== Edit job remotely

=== Detect and save changes
Detected changes in 1 resource(s):

Resource: resources.jobs.my_job
email_notifications.on_failure[0]: update
max_concurrent_runs: update
tags['env']: update


=== Configuration changes
--- databricks.yml.backup [TIMESTAMP]
+++ databricks.yml [TIMESTAMP]
@@ -1,6 +1,5 @@
bundle:
name: test-bundle
-
resources:
jobs:
my_job:
@@ -12,7 +11,6 @@
spark_version: 13.3.x-scala2.12
node_type_id: [NODE_TYPE_ID]
num_workers: 1
-
targets:
default:
resources:
@@ -22,7 +20,7 @@
on_success:
- success@example.com
on_failure:
- - config-failure@example.com
+ - remote-failure@example.com
parameters:
- name: catalog
default: main
@@ -33,8 +31,8 @@
interval: 1
unit: DAYS
tags:
- env: config-production
- max_concurrent_runs: 3
+ env: remote-staging
+ max_concurrent_runs: 5
timeout_seconds: 3600
environments:
- environment_key: default
96 changes: 96 additions & 0 deletions acceptance/bundle/config-remote-sync/config_edits/script
Original file line number Diff line number Diff line change
@@ -0,0 +1,96 @@
#!/bin/bash

touch dummy.whl
$CLI bundle deploy
job_id="$(read_id.py my_job)"

title "Edit config locally"
echo
# Case 1: Add field in config (on_failure) - will also be added remotely with different value
# Case 2: Add field in config (timeout_seconds) - will be removed remotely
# Case 3: Remove field from config (version tag) - will also be removed remotely
# Case 4: Update field in config (max_concurrent_runs) - will also be updated remotely with different value
# Case 5: Update field in config (tags.env) - will also be updated remotely with different value
cat > databricks.yml <<'EOF'
bundle:
name: test-bundle

resources:
jobs:
my_job:
tasks:
- task_key: main
notebook_task:
notebook_path: /Users/{{workspace_user_name}}/notebook
new_cluster:
spark_version: 13.3.x-scala2.12
node_type_id: i3.xlarge
num_workers: 1

targets:
default:
resources:
jobs:
my_job:
email_notifications:
on_success:
- success@example.com
on_failure:
- config-failure@example.com
parameters:
- name: catalog
default: main
- name: env
default: dev
trigger:
periodic:
interval: 1
unit: DAYS
tags:
env: config-production
max_concurrent_runs: 3
timeout_seconds: 3600
environments:
- environment_key: default
spec:
environment_version: "3"
dependencies:
- ./*.whl
EOF

title "Edit job remotely"
echo
# Case 1: Add field remotely (on_failure) - also added in config with different value
edit_resource.py jobs $job_id <<EOF
r["email_notifications"]["on_failure"] = ["remote-failure@example.com"]
EOF

# Case 2: Remove field remotely (timeout_seconds) - added in config
edit_resource.py jobs $job_id <<EOF
r.pop("timeout_seconds", None)
EOF

# Case 3: Remove field remotely (version tag) - also removed in config
edit_resource.py jobs $job_id <<EOF
r["tags"].pop("version", None)
EOF

# Case 4: Update field remotely (max_concurrent_runs) - also updated in config with different value
edit_resource.py jobs $job_id <<EOF
r["max_concurrent_runs"] = 5
EOF

# Case 5: Update field remotely (tags.env) - also updated in config with different value
edit_resource.py jobs $job_id <<EOF
r["tags"]["env"] = "remote-staging"
EOF

title "Detect and save changes"
echo
cp databricks.yml databricks.yml.backup
$CLI bundle config-remote-sync --save

title "Configuration changes"
echo
diff -u databricks.yml.backup databricks.yml || true
rm databricks.yml.backup
10 changes: 10 additions & 0 deletions acceptance/bundle/config-remote-sync/config_edits/test.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
RecordRequests = false
Cloud = true

Ignore = [".databricks", "dummy.whl"]

[Env]
DATABRICKS_BUNDLE_ENABLE_EXPERIMENTAL_YAML_SYNC = "true"

[EnvMatrix]
DATABRICKS_BUNDLE_ENGINE = ["direct"]
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# Top-level comment about the bundle
bundle:
name: test-bundle

# Resources section with extra spacing
resources:
jobs:
my_job:
# Comment about max concurrent runs
max_concurrent_runs: 1

# Task configuration
tasks:
- task_key: main
notebook_task:
notebook_path: /Users/{{workspace_user_name}}/notebook
new_cluster:
spark_version: 13.3.x-scala2.12
node_type_id: i3.xlarge
num_workers: 1 # inline comment about workers

# Tags for categorization
tags:
env: dev # environment tag
team: data-eng

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!

=== Modify max_concurrent_runs from 1 to 5
=== Detect and save changes
Detected changes in 1 resource(s):

Resource: resources.jobs.my_job
max_concurrent_runs: update


=== Configuration changes
--- databricks.yml.backup [TIMESTAMP]
+++ databricks.yml [TIMESTAMP]
@@ -1,14 +1,12 @@
# Top-level comment about the bundle
bundle:
name: test-bundle
-
# Resources section with extra spacing
resources:
jobs:
my_job:
# Comment about max concurrent runs
- max_concurrent_runs: 1
-
+ max_concurrent_runs: 5
# Task configuration
tasks:
- task_key: main
@@ -18,7 +16,6 @@
spark_version: 13.3.x-scala2.12
node_type_id: [NODE_TYPE_ID]
num_workers: 1 # inline comment about workers
-
# Tags for categorization
tags:
env: dev # environment tag
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
#!/bin/bash

touch dummy.whl
$CLI bundle deploy
job_id="$(read_id.py my_job)"

title "Modify max_concurrent_runs from 1 to 5"
edit_resource.py jobs $job_id <<EOF
r["max_concurrent_runs"] = 5
EOF

title "Detect and save changes"
echo
cp databricks.yml databricks.yml.backup
$CLI bundle config-remote-sync --save

title "Configuration changes"
echo
diff -u databricks.yml.backup databricks.yml || true
rm databricks.yml.backup
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
RecordRequests = false
Ignore = [".databricks", "dummy.whl"]

[Env]
DATABRICKS_BUNDLE_ENABLE_EXPERIMENTAL_YAML_SYNC = "true"

[EnvMatrix]
DATABRICKS_BUNDLE_ENGINE = ["direct"]
37 changes: 37 additions & 0 deletions acceptance/bundle/config-remote-sync/job_fields/databricks.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
bundle:
name: test-bundle

resources:
jobs:
my_job:
email_notifications:
on_success:
- success@example.com
parameters:
- name: catalog
default: main
- name: env
default: dev
trigger:
periodic:
interval: 1
unit: DAYS
tags:
env: dev
permissions:
- level: CAN_VIEW
user_name: viewer@example.com
environments:
- environment_key: default
spec:
environment_version: "3"
dependencies:
- ./*.whl
tasks:
- task_key: main
notebook_task:
notebook_path: /Users/{{workspace_user_name}}/notebook
new_cluster:
spark_version: 13.3.x-scala2.12
node_type_id: i3.xlarge
num_workers: 1

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Loading
Loading