Skip to content

clabverter: failed to add configmap volume when using startup-config inside kinds #217

@italovalcy

Description

@italovalcy

when using startup-config inside kinds, clabverter fails to add the correspondent configmap to the created deployment:

name: my_lab
topology:
  defaults:
    kind: linux
    image: ghcr.io/srl-labs/alpine
    binds:
      - configs/client.sh:/config.sh
    exec:
      - "ash -c '/config.sh ${NODE_ID}'"
  kinds:
    srl:
      kind: nokia_srlinux
      type: ixrd3
      image: ghcr.io/nokia/srlinux:latest
      startup-config: configs/srl.cfg
  nodes:
    node1:
      kind: srl
    node2:
      kind: nokia_srlinux
      image: ghcr.io/nokia/srlinux:24.10.1
      startup-config: configs/srl.cfg
    host1:
      env:
        NODE_ID: "1"
    host2:
      env:
        NODE_ID: "2"

  links:
    # links between client1 and srl1
    - endpoints: [host1:eth1, node1:e1-1]

    # inter-switch link
    - endpoints: [node1:e1-10, node2:e1-10]

    # links between client2 and srl2
    - endpoints: [node2:e1-1, host2:eth1]

Then, after creating the lab, we see that the container keeps in the CrashLoop state:

> kubectl --kubeconfig $KUBECONFIG -n kubernp get pods -l clabernetes/topologyOwner==clab-0a5adb59405a4e
NAME                                         READY   STATUS             RESTARTS      AGE
clab-0a5adb59405a4e-host1-8fb5746-85www      1/1     Running            0             25m
clab-0a5adb59405a4e-host2-dc7c8db5-l4xwr     1/1     Running            0             25m
clab-0a5adb59405a4e-node1-789c6b8f-4qnv5     0/1     CrashLoopBackOff   9 (45s ago)   25m
clab-0a5adb59405a4e-node2-544bf55668-ph4lt   1/1     Running            0             25m

And looking for more details:

> kubectl --kubeconfig $KUBECONFIG -n kubernp logs clab-0a5adb59405a4e-node1-789c6b8f-4qnv5

...

    INFO |              containerlab | Error: node "node1" startup-config file not found by the path /clabernetes/configs/srl.cfg

CRITICAL |               clabernetes | failed launching containerlab, will try to gather crashed container logs then will exit, err: exit status 1

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions