Skip to content

Version Issue for BetterTransformer. Please provide exact package dependencies and Python, Torch version you used #15

@deshwalmahesh

Description

@deshwalmahesh

It is giving me:

ValueError: Transformers now supports natively BetterTransformer optimizations (torch.nn.functional.scaled_dot_product_attention) for the model type llama. Please upgrade to transformers>=4.36 and torch>=2.1.1 to use it. Details: https://huggingface.co/docs/transformers/perf_infer_gpu_one#flashattention-and-memory-efficient-attention-through-pytorchs-scaleddotproductattention
Traceback (most recent call last):

I installed from requirements.txt.

I am using python 3.10.13 , torch==2.2.0 and transformers==4.37.2 on 4xA10 on Sagemaker.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions