Skip to content

fix: torch_float should return float, not int#44697

Open
LincolnBurrows2017 wants to merge 2 commits intohuggingface:mainfrom
LincolnBurrows2017:fix/torch_float
Open

fix: torch_float should return float, not int#44697
LincolnBurrows2017 wants to merge 2 commits intohuggingface:mainfrom
LincolnBurrows2017:fix/torch_float

Conversation

@LincolnBurrows2017
Copy link

Description

The torch_float function in src/transformers/utils/generic.py was incorrectly returning int(x) in two places where it should return float(x):

  1. When torch is not available (fallback case)
  2. When not in a tracing context with a torch tensor

This is inconsistent with the function's name and purpose (converting to float32). The similar function torch_int correctly returns int(x).

Fix

Changed int(x) to float(x) in both locations.


Bug originally identified during AI contribution task.

@github-actions
Copy link
Contributor

[For maintainers] Suggested jobs to run (before merge)

run-slow: doge

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant