You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add a GELU ball RTL implementation in the prototype lib (under the arch path).
A Pull Request (PR) containing a test written in C for the GELU operation and a README to introduce your design.
Report the performance results in this issue.
Task Description
GELU (Gaussian Error Linear Unit) is a smooth, non-monotonic activation function that has become increasingly popular in modern deep learning architectures, particularly in transformer models like BERT. Unlike ReLU, which applies a hard threshold, GELU provides a probabilistic approach to neuron activation.
Hardware implementation of GELU activation function primarily falls into two categories: lookup table and approximation methods. Lookup table approaches store pre-computed GELU values in memory and use interpolation between table entries to calculate intermediate values, offering predictable latency and good accuracy but requiring significant memory resources. Approximation methods use mathematical functions to approximate the GELU behaviour, utilizing arithmetic operations for computation while requiring minimal memory storage.
For this implementation, we will adopt the approximation method to approximate the GELU function
Deliverables
Task Description