Skip to content
Commit 3c639577 authored by Phil Wang's avatar Phil Wang
Browse files

fix the memory issue with the 2d relative attention bias (seq and quantizer...

fix the memory issue with the 2d relative attention bias (seq and quantizer positions) in the fine transformer
parent ff608683
Loading
Loading
Loading
Loading
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment