Skip to content
GitLab
Explore
Sign in
This is an archived project. Repository and other project resources are read-only.
Commits · 1b2a79d1cc964cd7e6cd0fd525a63e529ce140ca
audiolm-pytorch-flask
Browse files
Mar 27, 2023
Merge pull request #144 from tchamb/import-functools
· 1b2a79d1
Phil Wang
authored
Mar 26, 2023
1b2a79d1
Merge pull request #142 from hmartiro/accelerate_arg
· 6734f7f8
Phil Wang
authored
Mar 26, 2023
6734f7f8
Mar 26, 2023
Option to pass in Accelerator to SoundStreamTrainer
· b04d207d
Hayk Martiros
authored
Mar 25, 2023
b04d207d
Correct coarse trainer ckpt filename
· 43def8e0
Taylor
authored
Mar 25, 2023
43def8e0
Import einops
· 4bcce75a
Taylor
authored
Mar 25, 2023
4bcce75a
Update encodec.py
· f32cdf2d
Taylor
authored
Mar 25, 2023
f32cdf2d
patch
· 6d943ebe
Phil Wang
authored
Mar 25, 2023
View commits for tag 0.25.3
0.25.3
6d943ebe
Merge pull request #140 from tchamb/main
· a70b96e3
Phil Wang
authored
Mar 25, 2023
a70b96e3
Small error -- functools not imported for reduce
· e24cd70d
Taylor
authored
Mar 25, 2023
e24cd70d
Update hubert_kmeans.py
· 8344d2c2
Taylor
authored
Mar 25, 2023
8344d2c2
Mar 25, 2023
0.25.2
· 0240f6a6
Phil Wang
authored
Mar 24, 2023
View commits for tag 0.25.2
0.25.2
0240f6a6
Merge pull request #137 from LWprogramming/soundstream_error_msgs
· a246b185
Phil Wang
authored
Mar 24, 2023
a246b185
fix error messages
· 29a3e1b7
Leon Wu
authored
Mar 24, 2023
29a3e1b7
readme
· d499b819
Phil Wang
authored
Mar 24, 2023
d499b819
allow for specifying which layer of hubert to use
· f6f02a66
lucidrains
authored
Mar 24, 2023
View commits for tag 0.25.1
0.25.1
f6f02a66
update appreciation
· 5ff887c5
lucidrains
authored
Mar 24, 2023
5ff887c5
release encodec support, thanks to @LWprogramming !
· cd899677
lucidrains
authored
Mar 24, 2023
View commits for tag 0.25.0
0.25.0
cd899677
Merge pull request #135 from LWprogramming/encodec_support
· edf792b8
Phil Wang
authored
Mar 24, 2023
edf792b8
Mar 24, 2023
fix readme and demo jupyter notebook
· 310deea3
Leon Wu
authored
Mar 23, 2023
310deea3
Fix type hint mistake
· 2e301432
Leon Wu
authored
Mar 23, 2023
2e301432
setup.py
· 6e0dab4e
Leon Wu
authored
Mar 23, 2023
6e0dab4e
update trainer
· df05e1e9
Leon Wu
authored
Mar 23, 2023
df05e1e9
add encodec support
· 2e64311c
Leon Wu
authored
Mar 23, 2023
2e64311c
fix for ema
· 26762007
Phil Wang
authored
Mar 23, 2023
View commits for tag 0.24.1
0.24.1
26762007
Mar 21, 2023
in coarse transformer, make sure that coarse tokens attending to semantic...
· 0491eaaf
Phil Wang
authored
Mar 20, 2023
View commits for tag 0.24.0
0.24.0
0491eaaf
Mar 19, 2023
make readme more realistic
· dd4784ab
Phil Wang
authored
Mar 18, 2023
dd4784ab
Mar 17, 2023
force input to complex conv to have the same dtype as the weights, in the...
· 48380d5d
Phil Wang
authored
Mar 16, 2023
View commits for tag 0.23.7
0.23.7
48380d5d
Mar 16, 2023
whether to use the absolute value of the complex logits for output of complex...
· ce717d99
Phil Wang
authored
Mar 15, 2023
View commits for tag 0.23.6
0.23.6
ce717d99
Mar 10, 2023
fix the memory issue with the 2d relative attention bias (seq and quantizer...
· 3c639577
Phil Wang
authored
Mar 09, 2023
View commits for tag 0.23.5
0.23.5
3c639577
quick fix for not unwrapping model before getting config
· ff608683
Phil Wang
authored
Mar 09, 2023
View commits for tag 0.23.3
0.23.3
ff608683
Mar 08, 2023
use 2d dynamic positional bias for fine transformer, to try to improve...
· b29141f8
Phil Wang
authored
Mar 07, 2023
View commits for tag 0.23.2
0.23.2
b29141f8
Mar 06, 2023
gratitude
· 9ba1b040
Phil Wang
authored
Mar 05, 2023
9ba1b040
0.22.3
· 85070856
Phil Wang
authored
Mar 05, 2023
View commits for tag 0.22.3
0.22.3
85070856
Merge pull request #122 from LWprogramming/eos_id
· efbced79
Phil Wang
authored
Mar 05, 2023
efbced79
Use correct eos id when masking out
· 3da99089
Leon Wu
authored
Mar 05, 2023
3da99089
fix issue with turning off discriminators in soundstream
· 035c0ce9
Phil Wang
authored
Mar 05, 2023
View commits for tag 0.22.2
0.22.2
035c0ce9
Mar 04, 2023
appreciation
· 22b5f72e
Phil Wang
authored
Mar 03, 2023
22b5f72e
credit assignment
· f545bd1b
Phil Wang
authored
Mar 03, 2023
f545bd1b
add an autoregressive squeeze excitation module. go for the knockout
· e4b66cf0
Phil Wang
authored
Mar 03, 2023
View commits for tag 0.22.1
0.22.1
e4b66cf0
create specially engineered relative positional bias for fine transformer, so...
· af903342
Phil Wang
authored
Mar 03, 2023
View commits for tag 0.22.0
0.22.0
af903342
Loading