rezayazdanfar OP t1_jcu24bm wrote on March 19, 2023 at 3:52 PM Reply to comment by DeepLearningStudent in How To Scale Transformers’ Memory up to 262K Tokens With a Minor Change? by rezayazdanfar :) happy to hear it, hope you found it practical in your work. I also aim to use it in my future project. :) Permalink Parent 2
rezayazdanfar OP t1_jcu1yfy wrote on March 19, 2023 at 3:51 PM Reply to comment by WallyMetropolis in How To Scale Transformers’ Memory up to 262K Tokens With a Minor Change? by rezayazdanfar :) yess I see, thanks. If you like we can talk more and give me more feedback or/and comments; thus I can improve. :) Permalink Parent 1
rezayazdanfar OP t1_jc5ncpy wrote on March 14, 2023 at 5:32 AM Reply to comment by WallyMetropolis in How To Scale Transformers’ Memory up to 262K Tokens With a Minor Change? by rezayazdanfar True but i didn't call my own work fabulous, i meant the main work. :) Permalink Parent 7
How To Scale Transformers’ Memory up to 262K Tokens With a Minor Change? Submitted by rezayazdanfar t3_11qfl2o on March 13, 2023 at 5:19 PM in deeplearning 7 comments 15
rezayazdanfar OP t1_jcu24bm wrote
Reply to comment by DeepLearningStudent in How To Scale Transformers’ Memory up to 262K Tokens With a Minor Change? by rezayazdanfar
:) happy to hear it, hope you found it practical in your work. I also aim to use it in my future project. :)