[D] What are good ways of incorporating non-sequential context into a transformer model? Submitted by abc220022 t3_100y331 on January 2, 2023 at 12:23 AM in MachineLearning 11 comments 27
kdqg t1_j2onz1m wrote on January 2, 2023 at 9:53 PM Reply to comment by ai-lover in [D] What are good ways of incorporating non-sequential context into a transformer model? by abc220022 Did chatGPT write this Permalink Parent 1
Viewing a single comment thread. View all comments