论文:https://arxiv.org/abs/2307.03942, Miccai 2023
代码:GitHub - Junelin2333/LanGuideMedSeg-MICCAI2023: Pytorch code of MICCAI 2023 Paper-Ariadne’s Thread : Using Text Prompts to Improve Segmentation of Infected Areas fro…
理论基础
Transformer(来自2017年google发表的Attention Is All You Need (arxiv.org) ),接上面一篇attention之后,transformer是基于自注意力基础上引申出来的结构,其主要解决了seq2seq的两个问题:
考虑了原序列和目…