목차 iBOT: Image BERT Pre-Training with Online Tokenizer ICLR 2022 https://arxiv.org/abs/2111.07832 iBOT: Image BERT Pre-Training with Online Tokenizer The success of language Transformers is primarily attributed to the pretext task of masked language modeling (MLM), where texts are first tokenized into semantically meaningful pieces. In this work, we study masked image modeling (MIM) and indicate..