Abstract: Data-free knowledge distillation is a challenging model lightweight task for scenarios in which the original dataset is not available. Previous methods require a lot of extra computational ...
Multi-modal knowledge graph (MMKG) completion aims to infer missing links in knowledge graphs by leveraging both structural information and multi-modal data, such as images and text. While existing ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results