Deep Learning26 Graph Attention Networks, ICLR’18 논문 링크 : https://arxiv.org/abs/1710.109031. IntroductionOn the other hand, we have non-spectral approaches (Duvenaud et al., 2015; Atwood & Towsley, 2016; Hamilton et al., 2017), which define convolutions directly on the graph, operating on groups of spatially close neighbors.One of the benefits of attention mechanisms is that they allow for dealing with variable sized inputs, focusing on the mos.. 2023. 3. 26. GaAN: Gated Attention Networks for Learning on Large and Spatiotemporal Graphs, UAI’18 논문 링크 : http://www.auai.org/uai2018/proceedings/papers/139.pdf1. IntroductionTreating each attention head equally loses the opportunity to benefit from some attention heads which are inherently more important than others. To this end, we propose the Gated Attention Networks(GaAN) for learning on graphs. GaAN uses a small convolutional subnetwork to compute a soft gate at each attention head to c.. 2023. 3. 26. Learning Actor Relation Graphs for Group Activity Recognition, CVPR’19 논문 링크 : https://openaccess.thecvf.com/content_CVPR_2019/papers/Wu_Learning_Actor_Relation_Graphs_for_Group_Activity_Recognition_CVPR_2019_paper.pdf1. IntroductionTo understand the scene of multiple persons, the model needs to not only describe the individual action of each actor in the context, but also infer their collective activity.However, modeling the relation between actors is challenging,.. 2023. 3. 26. Mesh Graphormer, ICCV’21 논문 링크 : https://openaccess.thecvf.com/content/ICCV2021/papers/Lin_Mesh_Graphormer_ICCV_2021_paper.pdf1. IntroductionTransformers are good at modeling long-range dependencies on the input tokens, but they are less efficient at capturing fine-grained local information.Convolution layers, on the other hand, are useful for extracting local features, but many layers are required to capture global con.. 2023. 3. 26. 이전 1 ··· 3 4 5 6 7 다음