GNN算法初步总结

近期跨模态的论文涉及到了好多文字、图像处理算法,一点点更新。

总体还是跟着油管视频在学,上次BP算法也是跟着这个印度老哥在学。

好多图片还没插入,在notability里存着。markdown和notability的图片兼容性太差了,后期可能先更新在这里。

————————————————————分割线—————————————————————

Geometric Deep Learning

实现过程:把图的信息放入neural network

  1. 图可以是有向图可以是无向图
  2. 图可以的edge和node可以有label
    A. 不要把label和feature弄混

如何把图放入卷积神经网络:

  1. Incidence matrix: 假设nxm的矩阵 n是nodes的数量 m是edges的数量
    A. undirected graph:
    B. directed graph
  2. Adjacency matrix:方阵 nxn 默认对角线为0或1
    A. unweighted 0或1
    B. weighted 可以是float
  3. Degree matrix:对角阵 记录每个node连接了多少个其他node 若独立则为0
  4. Laplacian matrix(graph laplacian):L = D-A
    A. 衡量how smooth the graph is
    B. Example

CNN为什么容易fail(咋就fail了)

  1. CNN特点:Locality,Aggregation,Composition(function of a function)
  2. 详细分析

怎么用neural network训练graph:

  1. Approach1: 把feature加在矩阵后面
    A. 缺点
     a. Not in variant to node ordering
     b. Not applicable to graphs of different sizes
    
  2. Approach 2: GNN
    A. 特点:
     a. locality(neighborhood)
     b. Aggregation
     c. stacking layers(composition)
    
    B. 前提假设:
     a. A is adjacency matrix of size NxN
     b. X is the node feature matrix of size NxM
    
    C. Node embedding :Map nodes to d-dimentional embeddings such that similar nodes in the graph are embedded close together(视频中标记为 X_A –> Z_A)
     a. Locality information using computational graph(下图中的右图)
         1. 感觉大概上生成了树结构(层数怎么确定的)
         2. 借助树从 graph space 生成 embedding space
     b.  Aggragate
         1. 从树的根节点,从下往上做sum,直到target node