GNN算法初步总结
近期跨模态的论文涉及到了好多文字、图像处理算法,一点点更新。
总体还是跟着油管视频在学,上次BP算法也是跟着这个印度老哥在学。
好多图片还没插入,在notability里存着。markdown和notability的图片兼容性太差了,后期可能先更新在这里。
————————————————————分割线—————————————————————
Geometric Deep Learning
实现过程:把图的信息放入neural network
- 图可以是有向图可以是无向图
- 图可以的edge和node可以有label
A. 不要把label和feature弄混
如何把图放入卷积神经网络:
- Incidence matrix: 假设nxm的矩阵 n是nodes的数量 m是edges的数量
A. undirected graph:
B. directed graph - Adjacency matrix:方阵 nxn 默认对角线为0或1
A. unweighted 0或1
B. weighted 可以是float - Degree matrix:对角阵 记录每个node连接了多少个其他node 若独立则为0
- Laplacian matrix(graph laplacian):L = D-A
A. 衡量how smooth the graph is
B. Example
CNN为什么容易fail(咋就fail了)
- CNN特点:Locality,Aggregation,Composition(function of a function)
- 详细分析
怎么用neural network训练graph:
- Approach1: 把feature加在矩阵后面
A. 缺点a. Not in variant to node ordering b. Not applicable to graphs of different sizes
- Approach 2: GNN
A. 特点:
B. 前提假设:a. locality(neighborhood) b. Aggregation c. stacking layers(composition)
C. Node embedding :Map nodes to d-dimentional embeddings such that similar nodes in the graph are embedded close together(视频中标记为 X_A –> Z_A)a. A is adjacency matrix of size NxN b. X is the node feature matrix of size NxM
a. Locality information using computational graph(下图中的右图) 1. 感觉大概上生成了树结构(层数怎么确定的) 2. 借助树从 graph space 生成 embedding space b. Aggragate 1. 从树的根节点,从下往上做sum,直到target node