![]() Large-Scale: Support distributed graph storage and distributed training algorithms On this basis, PGL supports heterogeneous graph algorithms based on message passing, such as GATNE and other algorithms. As shown on the right of the figure above, nodes belonging to different types need to be aggregated separately during message delivery, and then merged into the final message to update the target node. As shown on the left, it has five neighbors, belonging to two different node types. Support Message Passing mechanism on heterogeneous graphīecause of the different node types on the heterogeneous graph, the message delivery is also different. Then on this basis, and introducing word2vec and other methods to support learning metapath2vec and other algorithms of heterogeneous graph representation. When you input any MetaPath as UPU (user-product-user), you will find the following results The right of the above figure is a simple sampling process of MetaPath. ![]() The nodes above have two categories of users and goods, and the relations between users and users, users and goods, and goods and goods. The left side of the figure above describes a shopping social network. Support meta path walk sampling on heterogeneous graph PGL models heterogeneous graphs that contain multiple node types and multiple edge types, and can describe complex connections between different types. Therefore, in the heterogeneous graph, we need to distinguish the node types and edge types in the graph network. Graph can conveniently represent the relation between things in the real world, but the categories of things and the relation between things are various. recv ( recv_func, msg ) Highlight: Flexibility - Natively Support Heterogeneous Graph Learning tensor () def send_func ( src_feat, dst_feat, edge_feat ): return src_feat def recv_func ( msg ): return msg. Graph ( num_nodes = num_nodes, edges = edges, node_feat = ) g. import pgl import paddle import numpy as np num_nodes = 5 edges = feature = np. To write a sum aggregator, users only need to write the following codes. For the second step, the recv function is responsible for aggregating messages together from different sources. As shown in the following figure, for the first step the send function is defined on the edges of the graph, and the user can customize the send function to send the message from the source to the target node. Users only need to write send and recv functions to easily implement a simple GCN. At PGL we adopt Message Passing Paradigm similar to DGL to help to build a customize graph neural network easily. One of the most important benefits of graph neural networks compared to other models is the ability to use node-to-node connectivity information, but coding the communication between nodes is very cumbersome. Combined with the PaddlePaddle deep learning framework, we are able to support both graph representation learning models and graph neural networks, and thus our framework has a wide range of graph-based applications. Furthermor, The newly released PGL also support distributed graph storage and some distributed training algorithms, such as distributed deep walk and distributed graphsage. The newly released PGL supports heterogeneous graph learning on both walk based paradigm and message-passing based paradigm by providing MetaPath sampling and Message Passing mechanism on heterogeneous graph. Paddle Graph Learning (PGL) is an efficient and flexible graph learning framework based on PaddlePaddle.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |