Big thanks to Jongwoo Kim for his hard work, Woosung Kang, and my advisor, Noseong Park, for his guidance!
On a personal note, all glory to God for the wisdom and faith given throughout this research.
Big thanks to Jongwoo Kim for his hard work, Woosung Kang, and my advisor, Noseong Park, for his guidance!
On a personal note, all glory to God for the wisdom and faith given throughout this research.
๐ฏ๐๐ ๐๐ ๐๐๐๐๐: We propose ๐ฬฒฬณฬฒ๐จฬฒฬณฬฒ๐ฬฒฬณฬฒ๐ฬฒฬณฬฒ๐ฬฒฬณฬฒ๐ ฬฒฬณฬฒ๐ฬฒฬณฬฒ, which pre-trains on synthetic graphs generated from random graph models in network science with controlled structural properties. One checkpoint, in-context learning at inference. No ๐๐-๐ก๐๐๐๐๐๐๐ or ๐๐๐๐-๐ก๐ข๐๐๐๐ needed for new graphs.
๐พ๐๐๐ ๐๐ ๐๐๐๐๐: GNNs require separate training for each graph due to varying homophily levels, community structures, and feature distributions. Collecting diverse real-world graph data for pre-training is difficult, and using AI-generated synthetic data can be unstable.
Our paper "๐๐๐๐ซ๐ง๐ข๐ง๐ ๐๐จ๐ฌ๐ญ๐๐ซ๐ข๐จ๐ซ ๐๐ซ๐๐๐ข๐๐ญ๐ข๐ฏ๐ ๐๐ข๐ฌ๐ญ๐ซ๐ข๐๐ฎ๐ญ๐ข๐จ๐ง๐ฌ ๐๐จ๐ซ ๐๐จ๐๐ ๐๐ฅ๐๐ฌ๐ฌ๐ข๐๐ข๐๐๐ญ๐ข๐จ๐ง ๐๐ซ๐จ๐ฆ ๐๐ฒ๐ง๐ญ๐ก๐๐ญ๐ข๐ ๐๐ซ๐๐ฉ๐ก ๐๐ซ๐ข๐จ๐ซ๐ฌ" has been accepted to #ICLR2026. See you in Rio! ๐ง๐ท
๐ OpenReview: openreview.net/forum?id=Fmx...
Iโve been recognized as an Outstanding Reviewer (top 10% of reviewers) for both the August and February cycles of #KDD2025 in the research track!
kdd2025.kdd.org/research-tra...
Got the #IJCAI2025 decisions today. Our paper "Learning Advanced Self-Attention for Linear Transformers in the Singular Value Domain" made it in!
The other one got rejected though. Win some, lose some I guess.
The deadlines for #CIKM2025 are coming up in three weeks! There is still time to prepare your papers for submission. Check out all the CFPs and full timeline at cikm2025.org!
Had a presentation on our "Graph Convolutions Enrich the Self-Attention in Transformers!" paper at #NeurIPS2024! Really enjoyed receiving lots of interesting questions about the potential applications of our work.
I totally agree with Sepp Hochreiter's message that "AI does not End with Scaling". I hope my research direction will follow a similar direction. Had great conversations with him after his inspiring invited talk. Grateful for these meaningful interactions at #NeurIPS2024.
Iโll be presenting our work โGraph Convolutions Enrich the Self-Attention in Transformers!โ in #NeurIPS2024 ๐จ๐ฆ!
๐ Location: East Exhibit Hall A-C #2100
๐๏ธ Date: Wed 11 Dec
โฐ Time: ๐ 4:30 p.m. - ๐ข 7:30 p.m. PST.
More details can be found on this #NeurIPS page: lnkd.in/g47GSuZp
I'm making a list of AI for Science researchers on bluesky โ let me know if I missed you / if you'd like to join!
go.bsky.app/AcP9Lix
I made a #starterpack for computational math ๐ป๐งฎ so please
1. share
2. let me know if you want to be on the list!
(I have many new followers which I do not know well yet, so I'm sorry if you follow me and are not on here, but want to - drop me a note and I'll add you!)
go.bsky.app/DXdZkzV
Yeah here's my geometry processing starter pack. I hand-picked researchers and practitioners either in geometry processing or not, and some people are definitely missing so,
go.bsky.app/Dqw66qn
Here is an initial starter pack list on Machine Learning on Graphs: go.bsky.app/HN2MTzp