- I'm Zhang-Each
- A graduate student in Zhejiang University, major in Computer Science
- Study Knowledge Graphs(KG) and Natural Language Processing(NLP) in ZJU-KG lab.
- Learning open courses released by Stanford/MIT/CMU
- Blog: link here
- Notebook: link here
- Personal Page: link here
- Knowledge Graph Completion with Pre-trained Multimodal Transformer and Twins Negative Sampling. (First Author, Accepted by KDD-2022 Undergraduate consortium, ArXiv)
- Tele-Knowledge Pre-training for Fault Analysis. (Accepted by ICDE-2023 Industry Track, ArXiv)
- Modality-Aware Negative Sampling for Multi-modal Knowledge Graph Embedding. (Accepted by IJCNN 2023, ArXiv)
- CausE: Towards Causal Knowledge Graph Embedding. (Accepted by CCKS 2023, ArXiv)
- MACO: A Modality Adversarial and Contrastive Framework for Modality-missing Multi-modal Knowledge Graph Completion. (Accepted by NLPCC 2023, ArXiv)
- Unleashing the Power of Imbalanced Modality Information for Multi-modal Knowledge Graph Completion. (Accepted by COLING 2024, ArXiV)
- NativE: Multi-modal Knowledge Graph Completion in the Wild. (Accepted by SIGIR 2024, ArXiV).
- Knowledgeable Preference Alignment for LLMs in Domain-specific Question Answering. (Accepted by ACL 2024 Findings, ArXiv)
- Making Large Language Models Perform Better in Knowledge Graph Completion. (ArXiv)
- Knowledge Graphs Meet Multi-Modal Learning: A Comprehensive Survey. (ArXiv)
- MyGO: Discrete Modality Information as Fine-Grained Tokens for Multi-modal Knowledge Graph Completion. (ArXiv)
- Multi-domain Knowledge Graph Collaborative Pre-training and Prompt Tuning for Diverse Downstream Tasks. (ArXiV)
- Mixture of Modality Knowledge Experts for Robust Multi-modal Knowledge Graph Completion. (ArXiV)