Excited to share that our paper “On the Theoretical Advantages of Bilinear Similarities in Dense Retrieval” has been accepted to the International Conference on Similarity Search and Applications (SISAP)! 📄 Paper: https://link.springer.com/chapter/10.1007/978-3-032-06069-3_11 💻 Code: https://github.com/shubham526/bilinear-projection-theory Why Bilinear Similarities Matter Most dense retrieval models rely on simple dot-product similarity—multiply query and document embeddings and sum them…
Paper Accepted to SIGIR-AP 2025! 🎉
I’m excited to announce that our paper “REGENT: Relevance-Guided Attention for Entity-Aware Multi-Vector Neural Re-Ranking” has been accepted to SIGIR-AP 2025! 📄 Paper: https://arxiv.org/abs/2510.11592 💻 Code: https://github.com/shubham526/SIGIR-AP-2025REGENT What’s REGENT? When you search through long, complex documents, you naturally focus on key entities and concepts—names, places, organizations—that help you understand what’s relevant. But most neural search…
Paper Accepted to SIGIR 2025! 🎯
Thrilled to announce that our paper “QDER: Query-Specific Document and Entity Representations for Multi-Vector Document Re-Ranking” has been accepted to SIGIR 2025—the premier conference in information retrieval! 📄 Paper: https://dl.acm.org/doi/pdf/10.1145/3726302.3730065 💻 Code: https://github.com/shubham526/SIGIR2025-QDER Bridging Two Worlds in Neural IR Neural information retrieval has evolved along two parallel paths: What if we could combine the best…
Paper Accepted to EMNLP 2025! 🌐
Excited to share that our paper “DyVo: Dynamic Vocabularies for Learned Sparse Retrieval with Entities” has been accepted to EMNLP 2025! 📄 Paper: https://aclanthology.org/2024.emnlp-main.45/ 💻 Code: https://github.com/thongnt99/DyVo The Problem with Static Vocabularies Learned Sparse Retrieval (LSR) models have shown impressive results, but they inherit a fundamental limitation from their pre-trained transformers: fixed vocabularies that split…