You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: home.html
+1Lines changed: 1 addition & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -35,6 +35,7 @@ <h2>Overview</h2>
35
35
<divclass="news-container">
36
36
<h2>Latest News:</h2>
37
37
<p><b>[August 21, 2025]</b> Congratulations to Heming Xia and Kaishuai Xu for their papers accepted into EMNLP2025.</p>
38
+
<p><b>[August 17, 2025]</b> Congratulations to Xin Zhang! His work on "Phased Training for LLM-powered Text Retrieval Models Beyond Data Scaling" was accepted by COLM2025.</p>
38
39
<p><b>[July 2, 2025]</b> We have successfully had five papers accepted into the main conference and an additional five papers accepted into Findings at ACL2025. Congratulations to Chak Tou Leong, Wenjun Hou, Kaishuai Xu, Xin Zhang, Yang Xiao, Jian Wang, Hanlin Wang, Qiancheng Xu and Xinghao Chen !</p>
39
40
<p><b>[April 5, 2025]</b> Congratulations to Yongqi Li for having two papers accepted by SIGIR'25!</p>
40
41
<p><b>[February 27, 2025]</b> Congratulations to Xin Zhang and Jun Gao! Their papers have been accepted at CVPR 2025, showcasing their outstanding contributions to the field. Well done!</p>
Copy file name to clipboardExpand all lines: index.html
+1Lines changed: 1 addition & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -35,6 +35,7 @@ <h2>Overview</h2>
35
35
<divclass="news-container">
36
36
<h2>Latest News:</h2>
37
37
<p><b>[August 21, 2025]</b> Congratulations to Heming Xia and Kaishuai Xu for their papers accepted into EMNLP2025.</p>
38
+
<p><b>[August 17, 2025]</b> Congratulations to Xin Zhang! His work on "Phased Training for LLM-powered Text Retrieval Models Beyond Data Scaling" was accepted by COLM2025.</p>
38
39
<p><b>[July 2, 2025]</b> We have successfully had five papers accepted into the main conference and an additional five papers accepted into Findings at ACL2025. Congratulations to Chak Tou Leong, Wenjun Hou, Kaishuai Xu, Xin Zhang, Yang Xiao, Jian Wang, Hanlin Wang, Qiancheng Xu and Xinghao Chen !</p>
39
40
<p><b>[April 5, 2025]</b> Congratulations to Yongqi Li for having two papers accepted by SIGIR'25!</p>
40
41
<p><b>[February 27, 2025]</b> Congratulations to Xin Zhang and Jun Gao! Their papers have been accepted at CVPR 2025, showcasing their outstanding contributions to the field. Well done!</p>
Copy file name to clipboardExpand all lines: publications.html
+15Lines changed: 15 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -46,6 +46,14 @@ <h2>2025</h2>
46
46
<p>Findings of the 2025 Conference on Empirical Methods in Natural Language Processing (EMNLP’2025).</p>
47
47
</div>
48
48
49
+
<divclass="publication-item">
50
+
<divclass="publication-bullet">
51
+
<spanclass="tag_conference">Conference</span><span><b><aclass="text-link" href="https://openreview.net/pdf?id=NC6G1KCxlt" target="_blank">Phased Training for LLM-powered Text Retrieval Models Beyond Data Scaling</a></b></span>
<p>Findings of the 62nd Annual Meeting of the Association for Computational Linguistics (ACL 2024).</p>
253
261
</div>
262
+
<divclass="publication-item">
263
+
<divclass="publication-bullet">
264
+
<spanclass="tag_conference">Conference</span><span><b><aclass="text-link" href="https://aclanthology.org/2024.findings-acl.456/" target="_blank">Unlocking Efficiency in Large Language Model Inference: A Comprehensive Survey of Speculative Decoding</a></b></span>
<p>Findings of the 62nd Annual Meeting of the Association for Computational Linguistics (ACL 2024).</p>
268
+
</div>
254
269
<divclass="publication-item">
255
270
<divclass="publication-bullet">
256
271
<spanclass="tag_conference">Conference</span><span><b><aclass="text-link" href="https://arxiv.org/abs/2402.06967" target="_blank">Instruct once, Chat Consistently in Multiple Rounds: An Efficient Tuning Framework for Dialogue</a></b></span>
0 commit comments