
Sign up to save your podcasts
Or
Joining SlatorPod this week is Longyue Wang, a Research Scientist at Tencent AI Lab, where he is involved in the research and practical applications of machine translation (MT) and natural language processing (NLP).
Longyue Longyue expands on Tencent’s approach to language technology where they integrate MT with Tencent Translate (TranSmart). He highlights how Chinese-to-English MT has made significant advancements, thanks to improvements in technology and data size. However, translating Chinese to non-English languages has been more challenging.
Recent research by Longyue explores large language models’ (LLMs) impact on MT, demonstrating their superiority in tasks like document-level translation. He emphasized that GPT-4 outperformed traditional MT engines in translating literary texts like web novels.
Longyue discusses various promising research directions for MT using LLMs, including stylized MT, interactive MT, translation memory-based MT, and a new evaluation paradigm. His research suggests LLMs can enhance personalized MT, adapting translations to users' preferences.
Longyue also sheds light on how Chinese researchers are focusing on building Chinese-centric MT engines, directly translating from Chinese to other languages. There's an effort to reduce reliance on English as a pivot language.
Looking ahead, Longyue's research will address challenges related to LLMs, including handling hallucination and timeless information issues.
4.3
66 ratings
Joining SlatorPod this week is Longyue Wang, a Research Scientist at Tencent AI Lab, where he is involved in the research and practical applications of machine translation (MT) and natural language processing (NLP).
Longyue Longyue expands on Tencent’s approach to language technology where they integrate MT with Tencent Translate (TranSmart). He highlights how Chinese-to-English MT has made significant advancements, thanks to improvements in technology and data size. However, translating Chinese to non-English languages has been more challenging.
Recent research by Longyue explores large language models’ (LLMs) impact on MT, demonstrating their superiority in tasks like document-level translation. He emphasized that GPT-4 outperformed traditional MT engines in translating literary texts like web novels.
Longyue discusses various promising research directions for MT using LLMs, including stylized MT, interactive MT, translation memory-based MT, and a new evaluation paradigm. His research suggests LLMs can enhance personalized MT, adapting translations to users' preferences.
Longyue also sheds light on how Chinese researchers are focusing on building Chinese-centric MT engines, directly translating from Chinese to other languages. There's an effort to reduce reliance on English as a pivot language.
Looking ahead, Longyue's research will address challenges related to LLMs, including handling hallucination and timeless information issues.
176 Listeners
2,110 Listeners
16 Listeners
86,340 Listeners
111,437 Listeners
3,963 Listeners
709 Listeners
300 Listeners
5,165 Listeners
8,738 Listeners
11 Listeners
0 Listeners
3,381 Listeners
49 Listeners
7 Listeners