
Sign up to save your podcasts
Or
Joining SlatorPod this week is Longyue Wang, a Research Scientist at Tencent AI Lab, where he is involved in the research and practical applications of machine translation (MT) and natural language processing (NLP).
Longyue Longyue expands on Tencent’s approach to language technology where they integrate MT with Tencent Translate (TranSmart). He highlights how Chinese-to-English MT has made significant advancements, thanks to improvements in technology and data size. However, translating Chinese to non-English languages has been more challenging.
Recent research by Longyue explores large language models’ (LLMs) impact on MT, demonstrating their superiority in tasks like document-level translation. He emphasized that GPT-4 outperformed traditional MT engines in translating literary texts like web novels.
Longyue discusses various promising research directions for MT using LLMs, including stylized MT, interactive MT, translation memory-based MT, and a new evaluation paradigm. His research suggests LLMs can enhance personalized MT, adapting translations to users' preferences.
Longyue also sheds light on how Chinese researchers are focusing on building Chinese-centric MT engines, directly translating from Chinese to other languages. There's an effort to reduce reliance on English as a pivot language.
Looking ahead, Longyue's research will address challenges related to LLMs, including handling hallucination and timeless information issues.
4.3
66 ratings
Joining SlatorPod this week is Longyue Wang, a Research Scientist at Tencent AI Lab, where he is involved in the research and practical applications of machine translation (MT) and natural language processing (NLP).
Longyue Longyue expands on Tencent’s approach to language technology where they integrate MT with Tencent Translate (TranSmart). He highlights how Chinese-to-English MT has made significant advancements, thanks to improvements in technology and data size. However, translating Chinese to non-English languages has been more challenging.
Recent research by Longyue explores large language models’ (LLMs) impact on MT, demonstrating their superiority in tasks like document-level translation. He emphasized that GPT-4 outperformed traditional MT engines in translating literary texts like web novels.
Longyue discusses various promising research directions for MT using LLMs, including stylized MT, interactive MT, translation memory-based MT, and a new evaluation paradigm. His research suggests LLMs can enhance personalized MT, adapting translations to users' preferences.
Longyue also sheds light on how Chinese researchers are focusing on building Chinese-centric MT engines, directly translating from Chinese to other languages. There's an effort to reduce reliance on English as a pivot language.
Looking ahead, Longyue's research will address challenges related to LLMs, including handling hallucination and timeless information issues.
7,665 Listeners
4,204 Listeners
253 Listeners
3,664 Listeners
9,266 Listeners
24,596 Listeners
118 Listeners
4 Listeners
46 Listeners
0 Listeners
3,027 Listeners
1,816 Listeners
809 Listeners
119 Listeners
1,163 Listeners