<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class="">Dear colleagues,<div class=""><br class=""></div><div class="">I would like to draw your attention to a talk in the lecture series of the large scale project SAIL (<a href="http://www.sail.nrw" class="">www.sail.nrw</a>):</div><div class=""><br class=""></div><div class="">When: July 6, 16-17:30 CEST</div><div class="">Who: Dr Hinrich Schütze, LMU (<a href="https://schuetze.cis.lmu.de/" class="">Homepage
of Hinrich Schütze's lab</a>) </div><div class="">Where: <a href="https://uni-bielefeld.zoom.us/j/64775735478?pwd=TFpEUVFPME5EQXFKMHZHY1ZsM2Y4Zz09" class="">Zoom link</a></div><div class="">Title: "Glot500: Scaling Multilingual Corpora and Language Models to 500 Languages"<br class=""><br class="">Abstract: Large language models (LLMs) are currently the most active area of research in NLP. Most work has focused on what we call "vertical" scaling: making LLMs even better for a relatively small number of high-resource languages. We address "horizontal" scaling instead: extending LLMs to a large subset of the world's languages, focusing on low-resource languages. Our Glot500-m model is trained on 500 languages, many of which are not covered by any other language model. I will talk about the major challenges we faced in creating Glot500: (i) finding, validating and cleaning training data for that many languages; (ii) evaluating performance of Glot500-m on languages for which native speakers and labeled datasets were not available to us; and (iii) determining the factors that ultimately make training on a language successful. We find that trying to reduce such factors to the so-called curse of multilinguality is naive and there is in fact also a "boon of multilinguality". We are in the process of making Glot500-c, our training corpus covering 500 languages, publicly available.</div><div class=""><br class=""></div><div class="">Best wishes</div><div class=""><br class=""></div><div class="">Barbara Hammer</div><div class=""><br class=""></div><div class=""><br class=""><div class="">
<div dir="auto" style="text-align: start; text-indent: 0px; word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><div><div class="">-- </div><div class="">Prof. Dr. Barbara Hammer</div><div class="">Machine Learning Group, CITEC</div><div class="">Bielefeld University</div><div class="">D-33594 Bielefeld</div><div class="">Phone: +49 521 / 106 12115</div></div></div><br class="Apple-interchange-newline"><br class="Apple-interchange-newline">
</div>
<br class=""></div></body></html>