facebook pixel
@vox
It could learn them all. But will it? Subscribe and turn on notifications 🔔 so you don't miss any videos: goo.gl/0bsAjO Large language models are astonishingly good at understanding and producing language. But there’s an often overlooked bias toward languages that are already well-represented on the internet. That means some languages might lose out in AI’s big technical advances. Some researchers are looking into how that works — and how to possibly shift the balance from these “high resource” languages to ones that haven’t yet had a huge online footprint. These approaches range from original dataset creation, to studying the outputs of large language models, to training open source alternatives. Watch the video above to learn more. Further reading: ruth-ann.notion.site/ruth-ann/JamPatoisNLI-A-Jamaican-Patois-Natural-Language-Inference-Dataset-91523ec89af24bfdbcb9c1ec7e28cc3c This is the hub for Ruth-Ann Armstrong’s JamPatois NLI. You can see the dataset and re...

 567.2k

 17.1k

 716

 567.2k

Credits
Tags, Events, and Projects