Pierre-Carl Langlais (also known as Alexander Doria) is the co-founder of Pleias, a French AI startup pioneering ethically trained language models. With a background in digital humanities and a passion for open science, he leads the development of AI systems trained exclusively on open data through the "Common Corpus" initiative—a collection of public domain and openly licensed content.
His work focuses on creating small but powerful "reasoning models" optimized for specific tasks like retrieval-augmented generation (RAG), with native citation capabilities and multilingual support.
The recently released Pleias-RAG model family represents his vision for responsible AI that respects copyright while maintaining high performance. Before founding Pleias, he served as Head of Research at opsci and published extensively on digital humanities topics, including legal aspects of text mining and Wikipedia.
Through his blog "Vintage Data," he regularly shares insights on LLM research, training methodologies, and the shifting paradigms in AI development, advocating for an approach where AI functions as a commons rather than proprietary technology.
Pierre-Carl Langlais (also known as Alexander Doria) is the co-founder of Pleias, a French AI startup pioneering ethically trained language models. With a background in digital humanities and a passion for open science, he leads the development of AI systems trained exclusively on open data through the "Common Corpus" initiative—a collection of public domain and openly licensed content.
His work focuses on creating small but powerful "reasoning models" optimized for specific tasks like retrieval-augmented generation (RAG), with native citation capabilities and multilingual support.
The recently released Pleias-RAG model family represents his vision for responsible AI that respects copyright while maintaining high performance. Before founding Pleias, he served as Head of Research at opsci and published extensively on digital humanities topics, including legal aspects of text mining and Wikipedia.
Through his blog "Vintage Data," he regularly shares insights on LLM research, training methodologies, and the shifting paradigms in AI development, advocating for an approach where AI functions as a commons rather than proprietary technology.
Pierre-Carl Langlais (also known as Alexander Doria) is the co-founder of Pleias, a French AI startup pioneering ethically trained language models. With a background in digital humanities and a passion for open science, he leads the development of AI systems trained exclusively on open data through the "Common Corpus" initiative—a collection of public domain and openly licensed content.
His work focuses on creating small but powerful "reasoning models" optimized for specific tasks like retrieval-augmented generation (RAG), with native citation capabilities and multilingual support.
The recently released Pleias-RAG model family represents his vision for responsible AI that respects copyright while maintaining high performance. Before founding Pleias, he served as Head of Research at opsci and published extensively on digital humanities topics, including legal aspects of text mining and Wikipedia.
Through his blog "Vintage Data," he regularly shares insights on LLM research, training methodologies, and the shifting paradigms in AI development, advocating for an approach where AI functions as a commons rather than proprietary technology.