Laurent Daudet
November 3, 4pm
Room Turing Conseil, 45 rue des Saints Pères 75006 Paris & Online (Zoom)
Abstract
OpenAi’s GPT-3 language model has triggered a new generation of Machine Learning models. Leveraging Transformers architectures at billion-size parameters trained on massive unlabeled datasets, these language models achieve new capabilities such as text generation, question answering, or even zero-shot learning – tasks the model has not been explicitly trained for. However, training these models represent massive computing tasks, now done on dedicated supercomputers. Scaling up these models will require new hardware and optimized training algorithms.
At LightOn – a spinoff of university research -, we develop a set of technologies to address these challenges. The Optical Processing Unit (OPU) technology makes some matrix-vector multiplications in a massively parallel fashion, at record-low power consumption. Now accessible on-premises or through the cloud, the OPU technology has been used by engineers and researchers worldwide in a variety of applications, for Machine Learning and scientific computing. We also train in an efficient manner large language models, such as PAGnol (demo at https://pagnol.lighton.ai ), the largest language model in French, that can be used for various
Laurent Daudet
CTO and co-founder at LightOn, Professor (on leave) of physics (Université Paris Cité)
Laurent Daudet is currently employed as CTO at LightOn, a startup he co-founded in 2016, where he manages cross-disciplinary R&D projects, involving machine learning, optics, signal processing, electronics, and software engineering. Laurent is a recognized expert in signal processing and wave physics, and is currently on leave from his position of Professor of Physics at the Université Paris Cité. Prior to that or in parallel, he has held various academic positions: fellow of the Institut Universitaire de France, associate professor at Université Pierre et Marie Curie, Visiting Senior Lecturer at Queen Mary University of London, UK, Visiting Professor at the National Institute for Informatics in Tokyo, Japan. Laurent has authored or co-authored more than 200 scientific publications, has been a consultant to various small and large companies, and is a co-inventor in several patents. He is a graduate in physics from Ecole Normale Supérieure in Paris, and holds a PhD in Applied Mathematics from Marseille University.
Other distinguished lectures
Nikos Paragios – Seeing the Invisible – Doing the Impossible: Reinventing Healthcare with Generative AI-powered diagnosis, treatment and beyond
Nikos ParagiosDecember 04, 2024Vulpian Amphitheater, 12 rue de l’École de Médecine (75006 Paris) Nikos Paragios (52) is distinguished professor of Mathematics (on partial leave) at Ecole CentraleSupelec, the school of engineering ofthe University of Paris-Saclay and...
Alon Halevy – Well-being, AI, and You: Developing AI-based Technology for Well-being
Alon HalevyDecember 04, 2024Vulpian Amphitheater, 12 rue de l’École de Médecine (75006 Paris) Alon Halevy is a Distinguished Engineer in Google Cloud. From 2019 until November 2023, he was a director at Meta’s Reality Labs Research, where he worked on Personal Digital...
Kimon Drakopoulos – Deploying a Data-Driven COVID-19 Screening Policy
Kimon Drakopoulos May 5, 2021, at 4 PM Online (Zoom) Abstract In collaboration with the Greek government, we designed and deployed a nation-wide COVID-19 screening protocol for travelers to Greece. The goals of the protocol were to combine limited...
Julia Stoyanovich – Building Data Equity Systems
Christopher Messenger April 6, 2022, at 4 PM Online (Zoom) Abstract Equity as a social concept — treating people differently depending on their endowments and needs to provide equality of outcome rather than equality of treatment — lends a unifying...