Back to episodes

Episode 7

Building Your Own LLM

Greg Diamos, co-founder of Lamini, shares how their discovery of the Scaling Laws Recipe led to rapid evolution of language learning models, and inspired Lamini’s product offering. He also discusses his message for policy makers, including what we should be worried about, and what pitfalls we should work to avoid.

Show Notes

In this episode of The Brave Technologist Podcast, we discuss:

  • Why you should build your own LLM
  • How language learning models have impacted ChatGPT
  • Why this is the best time for software engineers to enter the space

Guest List

The amazing cast and crew:

  • Greg Diamos - Co-Founder of Lamini

    Greg Diamos is a co-founder of Lamini, the enterprise LLM platform for building and owning LLMs. He is also a co-founder of MLPerf™, the industry standard benchmark for deep learning performance. Greg was a founding engineer at Baidu’s Silicon Valley AI Lab (SVAIL), where he co-invented the first deep learning speech and language model (which was deployed in production to billions of users). At Baidu, he discovered Scaling Laws, motivating LLMs. His team members from SVAIL built the most useful LLMs today, including OpenAI’s ChatGPT, Llama 2 at Meta, Claude 2 at Anthropic, PALM at Google, and NVIDIA’s Megatron. Before Baidu, Greg was a CUDA Architect at NVIDIA. Greg holds a PhD from Georgia Tech focusing on high performance computing.

Subscribe and stream on all the platforms below

About the Show

Shedding light on the opportunities and challenges of emerging tech. To make it digestible, less scary, and more approachable for all!
Join us as we embark on a mission to demystify artificial intelligence, challenge the status quo, and empower everyday people to embrace the digital revolution. Whether you’re a tech enthusiast, a curious mind, or an industry professional, this podcast invites you to join the conversation and explore the future of AI together.