• AIPressRoom
  • Posts
  • 82: ChatGPT_Machine Learning and Venture Capital

82: ChatGPT_Machine Learning and Venture Capital

In this episode of the [i3] Podcast, we look at machine learning and artificial intelligence. Our guests are Kaggle Founder Anthony Goldbloom and Stanford CS Professor Chris Manning, who are both involved in venture capital firm AIX Ventures, as investment directors.

We speak about the state of play in machine learning and artificial intelligence, the most interesting applications, including ChatGPT, and opportunities for investors in this space, covering smart sensors, travel agents and personal assistants.

Overview of the podcast:

Anthony Goldbloom:

04:00 The idea for Kaggle came from a conference competition06:00 Looking for the most accurate algorithms07:30 Before Kaggle, every academic discipline had their own set of machine learning techniques08:00 One technique won problem after problem09:00 Rise of neural networks: 2012 is often called the annus mirabilis for machine learning10:30 I’m mind blown by what you can do with ChatGPT11:30 Using summarization through ChatGPT12:30 We are in a world right now where the capabilities of these models run far ahead of the applications. People haven’t really build companies around these models yet14:50 The rise of chat-powered travel agents? Adding databases to ChatGPT17:00 Why was Google interested in Kaggle?19:00 Tweaking the value estimation algorithm for US real estate website Zillow21:00 Surpassing physicians on diagnosing lung cancer22:30 Two Sigma and Optiver also used Kaggle to solve problems23:00 Hedge funds who crowdsourced investment problems24:00 Being a one person band is hard in investing: you need to not only find the alpha signal, but also implement the trade in a way that doesn’t move the market27:00 Does machine learning work in time series? Yes, but it requires more babysitting if your algorithm works in an adversarial setting32:00 What I bring to AIX Ventures is the understanding of where the gaps are in the tools for machine learning33:00 Examples of companies we invested in35:00 Embedding ultra light machine learning into appliances

Chris Manning:

37:24 I was interested in how people learn languages, while I was always playing around with computers. Then I became interested in Ross Quinlan’s ID3 algorithm for natural language processing40:00 I started to work with large digital language databases slightly before the world wide web really kicked off43:00 The combination of neutral networks and predictive text led to the revolutionary breakthroughs we see now with ChatGPT45:30 You can use ChatGPT for text analysis, such as sentiment analysis or summarization of specific information46:00 These models are just wonderful, but of course there are still problems. On occasion these models tend to hallucinate. They are just as confident producing made up stuff. And at times they lack consistency in thinking. They will say things that contradicts what they said previously47:00 Human learning is still far more efficient in getting signal from data than machine learning50:00 The majority of businesses are conducted through human language, whether it is sales or support. These models can help people work fast and better.52:00 The case of Google and zero shot translation57:00 Facebook experimented with two systems talking to each other, but they found the systems would not stick to English, but developed a more efficient symbol system1:00:00 Interesting businesses we’ve invested in: weather prediction1:02 A lot of computing that was previously done in the cloud is now done on the device, which is much quicker1:04 Can NPL read the sentiment of a market by consuming just a lot of text? Well, a lot of mob mentality is expressed in language rather than numbers1:07 The start of AIX Ventures and the two Australians1:11 What might be the next big thing in NPL?1:13 Future applications of language models might potentially look at video and personal assistantsSource by [i3] Institutional Investment Podcast