Building entropically, Gemini, Prompt engineerings revenge, and Superpowered


Nabeel and Fraser briefly discuss Google's new AI model, Gemini, and keeping authenticity in startup pitches. They discuss the perils of trying to simplify when technology is lending towards complexity. How to stay authentic to yourself when pitching and fundraising. The new findings of Claude prompting, and the likely continued need for prompt engineering. Finally, they talk about the startup SuperPowered, its potential pivot, and how passion for problem-solving can impact a company's direction.
* Google Gemini's excellent product video "demo"
* History of Razorfish
* Claude 2.1 Prompting Technique
* AI Meeting notes from Superpowered.me
- (00:00) - Building when entropy is increasing
- (01:00) - Introduction and Welcome
- (01:15) - Discussing the Frequency of AI Developments
- (01:45) - Google's AI Developments and the Gemini Team
- (02:19) - Explaining Gemini and its Significance
- (05:04) - Analyzing Google Gemini from afar
- (18:02) - You are 5 words away from being done
- (27:00) - The analogy of the early web vs early LLMs
- (30:05) - Do we need the Razorfish of AI
- (33:27) - The Future of AI Tools and Platforms
- (34:11) - The Importance of Implementation Engineers in AI
- (35:54) - Are we in an entropic or de-entropy phase?
- (37:57) - The Importance of Authenticity in Marketing
- (38:53) - Founders just being real
- (47:28) - How would you fundraise differently now?
- (50:15) - Superpowered.me and AI Note Taking Startups
00:00 - Building when entropy is increasing
01:00 - Introduction and Welcome
01:15 - Discussing the Frequency of AI Developments
01:45 - Google's AI Developments and the Gemini Team
02:19 - Explaining Gemini and its Significance
05:04 - Analyzing Google Gemini from afar
18:02 - You are 5 words away from being done
27:00 - The analogy of the early web vs early LLMs
30:05 - Do we need the Razorfish of AI
33:27 - The Future of AI Tools and Platforms
34:11 - The Importance of Implementation Engineers in AI
35:54 - Are we in an entropic or de-entropy phase?
37:57 - The Importance of Authenticity in Marketing
38:53 - Founders just being real
47:28 - How would you fundraise differently now?
50:15 - Superpowered.me and AI Note Taking Startups
00:00:00,000 --> 00:00:05,029
Nabeel Hyatt: technology tends to go
through ages of entropy and de entropy.
2
00:00:05,569 --> 00:00:11,229
We all love, especially as engineers,
we love de entropy, we love simplifying
3
00:00:11,229 --> 00:00:15,778
everything, cleaning it up getting
the signal from noise bringing it
4
00:00:15,778 --> 00:00:17,778
all down into something that works.
5
00:00:17,998 --> 00:00:22,328
Things that are trying to make a promise
of de entropy too quickly when all
6
00:00:22,328 --> 00:00:26,414
of these LLMs are so new Just feel
incongruous to me when the goal is
7
00:00:26,414 --> 00:00:30,751
solve the problem reliably and, and
we're still not at reliable solution.
8
00:00:31,151 --> 00:00:33,891
Fraser Kelton: Boy, we wrestled with this
one, but that one feels really right.
9
00:00:34,426 --> 00:00:38,426
It's going to get more complicated
in every direction because we are
10
00:00:38,426 --> 00:00:44,186
not at the reliability required for
consistent value in many use cases.
11
00:00:44,676 --> 00:00:49,626
And like, why bother adding abstractions
of simplicity if you say it's
12
00:00:49,626 --> 00:00:50,646
still not going to be good enough?
13
00:00:51,416 --> 00:00:52,186
Nabeel Hyatt: Yeah, exactly.
14
00:00:52,196 --> 00:00:55,106
Fraser Kelton: lot easier for you to get
something than broken into production.
15
00:00:56,136 --> 00:00:57,746
Nabeel Hyatt: That's
what, that's the headline.
16
00:00:58,036 --> 00:01:00,616
Why are we making it easier to
get broken things into production?
17
00:01:00,906 --> 00:01:02,186
Fraser Kelton: Oh, but what a teaser.
18
00:01:02,531 --> 00:01:03,501
Nabeel Hyatt: I know, I know.
19
00:01:04,221 --> 00:01:05,041
Hello everybody.
20
00:01:05,271 --> 00:01:06,651
Welcome to Hallway Chat.
21
00:01:06,711 --> 00:01:07,621
I'm Nabeel.
22
00:01:08,559 --> 00:01:09,139
Fraser Kelton: Fraser.
23
00:01:09,644 --> 00:01:12,644
Nabeel Hyatt: And we are here to
talk about what we've been talking
24
00:01:12,644 --> 00:01:14,794
about in the world of AI mostly.
25
00:01:15,174 --> 00:01:17,794
I, I didn't really know we were signing
up for this every week when we signed up.
26
00:01:18,284 --> 00:01:19,604
They come fast, Fraser.
27
00:01:20,604 --> 00:01:23,074
I felt like I just talked to you
on the hallway chat last week
28
00:01:23,094 --> 00:01:25,954
about the launch of ChatGPT and
all of your stories around that.
29
00:01:26,319 --> 00:01:29,479
But at the same time, there's like
a million things to also talk about.
30
00:01:29,479 --> 00:01:32,249
So it is both, feels like these
C shows are coming all the time,
31
00:01:32,249 --> 00:01:34,229
but also too much to talk about.
32
00:01:35,369 --> 00:01:38,459
Fraser Kelton: I was thinking after
recording the last one, how nice it is to
33
00:01:38,469 --> 00:01:43,669
be able to talk in depth with you about
these topics and just laugh and explore.
34
00:01:43,699 --> 00:01:45,099
And so I'm good with it all.
35
00:01:45,379 --> 00:01:47,799
The, you know, here, here's
something last week, I think I
36
00:01:47,829 --> 00:01:49,279
said the line, where's Google.
37
00:01:49,984 --> 00:01:51,164
And we had an answer.
38
00:01:51,834 --> 00:01:52,694
Kind of, right?
39
00:01:52,694 --> 00:01:53,944
We kind of had an answer.
40
00:01:54,279 --> 00:01:54,829
Nabeel Hyatt: Yeah.
41
00:01:55,189 --> 00:01:59,879
Oh, I loved, we had our AI dinner this
week, and we had somebody from the
42
00:01:59,889 --> 00:02:04,556
Gemini team sitting at the dinner,
all night long, mouth shut, and I
43
00:02:04,556 --> 00:02:08,089
am like spouting off and spitting
all kinds of stuff about Google.
44
00:02:08,409 --> 00:02:11,139
And I don't know that I picked
up the smug look on his face.
45
00:02:13,034 --> 00:02:13,424
Fraser Kelton: I
46
00:02:13,492 --> 00:02:16,492
Nabeel Hyatt: yep, tomorrow morning
you're gonna see that Google's
47
00:02:16,522 --> 00:02:17,752
got a little bit of a comeback.
48
00:02:17,852 --> 00:02:19,942
Although I think it's a little
bit of a comeback, right?
49
00:02:20,012 --> 00:02:20,352
So...
50
00:02:20,352 --> 00:02:21,648
what is Gemini, Fraser?
51
00:02:22,403 --> 00:02:28,663
Fraser Kelton: Gemini gets announced
in summer by Google, where they say,
52
00:02:29,283 --> 00:02:35,763
we're training, A large language
model that's going to be amazing.
53
00:02:36,343 --> 00:02:41,133
And has now become a little bit of a meme
because they have talked and talked about
54
00:02:41,133 --> 00:02:44,033
what they're about to do and the rumors.
55
00:02:44,243 --> 00:02:46,113
Nabeel Hyatt: their answer to
OpenAI and Anthropic and the others.
56
00:02:46,213 --> 00:02:46,523
Yep.
57
00:02:47,023 --> 00:02:47,893
Fraser Kelton: Yep, that's right.
58
00:02:48,143 --> 00:02:51,643
And we're at that dinner, as you
say, and it's become a little bit
59
00:02:51,643 --> 00:02:53,123
of a joke as like, where are they?
60
00:02:53,343 --> 00:02:55,383
They've been talking about this
for months and nothing's here.
61
00:02:55,403 --> 00:02:57,693
And then, boom, we wake
up the next morning.
62
00:02:58,083 --> 00:03:02,443
I get a text from my friend
that says, surprise, and
63
00:03:02,443 --> 00:03:04,953
they've shipped some of Gemini.
64
00:03:05,113 --> 00:03:09,913
They haven't shipped the most capable
model and they shipped a lot of.
65
00:03:10,343 --> 00:03:15,053
Demo videos which we'll come back to
and talk about a little bit, but they've
66
00:03:15,143 --> 00:03:19,133
announced a something called Gemini Ultra,
which is, you can think of it as the
67
00:03:19,133 --> 00:03:24,093
equivalent of GPT 4,, then they've shipped
Gemini, I don't know, the terminology is
68
00:03:24,093 --> 00:03:29,403
crazy, Pro, Gemini Pro, and Gemini Ultra
is not available, Gemini Pro is available
69
00:03:29,433 --> 00:03:34,853
as of the day of the launch, and that
is comparable in performance, at least
70
00:03:34,853 --> 00:03:38,153
on the evals, the evaluations, to GPT 3.
71
00:03:38,903 --> 00:03:43,288
5, and then they have What I think
is probably maybe one of the more
72
00:03:43,288 --> 00:03:46,828
interesting things, they've then
distilled it all down to something
73
00:03:46,828 --> 00:03:53,351
called Nano, which can run on, and is
running on the Pixel, which is a pretty
74
00:03:53,351 --> 00:03:55,501
awesome thing for them to have done.
75
00:03:55,951 --> 00:04:02,021
Worth calling out, Gemini Pro, the
mid tier model that's equivalent to 3.
76
00:04:02,061 --> 00:04:09,151
5, is now live and integrated into
BARD, their ChatGPT like product.
77
00:04:09,631 --> 00:04:13,631
And it's, you know, one year after
the fact that that's been rolled out
78
00:04:13,641 --> 00:04:17,421
broadly by OpenAI, and so quite a
79
00:04:17,536 --> 00:04:20,796
Nabeel Hyatt: I have to admit,
Fraser, If this is nothing else,
80
00:04:20,866 --> 00:04:22,986
it reminded me that BARD exists.
81
00:04:23,666 --> 00:04:26,186
It was the first time
we've even mentioned BARD.
82
00:04:26,626 --> 00:04:30,606
Like that, that, how useful it is
in at least our workflows, right?
83
00:04:31,876 --> 00:04:33,176
Fraser Kelton: it has
reminded me that it's a thing.
84
00:04:33,176 --> 00:04:34,766
It still didn't encourage me to go use it.
85
00:04:34,766 --> 00:04:38,506
Like, I'm not sure what this is
an answer to in terms of ChatGPT.
86
00:04:38,536 --> 00:04:40,436
So, they now have a parity with the model?
87
00:04:40,496 --> 00:04:41,016
Okay.
88
00:04:41,166 --> 00:04:41,606
Right.
89
00:04:41,856 --> 00:04:42,346
Okay.
90
00:04:42,666 --> 00:04:43,046
Sure.
91
00:04:43,371 --> 00:04:45,691
Nabeel Hyatt: We were texting and
I've got some WhatsApp groups that
92
00:04:45,691 --> 00:04:49,321
were fiddling around and talking about
this when it launched in a Discord
93
00:04:49,321 --> 00:04:51,481
group of AI engineers and so on.
94
00:04:51,801 --> 00:04:56,611
And I got to say the evals of course
went from oh my god and then the
95
00:04:56,611 --> 00:05:02,591
demo videos, oh my god, to pretty
quickly, hey, is this a bunch of BS?
96
00:05:02,931 --> 00:05:03,961
Fraser Kelton: Yeah, that's right.
97
00:05:04,526 --> 00:05:08,936
I think the first thing to call out
is I think the general level of
98
00:05:08,936 --> 00:05:14,481
performance across Some eval benchmarks
give you a sense of where relative
99
00:05:14,541 --> 00:05:16,781
performance of the model might be, right?
100
00:05:16,781 --> 00:05:22,611
So GPT 4, far outperforms basically
anything up until the release of
101
00:05:22,621 --> 00:05:26,481
Ultra, and you could then have probably
high confidence that it's going to
102
00:05:26,481 --> 00:05:29,691
perform in a very different class
once you get it into production.
103
00:05:29,691 --> 00:05:32,811
I think the thing that we can say is
without having actually played with it,
104
00:05:32,821 --> 00:05:39,501
the evals suggest that Ultra, the large
one is directionally equivalent to GPT 4.
105
00:05:39,501 --> 00:05:41,741
And That, that's something, right?
106
00:05:41,741 --> 00:05:45,911
I think that the fact that we might
now have two GPT 4 type equivalent
107
00:05:45,911 --> 00:05:50,531
models in, in market proves that
somebody else other than OpenAI can
108
00:05:50,531 --> 00:05:52,191
do something on this magnitude and,
109
00:05:52,336 --> 00:05:53,536
Nabeel Hyatt: not willing say that yet.
110
00:05:53,966 --> 00:05:56,276
That's I'm not absolutely crap.
111
00:05:56,496 --> 00:06:00,606
Like I'm willing to say that
soon and, and, but, but,
112
00:06:00,941 --> 00:06:01,731
Fraser Kelton: Yeah, that's fair.
113
00:06:01,823 --> 00:06:07,403
, Nabeel Hyatt: the MMLU benchmarks are
comparing a five shot reported GPT 4
114
00:06:07,743 --> 00:06:11,643
benchmark to a 32 shot, I think it was.
115
00:06:12,103 --> 00:06:14,153
If I remember correctly, UltraReport.
116
00:06:14,483 --> 00:06:17,973
It's just not, those are
not comparable at all.
117
00:06:18,123 --> 00:06:22,053
Frankly, the stuff that everybody would
probably normally out of the box use
118
00:06:22,053 --> 00:06:26,203
this product for, the average consumer,
those all, at least the evals we can
119
00:06:26,203 --> 00:06:28,793
see, seem comparable and seem fine.
120
00:06:28,823 --> 00:06:33,863
So let's not overstate some kind
of eval problem where there isn't
121
00:06:33,863 --> 00:06:35,503
one, least in this specific case.
122
00:06:35,883 --> 00:06:39,383
But, the kind of like general
math cases in particular seemed
123
00:06:39,543 --> 00:06:41,423
a little cooked and unfortunate.
124
00:06:41,453 --> 00:06:45,183
And I think, frankly, when they're
speaking to a highly technical audience,
125
00:06:45,583 --> 00:06:49,193
I'm not sure why they were doing that.
126
00:06:49,213 --> 00:06:50,303
Like, I, I don't, I don't
127
00:06:50,353 --> 00:06:50,883
Fraser Kelton: Yeah, yeah, that,
128
00:06:51,423 --> 00:06:52,133
Nabeel Hyatt: it's not them.
129
00:06:52,143 --> 00:06:56,953
It just felt like a hiding the ball when
clearly even with code, like the code
130
00:06:56,963 --> 00:06:58,333
generation at a Gemini is quite good.
131
00:06:58,383 --> 00:06:59,743
I just don't understand
why they even did it.
132
00:06:59,873 --> 00:07:02,303
Fraser Kelton: I think if you strip
it away and you look at a comparable
133
00:07:02,313 --> 00:07:07,753
measure of five prompt, five examples
in the prompt, GPT 4 outperforms
134
00:07:07,753 --> 00:07:10,213
it, but it only outperforms it
by a couple of percentage points.
135
00:07:10,433 --> 00:07:12,683
But again, like, I think that
once we get our hands on it, my
136
00:07:12,703 --> 00:07:16,983
guess is that we will see that
this is directionally similar ish.
137
00:07:17,923 --> 00:07:19,983
Nabeel Hyatt: let's finish on this rant
and then I actually want to talk about
138
00:07:19,983 --> 00:07:21,573
the thing I really loved about Gemini.
139
00:07:21,903 --> 00:07:27,343
The part that I think was unfortunate and
I hope no startup takes away from is that
140
00:07:27,463 --> 00:07:30,133
everybody gets excited because there's
an announcement from Google that there's
141
00:07:30,193 --> 00:07:35,593
finally Gemini out and within a few
hours, it just dawns on everybody that,
142
00:07:35,983 --> 00:07:40,853
okay, Gemini is not really here because
it's just pro, we don't have an exact
143
00:07:40,853 --> 00:07:46,568
release date still, this, this graph of
evals is, Cooked in a couple places and
144
00:07:46,568 --> 00:07:51,358
the rest of them are still comparing to
GPT 4 back in March eval tests and only
145
00:07:51,358 --> 00:07:56,788
beat them by three five percent and since
then GPT 4 has gotten a lot better and
146
00:07:56,938 --> 00:07:58,438
By the way, evals don't really matter.
147
00:07:58,538 --> 00:08:05,443
So like that's the negative side the
positive side is So many companies
148
00:08:05,783 --> 00:08:13,473
absolutely fail to show their
product in action in unique and novel
149
00:08:13,473 --> 00:08:14,763
ways that pull the heartstrings.
150
00:08:15,243 --> 00:08:19,933
And I think they, if you haven't, if you
guys are listening to this on the podcast
151
00:08:19,933 --> 00:08:24,163
and you haven't yet watched the Gemini,
Demo videos go on YouTube, you should
152
00:08:24,193 --> 00:08:29,813
take a look there's some wonderful craft
work that is not too pretentious and not
153
00:08:29,813 --> 00:08:34,663
too overblown, but just, in fact, it's
very clean and simple, except for the
154
00:08:34,663 --> 00:08:38,583
YouTuber Roper, Roper video, that's kind
of overblown, but the rest of it is very
155
00:08:38,593 --> 00:08:43,533
simple, good demos of showing various ways
that this product can be put into use that
156
00:08:43,533 --> 00:08:47,133
an average consumer, which might be the
aim of this announcement, is much more
157
00:08:47,133 --> 00:08:51,703
the average consumer You know, or Google
stockholder then it is aimed at engineers.
158
00:08:51,703 --> 00:08:52,883
I mean, didn't you love those demo videos?
159
00:08:52,883 --> 00:08:53,443
You watch them, right?
160
00:08:55,073 --> 00:08:58,613
Fraser Kelton: Oh, yeah, but
I, I don't know if You're
161
00:08:58,613 --> 00:09:00,293
getting me riled up here, man.
162
00:09:00,493 --> 00:09:06,213
I had shared the one demo video where
they show it, the three cup technique
163
00:09:06,213 --> 00:09:09,783
where there's one ball underneath the
cup and then they shuffle the cups around
164
00:09:10,083 --> 00:09:12,363
and it tells them which cup it's under.
165
00:09:12,703 --> 00:09:16,443
Because this, It's a multi modal
model that has been trained from
166
00:09:16,443 --> 00:09:18,423
the start for multi modality.
167
00:09:18,493 --> 00:09:24,113
So, it's accounting for text and image
and, and video and, and audio right
168
00:09:24,113 --> 00:09:28,093
from the, the start of pre training
steps, rather than, you know, kind
169
00:09:28,093 --> 00:09:29,843
of bridging that in after the fact.
170
00:09:30,353 --> 00:09:33,853
And this, this demo video is one of
the best demo videos I've ever seen.
171
00:09:35,173 --> 00:09:35,433
Nabeel Hyatt: Yep.
172
00:09:35,743 --> 00:09:38,803
Fraser Kelton: And then, and then
it comes out that it's all fake!
173
00:09:39,368 --> 00:09:39,398
Nabeel Hyatt: Right.
174
00:09:39,973 --> 00:09:43,483
Fraser Kelton: So in the
demo video, go watch it.
175
00:09:43,493 --> 00:09:44,613
It's, it's remarkable.
176
00:09:44,613 --> 00:09:48,773
There's a a set of hands and some
cups and a ball and the demo says,
177
00:09:48,773 --> 00:09:51,613
okay, now I'm going to put the
cup under here and move it around.
178
00:09:51,913 --> 00:09:52,553
Where is it?
179
00:09:52,853 --> 00:09:57,953
And in, in real time, the Gemini
voice comes back and says, the cups,
180
00:09:58,003 --> 00:10:02,573
I don't know, under the left side,
and the man lifts up the cup on the
181
00:10:02,573 --> 00:10:03,883
left and sure enough, there it is.
182
00:10:04,058 --> 00:10:04,338
Nabeel Hyatt: Right.
183
00:10:04,923 --> 00:10:07,858
Fraser Kelton: And then, People
have discovered shortly thereafter
184
00:10:07,858 --> 00:10:12,668
the fact that this is basically the
equivalent of like a simulated scene
185
00:10:12,678 --> 00:10:17,258
where they had to prompt Engineer
along the way that said, just turns
186
00:10:17,258 --> 00:10:19,168
out to be they fixed it in post.
187
00:10:19,938 --> 00:10:21,068
Nabeel Hyatt: Yeah, that's really
188
00:10:21,198 --> 00:10:23,828
Fraser Kelton: Now, I will compare that,
I will, yeah, I will compare that to
189
00:10:23,838 --> 00:10:27,438
Greg's demo of GPT 4, which was all live.
190
00:10:28,293 --> 00:10:31,103
without any editing and in real time.
191
00:10:31,673 --> 00:10:34,973
And that is the, I think that is the
way that you introduce your products.
192
00:10:36,463 --> 00:10:37,133
It's brave.
193
00:10:37,203 --> 00:10:37,863
It's brave, right?
194
00:10:37,863 --> 00:10:38,493
You're doing it live.
195
00:10:38,493 --> 00:10:39,223
It could fail.
196
00:10:39,283 --> 00:10:42,793
And you, you're owning it because you have
so much confidence in what you've built.
197
00:10:44,543 --> 00:10:47,343
Nabeel Hyatt: uh, I don't,
I can't entirely disagree.
198
00:10:47,743 --> 00:10:50,993
You know, it is as much as I love
the product demo, what I would
199
00:10:50,993 --> 00:10:55,693
have loved was a demo like that
and then a how to behind it.
200
00:10:56,103 --> 00:10:59,723
You know, I think it's okay to make
things that are somewhat polished and
201
00:10:59,723 --> 00:11:04,103
beautiful, but it would be great if it if
it turned out that they reveal the covers.
202
00:11:04,103 --> 00:11:06,093
And by the way, that's no implication on
203
00:11:06,183 --> 00:11:08,283
Fraser Kelton: think, I think,
I think Paula should be.
204
00:11:08,443 --> 00:11:08,793
Nabeel Hyatt: separate
205
00:11:09,118 --> 00:11:09,678
Fraser Kelton: Yeah, yeah.
206
00:11:10,568 --> 00:11:13,988
Polished and beautiful is good, but I
think it has to be grounded in reality.
207
00:11:13,998 --> 00:11:17,418
This is a case where they edited
across two different dimensions, and
208
00:11:17,418 --> 00:11:20,948
people came away with a dramatically
different perspective of what
209
00:11:20,948 --> 00:11:22,268
it is that's actually happening.
210
00:11:23,058 --> 00:11:27,558
And so I think it's unfair to not
allow anybody to use the product, and
211
00:11:27,558 --> 00:11:32,228
then introduce it with a demo video
that basically obfuscates the truth
212
00:11:32,228 --> 00:11:33,458
from two different perspectives.
213
00:11:33,818 --> 00:11:34,928
That's just weird.
214
00:11:35,166 --> 00:11:41,801
I think that this is An excellent
moment for everybody at Google because
215
00:11:41,831 --> 00:11:46,751
they've shipped, or at least they've
partially shipped and I think that
216
00:11:46,881 --> 00:11:49,471
you, they've taken the first step.
217
00:11:49,681 --> 00:11:50,921
No, they've taken the first step.
218
00:11:50,921 --> 00:11:51,591
No, no, no, no, no.
219
00:11:51,591 --> 00:11:53,851
Like, listen, this, the
race was set off a year ago.
220
00:11:53,851 --> 00:11:55,361
They, they did this in a year.
221
00:11:56,081 --> 00:12:00,971
foR a company of their size, this
is, this is not to be, scoffed at.
222
00:12:01,321 --> 00:12:02,801
Think about all of the complexities.
223
00:12:02,841 --> 00:12:05,731
They had to live through smashing
together brain and deep mind.
224
00:12:06,121 --> 00:12:10,251
They had to go and find, like, the
path through all the bureaucracy and
225
00:12:10,251 --> 00:12:14,431
politics to get an aggregate amount of
compute required to be able to do this.
226
00:12:14,731 --> 00:12:17,071
They had to solve all of the
different challenges, both
227
00:12:17,081 --> 00:12:20,111
technically and politically,
within the organization to do this.
228
00:12:20,726 --> 00:12:21,466
And it's out.
229
00:12:21,576 --> 00:12:25,866
And I think that, that itself is
something that should be respected.
230
00:12:25,866 --> 00:12:28,736
And, we can squabble over the evals
and stuff, and the proof will be
231
00:12:28,736 --> 00:12:29,776
when we actually get to use it.
232
00:12:30,336 --> 00:12:32,416
But it looks, it looks directionally good.
233
00:12:32,666 --> 00:12:33,506
And that's something.
234
00:12:33,556 --> 00:12:38,386
You know, I, I feel like I also
did a good job playing your role.
235
00:12:38,386 --> 00:12:41,526
You usually are the one who's
clairvoyant in, in many respects.
236
00:12:41,696 --> 00:12:46,961
And at that dinner, My guess was that
Google was going to come roaring back.
237
00:12:47,056 --> 00:12:47,536
Nabeel Hyatt: your quote.
238
00:12:47,551 --> 00:12:47,901
Fraser Kelton: that.
239
00:12:48,641 --> 00:12:48,971
Yep.
240
00:12:49,201 --> 00:12:55,161
Because they are The best at, at the
technical pieces that have to come
241
00:12:55,161 --> 00:12:57,711
together for training a model like this.
242
00:12:58,081 --> 00:13:01,661
And if you look at some of the stats,
I forget what the stats called, but
243
00:13:01,661 --> 00:13:06,831
for basically the measure of efficiency
when they were training Ultra, I think
244
00:13:06,831 --> 00:13:12,631
they reached some level of like 90%, 97
percent efficiency in the utilization
245
00:13:12,631 --> 00:13:16,561
of their hardware when training Ultra,
which is just a remarkable achievement.
246
00:13:17,041 --> 00:13:20,051
And this is the area where we
should expect them to be great.
247
00:13:20,556 --> 00:13:24,136
And I think they have shown
that they can be great, if not,
248
00:13:24,136 --> 00:13:25,256
you know, on a year's delay.
249
00:13:26,021 --> 00:13:30,131
And then I think the real challenge for
them is going to be how they bring the
250
00:13:30,131 --> 00:13:35,561
great technical piece into their two
products that are now the two front war.
251
00:13:35,611 --> 00:13:39,961
BARD, as you laughed earlier, is a
thing, and then the second one is they're
252
00:13:39,961 --> 00:13:44,461
going to have to find the right way to
integrate these technologies into search.
253
00:13:44,971 --> 00:13:47,841
And that's going to be an
excruciatingly hard challenge because
254
00:13:47,841 --> 00:13:53,021
it's orthogonal to the business
model that is search historically.
255
00:13:53,936 --> 00:13:58,966
Nabeel Hyatt: Look, I, I, I, I say if I
were, I'm not to speak for Demi or Eli
256
00:13:58,966 --> 00:14:01,596
or anybody else over the Google team and
what they're doing, I'm sure they know
257
00:14:01,596 --> 00:14:05,626
a lot more about how to do this than
we do, but, but I do think their role
258
00:14:06,196 --> 00:14:10,966
or their way to fit if I were trying
to navigate this space and I was Google
259
00:14:10,996 --> 00:14:15,636
was to take almost an Apple approach
to this given their scale and size.
260
00:14:15,666 --> 00:14:19,336
And what I mean by that is I, I always
joke people think of Apple as innovative,
261
00:14:19,416 --> 00:14:23,856
and I think of Apple as a last mover
advantage, not first advantage company.
262
00:14:24,236 --> 00:14:27,956
They have had a few moments in their
life where they have been very early,
263
00:14:28,256 --> 00:14:30,926
but in many ways it's taking the
things that are already out there,
264
00:14:30,926 --> 00:14:32,246
that are already somewhat proven.
265
00:14:32,606 --> 00:14:37,286
And then making them so polished
and so well thought through that
266
00:14:37,316 --> 00:14:40,856
you just, they feel like they
fit into your life immediately.
267
00:14:40,876 --> 00:14:44,846
And, you know, they were not the first
to release little notification widgets
268
00:14:44,846 --> 00:14:46,686
on a smartphone that was Android.
269
00:14:47,016 --> 00:14:50,146
They're not the first to do
wireless charging that was Android.
270
00:14:50,566 --> 00:14:54,236
Go way back, they, they, they took a lot
of their early ideas from Xerox PARC.
271
00:14:54,466 --> 00:14:57,966
And so if Google wants to the game
of being last, because it's really
272
00:14:57,966 --> 00:15:00,926
gonna work and work reliably,
there is a game to be played there.
273
00:15:01,206 --> 00:15:03,246
Because I don't think OpenAI
wants to play that game, to be
274
00:15:03,246 --> 00:15:04,396
honest, and you can't play both.
275
00:15:04,406 --> 00:15:08,631
I think right now, in many ways,
OpenAI Playing closer to the Android
276
00:15:08,651 --> 00:15:13,761
or Samsung, if we're going to use
smartphone analogy model where they are
277
00:15:13,771 --> 00:15:15,651
riding the front edge of development.
278
00:15:15,731 --> 00:15:19,291
It drives them crazy if somebody else gets
something out new ahead of them and they
279
00:15:19,291 --> 00:15:21,531
want to play the front edge of the game.
280
00:15:21,801 --> 00:15:24,891
I think both can be successful
strategies as long as the thing that
281
00:15:24,901 --> 00:15:30,496
Google eventually releases As you get
to Ultra is worth the time and energy.
282
00:15:30,526 --> 00:15:33,916
That's the, you know, the, like,
it's worth weight that that's the
283
00:15:33,916 --> 00:15:36,466
thing that will be left to find out.
284
00:15:37,736 --> 00:15:38,631
Fraser Kelton: We shall see.
285
00:15:38,631 --> 00:15:40,356
We shall see.
286
00:15:40,516 --> 00:15:41,716
Nabeel Hyatt: look, not, it's hard.
287
00:15:42,016 --> 00:15:44,471
Of course, the cup example is tough.
288
00:15:44,951 --> 00:15:47,651
You know, these, these
prompts are hard to shape.
289
00:15:47,861 --> 00:15:50,681
It's hard to get the little alien
inside my computer to understand
290
00:15:50,831 --> 00:15:51,971
that I'm playing a Cup game.
291
00:15:52,471 --> 00:15:58,731
Fraser Kelton: Issue with the cup thing is
that they imply that they lead the viewer
292
00:15:58,731 --> 00:16:00,481
to believe that there's zero prompting.
293
00:16:00,481 --> 00:16:02,071
It's not that prompting's hard.
294
00:16:02,546 --> 00:16:07,366
The way that the video is presented
suggests that there's zero prompting and
295
00:16:07,366 --> 00:16:14,356
that there's this real time multimodal
model watching you and, and the cups
296
00:16:14,376 --> 00:16:16,826
and inferring with real reasoning
297
00:16:16,896 --> 00:16:22,486
and there's somewhat complex prompting
happening at each step behind the
298
00:16:22,486 --> 00:16:25,656
scenes, which is what I think has,
has caused everybody to be really
299
00:16:25,656 --> 00:16:29,096
disappointed in, in a decision to do that.
300
00:16:29,431 --> 00:16:33,421
Nabeel Hyatt: The last thing I'd say
on Gemini, is that, is a lot of this
301
00:16:33,441 --> 00:16:36,031
consternation would have been solved.
302
00:16:36,531 --> 00:16:40,771
If they would have released APIs for
developers to build with at the same time.
303
00:16:41,621 --> 00:16:45,311
And I think, I think we've, supposedly
gonna come out December 13th.
304
00:16:45,311 --> 00:16:47,111
I don't know if Ultra's
gonna be involved in that.
305
00:16:47,631 --> 00:16:53,901
But in a world of AI movement that's
five, seven days from now, I mean
306
00:16:55,031 --> 00:16:59,861
op open AI fires a CEO and goes
through a, a coup attempt then gets
307
00:16:59,861 --> 00:17:01,271
back a CEO in that time period.
308
00:17:01,271 --> 00:17:02,711
Like a lot, a lot happens in five days.
309
00:17:04,101 --> 00:17:05,291
Fraser Kelton: lot happens in five days.
310
00:17:05,971 --> 00:17:10,301
Nabeel Hyatt: and so, like, I'm sure
this was PR oriented, they wanted
311
00:17:10,301 --> 00:17:14,791
people to watch a Mark Roper video and
so on and so forth before developers
312
00:17:14,791 --> 00:17:16,251
had control of the narrative.
313
00:17:16,501 --> 00:17:21,791
But it's really unfortunate on,
because I think it creates a sense
314
00:17:22,101 --> 00:17:24,611
of doubt when it shouldn't be.
315
00:17:24,651 --> 00:17:28,661
It should just be high fives,
hand clapping, and playground.
316
00:17:29,051 --> 00:17:32,441
And I think that was a little bit
of a PR mishap, but we'll, we'll
317
00:17:32,441 --> 00:17:34,246
see what happens in, in seven days.
318
00:17:35,701 --> 00:17:36,001
Fraser Kelton: Yep.
319
00:17:36,321 --> 00:17:36,691
Yep.
320
00:17:37,251 --> 00:17:39,251
And anyway, prompt, prompting is hard.
321
00:17:40,041 --> 00:17:46,331
We, we talked last time about efforts in
ChatGPT to simplify the complexity and
322
00:17:46,331 --> 00:17:51,241
ambiguity of prompting specifically with
DALI, where they want to take the, the
323
00:17:51,241 --> 00:17:55,931
three words that somebody who's unfamiliar
or lazy with their, their directions wants
324
00:17:55,931 --> 00:18:00,641
to do and how, if you're a power user
such as yourself, it's just suboptimal.
325
00:18:02,036 --> 00:18:02,856
Nabeel Hyatt: Yeah, it's,
326
00:18:02,936 --> 00:18:03,426
Fraser Kelton: What, what
327
00:18:03,746 --> 00:18:07,436
Nabeel Hyatt: I saw my great example
of that this week because I'm going
328
00:18:07,436 --> 00:18:15,286
to keep banging the drum that I think
Prompt Engineering is a real skill and
329
00:18:15,286 --> 00:18:19,106
will be a career for quite some time
and that actually Prompt Engineering
330
00:18:19,566 --> 00:18:22,566
is going to become more of a language.
331
00:18:23,136 --> 00:18:27,116
Before it eventually gets abstracted out,
but our ability to totally abstract it
332
00:18:27,116 --> 00:18:29,786
out while we're still trying to figure
out what these non deterministic models
333
00:18:29,786 --> 00:18:34,646
can actually do is very, is very far
away, maybe years and years away before
334
00:18:34,646 --> 00:18:36,246
we can build these systems on top of them.
335
00:18:36,706 --> 00:18:40,276
I got handed a wonderful
example of this today, I sent
336
00:18:40,276 --> 00:18:42,026
it your way which is Anthropic.
337
00:18:42,671 --> 00:18:46,481
Which we are investors in by disc
full disclosure to everybody.
338
00:18:46,781 --> 00:18:50,401
Anthropic has a competitive model
to OpenAI and Gemini called Claude.
339
00:18:51,071 --> 00:18:56,141
And there is a well known research
problem and execution problem
340
00:18:56,161 --> 00:18:57,961
in these long context windows.
341
00:18:58,351 --> 00:19:03,151
of AI where I'm asking it to, for
instance, look at an entire PDF or look
342
00:19:03,151 --> 00:19:08,141
at a long chat and find some phrase
or find some word inside of that, did
343
00:19:08,141 --> 00:19:11,671
Sam talk about the beach or not, or
what's the best cooking technique?
344
00:19:11,671 --> 00:19:15,196
And it turns out that if it's mentioned
in the beginning of a doc, Or at the
345
00:19:15,196 --> 00:19:20,236
end of a doc every model, all of these
LLMs show that they can find information
346
00:19:20,236 --> 00:19:24,296
at the beginning of a doc and at
the end of the doc faster and more
347
00:19:24,296 --> 00:19:26,226
reliably than in the middle of the doc.
348
00:19:26,226 --> 00:19:29,396
The middle, it's the missing middle
it just sometimes misses stuff.
349
00:19:29,911 --> 00:19:34,251
Well, this has been a quote unquote
known thing, of which people have been
350
00:19:34,251 --> 00:19:37,661
trying to do all kinds of different
engineering techniques chunking the
351
00:19:37,661 --> 00:19:41,481
data into smaller bits, and then
there's like comparison evals against
352
00:19:41,481 --> 00:19:44,491
different models at different times,
and how they perform on the missing
353
00:19:44,501 --> 00:19:45,471
middle, and so on and so forth.
354
00:19:45,971 --> 00:19:50,381
And then it turns out that Claude
releases a paper today called Claude 2.
355
00:19:50,381 --> 00:19:55,081
1 Prompting, that says, well,
what did it say, Fraser?
356
00:19:55,601 --> 00:20:00,476
What's the crazy, deep Engineering
technique that, that scientists
357
00:20:00,476 --> 00:20:04,786
have figured out in order to finally
unlock moving from 23 percent
358
00:20:04,786 --> 00:20:08,896
missing middle accuracy up to 97
percent missing middle accuracy.
359
00:20:10,496 --> 00:20:16,626
Fraser Kelton: mean, they add a
line to the prompt that says here
360
00:20:16,626 --> 00:20:21,046
is the most relevant sentence in the
context, which basically nudges the
361
00:20:21,046 --> 00:20:27,796
prompt to go and pull out the relevant
sentence for the question at hand.
362
00:20:28,546 --> 00:20:29,696
And that's the bump,
363
00:20:30,676 --> 00:20:30,856
Nabeel Hyatt: Yeah.
364
00:20:30,856 --> 00:20:33,046
I mean, that's insane.
365
00:20:33,096 --> 00:20:36,286
This afternoon, I'm going to do
some work and figure out whether
366
00:20:36,286 --> 00:20:38,036
this works in GPT 4 as well.
367
00:20:38,036 --> 00:20:40,416
I didn't have a chance to come up,
but, but it'd be really interesting.
368
00:20:40,751 --> 00:20:43,841
If both of the results, don't you
think, Fraser, would be interesting?
369
00:20:43,841 --> 00:20:49,421
Like, if that phrasing does work in
GPT 4 as well then it's like, oh, you
370
00:20:49,421 --> 00:20:52,651
just figured out a new incantation,
kind of like we found out that if
371
00:20:52,651 --> 00:20:56,441
you, you tell a model you're going
to tip it to do something, I'll give
372
00:20:56,441 --> 00:20:58,081
you 20 if you answer this question.
373
00:20:58,151 --> 00:21:00,781
They, it tends to perform better in
that question, even though, of course,
374
00:21:00,781 --> 00:21:02,291
you're not giving the model 20.
375
00:21:02,681 --> 00:21:03,951
Another crazy incantation.
376
00:21:04,476 --> 00:21:08,426
And then if, so one, if it worked,
that's interesting, and it tells us
377
00:21:08,466 --> 00:21:12,556
more, a little tip into the language
of how to use these models for large
378
00:21:12,556 --> 00:21:16,326
context windows, which is particularly
valuable for Claude, because it
379
00:21:16,336 --> 00:21:18,176
has such a large context window.
380
00:21:18,176 --> 00:21:20,026
You can just put lots and
lots of text in there.
381
00:21:21,326 --> 00:21:24,596
If it doesn't work in other models,
that's even more interesting, right?
382
00:21:25,476 --> 00:21:29,686
Now, for all these companies that are
trying to say, don't worry, I'm building
383
00:21:30,006 --> 00:21:33,636
middleware dev tools that let you switch
in and out models arbitrarily, with
384
00:21:33,666 --> 00:21:35,466
the, like, like they're all the same.
385
00:21:36,006 --> 00:21:36,686
That they're not.
386
00:21:37,141 --> 00:21:40,841
Fraser Kelton: I would be so
surprised if they're the same today.
387
00:21:40,851 --> 00:21:43,681
And the difference is only
going to grow over time.
388
00:21:44,681 --> 00:21:47,711
There's a whole bunch of
different things going on here.
389
00:21:47,721 --> 00:21:51,801
This is a quote unquote eval
called Needle in the Haystack.
390
00:21:51,966 --> 00:21:56,346
And I think that, yet again, this is
a situation where the eval doesn't
391
00:21:56,356 --> 00:22:00,456
measure anything proximately close
to what happens in, in production
392
00:22:00,456 --> 00:22:05,606
for people who are building products,
right, is if you, if you insert into
393
00:22:05,606 --> 00:22:09,136
the middle of some financial set of
documents, a single sentence that
394
00:22:09,136 --> 00:22:14,126
says Dolores Park is the best place
to have a drink in San Francisco and
395
00:22:14,126 --> 00:22:15,586
then the model can't find it, right.
396
00:22:16,076 --> 00:22:20,306
I'm not sure that that is reflective
of any real world problem that
397
00:22:20,306 --> 00:22:21,456
people are trying to solve with this.
398
00:22:21,496 --> 00:22:27,946
The other thing that is so interesting
here is the Anthropic model they
399
00:22:27,946 --> 00:22:31,716
hypothesized performed poorly when
people were running it through that,
400
00:22:31,736 --> 00:22:38,116
that eval because they've trained their
models to cut down on inaccuracies
401
00:22:38,146 --> 00:22:41,356
specifically for these types of use cases.
402
00:22:41,671 --> 00:22:42,001
Right.
403
00:22:42,001 --> 00:22:46,961
And so they basically have trained the
model to say, okay, if something feels
404
00:22:46,961 --> 00:22:51,441
completely orthogonal from the rest of
the documents, it's probably not something
405
00:22:51,441 --> 00:22:53,431
that's, that's important and or accurate.
406
00:22:53,431 --> 00:22:54,561
It's probably not even accurate.
407
00:22:54,561 --> 00:22:56,061
So, so just ignore that.
408
00:22:56,531 --> 00:22:56,831
Right.
409
00:22:56,851 --> 00:23:01,161
And then the eval is basically testing for
the model performance to do exactly that.
410
00:23:01,314 --> 00:23:04,014
Nabeel Hyatt: Yeah, but I want to get back
to the point that I wanted to make, which
411
00:23:04,264 --> 00:23:11,574
this is six words that you put into a
prompt if you were trying to do long text
412
00:23:11,574 --> 00:23:18,554
retrieval, text from a long context window
that, that does boost performance and.
413
00:23:19,609 --> 00:23:24,899
I don't know, it just, it tells
me how naive we are collectively
414
00:23:24,919 --> 00:23:26,399
about how to use these models.
415
00:23:27,809 --> 00:23:33,046
Um, Emma, who's an AI hacker in
residence for us, she did a benchmark
416
00:23:33,056 --> 00:23:38,096
on some internal tools that she was
using on Glaive versus GPT and she
417
00:23:38,096 --> 00:23:42,696
found that without prompt engineering,
Glaive did better than GPT 4 probably
418
00:23:42,696 --> 00:23:46,096
because it's trained only on highly
quality synthetic data and so on and
419
00:23:46,096 --> 00:23:50,771
so forth, but that if you added the
sentence, You're a well known historian
420
00:23:50,961 --> 00:23:58,031
to the prompt for both Glaive and
GPT that then 4 suddenly did better.
421
00:23:59,031 --> 00:24:03,511
And it's just another good testament
to you just need to find the magic five
422
00:24:03,741 --> 00:24:09,291
incantation words to suddenly make your
business be able to move into prod.
423
00:24:09,291 --> 00:24:10,526
That's
424
00:24:10,571 --> 00:24:11,731
Fraser Kelton: that is so crazy.
425
00:24:11,731 --> 00:24:16,261
They could just even try to internalize
that in the brittleness of these models.
426
00:24:16,271 --> 00:24:20,861
You're going to have, you are,
you're a well known historian and
427
00:24:20,861 --> 00:24:22,221
then it finally outperforms it.
428
00:24:22,751 --> 00:24:25,641
I don't know how this gets solved,
like, other than at the system level.
429
00:24:25,821 --> 00:24:28,701
Nabeel Hyatt: But, in the very early
days of video games, you worked at the
430
00:24:28,701 --> 00:24:31,041
assembly level to make a video game.
431
00:24:31,041 --> 00:24:34,851
In the early days of computer
graphics, before we got to
432
00:24:34,901 --> 00:24:37,111
engines, we had to work in code.
433
00:24:37,611 --> 00:24:41,621
And we will eventually get
to lots of GUIs and engines.
434
00:24:41,621 --> 00:24:45,081
And we've actually talked before
about how prompt engineering is not
435
00:24:45,091 --> 00:24:47,901
how every average user to use these
products and people are bad at English.
436
00:24:48,336 --> 00:24:51,566
But at the same time, if you need
performance, you need to be at bare
437
00:24:51,566 --> 00:24:56,186
metal as close to the model as possible
for probably a little while until
438
00:24:56,376 --> 00:24:59,376
everything really, really works.
439
00:24:59,706 --> 00:25:04,196
And at that point, when it's automatic
when we've made our 50th first person
440
00:25:04,196 --> 00:25:07,176
shooter, That's in production and making
hundreds of millions of dollars a year.
441
00:25:07,406 --> 00:25:09,886
Then we can talk about making an engine
for making first person shooters.
442
00:25:09,886 --> 00:25:13,306
And when we get in game parlance,
you get Unreal and and you get
443
00:25:13,306 --> 00:25:14,546
Unity and so on and so forth.
444
00:25:14,546 --> 00:25:19,076
But it feels like we are still in,
you know, how was Pac Man built?
445
00:25:20,286 --> 00:25:22,406
Fraser Kelton: I don't, I don't want
to open up this can of worms, but
446
00:25:22,446 --> 00:25:25,186
don't you think that is a measure of.
447
00:25:25,251 --> 00:25:28,901
The model's capabilities
not being strong enough.
448
00:25:28,951 --> 00:25:29,821
Nabeel Hyatt: We know that.
449
00:25:30,251 --> 00:25:36,771
Just like in, like, early programming
days, you were wrangling with
450
00:25:36,771 --> 00:25:38,901
the amount of memory on computer.
451
00:25:39,111 --> 00:25:41,911
You've got a thousand twenty four
bytes of memory, and you're just
452
00:25:41,921 --> 00:25:45,091
trying to make a spreadsheet work in
this tiny little bit of memory, and
453
00:25:45,091 --> 00:25:51,071
you need every little squeezing bit of
thing just to make it operate, right?
454
00:25:51,181 --> 00:25:56,471
And isn't about speed the way it often
was back then but it's still about
455
00:25:57,071 --> 00:25:58,581
whether the job can be done well or not.
456
00:25:59,321 --> 00:26:03,211
And, and yeah, we'll need to be very
close to bare metal, until all these
457
00:26:03,211 --> 00:26:09,231
things run perfectly all the time, and be
about efficiency and cost and abstraction
458
00:26:09,346 --> 00:26:10,226
Fraser Kelton: what Yep
459
00:26:10,701 --> 00:26:11,395
Nabeel Hyatt: the rest of that stuff.
460
00:26:11,525 --> 00:26:17,295
If five words for your specific use
case going to increase performance,
461
00:26:17,585 --> 00:26:23,132
then I don't know if I'm Fidelity or
Procter Gamble or Figma, or InstaWork
462
00:26:23,219 --> 00:26:28,749
or another startup , like, I don't know
that I'm willing to take the future of
463
00:26:28,749 --> 00:26:36,219
my business's effectiveness in AI, which
could twist and turn on five words,
464
00:26:37,149 --> 00:26:37,599
Fraser Kelton: right.
465
00:26:37,839 --> 00:26:38,319
Yeah.
466
00:26:38,469 --> 00:26:40,879
Nabeel Hyatt: And who's going
to figure out those five words
467
00:26:40,879 --> 00:26:41,949
for your specific business?
468
00:26:41,959 --> 00:26:46,649
It's certainly not going to be
some random middleware company.
469
00:26:46,669 --> 00:26:49,329
It's going to be you because you
care about your company and you've
470
00:26:49,699 --> 00:26:52,552
hacked away at it or had a prompt
engineer who's hacking away at it.
471
00:26:52,552 --> 00:26:57,712
You've really worked it to try and
figure out how to wrangle this alien
472
00:26:57,892 --> 00:26:59,272
to do the work that you want it to do.
473
00:27:00,280 --> 00:27:04,430
Fraser Kelton: The point here is that
the brittleness of these models today
474
00:27:04,450 --> 00:27:08,020
across different use cases suggests that
you're going to want to have people,
475
00:27:08,040 --> 00:27:10,178
quote unquote, like, working at the metal.
476
00:27:10,178 --> 00:27:10,926
Yep.
477
00:27:11,240 --> 00:27:14,480
Nabeel Hyatt: analogy I would use is,
in the really early days of the web,
478
00:27:14,712 --> 00:27:20,472
there was almost immediately
A bunch of WYSIWYG web page
479
00:27:20,472 --> 00:27:22,242
developer software companies.
480
00:27:22,282 --> 00:27:27,582
There were 30 startups that were like, you
don't have to learn CSS and HTML just use
481
00:27:27,582 --> 00:27:32,642
our little product, and you can get your
web page out without tweaking it at all.
482
00:27:33,072 --> 00:27:36,647
And, you know, if we fast
forward 10 years, of course,
483
00:27:36,647 --> 00:27:37,777
there's many of those companies.
484
00:27:37,787 --> 00:27:42,002
Today, there's Squarespace and
webflow, a bunch of these companies
485
00:27:42,002 --> 00:27:45,842
that are helping everybody from a
restaurant up the street all the
486
00:27:45,842 --> 00:27:48,262
way to complex enterprise websites.
487
00:27:48,572 --> 00:27:57,352
But in the early days, As a good example,
prior to CSS, the way that you laid
488
00:27:57,362 --> 00:28:01,412
things out on a webpage, so the way I got
something to show up on the right hand
489
00:28:01,412 --> 00:28:05,912
side of a webpage versus the left hand
side of a webpage, was to use a kludge
490
00:28:06,472 --> 00:28:11,222
which is to build a table, kind of like
a spreadsheet on that webpage in HTML,
491
00:28:11,702 --> 00:28:16,412
and then, in one of the cells on the
right hand put my logo so it's on the
492
00:28:16,412 --> 00:28:22,792
right, and then make the cells of that
spreadsheet And it's a, for me, it feels
493
00:28:22,792 --> 00:28:29,812
like we are way more in that land than
we are in, in WYSIWYG abstraction land.
494
00:28:29,862 --> 00:28:33,632
And so the whole first wave,
the whole first couple of years.
495
00:28:33,648 --> 00:28:39,227
Of WYSIWYG website builder companies all
went out of business very, very quickly.
496
00:28:40,967 --> 00:28:41,887
What, what happened there?
497
00:28:41,887 --> 00:28:43,547
If we're going to use that
analogy, what happened there?
498
00:28:44,247 --> 00:28:47,737
WhAt would be the business, if you
wanted to help a million companies
499
00:28:48,737 --> 00:28:50,697
build their first LLM applications.
500
00:28:51,487 --> 00:28:55,667
And the contention is that it's
not the time to build the square
501
00:28:55,667 --> 00:28:59,417
space of the space, uh, which I'm
not, by the way, you know, this
502
00:28:59,417 --> 00:29:01,471
is us just chatting on a podcast
503
00:29:02,481 --> 00:29:03,805
.
A founder could walk in tomorrow.
504
00:29:04,250 --> 00:29:08,720
And pitch and pitch the most beautiful
wonder idea for Squarespace for AI
505
00:29:08,720 --> 00:29:11,230
and, and just prove you totally wrong.
506
00:29:11,260 --> 00:29:12,420
And that's the joy of this process,
507
00:29:12,675 --> 00:29:14,595
Fraser Kelton: That's,
that's what this rule so fun.
508
00:29:14,880 --> 00:29:16,010
Nabeel Hyatt: Yeah, exactly.
509
00:29:16,230 --> 00:29:19,240
So strong, strong convictions,
really loosely held.
510
00:29:19,540 --> 00:29:22,660
But, actually, do you
believe in my analogy?
511
00:29:22,660 --> 00:29:25,000
Do you think that's an apt analogy
or do you think I'm full of it?
512
00:29:25,550 --> 00:29:26,820
Fraser Kelton: No, I don't
think you're full of it.
513
00:29:26,960 --> 00:29:33,010
So, if I understand what's happened in the
Anthropic case, it is The way that they
514
00:29:33,020 --> 00:29:38,890
have tried to nudge the model to improve
performance has then resulted in some
515
00:29:38,950 --> 00:29:45,450
wonky behavior that you can then nudge it
over that hurdle with five magic words.
516
00:29:45,870 --> 00:29:47,100
And what does that say to me?
517
00:29:47,100 --> 00:29:51,830
That, that says to me that
there's probably a solution that
518
00:29:51,840 --> 00:29:54,310
happens at the system level.
519
00:29:54,350 --> 00:29:57,980
If you think about how this may mature
why would they want their customers
520
00:29:57,980 --> 00:29:59,030
to ever have to think about that?
521
00:29:59,030 --> 00:30:03,900
They'll, they'll find ways to absorb
the solution or abstract the solution
522
00:30:03,970 --> 00:30:05,765
for use cases where it makes sense.
523
00:30:06,007 --> 00:30:07,319
Nabeel Hyatt: Yeah, but I
don't have time for that.
524
00:30:07,319 --> 00:30:10,045
I'm a founder that wants
first mover advantage.
525
00:30:10,248 --> 00:30:14,509
Or, My boss me that I need to have an
AI strategy and I need to, I need to
526
00:30:14,509 --> 00:30:17,819
launch next month and it can't, it's
got to get out of demo land cause I've
527
00:30:17,819 --> 00:30:19,399
got an earnings report next quarter.
528
00:30:19,449 --> 00:30:23,579
Fraser Kelton: This, this is
why that that person is having.
529
00:30:23,914 --> 00:30:25,384
Random success.
530
00:30:25,394 --> 00:30:27,684
Sometimes they're succeeding,
sometimes they're failing.
531
00:30:27,754 --> 00:30:30,444
And sometimes they come back to
the drawing board with an entirely
532
00:30:30,444 --> 00:30:31,974
new approach one month later.
533
00:30:32,014 --> 00:30:33,174
We've seen that a lot.
534
00:30:33,789 --> 00:30:34,549
Nabeel Hyatt: That's very true.
535
00:30:35,109 --> 00:30:36,459
I do wonder.
536
00:30:37,059 --> 00:30:41,309
If Procter and Gamble and, and Fidelity
and JPMorgan and every other company
537
00:30:41,309 --> 00:30:43,469
is trying to figure out how to use AI.
538
00:30:44,549 --> 00:30:47,439
If I just think about the web, the
web analogy for a second, and you
539
00:30:47,439 --> 00:30:50,959
don't want to overstretch any analogy
of course, but the really effective
540
00:30:50,959 --> 00:30:54,059
companies in that first wave for
helping to bring everybody onto the
541
00:30:54,059 --> 00:30:59,769
web were kind of a mixture of tools
companies slash consulting companies.
542
00:31:00,659 --> 00:31:01,159
Fraser Kelton: Yeah,
543
00:31:01,302 --> 00:31:08,070
Nabeel Hyatt: It was scient and viant
and Razorfish that, you go pay them
544
00:31:08,080 --> 00:31:13,480
hundreds of thousands of dollars
and they would build time magazine.
545
00:31:13,710 --> 00:31:17,800
com for the first time, these
kind of mixture of design agency,
546
00:31:17,980 --> 00:31:20,700
software engineering, and then
they ended up with internal tool
547
00:31:20,700 --> 00:31:22,340
stacks that they knew how to use.
548
00:31:22,790 --> 00:31:25,270
I think there's an analogy to
549
00:31:25,351 --> 00:31:25,911
Fraser Kelton: Oh, hell, yeah,
550
00:31:26,901 --> 00:31:27,631
I mean, yeah.
551
00:31:27,751 --> 00:31:31,911
There's a reason why OpenAI has, I
forget, I'm going to get the names
552
00:31:31,911 --> 00:31:36,271
wrong here, but has a Keystone
partnership with Bain and Anthropic
553
00:31:36,271 --> 00:31:38,651
has a Keystone partnership with BCG.
554
00:31:39,056 --> 00:31:43,566
Is these are footsie things to bring
into the enterprise, as we've seen.
555
00:31:43,586 --> 00:31:45,986
Five words makes the difference between
something that looks horrible and
556
00:31:45,986 --> 00:31:49,666
something that would be delightful in
production, and there has to be people
557
00:31:49,666 --> 00:31:55,166
who can help you navigate that, uh, as the
world is changing underneath your feet.
558
00:31:55,821 --> 00:31:56,401
Three months.
559
00:31:56,481 --> 00:31:56,921
No.
560
00:31:57,646 --> 00:32:03,136
Nabeel Hyatt: Well, the contention
is right that, Razorfish and Scient
561
00:32:03,156 --> 00:32:07,336
and Vine were net new org, yes, they
were consulting organizations that
562
00:32:07,664 --> 00:32:11,724
rhyme with Bain in the way that they
actually but Bain is old school.
563
00:32:12,194 --> 00:32:15,714
Are there really great AI
implementation engineers waiting
564
00:32:15,714 --> 00:32:17,744
at Bain to take you out to market?
565
00:32:17,754 --> 00:32:19,164
Absolutely not, I would guess.
566
00:32:19,724 --> 00:32:27,704
I, I suspected it's a net, that there's
an opportunity for a net new company to be
567
00:32:27,704 --> 00:32:34,704
filled with people who like to implement,
who will help take these tools, which seem
568
00:32:34,724 --> 00:32:37,354
maybe very easy to stand up very quickly.
569
00:32:37,354 --> 00:32:40,564
I can just go to a prompt and type
things in, but I think are probably
570
00:32:40,634 --> 00:32:44,184
more complicated and people will find
are more complicated than they think.
571
00:32:45,004 --> 00:32:46,764
To actually implement and get live.
572
00:32:46,804 --> 00:32:48,864
And that's why I like the HTML analogy.
573
00:32:48,864 --> 00:32:51,714
It's incredibly simple to
build your first HTML page.
574
00:32:52,014 --> 00:32:54,254
But then, and it feels
like anyone can do it.
575
00:32:54,554 --> 00:32:57,294
But actually trying to
run the NewYorkTimes.
576
00:32:57,294 --> 00:33:02,204
com you know, is another whole
order of magnitude more difficult.
577
00:33:02,224 --> 00:33:05,044
And especially in the early days
where people didn't really know
578
00:33:05,044 --> 00:33:06,714
web and how to do web development.
579
00:33:07,084 --> 00:33:10,314
You needed a set of people that were
your launch team and stood up the
580
00:33:10,314 --> 00:33:12,224
internet, you know, website by website.
581
00:33:12,604 --> 00:33:16,204
I think there's a little bit of that
that probably goes on and I just
582
00:33:16,204 --> 00:33:20,874
don't think it's going to be McKinsey
or Bain or the folks that have,
583
00:33:21,429 --> 00:33:25,819
Really very little of this specific
type of DNA, but I could be wrong.
584
00:33:26,674 --> 00:33:26,974
Fraser Kelton: Yeah.
585
00:33:27,024 --> 00:33:31,824
People who did it back in the day for
transitioning people onto the internet.
586
00:33:32,419 --> 00:33:36,609
Did they do it through just specialized
know how, or did they build tools
587
00:33:36,609 --> 00:33:41,209
and platforms that allowed them to,
to simplify the task for others?
588
00:33:41,504 --> 00:33:43,864
Nabeel Hyatt: Like anything you start
out making a thing and then you're
589
00:33:43,864 --> 00:33:47,064
like, once you've done it two or three
times, engineers can't help themselves.
590
00:33:47,484 --> 00:33:50,984
And so you start to build efficient
591
00:33:51,124 --> 00:33:54,794
Fraser Kelton: is it, but that a, so, but
are we back to this is, there actually
592
00:33:54,794 --> 00:33:59,144
is a middleware company, like a tool
that's going to start from a consultancy
593
00:33:59,144 --> 00:34:00,614
type perspective and then get built out?
594
00:34:02,144 --> 00:34:05,144
And then is your, your issue
with the tool startups?
595
00:34:05,144 --> 00:34:07,874
Just the fact that they're not
going to market appropriately.
596
00:34:08,874 --> 00:34:09,904
Nabeel Hyatt: That's a good pushback.
597
00:34:09,924 --> 00:34:10,884
It might be.
598
00:34:11,214 --> 00:34:14,544
I mean, we'll, none of us know, we'll see
how this all plays out, but yeah, maybe
599
00:34:14,544 --> 00:34:16,414
the right way, it's not what VCs want.
600
00:34:16,434 --> 00:34:18,654
Hey, why don't you hire more
implementation engineers?
601
00:34:18,834 --> 00:34:22,464
It's not what VC on a panel would be.
602
00:34:22,464 --> 00:34:23,704
They'd be like, no, no humans.
603
00:34:23,724 --> 00:34:25,244
The AI should write itself.
604
00:34:25,734 --> 00:34:32,004
But for where we are on the technology
side it might be that the right answer
605
00:34:32,004 --> 00:34:34,194
for the next 12 to 18 months is.
606
00:34:34,824 --> 00:34:38,444
You have a whole bunch of
implementation engineers that are
607
00:34:38,444 --> 00:34:44,382
script monkey that know all of the
unique folklore about how to wrangle
608
00:34:44,382 --> 00:34:45,832
these models in the right direction.
609
00:34:45,882 --> 00:34:49,532
So you're still selling your tool
set, but you're selling your tool set
610
00:34:49,582 --> 00:34:53,332
along with a handful of implementation
engineers and a maintenance contract.
611
00:34:53,942 --> 00:34:59,242
And I know that that, that breaks a lot
of the purity software that we would
612
00:34:59,242 --> 00:35:03,972
all love for engineering to be, but
it might be the right thing for, for
613
00:35:03,972 --> 00:35:05,902
this particular stage that we're in.
614
00:35:06,197 --> 00:35:09,107
Fraser Kelton: Could be . You know,
going back to the start of the API,
615
00:35:09,427 --> 00:35:12,367
there were two people, a guy named
Boris and a guy named Andrew at OpenAI
616
00:35:12,867 --> 00:35:17,617
who were prompt wizards, like they
just knew how to, to construct and
617
00:35:17,617 --> 00:35:18,967
orchestrate these things in a way.
618
00:35:18,967 --> 00:35:20,427
And that's what, that's what they did.
619
00:35:20,487 --> 00:35:24,177
They ran around to the implementations
that seemed most interesting and then
620
00:35:24,177 --> 00:35:29,147
helped them sand off the rough edges
to see if it was a path to production.
621
00:35:29,147 --> 00:35:34,112
And in many cases, They could nudge them
there, whereas as few, few people could.
622
00:35:34,642 --> 00:35:36,132
Nabeel Hyatt: Boris is a
great name for a startup.
623
00:35:37,132 --> 00:35:38,472
Fraser Kelton: Yeah, he is remarkable.
624
00:35:38,532 --> 00:35:40,682
He himself could be a startup.
625
00:35:41,072 --> 00:35:44,522
So you don't think that these things
get abstracted the other way, where
626
00:35:44,522 --> 00:35:49,532
they get pulled down into the actual
model level, and that people aren't
627
00:35:49,532 --> 00:35:51,952
interacting with any of this above that.
628
00:35:52,022 --> 00:35:54,782
And, and it kind of ties back
to the Gemini thing, right?
629
00:35:54,902 --> 00:35:56,272
Nabeel Hyatt: Oh, I think
that's a very good point.
630
00:35:56,292 --> 00:35:58,332
Very likely that that happens in parallel.
631
00:35:58,992 --> 00:36:04,922
And, technology tends to go through
ages of entropy and de entropy.
632
00:36:05,462 --> 00:36:11,122
We all love, especially as engineers,
we love de entropy, we love simplifying
633
00:36:11,122 --> 00:36:16,592
everything, cleaning it up getting rid
of the noise from signal bringing it
634
00:36:16,592 --> 00:36:18,592
all down into something that works.
635
00:36:18,862 --> 00:36:23,972
But when things are not working fully
you can't jump three steps ahead, you
636
00:36:24,292 --> 00:36:25,922
have to go through a phase of entropy.
637
00:36:26,102 --> 00:36:29,142
It's why I don't get nervous
about One more model launching
638
00:36:29,142 --> 00:36:30,622
or one more startup launching.
639
00:36:30,872 --> 00:36:35,422
We need as many shots on goal
and bets to move this technology
640
00:36:35,422 --> 00:36:36,882
forward as quickly as possible.
641
00:36:37,382 --> 00:36:42,462
Things that are trying to make a promise
of de entropy too quickly Just feel
642
00:36:42,462 --> 00:36:46,403
incongruous to me when the goal is
solve the problem reliably and, and
643
00:36:46,623 --> 00:36:48,783
we're still not at reliable solution.
644
00:36:49,183 --> 00:36:52,743
And so my is that reliable solution
is going to get way more complicated
645
00:36:52,743 --> 00:36:53,833
before it's going to get easier.
646
00:36:54,828 --> 00:36:57,568
Fraser Kelton: Boy, we wrestled with this
one, but that one feels really right.
647
00:36:58,103 --> 00:37:02,103
It's going to get more complicated
in every direction because we are
648
00:37:02,103 --> 00:37:07,863
not at the reliability required for
consistent value in many use cases.
649
00:37:08,353 --> 00:37:13,303
And like, why bother adding abstractions
of simplicity if you say it's
650
00:37:13,303 --> 00:37:14,323
still not going to be good enough?
651
00:37:15,093 --> 00:37:15,863
Nabeel Hyatt: Yeah, exactly.
652
00:37:15,873 --> 00:37:18,783
Fraser Kelton: lot easier for you to get
something than broken into production.
653
00:37:19,813 --> 00:37:21,423
Nabeel Hyatt: That's
what, that's the headline.
654
00:37:21,713 --> 00:37:24,117
Why are we making it easier to
get broken things into production?
655
00:37:24,337 --> 00:37:26,523
or, we could just fix it
with marketing Frazier.
656
00:37:26,895 --> 00:37:31,189
Fraser Kelton: No, man, this is like,
I'm listening to you talk about Gemini
657
00:37:31,219 --> 00:37:35,562
and then like nudge me and I don't, they,
they misrepresented what the product is.
658
00:37:36,562 --> 00:37:38,422
Nabeel Hyatt: you, you're
not reacting to the evals.
659
00:37:38,482 --> 00:37:39,892
You're reacting to the demo video.
660
00:37:39,922 --> 00:37:40,312
Fraser Kelton: That's right.
661
00:37:40,342 --> 00:37:42,632
Actually, I don't even care
that much about the evals.
662
00:37:42,682 --> 00:37:46,382
I think it's more interesting to
consider that all of these models are
663
00:37:46,382 --> 00:37:51,252
going to have different tricks ranging
from those five words that Anthropic
664
00:37:51,262 --> 00:37:56,512
had to do all the way up to like Q
star with test time compute type stuff.
665
00:37:57,112 --> 00:37:58,992
The thing that bothers me is video.
666
00:37:59,052 --> 00:38:01,532
And I just thought about
what the equivalent is.
667
00:38:01,552 --> 00:38:06,392
Remember like a decade ago Apple
started marketing their new cameras
668
00:38:06,392 --> 00:38:10,482
by showing you the output of the
iPhone camera when they announced it.
669
00:38:10,512 --> 00:38:13,392
And then I don't know
whether it was Samsung or LG.
670
00:38:13,432 --> 00:38:20,492
And when they announced it, they shared
photos from DSLRs and they silently
671
00:38:20,492 --> 00:38:23,672
just wanted people to infer that
that was the image quality that was
672
00:38:23,672 --> 00:38:27,792
coming from the phone, and then people
discovered within an hour that it was a
673
00:38:27,802 --> 00:38:30,552
digital, like, SLR that took the photo.
674
00:38:30,812 --> 00:38:32,702
That feels exactly what happened here.
675
00:38:33,208 --> 00:38:37,678
And I'm sure that the demo with a little
bit of rough edges that they would have
676
00:38:37,678 --> 00:38:42,338
had if they had shown us the prompt steps
in between and the wait for an inference
677
00:38:42,338 --> 00:38:45,698
to occur still would have been a magical
moment and people would have lost their
678
00:38:45,698 --> 00:38:51,998
minds, but because we feel misled, it
erodes our trust and we feel betrayed,
679
00:38:51,998 --> 00:38:53,208
which is a very funny thing to say.
680
00:38:53,418 --> 00:38:57,023
This reminds me of A moment that
has surprised me, and there's a
681
00:38:57,023 --> 00:39:00,493
lesson here broadly for founders,
and it's not just, you know, be
682
00:39:00,493 --> 00:39:02,333
honest in your marketing material.
683
00:39:02,343 --> 00:39:07,253
I knew when I was a founder that the
common wisdom was just be completely
684
00:39:07,253 --> 00:39:11,833
upfront with VCs because they have seen
so many pitches that they can sniff out
685
00:39:11,833 --> 00:39:13,493
when something doesn't sound correct.
686
00:39:13,763 --> 00:39:17,393
I will tell you in a pitch a couple
of months ago you may not remember it.
687
00:39:17,893 --> 00:39:24,333
There was one moment where you paused, you
raised an eyebrow, you asked one question.
688
00:39:25,633 --> 00:39:30,783
And it was, it was not an aggressive
question, but it, it pulled the
689
00:39:30,783 --> 00:39:33,683
first thread that got to the truth.
690
00:39:34,773 --> 00:39:39,288
And my sense is, in that case, if
he had just been up front We reach a
691
00:39:39,298 --> 00:39:42,958
slightly different outcome versus having
to pull that thread and discover that
692
00:39:42,958 --> 00:39:47,598
there was a little bit of, of deception
in how he was presenting things.
693
00:39:47,788 --> 00:39:50,268
Nabeel Hyatt: Oh, well, to be clear, he
was trying to put a gloss on everything.
694
00:39:50,298 --> 00:39:54,378
And I do remember exactly that
meeting the company had gone through
695
00:39:54,458 --> 00:39:59,648
a pivot, some founder breakup y
stuff, you know, just a lot of change.
696
00:40:00,078 --> 00:40:01,038
And I think.
697
00:40:01,463 --> 00:40:04,443
And had been around for a little bit,
all things we didn't know when that
698
00:40:04,443 --> 00:40:12,053
founder came in to present, but we
take first meetings all week long,
699
00:40:12,263 --> 00:40:15,413
like, if we're not good at reading
people and figuring out what's really
700
00:40:15,413 --> 00:40:16,963
happened, you can't do this job.
701
00:40:17,593 --> 00:40:22,748
And meanwhile, the most important part
of this job is Establishing whether the
702
00:40:22,748 --> 00:40:26,068
other person across the hall is authentic
and you can trust them, because you're
703
00:40:26,068 --> 00:40:27,818
going to be on a long journey together.
704
00:40:28,208 --> 00:40:34,048
And so, before it's a good business
model or it's an amazing product or it's
705
00:40:34,058 --> 00:40:37,688
somebody you want to work with because
you love the intellectual banter and you
706
00:40:37,698 --> 00:40:40,168
think they're going to be a great leader
or whatever else is going to get you
707
00:40:40,168 --> 00:40:43,908
excited about this startup, you can't do
it if you don't think they're all being
708
00:40:43,908 --> 00:40:45,558
authentic and real and honest with you.
709
00:40:45,648 --> 00:40:49,583
And so, Yeah, that was a founder who
had clearly gone through some stuff,
710
00:40:49,946 --> 00:40:52,496
Fraser Kelton: we don't care if
they've gone through stuff, right?
711
00:40:52,566 --> 00:40:53,036
Of course.
712
00:40:53,346 --> 00:40:53,856
that's part of the
713
00:40:54,041 --> 00:40:57,851
Nabeel Hyatt: we would love if gone
through stuff, like, you learn some
714
00:40:57,851 --> 00:41:04,291
lessons, just own it, and tell the
story about how your, you started
715
00:41:04,311 --> 00:41:07,106
thinking it was this other thing
and you were just wrong, or, Bye.
716
00:41:07,396 --> 00:41:10,186
You moved to this town and it was
the wrong town because there was
717
00:41:10,186 --> 00:41:11,306
just a bunch of fly by nights.
718
00:41:11,306 --> 00:41:15,226
You took this founder on board, but they
were just a ne'er do well, so you had
719
00:41:15,226 --> 00:41:20,506
to get rid of them or just whatever it
happens to be that, that you went through.
720
00:41:20,516 --> 00:41:22,416
You just want your learned insights.
721
00:41:22,486 --> 00:41:26,296
And I think way too often people
want to tell a glossy story about how
722
00:41:26,296 --> 00:41:28,886
everything's up and to the right and
you got to get on board right now.
723
00:41:29,226 --> 00:41:33,156
Because this round is, oh, the other trick
is like this round is closing in two days.
724
00:41:33,206 --> 00:41:35,696
This stuff, like create senses of urgency.
725
00:41:36,096 --> 00:41:39,796
None of that stuff, all
of that stuff just hurts.
726
00:41:39,826 --> 00:41:42,926
If you have a slow fundraising
process, first of all,
727
00:41:42,926 --> 00:41:44,266
people probably already know.
728
00:41:44,276 --> 00:41:46,186
Just say, I think they're all dumb.
729
00:41:46,456 --> 00:41:49,636
This is the reason I think you are
going to be smarter than all of them.
730
00:41:49,636 --> 00:41:51,686
You can try and go to their ego.
731
00:41:51,876 --> 00:41:53,746
So I'm not saying you
don't try to storytell.
732
00:41:54,066 --> 00:41:56,576
I'm just saying you have
to know how to do it with.
733
00:41:57,096 --> 00:42:00,046
Being yourself and being authentic
to the journey that you've been on.
734
00:42:01,046 --> 00:42:01,416
Fraser Kelton: Yep.
735
00:42:01,466 --> 00:42:03,883
How many times have we seen somebody,
oh, okay, the deal's coming together
736
00:42:03,883 --> 00:42:06,506
in two days, you have to move quickly,
and then we're like, okay, well
737
00:42:06,506 --> 00:42:08,056
then this is not the deal for us.
738
00:42:08,756 --> 00:42:10,996
And then all of a sudden you see
them try to backtrack fairly quickly.
739
00:42:11,346 --> 00:42:14,366
Well listen, we really like you, so maybe
we can give you a couple extra days.
740
00:42:14,386 --> 00:42:15,176
And you're like, alright
741
00:42:15,336 --> 00:42:20,346
Nabeel Hyatt: I had the exact opposite
thing happen to me last week where I had
742
00:42:20,346 --> 00:42:26,976
a founder email in and I passed over email
and I wrote up, but I wrote like a good
743
00:42:26,976 --> 00:42:30,466
little paragraph about why, like this is
the thing that these are the reasons that
744
00:42:30,486 --> 00:42:31,366
I'm not sure you're going to be there.
745
00:42:32,416 --> 00:42:35,466
And I've gotten, mostly you get
crickets, they're going to move on,
746
00:42:35,496 --> 00:42:37,236
which is totally understandable, right?
747
00:42:37,716 --> 00:42:41,506
The second thing you get is defensive,
angry feedback that I'm just dumb.
748
00:42:41,896 --> 00:42:44,976
Which I'm not sure exactly what
that sales tactic is, but whatever.
749
00:42:45,656 --> 00:42:48,236
I got back from that
founder you might be right.
750
00:42:49,586 --> 00:42:52,176
Here are the things that I
think I've worked through.
751
00:42:52,656 --> 00:42:57,176
To try and prove what you're saying
wrong, and then gave a couple of little
752
00:42:57,176 --> 00:43:00,086
notes of the other things they tried
that don't come out, of course, in
753
00:43:00,086 --> 00:43:03,676
the one paragraph pitch of the other
versions of that business over time,
754
00:43:03,856 --> 00:43:05,506
the struggles they've had, and so forth.
755
00:43:05,906 --> 00:43:08,966
I mean, I got on a Zoom on that
person, like, three hours later.
756
00:43:10,646 --> 00:43:11,176
Fraser Kelton: Yep.
757
00:43:11,316 --> 00:43:12,116
Nabeel Hyatt: I was like, Oh, you're
758
00:43:12,206 --> 00:43:12,556
Fraser Kelton: get it.
759
00:43:12,586 --> 00:43:13,196
I get it.
760
00:43:13,506 --> 00:43:14,156
Nabeel Hyatt: this business.
761
00:43:14,176 --> 00:43:16,096
And you're authentically
trying to engage with me on it.
762
00:43:16,096 --> 00:43:17,086
And you're not combative about it.
763
00:43:17,086 --> 00:43:18,376
You're just having a
conversation about it.
764
00:43:18,376 --> 00:43:19,766
Like, awesome.
765
00:43:20,016 --> 00:43:22,796
And I, you know, didn't turn
into an investment that day.
766
00:43:22,836 --> 00:43:25,756
It may in the future we'll
see, but I certainly hold that
767
00:43:25,756 --> 00:43:26,946
founder in really high regard.
768
00:43:27,946 --> 00:43:28,356
Fraser Kelton: I get it.
769
00:43:28,406 --> 00:43:34,576
It was amazing to see, some depth
of experience such that you just,
770
00:43:35,266 --> 00:43:39,956
you knew based on two sentences
that something wasn't right.
771
00:43:39,956 --> 00:43:44,116
And it just reinforced what I had
been told when I was a founder.
772
00:43:44,641 --> 00:43:46,091
Don't bother, right?
773
00:43:46,416 --> 00:43:48,796
Nabeel Hyatt: It's similar when
you're pitching and you don't,
774
00:43:48,886 --> 00:43:51,786
you're trying to gloss over the
particular risks or problem with your
775
00:43:51,786 --> 00:43:54,386
startup, the old real estate trick.
776
00:43:54,741 --> 00:43:57,771
That Realtors use is when they
show you a house they list all
777
00:43:57,771 --> 00:44:00,081
the wonderful things and then, and
then they're doing the walkthrough.
778
00:44:00,091 --> 00:44:03,921
They talk about the one thing that's
the problem with this house and what
779
00:44:03,921 --> 00:44:06,641
they're trying to do focus your time
and energy on the one thing so you
780
00:44:06,641 --> 00:44:07,901
don't think of the 30 other things.
781
00:44:08,291 --> 00:44:10,881
That's very different from
authentically having a conversation
782
00:44:10,881 --> 00:44:12,361
with an investor about your business.
783
00:44:12,671 --> 00:44:15,191
But similarly, these are
early stage startups.
784
00:44:15,421 --> 00:44:17,831
There's no way nothing is
wrong with your business.
785
00:44:18,126 --> 00:44:18,446
Fraser Kelton: Yeah.
786
00:44:18,881 --> 00:44:21,691
Nabeel Hyatt: And so you might as
well talk about the things that you
787
00:44:21,701 --> 00:44:24,831
think are really risky or are broken
or that you haven't figured out yet.
788
00:44:25,386 --> 00:44:27,866
Because the right investor is going to
be the person that's going to be like,
789
00:44:28,156 --> 00:44:31,356
I don't think those are real risks, or
I'm willing to take on that risk, or
790
00:44:32,166 --> 00:44:35,516
like, I think you can solve that risk,
and that's, that's the right way to
791
00:44:35,516 --> 00:44:37,076
have the conversation about the path.
792
00:44:37,366 --> 00:44:40,526
Nobody expects these things to be
totally finished and that's a very,
793
00:44:40,546 --> 00:44:40,936
Fraser Kelton: sure.
794
00:44:41,036 --> 00:44:41,376
Yeah.
795
00:44:41,676 --> 00:44:44,506
Somebody internally here said
that the quickest way to a no
796
00:44:44,546 --> 00:44:47,456
is when there is no risk, right?
797
00:44:47,456 --> 00:44:49,126
Because that's not a venture business.
798
00:44:49,126 --> 00:44:50,416
That's, that's not for us.
799
00:44:50,516 --> 00:44:53,396
Nabeel Hyatt: When a founder feels like
They know the problems they think they've
800
00:44:53,426 --> 00:44:59,136
kind of solved and the areas where they're
self reflective and self aware enough to
801
00:44:59,136 --> 00:45:02,016
realize they've got a lot of work to do.
802
00:45:02,606 --> 00:45:06,826
And you can have an open and honest
conversation about doing that work..
803
00:45:07,146 --> 00:45:07,536
Fraser Kelton: Mm hmm.
804
00:45:08,206 --> 00:45:11,026
I also think the challenge here is
that every firm is different, right?
805
00:45:11,226 --> 00:45:15,026
And so, whenever anybody has
shown us a demo, you see almost
806
00:45:15,026 --> 00:45:16,556
everybody in the room lean forward.
807
00:45:16,566 --> 00:45:21,646
And when people have had stilted
presentation pitch mode, everybody's,
808
00:45:21,676 --> 00:45:23,176
you know, kind of in lean back.
809
00:45:24,051 --> 00:45:26,751
Nabeel Hyatt: At Spark, we like
demos, we like talking product, and
810
00:45:26,911 --> 00:45:29,801
that's, you know, but you're right,
that's not how it is at every shop.
811
00:45:30,151 --> 00:45:32,241
That's not that's not how
lots of investors operate.
812
00:45:33,491 --> 00:45:36,141
Fraser Kelton: that would
probably be the challenge here is
813
00:45:36,151 --> 00:45:37,741
everybody operates differently.
814
00:45:37,831 --> 00:45:42,881
That's where you have an opportunity in
these moments to find the person that
815
00:45:42,881 --> 00:45:44,491
you want to be with for a long time.
816
00:45:44,581 --> 00:45:44,841
Right?
817
00:45:44,841 --> 00:45:48,431
So there, there will, there are
different founders who appreciate
818
00:45:48,451 --> 00:45:49,671
different types of techniques too.
819
00:45:49,726 --> 00:45:52,546
Nabeel Hyatt: It can feel from the
fundraising side, and I certainly felt
820
00:45:52,546 --> 00:45:55,936
it as a founder, like, I just want
somebody to give me a first term sheet.
821
00:45:55,956 --> 00:45:58,116
I'm just trying to raise
capital, whoever it can be.
822
00:45:58,686 --> 00:46:03,236
But that's a little bit like in today's
age, like applying to college by just
823
00:46:03,236 --> 00:46:06,826
saying, I, I really love your school
because it's a great school for learning.
824
00:46:06,856 --> 00:46:09,286
And it's, that's not a great
way to get into college.
825
00:46:09,586 --> 00:46:13,436
I had an admissions person at NYU tell
me that they, the admissions people
826
00:46:13,436 --> 00:46:17,066
there, they always do a thing where they
cover up, why do you want to go to NYU?
827
00:46:17,336 --> 00:46:23,026
And if the answer is you could put in
Columbia instead of NYU the answer, then
828
00:46:23,026 --> 00:46:24,996
that's not the person for NYU, right?
829
00:46:25,436 --> 00:46:25,856
Fraser Kelton: Yep.
830
00:46:25,956 --> 00:46:26,506
Yep.
831
00:46:26,556 --> 00:46:28,726
Nabeel Hyatt: know, it's, if
it's, I love the opportunity for
832
00:46:28,726 --> 00:46:31,606
internships in the dynamic city
and, you know, stuff like that.
833
00:46:31,606 --> 00:46:32,776
It's like, that's not really about NYU.
834
00:46:32,796 --> 00:46:33,566
That's about New York.
835
00:46:33,916 --> 00:46:37,871
so I think similarly when fundraising
The thing I got to in the latter
836
00:46:38,101 --> 00:46:44,041
half of my third, fourth startup in
fundraising was, um, I'm going to pitch
837
00:46:44,101 --> 00:46:47,138
the way I want to pitch, not the way
my founder friends tell me to pitch.
838
00:46:47,138 --> 00:46:50,941
And I'm going pitch in a way that is
authentically me and the way that I
839
00:46:50,941 --> 00:46:53,821
want to talk about how I want to raise,
run this company, the culture I want
840
00:46:53,821 --> 00:46:55,481
to build, the problems my startup has.
841
00:46:55,791 --> 00:46:58,031
I'm just going to lay it
on the table authentically.
842
00:46:58,301 --> 00:47:00,431
And then the job isn't to find.
843
00:47:00,931 --> 00:47:01,851
50 term sheets.
844
00:47:02,301 --> 00:47:05,251
The job is to find one or two term sheets.
845
00:47:05,491 --> 00:47:09,431
If I, if I can pitch the way I want
to pitch my business I'll get lots
846
00:47:09,431 --> 00:47:11,651
of strong no's, but one strong yes.
847
00:47:12,166 --> 00:47:17,556
And lots of strong no's, but one or
two or three strong yes's, it is ten
848
00:47:17,556 --> 00:47:20,406
times more valuable than a bunch of meh.
849
00:47:20,886 --> 00:47:25,336
This seemed okay, because those
don't lead to board seats and checks
850
00:47:25,386 --> 00:47:28,856
and people who are going to join
your cause for the next ten years.
851
00:47:29,478 --> 00:47:33,118
Fraser Kelton: So having listened to
that and then having a moment to reflect.
852
00:47:33,168 --> 00:47:37,978
The thing that I would do differently
that I think would have a material impact.
853
00:47:38,813 --> 00:47:43,793
is to have a very authentic opening
as to why I was excited to have
854
00:47:43,793 --> 00:47:47,723
this conversation with this specific
person in this specific firm.
855
00:47:48,403 --> 00:47:51,313
You and I had the joy of sitting
with that founder a couple of months
856
00:47:51,313 --> 00:47:54,813
ago now, who said, I'm excited
at the prospect of working with
857
00:47:54,813 --> 00:47:57,983
Spark because you have a history of
supporting founders doing brave things.
858
00:47:58,013 --> 00:48:00,573
And I know that that worked.
859
00:48:00,708 --> 00:48:05,198
On you and I because we independently
said it with other people after the fact,
860
00:48:05,369 --> 00:48:05,539
Nabeel Hyatt: good
861
00:48:06,068 --> 00:48:08,708
Fraser Kelton: And it was
gr it was great sales.
862
00:48:08,738 --> 00:48:12,248
It, but it was, well, just like
any great sales, it was authentic
863
00:48:12,248 --> 00:48:13,898
and it, and it resonated, right?
864
00:48:13,984 --> 00:48:16,914
Nabeel Hyatt: and you could feel it in
the tone of when they were saying that,
865
00:48:17,224 --> 00:48:18,784
it was something they were really feeling.
866
00:48:19,034 --> 00:48:22,324
What do you do when you're
pitching, X, Y, Z, fund?
867
00:48:22,904 --> 00:48:26,744
That you don't really know why,
why you're talking to them.
868
00:48:26,754 --> 00:48:29,604
You just, you can't figure out
the most amazing, there's no
869
00:48:29,604 --> 00:48:31,804
obvious amazing reason why you're
talking to them in the first place.
870
00:48:32,679 --> 00:48:35,574
Fraser Kelton: Why are you wasting
either of your party's time?
871
00:48:36,329 --> 00:48:37,349
Is the first one.
872
00:48:37,939 --> 00:48:43,179
If you can't put in 15 minutes of thought
and research and come up with one reason,
873
00:48:44,349 --> 00:48:45,449
then why are you talking to that person?
874
00:48:45,999 --> 00:48:49,039
Nabeel Hyatt: There are people that like,
there are CEOs that like to build very
875
00:48:49,039 --> 00:48:52,819
long spreadsheets of the 40 people that
they're going to go through and talk to.
876
00:48:52,859 --> 00:48:58,919
And look, there are times where
fundraising is really CEOs who It was
877
00:48:58,919 --> 00:49:04,819
the 38th person in the Excel spreadsheet
that you got to that raised the round.
878
00:49:04,829 --> 00:49:08,849
Personally, I have had rounds where
I have had to do that in the past.
879
00:49:09,259 --> 00:49:12,109
But I think that is different
from being casual about it.
880
00:49:12,739 --> 00:49:14,409
And I think that's what talking to.
881
00:49:15,684 --> 00:49:21,414
Long term relationships, these are
big decisions for the person on the
882
00:49:21,454 --> 00:49:26,834
other side and they can feel it when
when the work hasn't been put in.
883
00:49:26,934 --> 00:49:31,834
And so, I know for a lot of founders
raising money can feel like a quote
884
00:49:31,834 --> 00:49:34,764
unquote distraction and I want to
get back to quote unquote work.
885
00:49:35,134 --> 00:49:40,394
And I've always really hated that
phrasing because You know, getting
886
00:49:40,394 --> 00:49:44,174
rid of a board member is like 10
times harder than getting divorced.
887
00:49:44,174 --> 00:49:48,174
Like, you're, you're recruiting
somebody that is going to be
888
00:49:48,414 --> 00:49:49,954
you for a really long time.
889
00:49:49,954 --> 00:49:52,494
Like, you should put the time and
effort in the same way that you would
890
00:49:52,494 --> 00:49:54,204
to recruit a CTO or anybody else.
891
00:49:55,178 --> 00:49:56,328
Fraser Kelton: Oh, for sure, right?
892
00:49:56,328 --> 00:50:01,433
It is, it's a byproduct of COVID
maybe where it became speed dating
893
00:50:01,453 --> 00:50:05,843
and the whole community went crazy,
but the idea that you would sign
894
00:50:05,843 --> 00:50:08,083
up for this level of intense.
895
00:50:09,033 --> 00:50:13,503
Camaraderie without having a an
investment seems rather silly.
896
00:50:13,733 --> 00:50:15,143
You know, I got good news for you.
897
00:50:15,233 --> 00:50:16,223
I got good news for you.
898
00:50:16,728 --> 00:50:17,218
Nabeel Hyatt: What's up?
899
00:50:18,218 --> 00:50:19,118
Fraser Kelton: I googled superpowered.
900
00:50:21,113 --> 00:50:21,623
Nabeel Hyatt: Mm hmm.
901
00:50:21,683 --> 00:50:24,023
Fraser Kelton: And I'm, I'm reading
an article and we'll come back
902
00:50:24,023 --> 00:50:27,233
to the product, but I wanna, I
want to, I wanna live with you.
903
00:50:27,643 --> 00:50:31,543
The company says they are not
shutting down the initial product,
904
00:50:32,353 --> 00:50:32,913
Nabeel Hyatt: Yes!
905
00:50:34,980 --> 00:50:35,220
Fraser Kelton: All right.
906
00:50:35,220 --> 00:50:36,210
Let's take a step back.
907
00:50:36,380 --> 00:50:39,140
Tell us about superpowered
and why you're over the moon.
908
00:50:39,145 --> 00:50:39,275
Mm,
909
00:50:40,250 --> 00:50:45,790
Nabeel Hyatt: so, this is a great
segue into product of the week I have
910
00:50:45,790 --> 00:50:49,310
been trying to record most of my life
on a daily basis more and more of my
911
00:50:49,310 --> 00:50:54,030
life, and try and summarize it and make
it searchable and so forth the super
912
00:50:54,030 --> 00:51:01,360
powered started out as kind of like
meeting bot helper company actually
913
00:51:01,370 --> 00:51:07,360
prior to GPT and then post ChatGPT,
they turn it into an AI note taker
914
00:51:07,580 --> 00:51:11,430
for your Zoom meetings or your Google
Meet meetings and so on and so forth.
915
00:51:11,720 --> 00:51:17,080
Now if that sounds like 30 other startups,
that is because there are like 30 other
916
00:51:17,080 --> 00:51:24,070
startups that are also aI note taking
startups, folks like Fireflys, and I
917
00:51:24,080 --> 00:51:26,600
think Gong does this for salespeople.
918
00:51:26,880 --> 00:51:31,520
And you could just go open the Zoom
app store and take a look through.
919
00:51:31,520 --> 00:51:36,260
And by the way, Zoom itself has
natively launched summarization
920
00:51:36,260 --> 00:51:38,780
as well to take notes while
you're inside of your meetings.
921
00:51:39,380 --> 00:51:42,620
And so the question is, why am
I excited about superpowered not
922
00:51:42,620 --> 00:51:45,350
dying when all these things exist?
923
00:51:45,360 --> 00:51:49,250
Everyone's going to launch a
version of a product, and they're
924
00:51:49,250 --> 00:51:52,640
all going to be noisy, but the real
question is, who's done it right?
925
00:51:52,680 --> 00:51:55,700
And at least in my personal view,
I've tried all of these products.
926
00:51:56,145 --> 00:52:00,365
And none of them are good enough that
I would ever use them week after week
927
00:52:00,365 --> 00:52:02,535
after week, except for SuperPowered.
928
00:52:03,435 --> 00:52:05,085
Fraser Kelton: What,
what is super powered?
929
00:52:05,865 --> 00:52:07,155
Nabeel Hyatt: You mean,
what does the product do?
930
00:52:07,625 --> 00:52:11,255
Fraser Kelton: You just said that it's
all of the small things that they've
931
00:52:11,255 --> 00:52:15,515
done right that make it stand out from a
different AI, like transcription service.
932
00:52:16,115 --> 00:52:19,805
Like, isn't it, don't you just want
it to do reliable transcription?
933
00:52:20,385 --> 00:52:20,815
Nabeel Hyatt: No.
934
00:52:21,175 --> 00:52:24,725
First of all, nobody wants to look
at the transcription of any meeting.
935
00:52:25,415 --> 00:52:29,395
thEre is no way That I want to
actually look through all of the
936
00:52:29,395 --> 00:52:34,215
ridiculous things that I talk about
every single day, word by word.
937
00:52:34,535 --> 00:52:39,645
What you really want is summarization,
and what you really want is action items.
938
00:52:40,225 --> 00:52:43,175
And the execution on that
summarization and the execution on
939
00:52:43,175 --> 00:52:45,185
those action items is what matters.
940
00:52:45,585 --> 00:52:47,875
And it just turns out that
there's actually wildly
941
00:52:47,875 --> 00:52:50,925
variant execution on that job.
942
00:52:51,295 --> 00:52:55,195
The particularly two problems that I
have With most of the other products
943
00:52:55,195 --> 00:53:01,085
that do summarization are first,
they run inside of Zoom as an app.
944
00:53:01,585 --> 00:53:03,855
And I don't want Zoom
having control over it.
945
00:53:03,955 --> 00:53:06,105
I want my desktop to have control over it.
946
00:53:06,595 --> 00:53:09,915
And so, SuperPowered is a
desktop app, not a Zoom app.
947
00:53:10,475 --> 00:53:12,595
That's the first that matters a lot.
948
00:53:12,730 --> 00:53:13,720
Fraser Kelton: that's a big difference.
949
00:53:13,895 --> 00:53:16,525
Nabeel Hyatt: and it allows
them, in particularly, to add
950
00:53:16,535 --> 00:53:18,235
new interfaces, new Chrome.
951
00:53:18,245 --> 00:53:20,825
They can iterate on it,
like, 50 times faster.
952
00:53:21,205 --> 00:53:25,385
than trying to be one button on the
toolbar on the bottom of, of Zoom.
953
00:53:25,825 --> 00:53:28,798
It also means a user doesn't have
to ask corporate overlords whether
954
00:53:28,798 --> 00:53:33,828
they will approve this, this
app to run their infrastructure.
955
00:53:34,058 --> 00:53:36,928
Which I think is a real thing
we've got to think about in AI.
956
00:53:37,028 --> 00:53:41,238
A quick aside, I was talking to my
friend who works at Amazon and he said,
957
00:53:41,548 --> 00:53:44,818
you know, we talk in these podcasts
of all these wonderful products and he
958
00:53:44,828 --> 00:53:48,088
goes, he just, he's like, just remind
you what's going on in real life.
959
00:53:48,288 --> 00:53:54,098
Every time he even goes to use ChatGPT,
Amazon internally puts up this big
960
00:53:54,158 --> 00:53:58,188
prompt that yells at him and says,
listen, just so you know, if you put
961
00:53:58,218 --> 00:54:00,878
any confidential information into this
product, we will come and kill you.
962
00:54:01,168 --> 00:54:04,838
He can't have things scraping his
email to summarize them properly.
963
00:54:05,043 --> 00:54:09,533
You can't have products that are
helping arrange meetings through AI,
964
00:54:09,543 --> 00:54:10,793
Amazon's not getting that stuff happen.
965
00:54:10,843 --> 00:54:11,763
They're on lockdown.
966
00:54:12,143 --> 00:54:12,153
Oh.
967
00:54:12,473 --> 00:54:16,133
And anyway, so, but super powered
runs on your desktop, that's the first
968
00:54:16,133 --> 00:54:19,703
thing, and so I have control over
its use not my corporate overlords.
969
00:54:20,063 --> 00:54:23,103
And then the second thing and again, it
kind of maybe goes back to this previous
970
00:54:23,103 --> 00:54:28,023
conversation about just understanding
where we are on the entropy curve, is
971
00:54:28,023 --> 00:54:29,533
that they let you edit the prompts.
972
00:54:30,613 --> 00:54:33,203
So they have, they have meeting types.
973
00:54:33,453 --> 00:54:36,443
So for instance, if I'm meeting
a new company, I have a meeting
974
00:54:36,443 --> 00:54:38,033
type called New Company.
975
00:54:38,053 --> 00:54:40,873
And then when I meet with the founders we
work with, that's called Founders, right?
976
00:54:40,913 --> 00:54:44,583
And, and the notes I to take away,
the takeaway from each of these types
977
00:54:44,583 --> 00:54:46,483
of meetings is remarkably different.
978
00:54:47,033 --> 00:54:51,663
And of course have prompts that they put
in there that are starter prompts for the
979
00:54:51,663 --> 00:54:53,153
noob who doesn't know what they're doing.
980
00:54:53,353 --> 00:54:57,358
But inevitably, I probably for
90 percent of people today, Like,
981
00:54:57,358 --> 00:54:58,458
something's wrong about that.
982
00:54:58,488 --> 00:55:01,998
It says something in there that I
want that's not right for me, and it
983
00:55:01,998 --> 00:55:06,278
lets you open up the prompt, edit the
prompt, and get what you want out of
984
00:55:06,278 --> 00:55:10,538
it, which is the difference between
something that is kind of like, meh and
985
00:55:10,538 --> 00:55:15,848
okay, and it like gave me a couple of
interesting summarization topics and
986
00:55:15,848 --> 00:55:19,878
titles, versus I feel like an active
participant in making this thing work.
987
00:55:20,388 --> 00:55:24,528
Fraser Kelton: Don't you think that
this is also maybe why they're pivoting
988
00:55:24,528 --> 00:55:28,793
away from it in the sense that This
is a really hard problem, right?
989
00:55:29,253 --> 00:55:33,243
I was just thinking that the diversity
of meetings that people have and then
990
00:55:33,253 --> 00:55:37,983
the preferences of workflows across
those different types of meetings means
991
00:55:37,983 --> 00:55:42,713
that there's like an explosion in Quote
unquote, getting this to work well.
992
00:55:42,783 --> 00:55:46,213
Nabeel Hyatt: I think there's two
points there worth touching on.
993
00:55:47,083 --> 00:55:50,573
The first of which is that,
look, of course it's a problem.
994
00:55:51,123 --> 00:55:52,303
It's a problem.
995
00:55:52,803 --> 00:55:56,893
That there are lots of different use
cases in meetings, and it's a problem that
996
00:55:56,893 --> 00:56:01,183
this is a really busy market with lots of
competition, so it's hard to stick out.
997
00:56:02,243 --> 00:56:04,743
iF a startup doesn't want to solve
problems, then what are they doing?
998
00:56:05,333 --> 00:56:12,083
Like, I, I do worry sometimes that we,
we try and avoid all of the risk in
999
00:56:12,083 --> 00:56:16,083
our startups when problems existing
out in the world is why startups have
1000
00:56:16,083 --> 00:56:17,823
a chance to exist in the first place.
1001
00:56:18,738 --> 00:56:21,558
So you have to pick your
proper problems, but,
1002
00:56:21,653 --> 00:56:22,023
Fraser Kelton: Huh.
1003
00:56:22,103 --> 00:56:25,393
Nabeel Hyatt: but yeah, let's, but let's
spend a little time figuring out if I'm
1004
00:56:25,393 --> 00:56:29,233
a startup, how to solve this problem,
because if we can solve it, then it's
1005
00:56:29,303 --> 00:56:34,503
perfectly obvious for the 18 or 20
other AI meeting note companies that
1006
00:56:34,503 --> 00:56:36,053
they have not solved this problem yet.
1007
00:56:36,243 --> 00:56:37,043
So if I have a
1008
00:56:37,733 --> 00:56:37,913
Fraser Kelton: Mm
1009
00:56:38,143 --> 00:56:40,333
Nabeel Hyatt: then suddenly I can
explain that breakthrough very
1010
00:56:40,333 --> 00:56:43,033
clearly to my customers, and I now
have an advantage in the market.
1011
00:56:43,913 --> 00:56:43,923
Fraser Kelton: Hmm.
1012
00:56:44,148 --> 00:56:44,358
Nabeel Hyatt: So,
1013
00:56:45,023 --> 00:56:45,563
Fraser Kelton: Good, good point.
1014
00:56:45,613 --> 00:56:46,123
That's fair.
1015
00:56:46,228 --> 00:56:51,428
Nabeel Hyatt: and then my second point is
somewhat related, but I think it's also
1016
00:56:51,868 --> 00:56:53,878
back to the entropy, de entropy thing.
1017
00:56:53,908 --> 00:56:57,638
If you, if you honestly think that
we are still at the point where we're
1018
00:56:57,638 --> 00:57:01,288
trying to make all these AI products
work, then just accept the idea that
1019
00:57:01,288 --> 00:57:06,408
this is AI products are early adopter
products for right now, and they will
1020
00:57:06,408 --> 00:57:10,658
not be early adopter products, only
idiots, crazy people, only crazy people
1021
00:57:10,658 --> 00:57:12,908
like you or me might try and play with.
1022
00:57:13,548 --> 00:57:17,568
An early adopter across every
single vertical and horizontal,
1023
00:57:17,568 --> 00:57:21,818
every single week, to see how all
of these tools are developing.
1024
00:57:22,238 --> 00:57:27,808
But early adopter doesn't just
mean nerd in, in It means that for
1025
00:57:27,828 --> 00:57:30,928
somebody, this problem is so acute
1026
00:57:31,893 --> 00:57:32,423
Fraser Kelton: Mm hmm.
1027
00:57:32,918 --> 00:57:36,118
Nabeel Hyatt: will be an early adopter
to try and figure out the solution.
1028
00:57:36,463 --> 00:57:42,183
And that early adopter customer will
help you find the solution with you if
1029
00:57:42,183 --> 00:57:44,183
you give them the tools to work on it.
1030
00:57:44,683 --> 00:57:50,643
And as an, as an example, you know,
Adept, which is where an investor in,
1031
00:57:50,643 --> 00:57:54,603
is an action, they create an action
transformer model, and they've released
1032
00:57:54,913 --> 00:58:00,303
a workflow tool for building your own
little webpage navigator to take actions
1033
00:58:00,303 --> 00:58:03,173
on a webpage and do little workflows.
1034
00:58:03,643 --> 00:58:07,743
Now, the model itself is not the large
model they'll be launching relatively
1035
00:58:07,743 --> 00:58:13,513
soon, so it's an earlier model, and
the workflow tool itself is, let's
1036
00:58:13,513 --> 00:58:20,663
be honest, like, kind of hard to use,
clearly an R& D product and definitely
1037
00:58:20,663 --> 00:58:23,753
not a late adopter product that I
would give my mother or father, right?
1038
00:58:24,313 --> 00:58:24,873
But
1039
00:58:25,003 --> 00:58:25,563
Fraser Kelton: mm hmm,
1040
00:58:25,593 --> 00:58:29,118
Nabeel Hyatt: for the people who,
which Those workflows are really,
1041
00:58:29,128 --> 00:58:31,838
really acute problems in their lives.
1042
00:58:32,198 --> 00:58:34,618
They are going to trudge through it.
1043
00:58:35,058 --> 00:58:41,568
And then you will learn with your customer
versus in some R& D lab somewhere where
1044
00:58:41,568 --> 00:58:45,448
your assumptions about your customer
are wrong, which is the right way to
1045
00:58:45,448 --> 00:58:46,698
build when you're early in a market.
1046
00:58:48,023 --> 00:58:54,163
Fraser Kelton: mm hmm, the challenges that
exist today are, you know, normal in terms
1047
00:58:54,163 --> 00:58:58,883
of trying to figure out how to solve a
large, meaningful problem and that it's
1048
00:58:58,883 --> 00:59:04,203
a shame, uh, if that's the reason why
a group who had a little bit of an edge
1049
00:59:04,203 --> 00:59:08,223
on it is, is likely to not be investing
too actively into trying to solve it.
1050
00:59:08,271 --> 00:59:11,071
Nabeel Hyatt: And look, we don't
know the superpowered AI founders.
1051
00:59:11,131 --> 00:59:13,891
I don't know where they are in
funding or their traction or their
1052
00:59:13,891 --> 00:59:16,971
progress or what excites them
and gets them up in the morning.
1053
00:59:17,131 --> 00:59:21,601
I'm just a consumer of the product . But
all I know is that everybody here should
1054
00:59:21,601 --> 00:59:26,031
go to SuperPowered AI and give them
money so that they stay in business,
1055
00:59:26,041 --> 00:59:27,831
so that I can keep using product.
1056
00:59:29,426 --> 00:59:31,076
Fraser Kelton: You know, I
just skimmed the article.
1057
00:59:31,166 --> 00:59:34,956
It says that it's hard to
differentiate in this type of a
1058
00:59:34,956 --> 00:59:36,766
market to have sustained growth.
1059
00:59:37,456 --> 00:59:40,066
thEy are profitable, and so
they hope to find somebody who
1060
00:59:40,066 --> 00:59:41,406
will just continue to run it.
1061
00:59:41,706 --> 00:59:47,106
But they're pivoting to become an API
provider for anybody to create a natural
1062
00:59:47,106 --> 00:59:49,506
sounding voice based AI assistant.
1063
00:59:50,113 --> 00:59:55,883
Nabeel Hyatt: That is also a busy space
of course, but, you know, it's also
1064
00:59:55,883 --> 00:59:58,333
very possible that's just a problem that
they are more excited about solving.
1065
00:59:59,183 --> 01:00:02,833
And will, they will through the
hard difficulties of that particular
1066
01:00:02,833 --> 01:00:09,033
problem with more verve and with
more passion than they have for
1067
01:00:09,113 --> 01:00:10,363
meeting notes, which is fine.
1068
01:00:11,193 --> 01:00:14,233
Fraser Kelton: you know, yeah, people
don't often talk about that, right?
1069
01:00:14,243 --> 01:00:19,463
Is you go through a pivot and you're
doing it for a lot of You know,
1070
01:00:19,463 --> 01:00:25,873
logical reasons, but you might either
pivot into or away from an idea
1071
01:00:25,873 --> 01:00:27,483
that you actually care deeply about.
1072
01:00:27,843 --> 01:00:29,183
Nabeel Hyatt: Yeah, it's not a short road.
1073
01:00:30,223 --> 01:00:32,443
Yeah let's be done for today.
1074
01:00:32,443 --> 01:00:33,493
I
1075
01:00:33,588 --> 01:00:33,858
Fraser Kelton: do it
1076
01:00:34,273 --> 01:00:35,613
Nabeel Hyatt: think we went
through some good stuff.
1077
01:00:36,453 --> 01:00:37,833
Go download SuperPowered.
1078
01:00:38,643 --> 01:00:39,463
Give it a try.
1079
01:00:39,893 --> 01:00:41,698
We'd love to hear from
you on that product.
1080
01:00:41,728 --> 01:00:44,231
I'll also add there's a couple
other products that have launched
1081
01:00:44,231 --> 01:00:46,691
that are allowing you to do live
AI drawing if you wanna try one.
1082
01:00:46,741 --> 01:00:51,571
Leonardo AI launched a pretty good
live canvas feature where you can draw
1083
01:00:51,571 --> 01:00:54,601
on one side through a prompt and it
redraws it on the right hand side.
1084
01:00:55,636 --> 01:00:58,741
If you want to give it a shot
and then we will see you all.
1085
01:01:00,221 --> 01:01:01,601
doing one next week, Fraser?
1086
01:01:02,028 --> 01:01:02,388
Fraser Kelton: I don't know.
1087
01:01:02,438 --> 01:01:02,828
Let's see.
1088
01:01:02,868 --> 01:01:03,728
We'll see in the future.
1089
01:01:03,808 --> 01:01:04,688
We, we're not sure.
1090
01:01:05,558 --> 01:01:06,558
Nabeel Hyatt: We'll see
you all in the future.
1091
01:01:08,138 --> 01:01:08,298
Bye
1092
01:01:08,608 --> 01:01:09,088
Fraser Kelton: See ya.