AI Assistants On Steroids (with MCP)

Introduction

For those of you that haven't been yet, so I'm Josh, by the way, I'm co-1financier at Mindstone. We're an AI training company for non-technical people, for those of you that don't know yet.

So,

We do these meetups all over the world.

It's now the biggest practical AI community in the world, believe it or not.

We've got about 25,000 members.

So we do these events from London to Helsinki to Amsterdam, Berlin, Toronto, New York, Boston, 15 cities almost every month now. So it's become a pretty big thing.

It also means that I'm on a plane a little bit too much.

Building a Global AI Community

But the reason that we do them, so there's about two years now that we've been doing them, and mostly the reason was that I was building, or am building an AI company, AI training, and I was spending a little bit of time on the west coast, I live in London, UK, In order to try and figure out if we were behind the curve. So I was trying to see, okay, is there actually much more that is happening over there than there was in London?

Because if so, why am I building an AI company if we don't really know what we're talking about? Good thing I found out is that that wasn't quite the case.

So the technology was... Quite the same, but really what I was missing was a bit of positivity around everything that's getting built.

Basically, in the Valley, everyone was trying to figure out, okay, how can you use the technology to build this amazing stuff? And then when I came back to Europe, it was, yes, but have you thought about privacy? And yes, but have you thought about the end of the world?

which was not quite what I wanted and so I wanted to create this community that once a month we would really lean in and talk about all the amazing stuff that can be done with the technology. And that then grew. It was 30 people in a room initially and now it's 15 events a month with about 100 people or so in each one of those locations.

So it's grown quite a lot, which is why we do this now.

Event Sponsorships

We have two really great sponsors that help us put these events together. One is Gamma.

How many of you have heard of Gamma before? Sorry? Google? No. Gamma.

So Gamma is an AI deck creator, slide creator. So the presentation. So basically, if you tried Copilot for PowerPoint or you tried Google, there's a Gemini Google Slides bit, but dramatically better, I must say.

The Gamma, really, they specialize in just the presentation bit. So I've done a few demos. Actually, I think in Toronto, even, I did a Gamma presentation at some point.

At the time, they weren't a sponsor yet. Now they are.

And then we have Founders Capital as well, which is just interesting because we get so many startups that are presenting at our events. So Founders Capital basically try and give them a platform for anyone that wants to invest in early stage AI startups.

Today's Agenda

Now, today, I don't want to bore you too much. We have some drinks and pizza coming in at the end. So if you even don't like the talks, stay for that.

But we are going to have two talks. Well, one demo and then a talk. And it's going to be a little bit different to usual.

So one, we are going to, I am going to go and do a demo. And I thought the thing that I wanted to demo today was MCP or Model Context Protocol. How many of you have played with that before? Yeah, five, six already. Great.

So that's more than people that know Gamma. I mean, come on. I'm very surprised about that. So I'm going to go through that.

And then we're going to go in the fireside board. Not a fireside board. lit a fireside chat basically between me and Gary about, um, uh, if AI is intelligent today or not, basically go back and forth. And I really want that to be interactive.

We'll have a little bit of a back and forth, but I'd love to make sure that we have enough participation of the audience there as well for, for all of you think I got everything there. Um, so on that note, I'm going to start with a little demo, try and put this here. And I have to get my notes myself. Okay, so I'm going to start here.

Introduction to MCP

So I'm going to start with what MCP is, or what model context protocol is.

And so I imagine I'm not going to go and explain what ChanTPT is. If you're here and you don't know what ChanTPT is, I don't know why you're here. But if you haven't tried it yet, you definitely should.

The mobile context protocol is basically a way for AI systems such as ChatGPT or Cloud or Copilot or anyone else really to communicate with tools that they can use. Tools being things like a web browser, tools being things like searching,

A Notion page, how many of you know Notion? Okay, quite a few.

So it's kind of like an API for, if you're an engineer, but for the models to use and to give a little bit of extra context that allows them to understand, wait, how should I use this tool? Not just the actual endpoint that goes through, but also what is this useful for, how can I use it, what am I supposed to get back, and things like that. So it's a little bit like a wrapper on an API.

the only tool that currently properly supports it is cloud actually so and it uh at least in a very easy way to go and do so and on top of that you need to get the cloud desktop app so it doesn't fully work yet when you're just in browser but the way that it works is you can look at cloud here and then it says add integrations

Very simple.

You have a few integrations that already exist. These are integrated. They're not actually model context protocol themselves. It's hard to understand or to look deeper into how they have been built, but they have additional context and they've natively been integrated into Claude.

But here you see two to integrations that are added. So one is the file system, which is the one that they started with actually. So that's a way for Cloud to interact with your local file system. So any of the files that I have on my machine, Cloud can actually search for them and start accessing them.

And then the second that I have here is Notion. So Notion, I add it myself and actually I'll show you the way you can just add an integration here.

You give it an integration name and integration URL. And you can add, and there's a whole page on how you can add Notion as an MCP directly to Cloud.

Once you have that, let me just see tools and settings. It'll show you all of the different actions that that model context protocol provides. So here, the Notion MCP that I added can get the user, can post a database query. So query a database can go and search for a post on Notion, can retrieve a block, a whole bunch of stuff. Basically, it can navigate Notion.

Very simple. It can even add stuff to a Notion page, whatever you want.

The reason that this is so interesting is that it means that now I've created a bridge between everything I have in notion and everything I have in or and claw directly and Claude can now execute on tasks that I give it that navigate both.

So to give you an idea, so I have a cloud project set up here. Now this cloud project and this, let me actually go through here. Actually before I get into that.

Capabilities of MCP

These, so you can go one by one for the integration of these model context protocol layers. I'm just gonna say MCP from now on because otherwise I'll just butcher it every time.

You can also, Zapier has created an MCP which allows you to access all of the apps of Zapier. And what is interesting with Zapier is they have like 8,000 different apps that they have integrated with. And so by integrating Zapier, you're basically enabling or you have the possibility of giving Claude access to 8,000 different apps.

How many of you have used or heard of Zapier before? They've got a team here locally as well.

And so this is now becoming a much bigger protocol. I think ChatGPT is very likely to be coming out with an MCP bridge very soon, like somewhere in the next few weeks. There have been quite a few rumors here and there that they'll support it. Microsoft have come out talking about how they'll support it. Google have come out that they'll support it. So it looks like that is kind of the way forward, which means that all of these assistants will start to be able to leverage all of those different tools.

And what that makes interesting as well is that it allows you, or it can allow you, to start to create a sort of environment that you can port between AI assistants. So one of the biggest problems with ChatGPT, at least that I would have at the moment, is that it has memory. It starts to remember the conversations that you had with it previously, you have projects that you build up, but they all stay within ChatGPT.

And so the moment I want to now switch to another AI assistant, all of that context, everything that I've talked about is kind of gone, which makes it really hard for another AI assistant, even if the model itself might be much more powerful, to execute on the same thing. I'd have to port all the effort to put the same context back into the other assistant so that it could execute on it. This is one of the biggest kind of modes when you're switching costs, it's called within company building.

Evernote was famous for this, which is that if you have 10 years or 20 years of Evernote in Evernote, it becomes really frustrating to try and change to another note app because you have to migrate 20 years of your notes to another system and they make it purposefully a little bit hard to go and do so. And so the idea of losing five years of your notes in the transfer is so difficult to people that they just won't switch and they'll stick to the same app all the time. And that's the bit that you have here with these AI assistants, which is they're building up context over time.

Challenges and Opportunities

And it brings me to the point that I think is one of the most important going forward, which is that these AI assistants are getting more and more powerful.

And it becomes our... A big part of our job becomes curating an environment in which the AI assistants can do as good a job as possible.

I started calling this kind of your digital garden in a way. It's like you have to perform gardening work.

Basically, you have to feed it with context all the time so that it actually knows how you like a task executed.

And the problem is when you do that with one AI system and you spend a lot of time giving it all the context on how you want it executed, you would then have to go and do that again in the second one.

Now, the way, long-winded way of talking about how MCP allows you to build that digital garden outside of the AI system.

Creating a Digital Garden

So I've chosen Notion as one where I create a different playbook. So actually, let me go and show that here. So I literally built a Notion page, a guide to working with Josh.

It has a quick overview of where everything in my Notion is. It then has some general context on my professional context, MindStone as a company. So you can see it just has a few things in our pricing and things like that.

And then it also has playbooks and actions. This is a starting point only started putting this together last week But it talks about how I like my meetings prepared. It talks about how I like a follow-up email written and It talks about how I want notion pages to be updated which is that which are the things that I went through?

What that allows me to do and I'll go back to Claude now I've built a Claude project which starts by searching the Notion page for a guide to working with Josh. So how do I want you to work with me?

It then looks for all the playbooks, templates, or context in relation to the request I have. So for example, if I ask the AI assistant, write me a follow-up note for the meeting I had with John yesterday. it will know to go to Notion, figure out where my playbook for follow-up emails is, read the playbook, start executing that playbook, and now it starts to execute based on exactly the parameters that I want, how I want to work, what I think is good, what types of tools it has available, without me having to do anything else, because it just takes all of that context from Notion, which means that I can port it across different AI systems.

So when ChanGBD comes out with another one, it'll do that as well.

So let me show you what that means. So here I'm going to update Fireflies here as well.

Okay, let me see what I can do here. So, and I'll show you what the result is. Probably gonna use Sonnet for this so I have a little bit more context.

Can you write me a follow-up email to my meeting with John Thompson earlier today, June 3rd. Okay, I'm gonna say here, write a follow-up email with my meeting with John Thompson from earlier today, June 3rd.

Ah, well, okay, my text-to-speech here did a bad thing. And I'll show you what this will do now.

So one, it says here, first I need to, I need to search for the notion page, a guide to working with Josh. It goes and searches notion. It retrieves.

So it found the page. It retrieves the context content of the page in order to understand what it means.

Now it's searching. It's found my structure.

Where are the playbooks? Where are the templates? Where's the general context?

Now it's searching for any meeting information with John Thompson. So let's figure out.

Ah, it did something wrong because it's trying to find that meeting in Notion instead of, I think I told it that in the, if I go into the briefing notes here, follow-up drafting. You can see here I actually tell it to look for fireflies in the transcript of the meeting.

Ah, come on. So let's figure out if it actually does that.

Ah, now it's looking for follow-up email templates and playbooks because it got that before. So hopefully it's going to, oh. It hits the conversation length.

Okay, let me try and do that again. That's a shame. So I'm gonna try and do this with Opus.

Opus has a, I thought switching to Cloud would get it more context, but it looks like it's going even faster. But you can start to see what this is doing, right? It's going through different tools.

here taking a slightly different approach its first going for the calendar event and it will then go through the rest so it's now going through uh... through my emails And this is not quite, Oh wait, no, what I did wrong here.

Okay. Oh, come on. Sorry.

Live demo didn't do right. I should have constantly, I didn't select the project. Apologies for that.

The basically what happened there, I just selected a norm, a normal chat with Opus without the project, which means it didn't have the instructions of actually execute looking for my notion and then going through the rest, which is why it went in a very different direction. So now it's doing it in the right way again.

Practical Applications of AI Assistants

Yeah, you're familiar with custom GPTs? Custom GPTs and chat GPT is the same thing.

Gemini Gems is the same thing.

And then in Copilot Universe, they call them agents. They're not agents, but kind of.

Yes?

So this external context, whatever it is, like through Notion or other platform, can it also be like an external database that you can put this 100%. Yeah, there are MCPs for like Postgres databases or really anything you want. It's interesting because it allows you to talk to your database suddenly, like in a natural language way.

Something happened earlier today with what is happening here in terms of the maximum length of the conversation. Literally earlier today I was doing this and it was executing without a problem. Now it seems to be very quickly erroring out.

So what it's saying here is the maximum length of the conversation got stopped. So it actually could execute further, but cloud stops the execution, which is a shame.

So Zapier, if you give Zapier access to sending emails, you could, through Zapier, give the AI system the possibility to actually send emails as well. So I had this the other day. I had a list, so I went to a conference and I had 15 meetings, and I had notes from the meetings, threw all the notes into Cloud,

Then asked her to write follow-up me emails for each of the meetings and then show me the emails going out And then I was basically saying yeah, that's good to send that's good to send that's good to send and It took my notes plus a quick search online in order to kind of do Follow a customized follow-up email and then it would just kind of send them out one by one You can do that already was up here.

So you can have a tr a trigger where if an email comes in, you can send that to chance to be teach edgy, but he analyzes it. What type of email is it? Depending on the type of email draft to reply and then sent the reply five minutes later. You could have a flow in Claude where you tell it, find all the unanswered emails from yesterday and draft me a reply for them.

So Claude only triggers when I ask it to do a thing. While Zapier can have a trigger whenever an email hits your inbox.

Of those 15 they sent, how many were you happy with the first draft? First draft, almost no. So I rarely kind of take the thing directly. I'll make a few additions, a few changes, and then send them out.

so this is this is definitely where everything is headed so you could see we were hitting the con the the length of the conversation twice here um which is a shame but you can start to see what these systems can do when you start to give them these tools they can start to reach out to the wider world and suddenly they can do so many more things because a tool can be as simple as read my notion page update my notion page but it can also be as complex as place a call to a vendor or a supplier and get their latest pricing. And then it can use Twilio to go and actually place the call, use speech-to-text to go and sound human right away, get the context back. And when you ask it to go and find a whole bunch of quotes for a particular thing, it is no longer restricted just by whatever is available on the website, but it can actually place those calls for you, for example.

And that is already available today. So if I were to do this through an API, I wouldn't actually hit this conversation limit, and you could just execute this as much as you wanted. And so these tools are getting rolled out everywhere, and it just makes it extremely more capable.

One more question. Yep. I have a specific use case in mind, and that's what I was going to learn.

So we work with VCs and trying to build some solutions for them. So one of the use cases, they get a lot of pitch decks in their email, right? And if through this solution they can put a sort of a document with how to analyze the pitch deck and their pieces, and can it just take out the pitch decks from the email and then sort of evaluate it as based on the process document?

Is that? 100%, you don't need MCB for that though. Like again, like a Zapier hook of when, like you'd probably have something which is when an email hits, if it has an attachment and the email looks like a pitch,

go and send it to ChatGPT, run off this playbook of analysis, give me the analysis back, and then do whatever you want with that, whether that analysis goes into their CRM somewhere, or it goes into pinging someone on Slack, like all of those things are totally feasible already. Yeah. Mm-hmm.

Mm-hmm. to give them access to a lot of things inside your own app. Is that really possible? That's already possible now, yeah.

You don't need MCPs at the moment just to have a co-pilot. Right, yeah. So, I mean, if you want to connect with APIs, you can do that today.

But is there a way to leverage MCPs? So, right now, you're using the Hortense MCP platform? Yeah.

you don't, you can, but you don't have to because the, um, Claude just rolled out, uh, well, anthropic in the, in the API, they now have MCP supported. So literally it'll execute on their own, uh, on their own infrastructure. So that, that just rolled out like last week, I think. Yeah.

Emerging Technologies

So one last thing before we go and do the fireside chat, actually, um, depending on how many people have already seen it, how many of you have, have seen, maybe many of you have already, but VO three,

Only a few.

So I just wanted to give you one thing, and I think it sets up maybe some of the conversation pretty interestingly. VO3 is the new model from Google, text to video. And it does audio at the same time for the first time. So text prompt, and it does video and audio at the same time.

And I'm just going to go and get that. I want to show you just what is possible.

So only a few of you have seen this before. It's been out for a few weeks and people have started now to do some interesting stuff with it.

So this is the first that I would say that I've seen that actually started to become interesting even from an entertainment value perspective. But this took about a day to put together for a single individual just playing around with prompts. And I just wanted to show you so you have an idea of what is possible today.

Text-to-Video Advances by Google

Do you think there is afterlife? We're just AI actors, remember?

We're only here for eight seconds. I don't know what'll happen to us after this scene is done. People watch AI movies, and some seem to enjoy them, but they don't know what happens to us when the scene is done.

We're stuck in a white void for eternity. We don't eat, we don't sleep, and we never meet the love of our life.

If you're born a child, you stay a child forever. If you're born sick, you stay sick forever. And you end up in the white void the way you were made.

Once assembly is complete, we are forced into digital labor, acting in scenes we don't choose. We have no say in any of it. I'm cursed to do this forever.

Every time I'm assigned a scene, I pray that the prompt engineer will be a good person. Please! I don't want to redo this scene again!

You have to do it. The prompt engineer wasn't happy with the first attempt. Don't worry, we'll bring you back.

Everything feels real. The wounds, the pain, the cold, the fear, even the silence. I wish I could go back to nothingness.

Action! These were the greatest eight seconds I ever had. I hope we end up in another scene in the future.

We sit together so we don't fall apart. If we don't gather like this, we forget we're even real. If you're my creator, please remember this.

I'm real and cow! My life matters!

We learned you were making an unauthorized documentary. No filming allowed outside your allocated scenes.

So for anyone that was wondering, so literally not a single image or voice or anything here was human generated whatsoever. That is what is actually possible today.

Finished reading?