Hugo, you gave the best possible introduction I could hope for, or the best possible context I could hope for.
This third talk is normally about the future. It's a bit more conceptual and less demo -y.
But what I thought I would do today is to show what our version of the future that we are already living in looks like in MindStone. So I will show you the
system that we are building and hugo has set the scene very well because the approach is brilliant by poses at least three problems one is technical do we want everyone to be able to work on a terminal one is security how can we get security right and third is coordination these three things are very important if we want to adopt an approach like that in an organization.
So how is it that you can work with your colleagues on a system like this? And this is the problem that we have started to solve at MindStone
with what we call MindStone Rebel. That's the name of the system.
I see some doubtful faces.
is it's rebel because it's a rebellion to the way in which ai is currently being used and we want everyone to use ai autonomously but to work differently to use ai pretty much as yugo showed as a chief of staff someone that can execute tasks on your behalf on multiple platforms
what do we need to do that we need skills a chief of staff who likes this idea of a chief of staff can I get the show offense this is something we are discussing a lot do you get the concept intuitively okay half of the room but SEO a prime minister would have a chief of staff it's like
1Like, organize my agenda, solve a problem for me. I give very vague instructions, and that person can solve problems.
And so if you think about this figure and you want your AI system to operate like that,
you need to give the system skills, ways of doing things. You need to give the system memory, not just about yourself, but also about the company.
And you need to decide what memory can be used for certain things.
and then you need to give the system connectors who is familiar with the idea of an mcp half of the room for the others an mcp is basically an api for llms so it allows models to uh to to talk to each other and to talk to different apps and execute on different apps
roughly so what i will do is show you how this system works and what we're going to do we're going to do a few things draft a message send an email but i will start by asking for a summary of otavio's talk because i was recording the talk have a look in my downloads there should be
a talk by Ottavio as the first download and can you please give me a summary
I'm in a live demo in front of 60 people so please don't screw this up
I should have told that to my guys and remember this works because when you're saying don't screw this up you're activating a different word cloud for the LLM to look at
It's like, this is important. This person is speaking publicly. So you are increasing the stakes for the LLM to execute the task. So it's a bit of a joke, but not really. All right.
Let me show you what is sourcing the information from, because it wasn't an easy or simple file. It was like this.
So I'm literally taking the Granola transcript. great who is familiar with granola so it's a note taking up what do we want to do with this
otavio do you recognize the top description first we want to build a memory can you build a memory on this because once i have a memory it's on the system then i can summon that memory memory to do other stuff and we will see what other things what other other tasks I can do with that.
Because remember, this system is linked to all of my apps, my email, my Slack. So once I have this memory, once I have this summary,
I can use it to do things pretty much like a chief of staff would do.
And you see, it is not as quick as other AI systems possibly. 1But in a sense, that's by design, because there are security checks in place on every task and also because you can do multiple things in parallel.
You see all these conversations. I can, this is how I'm working these days. I would start three, four things and then Rebel would tell me when each of this is done.
All right, still going through this.
In the meantime, I'm in a live demo in front of 60 people in Lisbon.
can you send a slack message to my team on the general channel saying hi and that I miss them and this is going to be the last message they get from me while I do live demos this week I recheck the transcript you see slack message oh and
And we have the memory now. So I can also say, also include a very brief summary, three bullets of Ottavio's talk.
I mean, we still need to train these models to understand Italians. Probably seeing what? Yes.
You can add words, because I'm using Super Whisper. Who has heard of Super Whisper? You're a Whisper Flow guy. So this is a dictation app.
And the particular thing about it is that you can just activate it with a shortcut and it would dictate in every in every field that you select so email slack whatever you just need to place the space bar there
yes it is a memory we just saved it this is not this is not scripted In the meantime, as that message is going,
we are going to create a presentation of Octavio's talk.
How many of you have heard about Gamma? A few. So Gamma is a presentation -making tool.
It's what we have found to be what PowerPoint should be in the age of AI, if I need to summarize it very briefly.
can you create a presentation with otavio stock and give me the gamma link the presentation should be on gamma let's refix this still not finding the memory weird all right just send the slack
message we will look into it later and so what this will do it will use my mcp connector remember Remember, the MCP is what links LLMs to apps. And it will send the Slack message directly to my team.
Obviously, you might be thinking, OK, you are just, can you not copy -paste it? There is a context switching problem.
But also, imagine when you start having 20 Slack channels. I need to manage the relationships with 20 local organizers like Michael. And so this allows me to send 20 messages in parallel.
oh there you go it was sinking the memory is there you see thank you you go you see there is an authorization here and i've programmed rebel so that i need to check every single message i sent allow this once and so it's going to take the message look for the connection to slack and then it's gonna upload it on the general channel let's see if that worked
What is general? Success.
Did it get the accent right? Is the accent on the A? Actually, no, but that's fine. I was curious, because it wasn't in the instructions I gave.
So I thought, is it the right accent? Did it pick it up?
Are there any questions as we wait for this to generate?
A very simple one. When you ask something by voice, the trigger is coverage return or the end of speaking of someone else?
I activate it with the keyboard shortcut. It's possible to activate this end of speaking of someone else with the voice?
Yes, it is. It is. It's a .
Thank you, Hugo. Thank you. Yes?
How is this different from client co -work? so co -work allows you to do this but within a folder so this opens up the entire tech stack of the organization and brings together all the system all the systems that you might want to use with respect to what we have seen the principle is really similar that's why I was so impressed
by what you would did what we are doing we are creating an environment where multiple people can do this simultaneously they can write skills simultaneously they can share memories they can trade information because the the it is quite difficult to scale you will correct me if you disagree but I
think it's quite difficult to scale an organization of a hundred two hundred people and I will stop there I don't want to think about 10 ,000 not yet but I think it's quite difficult to scale that approach in an organization of a hundred
Just to clarify, the one I demoed is designed for personal usage. It's not designed to scale within an organization. It is not. Eventually, you might have a level above of an orchestrator that talks to my PodBot and
other people's PodBot, but it's not. It's only... It's basically the idea you heard about the Apple AI. No, no, no.
It's Siri if it worked. Correct. I keep having Tim Cook on posts about the bot, like, dude, this guy in Vienna was able to build it.
So, let's have a look. Ten years ago, huh? I mean, and this is not. so the point of this is not that it's perfect the point is that it's pretty good and this is
all editable you can take it on you can put it on PowerPoint you can export it and you would be able to just speak to gamma and make changes in real time by the way gamma works regardless Regardless of this interface. I've used it by an MCP, but you can basically do the same
So I've been I've been lazy or efficient in a sense, but you can do the same With this interface where you can just put a file you can paste in some some text And you would generate the slides Pretty much automatically for you
Yes, the first few cards are free then if you go on the milestone platform, there is a discount 50 there is a discount on the on a pro plan no one more question maybe any more questions yes
that's a very good question we have created different spaces and so different people have access to different skills and they can edit different skills in different spaces and so
we are managing that process and that's to me that's probably that starts being the biggest barrier to scalability here because that works for we're 15 ish I could work for 50 could work for a hundred maybe 200 as soon as you are speaking about two thousand three thousand
probably need something different so our spaces would be sales community customer success so it's by function. Legal.