Hi, my name is Glenn Smith.
A quick bit of background to me.
I used to be in finance for 15 years, don't hold that against me, and now I've been in startups for about 15 years down here in Bath in Bristol.
I'm currently working for a startup called Fueled.
I've gone through a number of different companies.
I've been in AI for about 20 years now.
We built AI systems in the markets 20 years ago with top end teams and systems.
systems, I've had a robotics company, we've done a knowledge software, knowledge management
company with AI, consulted a couple of big companies in the last little while around
this as well, and we just had a FinTech which we just sold, so now in the last two years
I've just been having time in my hands just doing AI coding, and as a side project I've
just built this company around this stuff, and on that process I've done a lot of development
with the latest AI tools, because as an early stage start -up I've been able to use all these
these tools at a level which you probably can't do in many companies but we've been flat out trying
to do that so therefore I've been learning lots and lots and lots and lots I'm going through about
a billion tokens a month right now and some of the stuff I'm doing that's a big number and that's a
lot of tokens so there's no tokens about three quarters of a word so strangely so that actually
gives you an idea that's a lot of words that I'm feeding into models and a lot of that's coming
from my code bases and stuff like that so that's what we've been learning so I've done a lot of
of intense learning, I'm here to share a little bit
of that with you guys today, okay?
So, what's Fueled all about?
Well, we're actually on a good mission,
and that's to improve health at the population scale.
And it's a pretty big goal, and we're trying to use
the latest AI to do a lot of stuff in that.
So I'll give you a little bit of context to that.
Just, there's a lot of obesity out there,
really bad in the States, really bad here.
Belgium, I was there last week, and it's still bad there.
So generally, this is a big problem,
not just in the human scale but also a big financial problem for the countries
and the social systems that we have as well so we're trying to do that in a way
as opposed to treat health and when people get ill try to make them not get
there in the first place we're doing that through looking at nutrition looking
at through integrating that with lifestyle change metrics through wearable
data etc and then delivering personalized support to help people
change their habits that's what the sort of aim of what we're trying to do is so
how are we doing that we've built using a lot of AI and a lot of this process so
So what we have is an app that will let you do some cool.
It's a bit more common now, but when we first put it out,
we were one of the first companies out there with this,
which just take a picture of your food,
and it will give you a full nutritional breakdown.
Now, there's a lot of stuff going on with that at the moment.
Other companies can do this stuff too now.
So what we've been doing over the last year
is helping to make that more professional
and how we begin to build and analyze that data.
So what we've got is a back -end portal,
which are working with nutritional medicine people
around that and and personal trainers and how they can analyze that data from
their clients and give them reports that are AI generated as well so I want to
talk a little bit about some of the tools that we're using to do that so
we're building a lot with AI tools so I'm using cloud code we're using cursor
we're using anti -gravity in the last couple of days this new tool it's just
come out with Google so these new tools come out we can just use them and begin
to really push the boundaries and learn a lot about them so can have to talk to
the guys afterwards if you want to chat about any of those very technical detailed stuff we're deep
in the middle of this stuff uh but the we're using like the google's gemini models for instance for
the food recognition capability and how these multi -modal models work so here's a question
for people in terms of understanding of ai um when the ai looks at the picture of food when you take
a picture with your camera and you give it up to chat gpt or gemini and say okay what's in this
image give me the nutritional breakdown of this photo how do you think it's
doing that how do you think the model is working under the surface anyone got any
ideas like what's happening in that process more so visual pattern
recognition so it's looking at on the plate is saying okay there's a shape
there's a there's a sheep and there is a car looking at that but what is that
that looks like a car okay so looking at similar pictures in the big LLMs and
how you compare those pictures and saying this is labeled as a correct that
that is correct and you're not correct okay so your sound so your so your sound's correct which
is actually the clever i was up in oxford a couple of weeks ago and and that that was the answer i
was getting from multiple people as well because when people have tried to do vision systems for
the last like 15 20 years that's how they've been going about trying to do it right and uh when i
was building robotic vision systems 10 15 years ago that's how we were trying to do it but in
reality that didn't work that well and it was really quite complex and quite brittle what they
what did happen is what these models are doing is taking those images and doing exactly what you
said they're turning them into the same embedding space as they are when they're reading books and
the internet okay so um what that what does that mean well basically when i say the word chicken
right think about what happens in your brain when i say the word chicken right you sort of you can
imagine a picture of a chicken you might imagine a kfc you might imagine the word chicken out there
and you can see an image of it in a book as well as maybe the item and stuff like that there's lots
of different things in your brain that are all associated with the word chicken
and and in reality that's what the LLM is doing as well at that point it's
taking the image and it's putting it in the same embedding space it's just turn
it into the same embedding just a series of numbers and weights and it's doing
that with the same as when it reads the word chicken in a book it's saying the
same thing it's actually these LLMs are working in the same way as your
brain does when you're taking an audio and visual and picture data etc as well
That's why, that's the mechanism for your brain actually trying to generate this, and this is why that does it well.
So we're trying to figure out what the nutrition is on a plate of food.
It's not just understanding what's in the picture,
it's seeing that in the same way as it's read every recipe out there for a bolognese.
So that's what it estimates what's in a bolognese pretty surprisingly well.
How could it do that?
It's because it knows every recipe what's in a bolognese, and sort of knows the ingredients as well.
So we know, we understand the stochastic nature of what it's doing, we know that's a probabilistic guess.
So, therefore, we give options in our UI and make it easy for people to tweak and change
those sort of things.
But, hopefully, what we're trying to do there is not talk about what my product does specifically,
but thinking about how does the product work, how is the LLMs thinking about the world,
how are they underworking underneath the surface, and then how do we apply that to our use cases
and what we're doing.
So, this goes back up again.
There we go.
Okay.
And so, the other thing that becomes interesting is how we're doing some of the reports on
this stuff.
just click on there okay okay so when we do that though we begin to use the
ability of AI to do more language based stuff as well so what we do when they
help try to move people's behaviors and this is we do AI coaching on the way
through so what does that mean so when you take a plate of food when we set up
the account we take in your preferences your allergies what well are you your
vegetarian or well you just eat fish or whatever it might be what your allergies
are you can't have gluten etc and we take those in the account and then we
We try to give a bit of coaching and try to move you in the right direction from your
health basis.
We don't always say, just eat a salad.
That's not great advice.
We try to incrementally take you from where you are and move you incrementally forward
in terms of a health journey to try to meet your health goals.
We have different personalities that we let you do that.
We can give this real time.
We've had some really great customer feedback and this stuff for people going, wow, this
is really good.
I've never had this sort of advice before.
I love the way it makes me suggest I eat kale.
We try things like that.
And we had this from a lady in America that was quite amusing that this is the first time she'd heard this sort of very specific advice and just didn't know this and these sort of things, which was quite interesting.
What we do in the background and also with this portal is we take all this data and then we use it to analyze it.
What, five minutes to go?
Okay, cool.
So this is one of the things that we do.
1We now turn this into a fully agentic system.
So we had the agentic agents being shown by Johnny earlier on in Cloud Code.
we have a number of agents that we're working on to compile these reports and
data and do different jobs in the background here and so we're using this
with interpersonal trainers as a one pager or we're using this as a
compliment with nutritionists without with a much larger document that goes
along in the background with that will be a 12 page report on your diet to help
you give more customized advice around that so we gather the data with the app
and then the background we have a portal to analyze it and then give reporting on
that then as well so trying to build these different a using AI and multiple
different modalities there of image capture and then agentic AI reporting as
well to help have the impact on health.
This is cool, we do recipe suggestion as
well, we won't talk too much about this but it's quite a cool little thing, take a
picture of your fridge, it'll itemize your whole fridge and then we can use
that to create recipes, cool little thing we played around with, it's not that
big a feature, we don't push it too much in the app but it's pretty cool and how
it works and that got Google's attention, they actually profiled us
on their blog post earlier on in the year,
which is pretty cool.
But one of the things we do with this
is not to push the build in the product,
it's how we market it as well.
Now we're a pretty techy room actually,
I thought we might have a bit more diverse marketers
and stuff in here, but we're using a lot of these tools
in our marketing processes as well.
I'm gonna just talk briefly about that.
In the last few websites we've done, we've used Webflow.
I think we used Webflow in CloudFind as well, didn't we?
Yeah, so we moved to Webflow so our marketing teams
could run the website without our engineering teams
needing to do it.
And I've done that for the last four or five startups, but we've actually moved this one back to Cursor and let my marketing team, it's not technical, just build our website using the AI coding tools now, because I find it's easier and faster to develop and use that stuff, and that's been slightly revolutionary for our website to become a complex product as well for us now, where we've done both A, all the content that looks great, but then also things like we've moved our onboarding from our app through to the website now,
So we're taking the onboarding out of the app onto the website that lets us do two or three things that are pretty cool
One thing is lets us map our marketing better
So we understand our customers because we can iterate the way the onboarding flow on our website more often than we can in our app
We can actually have much more
web onboarding flows
Operating from our website and then have different marketing campaigns matching news onboarding flows
which increases the conversion of those marketing campaigns as well and
And because we're going to be looking at more health data as we go forward, we don't want to have Facebook and that nastiness of Facebook Pixel within our app.
So we keep that out, so we control the data that we do send to Facebook, which gives them the right information on conversions in terms of this customer has now bought the product sort of thing.
But we don't let them extract data from our website itself, and therefore we protect privacy, especially from health -type data in a more systematic way.
So that's a pretty cool mechanic of what we've learned to use this with and how we built that.
we can read that data back.
So you sign up on our website,
now we can read that data back securely
in our backend, in their app.
So there's a joined up experience
from the website to the app as well,
which is pretty cool.
We've also built up our own tools.
With these AI tools,
we can just build so much cool stuff.
This is our email campaign management
that we have,
and connecting in our tool mailer send
that we have,
but we can select customers
and groups of customers
in our web,
in our sort of these tools
to send out campaigns
and specific offers to,
to, and then see how those campaigns are doing as they come back.
And this is integration into Facebook data, this is how we manage our Facebook campaigns
and different ads and ad sets and see the conversions and how they're doing and click
through rates and all those sort of cool things.
We build all these sort of tools within our own environment and our ecosystem, which the
operational tools that we have in this is better than a lot of big companies and how
they do this.
And we've got this as like a very small four person startup at this moment in time, which
is sort of cool to do this.
The other little thing we've built is another side of things
just because we build a lot of stuff, because we love building these tools and we use these
as a lot of experiments.
AI Coder Guru is basically figuring out how to do the analytics
and here is my usage of some of these models over the last month.
What this tool lets you
do is drag and drop the output from this cursor tool that we use into here and it shows you
how the different models I'm using.
So all these different coloured bars on here are
different models so this this one this greeny colored one on the screen here is claude 4 .5
at the moment so i've used 627 million tokens in claude code i started claude's 4 .5 model over the
last month or so i'm using gpt 5 .1 composer and these and there's a bit of gemini 3 .0 pro in there
just launched two days ago so we're beginning to bring these things into the mix but this gives us
an analytics and a bit of breakdown and all the different things that we do there okay so and this
This is what we're building and we have a lot of this built now in the background of
trying to get all the next level of the data through there for your health stuff, so taking
data from your Apple Watch and giving you a health score to pull together that original
vision that we have of nutrition, sleep, exercise, and stress management all in one device and
use that to help your health and improve your health goals.
If you want to check it all out, you can do it in here.
We have a discount code, maybe a month free to check out the app and the technology we
have if you want to try that out.
but if you want to take a picture of that great any questions out there on
that