What's Coming in 2025?
Download MP3Welcome to DevOps and Docker talk.
And I am Bret.
If you've been listening to this podcast for a while, this is
probably the seventh year or so that I've been doing this podcast.
It hasn't changed a terrible amount in that time.
We've mostly had guests on the show from various cloud native and DevOps related product
companies, talking about tools and solutions, and you often will hear My co-host Al Al Metha
on the show, but every so often I just monologue, and this is one of those specifically.
This is about what I'm seeing and what I'm doing right now, and then
for the rest of the year and so this is gonna happen in three parts.
First, I'm just talking about.
What's about to happen for me for the next three weeks in going to
London for KubeCon, and then what I'm planning to change in this
podcast, as well as my other content on YouTube for the rest of the year.
And last, I'm gonna talk about some industry trends that I'm seeing
that will force me, I think, to change the format of this show.
All right, let's get into it.
Today is March 22nd, 2025, and I am days away from leaving for London.
Where I will be spending time first with my wife and mother and
attending three conferences over nine days as a part of that.
So it's a little bit of fun and a lot of work.
First, when we land, the day we land, I'm actually gonna go to Red Monks Mardi Gras, which
I've never been to before, but I'm a fan of Red Monk and I know that their events can be.
Small and intimate and full of powerful people doing cool things in tech.
And no surprise, this year's version in London is a packed agenda of
ai, everything, which is the reason I think I'm really wanting to go.
As I will hint to later in this podcast and why I think that is, so I start with that.
That's a two day thing.
Then I do some fun with the wife and mother.
My mom has never actually been to England so it's pretty exciting to take her for the first time.
And if you didn't know some behind the scenes, my wife
and I actually worked together in this business full time.
And have so for many years, and it's not our first business, I think it's something
around our sixth, uh, I think it's certainly the most successful long running one.
And then we go to work at Rejects, which is, if you've not been to KubeCon
Rejects is the conference of Rejected conference talks from KubeCon.
But I've already started to really like it and I've only been to one.
I like that it's at least much smaller.
KubeCon, so you can actually see everyone in a matter of hours at the conference.
And it tends to be a little bit of the who's, who of people in the community there, that
you maybe don't get access to at the big conference because you can never find them.
Or there's just, you know, there's 10 plus thousand people at KubeCon,
so when there's only hundreds or possibly up to a thousand at rejects.
It's a much more accessible conference and you can sort of
slow down, have longer conversations, and it's pretty nice.
Um, but unfortunately with rejects, which this year at London sold out in
days, unfortunately I missed the blue sky skeet on the announcement that the.
Tickets were open and within days I went back and it was gone already.
So the, I'm hoping I can get in the door, maybe I can
find someone I know and sneak in the back or something.
Hopefully that'll happen.
And then there's the KubeCon Day Zero, which is sort of mini conferences within
the big conference and platform engineering recently has been the biggest, and
there's a growing number, but platform engineering is definitely a trend there.
And I like to visit the Argo CD Argo Con Day because I'm a fan of deployments and
automation, and Argo CD seems to be the clear winner right now in the ecosystem.
And then that same day, I'm gonna go off to Portainer's workshop,
which is gonna take place on a Yacht where we're gonna work hands-on
with Sidero's, Talos Linux, and their Kubernetes manager, Omni.
So I'm excited to learn some of that stuff 'cause I haven't really been able
to put my hands on Talos Linux yet, which is the self-proclaimed Kubernetes
Linux and the tool from Sidero, I think that's how you pronounce it.
They're omni tool for managing all of that.
And then three days of KubeCon, where for the fourth year in a row we're probably
gonna have keynotes largely about ai, while the average person walking around the hall
isn't even running their own inference clusters and doesn't really do anything with
AI except use open AI or LLM models to answer questions and write text and to an IDE.
So it's a bit of a weird disconnect, but I'm going to talk about why I
think that's about to change for US infrastructure and DevOps people.
But first.
Let me talk about what we're gonna be doing on this show this year.
And you probably, if you've been paying attention, if you're an avid
listener of various tech podcasts, you might have noticed that I haven't been
shipping as many episodes this year so far, and there's an intention to that.
It doesn't mean we're slowing down.
If anything, I'm hoping we're gonna speed up and do more, and I needed to take.
My small little team and spend some serious time focusing on a few things,
We've spent the last two and a half months focusing on improving our content workflow
so that we can ship things faster and smaller and more dynamic in the moment.
As well as improving our sponsorship offerings.
Hey, do you work at a company that wants to sponsor us?
Lemme know.
Um, there's an increasing number of companies that want to sponsor us in certain ways
and we're trying to figure out how to make that work while also ensuring my journalistic
integrity, which I'm not a journalist, but whatever that integrity thing, right?
Like I don't wanna be a corporate.
She, so we've been trying to figure out that stuff, spending a lot of time
honing in on what we feel comfortable with and what we think is genuine.
But not to be a complete sellout to anyone who wants to give us a buck.
And something else fun we've been doing is improving the studio, which is in one of our spare
bedrooms, where we've basically taken over the last few years, the entire room and expanded a
little bit of the studio so we can have different camera setups and just produce better content.
And recently, specifically this month in March, I've spent some time
leaning into AI and more than just, Hey, I'm using chat GPT now more than
I'm using Google or Stack Overflow, which it's what's happening everywhere.
Uh, if you didn't know, stack Overflow is down 60% in traffic year over year.
Google is down in traffic as well, but I'm not sure that there Talking publicly
about how much that is, but we are definitely seeing a shift where AI is answering
questions better and faster than we can find through traditional search methods.
That's a little bit scary because that's kind of what the first wave has been, there are
more waves of AI coming, and I think it's coming for a lot of the jobs in a non scary.
Oh no.
What's gonna happen to our job roles and will we be able to find a job if the AI takes it?
I'm not one of those people.
I am more in the middle where I think there will be some churn and
some people that have to shift their job roles to something adjacent.
But I think we're about to see AI come for infrastructure.
If you haven't been hearing about the term AI agents, which is basically
something that was invented in 2024, as far as I can tell, it's giving
AI tooling, allowing it to control things, not just say or write things.
And that's the moment where I feel like it becomes way more
interesting for DevOps and platform engineering and cloud ops.
I think we're really, really early.
And if you look at some of the things like Google Trends, you will notice that
even just the term AI Agents basically wasn't being searched for six months ago.
But we're starting to see exponential growth , in the interest for AI Agents . I expect
that to continue as the industry starts to realize that allowing the AI to do things on
our behalf rather than just tell us how to do it, is where the real value's gonna happen.
But you know, we're DevOps, we're platformers.
We need deterministic output, and AI is nowhere near doing that.
So.
I am gonna be talking more and making some videos.
You'll probably hear me make a few dedicated podcasts about this, but I believe
that we're possibly on the precipice of AI coming for our infrastructure.
But that's not gonna happen if we can't control it, if we
can't rely on it to be correct a hundred percent of the time.
And I think one of the things that's gonna happen there is we're gonna have
to learn how to give the AI a locked down playground to figure out what it
needs to do before it actually does it in the real, and I don't know how
this is gonna look, and I could be totally off on how this actually happens.
But something happened this month.
The short version here is.
I have been ignoring AI largely in terms of doing anything in my job other
than helping me write some code or Terraform or, well, it doesn't need
to help me write Docker files 'cause it's not as good as I am at that yet.
But I've tried and you know, things like it even knows bake files, right?
It knows a lot of the things that are even new because.
The models are getting better at having more current
knowledge, and they can also now search the web.
But it was never terribly interesting for me to talk about.
But I think what's about to happen is it's going to do things, not just talk about 'em.
So what happened is on March 1st, Solomon Hikes the founder of Dagger, a programmatic CI/CD company.
Who previously founded Docker, you may have heard of him.
Uh, he reached out to me saying something that had happened
in their community and he wanted me to be aware of it.
it was totally out of the blue.
We had actually talked at KubeCon last year for.
An hour or so about what they were doing in the CI space.
I've been watching them for years kind of wanting to see where they went and not spending a
terrible amount of my own time learning it because I was sort of waiting for that inflection
moment where they, are starting to become a popular tool and they've got it fleshed out enough
that it makes sense to programmatically write your CI in your favorite language rather than
using YAML and what he's telling me is now something that they have been rapidly iterating on
day and night for the last month or so, and speaking at meetups all over about, because it's
caught the attention of the AI fan people and the short version of that for this podcast.
Is that if AI is coming for us, and it's still a crazy chaos monkey in and of itself because
I've witnessed this on my own machine just last week where I was using Cursor and gave it what's
called YOLO mode, which means that the Cursor ai, I can basically give it a set of instructions.
And it will iterate over and over and over again,
including command line tooling and command line builds.
And it does that over and over until it gets a successful result.
And it was trying to build for me an iOS app, which it was
doing very poorly and it would constantly fail at the build.
And it tried this over 20 times in a row to build this
thing, edit some code, build this thing, always failing.
It.
Started to believe that my computer was broke so it.
Stopped just short of executing a sudo RM dash RF of a specific
system directory where X code binaries were installed before.
I said, ah, ah, ah.
That is definitely not something I want you to do.
Luckily, in Cursor there is a setting under the yolo mode, Which prevents it from doing any
RM commands, but there was a clear indication of this thing being so confused, doesn't know
when to quit, which my friend Nirmal Mehta actually pointed out to me that that's one of
the problems with our models today is they don't know when to quit and it wasn't quitting.
And this exact thing could easily happen when we would ever let AI
control any sort of infrastructure or run in CI or do anything like that.
Right.
So.
The fundamental problem today is that we have these things
that are kind of smart, but they don't know when they're lying.
They don't know when to quit, and we need to give
them a safe place to play until they figure it out.
Well, containers happen to be the perfect place to do that, and since
we're all already used to containers, it turns out that daggers open
source may have accidentally stumbled on to a method or a workflow.
That allows you to easily use LLMs inside a pipeline of tasking that you give it, but
that it also has access to a giant set of tools that Dagger calls the dagger verse.
It's essentially a
Docker Hub esque place for all the different tools and functions that people build.
That you can use in Dagger.
It would be kind of like the dagger version of GitHub Actions.
And they have lots of them and they all happen to have a common API that Dagger
has learned how to give the L LMS access to, which means the LLM can instantly
understand the purpose of that tool and be able to use it to solve the problems.
And it does all this in containers.
So it is extremely early days for this, but the big minds over at Dagger are thinking
heavily about how they can use this to give AI work, let it iterate in a safe place, and
then return the final result while also having access to all the tools we want to give it.
And this led to some thinking on my part And I'm probably gonna make a different podcast about
this because it comes down to some of the research I've been doing on AI agents and how they're
coming at us from the industry of AI to those that are making and deploying the software.
Right.
And there was a great talk.
That, I'll worry about the details in another podcast, but essentially they were
talking about the idea that now that engineers are getting comfortable with AI
writing the code, they're able to iterate faster and produce more work output.
But because operators and DevOps engineers have largely been standing on the sidelines and have.
generally had this attitude of the AI will never touch my infrastructure.
This has meant that one part of the pipeline of the
software lifecycle has sped up, but the others haven't.
And that's gonna create the natural tension that we often have.
Before we had DevOps, where you had devs creating software, and then they would have
to wait on the operators to get around to building the servers, and deploying the code.
And we possibly could be in a moment where that's going to happen again due to mismatched velocity.
and I haven't witnessed that myself because quite honestly,
I'm not working with any AI prompting teams in production.
so I haven't witnessed this, but I can imagine that it's a real thing for some companies.
Where they're aggressively taking advantage of AI in the dev groups.
so anyway, that whole story has happened over the last three weeks.
I'm very excited to be involved with this and I'm appreciative to Solomon
for the heads up because it caused me to spend weeks deep diving into
the state of these agents, the different players in the industry, and.
Even how AWS is coming hard at this stuff.
So I think it's real and I think it's gonna change a
lot of the content I create this year and going forward.
But not to fear I am still strictly a.
focused podcast on everything, infrastructure, DevOps, containers, and the like.
It's just gonna have the assistance of robots at this point.
So stay tuned on that front.
and the last thing I'll say is that you probably have noticed that we
haven't had guests on the show for a few months, and that's due to us
coming out of KubeCon last year and taking a break during the holidays.
Which led into this whole sort of rebooting of our content machine and me spending
some time focusing on business improvements as well as the AI rabbit hole.
So what we're doing now is we're going throughout all the CNCF projects, all the cloud native
ambassadors, all the Docker captains, and we're trying to find two main focuses, new, exciting
CNCF projects that are becoming popular or are graduating and that we bring one of the experts
from that project on the show to talk about where they're at and what the product does.
At the same time, we're definitely gonna be leaning into this AI
agents wave and how that's going to affect all of us specifically.
And I personally can't wait.
I have a feeling in my gut that's telling me this could be the next big wave for
infrastructure tech like we saw over a decade ago with containers and distributed computing.
But I'm not smart enough to make any grand declarations
of exactly how this is all gonna come together.
I just want to be there to witness it and to report it.
So thanks for sticking with me on this one.
Expect some more content coming out of KubeCon over the next few weeks as I try to read
the tea leaves at the conference about not just what they're saying at the keynotes, but
also what people are doing on the ground day to day in the world of cloud native DevOps.
See you in the next one.
Creators and Guests


