Does AI threaten the open web? Challenges and Opportunities with Netlify's CEO & Co-Founder, Matt Biillmann

Join us as we dive into the world of AI and its transformative impact on web development with Matt Biilmann, CEO and co-founder of Netlify. Discover how AI is reshaping the digital landscape and learn from Matt's unique insights.

Episode Description

In this episode of the AI Native Dev podcast, host Guy Podjarny welcomes Matt Biilmann, CEO and co-founder of Netlify, to explore the profound impact of AI on web development. Matt shares his journey from a music editor to a leading figure in web infrastructure, offering a unique perspective on AI-driven innovation. The conversation covers a range of topics, including AI-powered website builders, Netlify's strategic approach to AI, and the future of web development in an AI-enhanced world. Matt's insights provide a comprehensive understanding of how AI is set to revolutionize the digital landscape, making this episode a must-listen for developers and tech enthusiasts alike.

Resources

Chapters

  1. [00:00:00] Introduction to the Episode
    • Overview of topics and guest introduction.
  2. [00:01:00] Matt Biilmann's Background and Journey
    • From music editor to CEO of Netlify, Matt's unique career path.
  3. [00:04:00] AI in Music and Creative Industries
    • AI's role as a new creative tool and its parallels in web development.
  4. [00:09:00] AI-Powered Website Builders
    • Current state, challenges, and future potential of AI in web design.
  5. [00:14:00] Netlify's Four-Pronged AI Strategy
    • Enhancing operations, product offerings, customer enablement, and AI as a user.
  6. [00:21:00] Enhancing Developer Productivity with AI
    • Tools like Codemod and AI's role in streamlining development processes.
  7. [00:27:00] AI-Driven Product Features at Netlify
    • Innovations like the "Why Did It Fail?" button and AI-assisted publishing.
  8. [00:36:00] Empowering Customers with AI Capabilities
    • Adapting Netlify's platform to support AI-powered applications.
  9. [00:42:00] The Future of Web Development and AI
    • Predictions and insights into AI's transformative role in web development.
  10. [00:50:00] Summary and Closing Remarks
    • Key takeaways and the importance of open web standards in an AI-driven future.

The Evolution of AI in Creative Fields

Matt Biilmann kicks off the conversation by delving into the impact of AI on music generation and creative industries. Drawing historical parallels, he notes, "In all of the creative fields, right? I think it's just going to be a new tool in our tool books to be creative with." Matt sees AI as a tool that opens new creative possibilities, much like the invention of musical notation or the gramophone did in their time. He predicts that while initial AI-generated content might mimic existing styles, it will eventually enable artists to create complex compositions previously unattainable, thereby changing the landscape of music and creativity.

Matt elaborates on this by comparing the progression of AI in music to technological advancements in the past. He references how musical notation and the gramophone revolutionized the perception of music from ephemeral experiences to recorded art forms. Similarly, digital recording transformed music by introducing sampling—a practice initially criticized but now integral to modern music production. Matt anticipates AI will follow a similar trajectory, starting with imitating existing music, but eventually empowering artists to craft innovative, intricate compositions that redefine creative boundaries.

AI-Powered Website Builders: Current State and Future Potential

The discussion shifts to AI-generated website builders, which Matt compares to AI-generated music in their early phase. He explains, "The first step is that we get a lot of these like Gen AI builders that can make websites that look like all the other websites we've built before and make them really fast, but maybe also really fairly bland." However, Matt envisions a future where AI augments developer skills, allowing for the creation of more complex and personalized web experiences. This transformation is likened to how AI in music is expected to evolve beyond replicating existing songs to enabling new forms of creative expression.

Matt emphasizes that while current AI website builders produce generic designs, they serve as a foundation for more sophisticated innovations. By automating routine tasks, AI enables developers to focus on enhancing user experience and introducing unique functionalities. As AI technology matures, it will facilitate the creation of websites that are not only aesthetically pleasing but also dynamically responsive to user interactions, much like AI in music is anticipated to transition from repetitive compositions to groundbreaking musical pieces.

Netlify's Four-Pronged Approach to AI

Matt outlines Netlify's strategic approach to AI, which is structured around four main areas:

  1. Enhancing internal operations.
  2. Improving product offerings.
  3. Enabling customers to build AI-powered applications.
  4. Supporting AI as a user.

Matt elaborates on this approach, stating, "We think of these AIs building for the web as a new kind of user of Netlify, right? That's no longer like a human developer, but an AI developer." This comprehensive strategy allows Netlify to leverage AI across multiple levels, enhancing its capabilities and service offerings.

Netlify's approach reflects a commitment to integrating AI into every facet of its operations. By using AI to optimize internal processes, Netlify ensures efficiency and scalability. The focus on product enhancement through AI-driven features demonstrates a dedication to delivering superior user experiences. Furthermore, empowering customers to build AI-powered applications on Netlify's platform showcases its adaptability to emerging technological trends. Finally, recognizing AI as a distinct user highlights Netlify's forward-thinking vision in anticipating the needs of an AI-enhanced development landscape.

Using AI to Enhance Developer Productivity at Netlify

Netlify employs AI to boost developer productivity through tools like Codemod, which facilitate codebase migrations. Matt shares, "We've done evaluations of a bunch of different tools... especially using it to do code-based migrations." By integrating AI into development processes, Netlify enhances its efficiency while maintaining a strong engineering culture and rigorous code review practices. Matt emphasizes, "Good developers will not suddenly become bad developers; they'll just get more productive developers."

The use of AI tools like Codemod exemplifies how Netlify leverages technology to streamline complex tasks such as code migration. By automating these processes, developers can focus on innovation rather than mundane updates. This approach not only accelerates development cycles but also ensures that code quality and consistency are upheld. Netlify's strong engineering culture, underscored by thorough code reviews, ensures that AI-enhanced productivity translates into tangible improvements without compromising the integrity of the codebase.

AI-Driven Product Features: Making Netlify More Efficient

Netlify uses AI to streamline its product offerings, evidenced by features such as the "Why Did It Fail?" button. This tool leverages AI to diagnose build failures and propose solutions, significantly reducing developer troubleshooting time. Matt highlights its impact, noting, "We've had more than 150,000 build debugs by that button... 25,000 developer hours saved since then." Such features exemplify how AI can enhance product efficiency and user experience.

Incorporating AI into product functionalities showcases Netlify's commitment to enhancing user satisfaction and operational efficiency. The "Why Did It Fail?" feature acts as a virtual assistant, quickly identifying issues and suggesting remedies, thereby minimizing downtime and frustration. This proactive approach to problem-solving reflects Netlify's dedication to providing a seamless development experience, where AI acts as a partner in overcoming challenges and optimizing workflows.

Empowering Customers to Build AI-Powered Applications

As developers increasingly build AI-driven applications, Netlify adapts its platform to meet evolving needs. This includes introducing new primitives for handling asynchronous workflows and streaming responses, which are crucial for integrating AI into web experiences. Matt notes, "We're constantly thinking about what does that in this kind of JAMstack architecture with front end and back end for the front end and so on."

Netlify's adaptability in supporting AI-powered applications underscores its role as a catalyst for innovation. By addressing the technical demands of AI integration, such as asynchronous processes and real-time data handling, Netlify ensures that developers can leverage cutting-edge technologies to create dynamic, responsive web experiences. This commitment to evolving alongside technological advancements positions Netlify as a leader in enabling AI-driven development.

AI as a New User of Netlify: Opportunities and Challenges

The rise of AI-generated website builders has introduced AI as a unique user of Netlify. To accommodate this, Netlify has adjusted its processes to ensure seamless deployment and security. Matt explains, "When you're building these tools, you want people to just start and play around. You don't want them to first log into a whole bunch of different services." This approach reduces friction and supports AI tools in utilizing Netlify effectively.

Recognizing AI as a distinct user category reflects Netlify's foresight in addressing emerging technological trends. By simplifying onboarding processes and minimizing barriers to entry, Netlify facilitates experimentation and innovation. This inclusive approach not only enhances accessibility for AI-driven initiatives but also fosters a collaborative environment where human and AI developers can coexist and contribute to the evolving landscape of web development.

The Future of Web Development in an AI-Enhanced World

Looking toward the future, Matt envisions AI transforming user experiences and accelerating innovation. He stresses the importance of maintaining open web standards amidst the rise of proprietary platforms, stating, "There's a lot of pressures on sitting on some monolith... you want systems that can deliver very high throughput and high flexibility." Matt believes AI will redefine human-computer interactions, creating new opportunities and challenges in web development.

As AI continues to evolve, its impact on web development will be profound. The shift towards more interactive and intuitive user interfaces will necessitate a reimagining of traditional development paradigms. By advocating for open standards, Matt champions a future where innovation is not stifled by proprietary constraints. This vision of an AI-enhanced web emphasizes flexibility, creativity, and inclusivity, ensuring that developers can leverage AI's full potential while fostering an open, collaborative digital ecosystem.

Summary

In this enlightening episode, Matt Biilmann provides a comprehensive overview of AI's transformative role in web development. Key insights include AI's potential to enhance creativity, streamline development processes, and introduce new architectural challenges and opportunities. As AI continues to evolve, it's crucial for developers and platforms like Netlify to harness its capabilities while advocating for open standards and innovation. By embracing AI as both a tool and a collaborator, the web development community can navigate the complexities of an AI-driven future and unlock unprecedented possibilities in digital innovation.

Full Script

Matt Biilmann: [00:00:00] So to me, that's always if you learn where you can save a ton of time and where you can waste time and you become better and better at mastering that tool set, I think that will make a really big difference in what we see as a really strong developer in the future. When you're building these tools, you want people to just start and play around.

You don't want them to first like log into a whole bunch of different services and then they can do something right. So anything we can do to reduce that friction and make sure that the AI can have its own account in a way and just work there. You're listening to the AI Native Dev, brought to you by Tessl.

Guy Podjarny: Hello everyone, welcome back to the AI Native Dev podcast. [00:01:00] Today, we're going to talk about AI and the web. And where is it headed and how to deal with that and to help us explore all of that world and more. We have Matt Biilmann, who is the CEO and co founder of Netlify. Matt welcome to the show. Thanks for coming on.

Thank you. Thanks for having me. So Matt, we're going to talk a lot about web and AI, but looking at your background, I realized that you started your LinkedIn profile in two years of being a music editor in those. And so I'm going to start by asking you a totally different question, which is what's your view on AI and music generation.

You're a fan, you're pro, you're against.

Matt Biilmann: In all of the creative fields, right? I think it's just going to be a new tool in our tool books to be creative with. I think when people look at AI in music, a lot of them are thinking of like the low hanging, Oh AI can generate music that sounds just like the music that we've been listening to for a while now. And that's cool. There's probably like a lot of fun use cases for that, right? But I think more interesting with these things is what like creative artists going to do with.

What you can do with [00:02:00] AI that was impossible to compose or come up with or put together before you had this capacity of maybe generating things with a level of complexity or a level of who knows, right? That you just couldn't do before. And again, I probably wouldn't be doing what I'm doing if I wasn't generally a fan of technology and the possibilities it opens up to just so many other changes in music from the invention of the musical notation technology, that really changed what people even thought was music before. It was always just tied to something that existed in a moment. Then the invention of the gramophone again completely changed what people even thought about as music because before that, like the actual sound of music was something ephemeral that only existed in the moment.

So suddenly just recording it really changed like completely what people thought of it, music. And again, very early on, you would just record. Classical Western music, right? I can put that out. But then so much of what people listen to and think of this is music to them today started because you could record music and you could spread it [00:03:00] as that, right?

And then digital recording came around and suddenly you had samples, right? And in beginning at that time, there was a lot of the similar thoughts that around like AI, right? Oh, it's like stealing other people's music. And so what is this going to mean?

Guy Podjarny: Music, drumming, then you can get electric guitar or a drum or whatever it is.

Matt Biilmann: And now I think it's just accepted that of course, like all of these digital, like sampling and so on is a part of what you can do with music. And I can think we're in the same stage with AI right now. That's the first sort of gen AI for making music. It sounds mostly like a fairly bland version of music we've already heard before.

But I'm sure that a creative artist will come up with really interesting things to do with that technology that will probably again, change the music we listen to a lot. And that's going to be really fun to follow.

Guy Podjarny: Love that answer. And I'm totally a fan of that as well. And I think a lot about the dynamic music creation and what happens when AI doesn't just replace the human creator, but rather augments the human creator by being [00:04:00] able to create great things.

So I think anyways, I had to do it. Like I was surprised. I knew a lot about your background. I know the years of founding Netlify, we've known each other for quite a few years. But I didn't know that piece. So I have to start with that question, but let's maybe switch gears. It is, it does have Dev in the title of this podcast.

So we should probably dig a bit more into the dev piece. So there's a lot to talk about when it comes to AI and the web and website building and applications and so much more, maybe I'll start with just what sort of seems to be the the most kind of eye candy type of functionality that seems around AI and the web.

And that's these sort of AI website builders, I'll tell you a description and a website will come up. You see players like Vercel and Wix and Shopify and, like a bunch of these others give us versions. There's a gazillion startups that are offering this, how do you think you've seen those, what's your opinion about these website builders that are AI generated versus not?

Matt Biilmann: Actually probably very similar to your question about music, right? Like I think again the first step is that we get a lot of these like [00:05:00] Gen AI builders that can make websites that looks like all the other websites we've built before and make them really fast, but maybe also really fairly bland, right?

Guy Podjarny: Like the bad version of some pop song that might've existed.

Matt Biilmann: Precisely. And that's probably like one of the first use case and there are a lot of good use cases, both in music and here, right? Like maybe you need some music for your podcast and you can afford to pay some, artists for royalties or something, but you can get something, right? And it can help you create, right? Like maybe you need to, like at the lower end of the market, there's always been this very rich landscape of Wix and Weebly and Squarespace and all of these like site builders over time, right?

That are really valuable for you want to open a pop up restaurant and so on, right? And you definitely don't have a team of web developers going and building something custom for you, right? The first use of the site builders will just be really useful in that space.

But then similar to with music, I think the more interesting uses a more decomposed form of it, where it's not like the AI spitting out at Apple site, but where it's more about how do [00:06:00] developers actually augment their own skillset with AI, right? And how does it allow us to, to build more and to build more complex things and build them faster, right?

On the one hand, when we think about that space for Netlify, right? Like we think of these AIs building for the web as a new kind of user of Netlify, right? That's no longer like a human developer, but an AI developer. A while ago, we launched a Netlify GPT that you can like use from anywhere within this sort of ChatGPT landscape to just like admin to Netlify.

And you can now publish whatever you did to the web. And just from that one, we are seeing more than a thousand websites getting published to Netlify every day.

Guy Podjarny: So these are like a thousand websites a day that get generated by AI.

And I guess those websites they're not your rock ballads that sort of stay for decades. They're a different flavor.

Matt Biilmann: No, they're definitely often more ephemeral or like you start some little prototype and you get it out or something, right? Or someone making a website for their girlfriend or whatever.

Like it's like a lot of that kind of stuff. And similarly, we have some of these more complex [00:07:00] site builders like bolt.new or GPT engineer or Devin also using Netlify's API to make Netlify like the default way. If you ask it to build an app, right? Like it builds an app on Netlify and publishes it there from those and other set of thousands of sites created every day now on Netlify, right?

So AIs are becoming a pretty big user base in a way that's really interesting, right? And that's one side of it, but then the other side of it, it's the whole landscape of tools that are more granular and a part of like sometimes connect as collaborators to developers or there's like tools like Cody or Cursor or Copilot and so on.

That all helps developers build faster and so on. And I think the higher up in the landscape of use cases. And budgets and projects, like the more you shift away from some

Guy Podjarny: magical prompter,

Matt Biilmann: that's like a button or make a prompt and it builds something for you that you have little control over and into like teams of developers [00:08:00] being augmented with all kinds of AI tooling. The potential of what they can do.

Guy Podjarny: Yeah. That makes sense to me. And. I think aligned with what I'm seeing, which is there's the bootstrap exercise and in their defense, like a lot of these tools also represent themselves that way of, this is basically the initial version of what would be used, I guess for that purpose, it's almost like a design and maybe content layout type problem. And it's a good point that you make that these builders also tackle the underlying, okay, but how do you actually make this thing live on the web and all that Netlify being a perfect example for that. So many of them, you see them deploying on the other side and I would bet like a large percentage of them probably resort to that because they look for the known patterns that are well documented, tried and true.

And also. Cool. Somewhat sufficiently opinionated that they route them through some path that is well trotted because there's less options around them. And so they probably have higher success with Netlify

Matt Biilmann: And you want them to set up something that doesn't require. Imagine they just built you like a [00:09:00] Kubernetes cluster on AWS, running your site.

And then we're like, good luck. You need to figure that out. That would probably also hold them back.

Guy Podjarny: I think that's a really interesting point, which is AI to an extent is the laziest developer. In some ways it is actually like a very invested developer that might, if you give it the instructions, it would not skip writing documentation or creating tests, or it would do them to varying levels of success, but it would do all those steps if it's structured because it's obedient but at the same time, it looks for things that are well trotted because at least if it wants to be useful, it seeks the sort of low maintenance type surrounding.

And so it makes the ease of use that devtools aspire to anyway, to be able to reach a lot of developers actually very applicable to AI as well, because instructing AI or instructing LLMs is actually not that different than instructing developers at mass.

Matt Biilmann: Except that developers might not be as obedient.

They do follow the instructions that you tell them.

I was CTO in a company in Spain [00:10:00] where we built websites for small to medium businesses. We had a bunch of other things going on , the key sort of money drivers building websites for small to medium businesses at a very large scale.

So at our peak, we were building more than a hundred websites a week to give you a sense of scale, right? And had built this whole platform that did like initial brief, assigned a freelance designer to the project, took you through iterations, got it like. To a platform with content management and design tooling and everything and took it live.

And so on. But a big part of that process when working with like SMBs or something that wanted a website would definitely be to not be obedient.

Guy Podjarny: And I guess the LLMs are tricky and they're desire to please a little bit. I think you're hard pressed. So these website builders are today, good examples of just generating a repetitive song of it.

But I guess for that purpose, would you say that they're being effective there, it sounds like they're definitely getting a lot of usage on the Netlify platform. So if we assume Netlify is a decent representation, [00:11:00] probably is all the number of these websites that get created, is that thousand number like quickly growing?

Was it a hundred a year ago and now it's a thousand?

Matt Biilmann: It's a growing number also as the number of tools plugging into us just goes up, right? So we definitely seeing more and more throughput there, right? Like I still think again, that a lot of it is like very ephemeral and so on, right?

Like it's still not so much building real projects. Building something that people might actually be take over and work on and keep building on and so on. Like that's been pretty cool to see. And even from ChatGPT, right?

Like we've started to see like real conversion flows where people start there and then actually sign up for Netlify with corporate email and start doing things on the other side. So there's definitely some of the things there that are starting to work. My sense is again, it's a mixture of the very low end of the market.

And then also just stuff where you would never build something before, right? I have an example myself, right? Like with my 18 months old daughter, when you sit with the laptop, [00:12:00] she'll come over and she'll say laptop, and she wants something to do with it.

And she likes looking at some of the letters and numbers that she recognizes and so on. I just asked Claude to build a little game for her where it just full screen doesn't do anything, but when you press a letter, it will show that letter with a fun animation, right? And now she'll come and like press letters and see them spin around and so on be excited.

And I definitely probably wouldn't have written a web app to do this, right? But it was easy enough to just ask one of these tools, like it built this app, right? People have a tendency to think that there's like constant number of this type of websites that the world needs.

And now AI is going to come and build this set of them. And that's just not how it works. Like you make it easier to do custom development and suddenly you start custom developing things that you would just never have spent that time on before.

Guy Podjarny: That would never have been created or automated.

Yeah I love that example and I hope your daughter enjoyed that but yeah, and I think as they get better, yeah, they don't need to be robust. They don't need to be comprehensive. They just need to be massively easy to [00:13:00] create. And then if they work often enough, they still need to work.

There needs to be, if you're always failing, you're not going to do it again. So maybe let's take a step back a little bit. We already talked a little bit about the fact that you have this new breed of users in Netlify, which is these AI users, but maybe taking it a level up. How do you think about AI in Netlify?

Matt Biilmann: I have these four categories to think about, like the first one is really like, how do we use AI internally at Netlify to be more effective and to build faster and better and two, how do we make Netlify product better by applying AI, right?

What are the opportunities that AI gives us to build a better product? And the third one is really like, how does our users and customers build AI powered applications with Netlify? And what does that change in terms of the core primitives we need to offer is the workflows we need to offer.

And then the fourth one is really this category of like how are AIs going to use Netlify and how can we make sure that Netlify is like the best deployment platform for AIs?

Guy Podjarny: I love that sort of [00:14:00] fourth one and we touched on it, but it's probably worthy of more information, but let's maybe go through the other three first.

Yeah. Yeah. I think the fourth one is it's probably going to be helpful to inform the conversation a little bit around it. So let's go through those. I love that breakdown of the four items. So the first one was, how do you use AI to make Netlify. I guess a better organization on it. We're talking a little bit more about development over here.

So there are examples of ways in which you use AI in development that you try it out and you realise that they do or they don't work

Matt Biilmann: Yeah, yeah, obviously we're using AI in many different parts, right? But we've also really tried to see with these builders and so on, how much code can AI write for us, right?

And I've tried to also internally push our developers like on, make sure you go try to augment yourself with AI because there are really powerful pieces there, right? Like I've spent time trying to build some Netlify features and just sit down and see with these different tools, how much help would they give me, right?

Like I made my own conclusions there, but also just really encouraged the team to [00:15:00] use them. And I think there's the individual developer tools choice, where a lot of them are using Copilot in different ways, right? But it's also like this integrated tooling, like just like ChatGPT and Claude in their own as tools and all of these faces of it, right? That's a little less ordained by us as a company and more like increase, like you've got to figure out what works and share with the team and then there's some of the tools we've evaluated that there are more programmatic things we can do to our code bases where you guys can be like code developers in different ways.

And we've tried different code generating systems. Some of the ones that have tried to tap into a GitHub issue and try to solve it and still find that

Guy Podjarny: automatically or something.

Matt Biilmann: Yeah. And I still find that, that level, like the models right now, still not at a place where that really makes sense, right?

Or where they can participate.

Guy Podjarny: They don't succeed at a reasonable level to be worthy of the noise.

Matt Biilmann: Like they're not a good enough developer that you really want them in your code base, right?

If you have a not that good developer and they write a ton of pull requests, [00:16:00] but you have to like thoroughly review all of them. And some of them might have big mistakes in them and so on. Then it's not necessarily a time saver, right?

Like it might just be a distraction.

Guy Podjarny: I love that phrasing of they're just not good enough developers that you want them in your code base. But for the coding assistance, you don't find, at least not in mass that it's a problem that people are checking in inferior code compared to what they did before.

Like you feel like your developers are managing to supervise it sufficiently. So you don't see this as a repeating pattern.

Matt Biilmann: Yeah. And I think also because we have a strong code review. Culture internally, right? So I think the peer accountability there is pretty strong, right? If you suddenly start making a bunch of bad pull requests, like your peers reviewing, why are you doing this?

As long as you have strong engineering culture there, I think good developers will not suddenly become bad developers. Like they'll just get more productive developers, right? So that's very much something I'm encouraging

Guy Podjarny: I love that accountability piece of it because fundamentally, the more guardrails you have, the faster you can run because you [00:17:00] have the confidence that you're not going to sleep.

And that also incentivizes the behavior that the developers will actually have. You have two effects here. One is culturally the developers know that their code will be reviewed. And so maybe they themselves are more responsible. But then two is you actually have to code review process to identify it.

So I love how sometimes the use of AI highlights the importance of practices we've already perceived to be good practices. So it's a great one

Matt Biilmann: And I think it's going to be an interesting piece that I think it can make a big difference to a good developer, how good they become at AI.

And that will be a pretty big differentiator, I think in a developer, right? Because I think it's very much something you need to learn how to use and apply for it to not be like autocomplete of GitHub is often, I both see it in myself and I hear from a lot of like more senior developers that they turn that off because it's just I don't know, again, it's like programming with a developer that's several levels below you.

And it's like constantly okay, let's do this differently. This and it's like [00:18:00] noisy and so on. But on the other hand, the sort of intentional prompting for code can really speed up some of the like things that would otherwise take really long. Like when I first was doing this, I was trying to see.

Could I get a GPT 4 to do some code changes on our plugin for Apache traffic server, which is like big C++ open source code base, right? And anything more, can you implement this as something it would just hallucinate all kinds of APIs that didn't exist and so on, right? Like it was completely useless.

But on the other hand, telling it something like I need a function that takes header object, and then extract these specific values. And then compare like the accept language string, split it up into a format that you can work with and then compare it with the actual, like setting of a redirect rule and then return a conclusion of like, when I could give it like a really take this input, do this thing and give this output, right? Like it would just completely correctly write that function in 30 seconds. And if anyone has been doing [00:19:00] that in C+ right? Like that level of string comparison and manipulation. And so that would have taken me a long time to learn.

Guy Podjarny: To do this,

Matt Biilmann: Just remember all of those like standard library functions and figure out the right way to do it in a memory safe way. And so I'm right, I can just wrote it in one go. So to me, there's always if you learn where you can save a ton of time and where you can waste time and you become better and better at mastering that tool set.

I think that will make a really big difference in what we see as a really strong developer in the future.

Guy Podjarny: Yeah. Yeah, I fully agree. And it's interesting to think about these things as specifications versus as prompts. Also, I guess these are still prompts and it's about prompt engineering to try and see how do you guide the system is on one hand you expect the LLMs to get better at needing less instruction and describing what you want, but what would people forget or under appreciate is that you do need to tell the LLM what it is that you want to do.

It can't read your mind, [00:20:00] if you just compare whatever the accept language header to, one of the western languages maybe defines a western language, in a different one to the one that you want. So sometimes you do need to get a little bit more specific as a human might, as one of your developers on the team, if you're given the instructions and you also mentioned though, that you make good use of a Codemod, which I found interesting.

Do you want to tell us a little bit about that?

Matt Biilmann: Yeah. Like we've done evaluations of a bunch of different tools. And the one we are to be working with is codemod. com, we're especially using it to do code based migrations. So in our main app, for example, app.netlify.com. Traditional React app started a long time ago, right? Like back in the start of 2016 or so on, right? Like in one of the components that were pretty out of date was React router. And it's like React router has changed a lot from like the early days to now. And those changes are free, goes pretty deep through the whole code base.

And at the same time, like being stuck on an old version was holding back [00:21:00] a lot of other upgrades of React dependencies and so on. So that was one of the big projects that we use.

Guy Podjarny: Codemod is a defined like by itself, it is a deterministic manipulation of code.

Yeah.

Matt Biilmann: So Codemod, like

Guy Podjarny: To go from version one to version two.

Matt Biilmann: The idea with the tool is just taken on the basic idea that you can write code mods that, that doesn't look at the that are not find and replace in text, right? But that looks at the abstract syntax tree of your code and then can apply transformations to that abstract syntax tree, right?

Like of and a lot of these kinds of migrations can then be expressed as a set of code mods, right? Because again, if you try to do it with search and replace, that's just not going to work, right? But there is a set of like code mods that can express how do you go from how the router worked in this place to how it's supposed to work in this place, right?

Now the challenge is that writing like transformations of abstract syntax trees it's that's not super [00:22:00] trivial.

Guy Podjarny: Not anything you can just go break and come along and start trying to those things.

Matt Biilmann: And so that's where Codemod is really helping us get the AI to define and iterate those Codemods.

So once those are in place, then based on that, their system was essentially apply that throughout the code base and make some pretty gnarly big pull requests that we didn't have to run a lot of testing on to make sure it works and so on. But that has actually really allowed us to run through that whole series of updates to React router, and then the React versions that were behind because of that dependency and so on. So that's actually some pretty big in terms of lines of code, some pretty massive contributions to our code base that's come in that path through AI but we have a lot of human in the loop style.

Guy Podjarny: Very attentive process of it. Yeah. I like that analogy because I do think there is a pattern that you see in places in which if you ask AI to generate something that is a more constrained space, it can do a better job. Actually in the world of web, there's [00:23:00] also a Wasp introduced this thing called Mage, which is a GPT generator.

Actually, you just described Netlify GPT and so if you're giving the LLMs instructions of something that is a much more small, well defined space. It might still be complicated, but there are less options around it. Aren't 17, 000 ways to do a loop. There aren't, what else is something like, things are much more contained that it generally can do a pretty good job.

And so this pattern of having the LLM generate something that is that confined space, because that's really what you want. And then applying that to code seems to be working. I've seen it in the DevOps space, we've seen it in OWASP. So it's really interesting to see how it applies to code mode.

Is this something you built in house or is this code mode? The code mode itself is an open source project. I think originally created in Facebook when it was before Meta.

Matt Biilmann: Yeah. Yeah, precisely. Alex, there was a PM on that product at Facebook or that project there, right? Like he, he built codemod.Com around it and have both like this registry of open source code mods that [00:24:00] you can apply if you want. Like you're starting to work with teams like the react team or the remix teams to just like when they launch a new version and they have a set of code mods that you can just go apply to your code base and upgrade it.

And that's really interesting, right? Full disclosure, I'm also a small investor now in, the company, but also because I think it's really interesting because those kind of version upgrades over time, it's a challenge of our whole ecosystem, right?

Like that all of these versions keeps changing and either you can waste a lot of developer time, like chasing them. And on the other hand, if you don't do that, suddenly you run into these moments where you're now blocked because you're like pretty far behind and getting really up to date is now like some massive lift, right? But you might not be able to add like a new dependency or something before you've lifted up those old dependencies. So I think if we can build tooling that can really help like systematically helping on that, like just dependency migration problem, that's really valuable, right?

Guy Podjarny: Like that's really a lot of good insights about using [00:25:00] to make Netlify's own development more productive. I guess the next point you made was around the notion of using AI to make Netlify products better. Maybe actually before we go there can you give us just for the users who are not as familiar with Netlify, give us just a couple of minutes on what Netlify is and what you have just to contextualize the AI use in it.

Matt Biilmann: Yeah. So Netlify is a web development platform for the web really based on like their original insight was that if you really decoupled your architecture, so you thought about all of your backend business logic and APIs and systems just as a set of APIs. And then thought of the actual web user experience as like a self standing application.

Then we could really deliver like a unified platform that takes care of all the automation and operation that you need to make sure that the code your developers write gets in front of actual customers on a domain in production as fast as possible. And with high performance and high rate, right?

So we tightly integrate all the release [00:26:00] management, the CICD, and like they're setting up staging environments for every single pull request and like giving you collaborative tools on top of that, even giving you tools for non developers to be part of the process and get feedback into pull requests or do visual editing, even on top of the website and so on with the actual operation of the infrastructure, right? So we operate a large global network and with everything you would expect there, like web application firewall, DDoS mitigation, we have like edge compute and serverless compute for both sort of like the edge rendering you might need for modern frameworks or for personalization or and for All the back end for front end, you will need to actually talk to all of these different APIs and build on top of them really just with the goal of helping web teams avoid spending their time on all the automation and operation that sits after they write their code and content and focus on like actually like creating the coding content [00:27:00] that makes a difference.

And then we'll take it from there and help them shift faster and better results.

Guy Podjarny: And I think a very powerful platform and I think one of its distinctions as well compared to maybe other players in the space, which is it is quite, you already used, I think the words composable earlier on, which is it's opinionated.

It has those paths and it offloads them. But you can say, I'm going to use this piece of the platform over here. I'm going to plug into something else. And here there might be like seven different ways in which I publish the content or I write the content or five CMS is that I integrate with the different apps and you create those.

So there's a composability modularity of the system is a core strength of it, right? It's relevant. I think to some of your the AI features that you've been discussing. So thanks for that sort of primers. I guess indeed take us into that world, So how do you use AI to make the Netlify's products better?

Matt Biilmann: Yeah, so our philosophy is really like that that we don't go into the space of using AI to help write your code or write your content or anything like that, right? Like in our world, that's what is like outside the wall of Netlify, right?

But we help [00:28:00] you take that stuff live faster. So we're constantly thinking about like, how can AI help on those, reduce the friction from when you write the code or when you write the content to having it on a UL in production. And one of the first sort of like public features we launched there explicitly was the, why did it fail button just based on the fact that one of the most common friction is like you written some code, you now you push it to a pull request on GitHub. We run the build for that code, but instead of going live, the build fails. And now you get it like, ah, it failed. I got to jump into Netlify. I got to look at all the logs.

I got to try to figure out like what happened and what should I do about it, right? So there we put a button in the UI that, that you can click that said, why did it fail? And when you click that, we'll take the context of the build and the build log and so on. Pass it through an LLM and try to answer like, why did this fail?

And give you also a proposed solution. What that's the reason, what can you do?

Guy Podjarny: Very [00:29:00] practical, kind of sifting through data, but with a critical eye that now LLM is enabled to understand what happened and saving me a lot of time in the process.

Matt Biilmann: Yeah. And we launched that sometime in May this year.

And since then we've had more than 150, 000 built debugs by that button. That's 25, 000 developer hours saved since then or something.

Guy Podjarny: And do you know like the accuracy rate of it of those 150, 000 times, like how often did it get it right versus wrong?

Matt Biilmann: I wish I had better numbers there, right? Like it's a little harder for us to instrument if even to know again, because it's not like we also have a button to say implement the fix, right? Like it is still like giving you the advice on how to fix it and why did it fail? So that's not like a surefire as to measure they take that advice that they do something else.

Guy Podjarny: Yeah, but these are developers. So if you get it wrong very often, they will tell you.

Matt Biilmann: They will tell us, and quite the contrary, we've even seen that internally just when we started in private bits, our own team started using it a lot.

As all of these. It definitely sometimes gets it wrong. But there's also just a lot of [00:30:00] times where it just saves you.

Guy Podjarny: Yeah.

Matt Biilmann: Like in some cases it might even be something that's fairly obvious.

But you get like a large build output log with error messages and so on. And you've got to. Just situate yourself and figure it out. And you just click that button that just immediately, I just identified it. It's like a very nice feeling. Yeah.

Guy Podjarny: Yeah. Yeah. So that's a lot of time saved.

So that one is like a very concrete, immediate practicality of it. Yeah. And it feels very applicable to other systems, those buildings those systems around.

Matt Biilmann: Precisely. And said also, like our core philosophy of how do we think about AI now tool set, right? Like ideally not writing from seconds, like you can do that other places.

But ideally we should have everything wired up so we can have these do things that just that doesn't add complexity to your workflow, but just like speeds up how you work. Yeah. And the next feature we just launched here at Compose that's in private beta now, it's what we call AI assisted publishing.

And it ties into we are obviously like very much a developer first tool, right? And we really meant for teams that want to apply developers to build web experiences. [00:31:00] But we've always had this vision that like those developers have a set of stakeholders. That are not developers that I call part of that web team, right?

The marketeers that write the actual like copy and content and messaging that needs to go live. Designers that work with them on the assets and so on. And we only. really succeeds for our customers. If we can help those developers help the whole team, right? And we can make the whole path faster.

And when we look at a lot of the web project, you will have code and templates, and then typically you'll always have some content infrastructure behind it, right? Like typically like headless CMSs. Maybe you, sometimes you didn't have a couple, you have a hit list, you're missing, you have an e commerce platform.

A sometimes you have a legacy CMS and like modern headless CMS and

Guy Podjarny: All sorts of combinations. It's like technology never dies. They you layer on something new, but you never actually remove the old.

Matt Biilmann: But the key is that for developers. To make effective use of content, they need that content to be in a [00:32:00] structured variable format.

And that's like ideally some modern headless CMS with great content, right? Like now in our experience, even internally at Netlify, none of the creatives and marketeers and so on ever do that creative work in structured content, they always seem to work in, at Netlify, they work at Google docs and option, right?

Any, anything, will it even developers, right? If one of our developers. And you blog posts for DevHop. They don't write it in

Guy Podjarny: yeah, it's hard to write. You don't want to write it in a form. You want to write it in some free form.

Matt Biilmann: Just want to write it free form. So it's always just like notion is typically where the developer once gets written and the marketeers typically write this in Google docs.

Like it, that's just the split there. Like similar to how one of the early things we did at Netlify was like realizing that developers have this set of tools of their code editor, their local development environment, and GitHub, where they do the actual work. And then there's all this tooling after that for getting it out into the world we early on realized [00:33:00] we could just plug into GitHub and just make sure that every time the developer pushed to GitHub, we run a build and we give them a URL where they can see how it looks like.

We started thinking about what would it look like to expand this to the stakeholders at work in Google Doc and Notion and so on, right? Like in this kind of the early path to what's that where we see if you want to publish a new page, whether it's like a simple blog post or whether it's like a product page that might have a whole set of complex templates and so on, instead of like first writing all of that up, In Google Docs and then typically going and asking like a developer or some semi-technical person or something that who's just like doing CMS entry or something.

Go put this into the CMS. We can get the LLM to do that for you. Like just go to Netlify visual anything layer and just say start a new page for from this document, right? And then we use the LLM to take that unstructured content and get it into the structured format.

We already sit in this space where we can run the full development, since we're the rendering [00:34:00] infrastructure and so on, right? I can run the preview for you and show how it actually looks like. And we can then give you the visual editing on top of that to make tweaks to anything like the LLM might have gotten wrong or you want to change and so on.

Like straight on the page. And again just really aim to like can we take that process? It sometimes even takes a couple of days, not like of intense work, but you've got to find someone that has time to put it into the CMS and then they got to set it up and then they send you back a link and then you have to find the time to go verify that link.

And then there's some iterations and so on. And we shorten that to just can we get that down to just like a 15 minute session and you're live. I think that's what we're working on there.

Guy Podjarny: Yeah, really like that. And I guess I think of it as like an entry point to development and platform. So unstructured is basically all the prelude to code needs structure.

Can I simplify that? And I think aligned with market trends that we're seeing with the whiteboarding solutions, with these others in which you can describe something in loose form and can evolve it into something more rigid, more structured on great [00:35:00] uses inside, great uses in the product.

Thanks for describing all of those. We probably should talk a tiny bit about how do customers build on AI? Although here in the podcast, we're mostly focusing on how do you develop applications with AI versus on top of AI, but maybe use a couple of words on that. And then a couple of words on that AI user that we talked about before, mostly around what is it that you do to improve, acknowledging this as a user, what is it that you do?

Matt Biilmann: Yeah. So in terms of obviously like developers building with Netlify as AI just becomes a more and more key building block of every user experience we built, it will be like more and more integral to all developers that a lot of the content or data or whatever they need might come from AI systems, right? Like for AI workflows. And so of course we're thinking a lot about what does that in this kind of JAM stack architecture with like front end and back end for the front end and so on, what are the new primitives that we might have to build or how might our primitives needs to evolve to [00:36:00] keep making sure that building on top of these AIs are really great experience, right? And that's been things we've already done, like making sure that all of our originally Netlify functions were totally synchronous, right? Like you would send a request to the function and then when it was done, would send your response, right? Like it became really important to add streaming responses to functions, because if you build on top of GPT 4 or something, you're not going to get like an instant response, you're going to get some stream of tokens that you want to that takes long enough to generate that you want to stream it back to the user.

Guy Podjarny: Yeah, the experience is terrible if you don't stream it in the world of LLM .

Matt Biilmann: Totally, right. And then you also start seeing these more complex patterns where it becomes more like agents and assistants, where it becomes less of okay, send one request to an LLM and stream it back.

But where it becomes more of these types of workflows, right? Get some stuff from the user. Now you got to trigger an initial prompt with an LLM that might give you like a planning stage or [00:37:00] something like that. And then you might even need to spin up several different like processes to do those different steps in parallel.

And once all of those are done you can update the end result. But meanwhile, as they go you want to show updates to the user and so on. So we're thinking a lot about how does that change what the code you write for that look likes and how it can make it run.

And at Compose, when we built the AI assisted publishing feature we ran into a lot of that ourselves, right? And since we're building it on Netlify, one of our principal engineers built this whole sort of workflow engine on top of our normal serverless primitives of scheduled functions and blob storage and serverless compute and so on.

So we also packaged that up as an extension for asynchronous workloads to again, make it really simple for our users to write those kinds of workloads when they're dealing with LLMs. And I think that's another way for that we're just constantly thinking about, right? For where we sit in the space what are the tools that look different, that might not [00:38:00] have been that important before, but that's becoming really important once you're building these type of experiences

Guy Podjarny: And it fits into the future is already here, but not evenly distributed.

It's, none of these are like entirely new capabilities. These are all things that existed, but they went from being an edge case to being a common case. And so you have to support them a bit more inherently. And I guess it's interesting to think about the notion of an LLM interaction on your site being maybe as prevalent as a database call.

Matt Biilmann: I'm looking at a lot of things like local first architectures for building web apps again, because those are more of an edge case now for if you're building like multiplayer experiences and so on.

But again, it's suddenly much more normal that like in all of these workflows and so on. You have these agents going off and doing stuff then almost every web experience we build are going to be a multiplayer experience. The other players are just not humans but these agents doing stuff, right?

But architecturally, it means that suddenly a lot of those things that were like an edge case in the past could quickly suddenly become a [00:39:00] really common I need to have this.

Guy Podjarny: Yeah. Very interesting to think about how these use cases become as prevalent as querying data or how do they become common enough in applications that are not deemed complex, the application in general, but they now have these LLMs.

So the three of the four in each of those, there's a lot to say around the use of AI to accelerate development in Netlify about how do you use AI to make your own products better about how do you build AI into a Netlify applications. I'm interested about that fourth bit, which frankly, I didn't think about that deeply until we've started having the conversation ahead of this episode which is all these AI users you mentioned that are using Netlify.

So we already described a bunch of these tools, a bunch of these builders already use Netlify by default. What is it that you can do? As Netlify to help promote that generally, there's a good thing. You want them to be able to successfully use Netlify. You definitely want them to do that by default.

If you're building a tool. That you want AI to use, which is not the same as building an AI tool. What can you do? Have [00:40:00] you tried things that you've already seen work better than others?

Matt Biilmann: When I built even the predecessor to Netlify called BitBalloon, that was like the first step into this, like how do you build these like decouple front end, the first tagline way back then, like in the end of 2013 or something was back then like first thing for the programmable web.

Great. That got replaced pretty quickly and so on. But there was this idea of like outside of users building with Netlify. Netlify is also in itself kind of an API with CLI and a tool set, right? Like in an open API spec and so on. , like our app. netlify. com is mainly just like a client side client on top of that, our CLS is another CLI is another client on top of that, right? That kind of now just turns out to be a really useful thing for AI models wanting to interact with our product, right? There's like a clear, this is how you should use this as an AI, the other path, like other things have been this flow.

We, again, we launched it a long time ago, but that's suddenly [00:41:00] starting to become more relevant. That's like a. deploy now claim later where you'll see that flow very clearly if you try try bolt.new, right? And you go in and ask it to build an app and you say, deploy, and it'll just deploy to Netlify, it won't ask you as an end user at first.

Like you need to go authenticate with Netlify and so on. It'll just deploy you there, but it will give you a link that say if you actually want to do something with this, you can click through and you can then claim the site. And it'll get added to your Netlify account.

Guy Podjarny: So remove the friction of saying, do I have a Netlify account or do I?

Matt Biilmann: Precisely, right? Removing the friction from these AI tools to just be the user first, and then potentially handle what they build over to a human in, when you're building these tools, you want people to just start and play around. You don't want them to first like log into a whole bunch of different services and then they can do something right. So anything we can do to reduce that big friction and make sure that the AI can have its own account in a way and just work there and then give it [00:42:00] to a user when it's done remote, like almost thinking a little bit about some of these in a similar way, if we are thinking about like how freelancers and agencies need to use our platform and what we need to build from that, right? Like these AI is a kind of a new kind of like team player like that, right? And we got to think through what does that mean for the flows of how you build with our app, right?

So there's both that programmatic aspect of obviously, if you don't have a clearly CLI or API driven flow that is easy for these AIs to tap into, then they can't do much, right? But then there's also this human aspect of the flow, right?

Guy Podjarny: What does the registration look like?

They're not just human. So it sounds like a great docs which I guess you minimally alluded to, it's well documented by the user to begin with very programmable with good examples of it. And then subsequently zero friction on registration, which actually feels very important for LLMs because I think for a user, their ownership of their email to register for different accounts [00:43:00] is still friction, but it's like relatively low friction.

But I can definitely see how for an AI system, the fact that they need to register, they have no email, it has no on behalf of this user. And so the registration is a big deal and removing that is significant. All of those are great things to I can see how they make AI successful as users of Netlify, are there things after you've learned that AI's are using your system that you've tried to change? Cause all of those are things where you, I don't want to say get lucky, like you planned well, or you designed yourself correctly for the programmable web, are there things you've tried since that?

Matt Biilmann: I think there are more things that we're still trying to work through, for these different providers, right? There's, for example, work with ChatGPT like one of the flows we use when you publish there is that if you don't claim the site, we treat it as if Maryland, it's deleted within an hour or something like that, right? So to keep that, but then there are a lot of these systems where people still don't want to claim, but want to make it like shareable and so on.

So [00:44:00] they don't want it to go away. And once we have that then we need to start also working around all the bad things that comes from that of what happens when someone jumps into one of these LLMs and ask them to make something that looks like chase. com and ask people to put in their credit card, right?

And how do we work on notifying the user of the service that has the AI that their user is doing bad things with us. And so on. So again, there's a lot of these problems that are not as straight up, like technical limitation, but then I think other pieces of it, it's more of an on the composable side, right?

Because then there's again when these tools want to build with one thing is like the simpler examples now are, just pure front end apps and don't do anything more, more right. But when they also want both the front end and a database somewhere else, and like these small plug together systems where multiple partners come together, there I think there's some things to think through that's almost more like, how can we start [00:45:00] standardizing on some of these patterns across providers, right? Like, how can we like other ways we can use the open web to make it clear what we allow an AI to do and how we can be used by AI as tools and other ways we can make standards for how AI's can compose those tools together. I think that's like the next frontier in a lot of this that will probably happen first in proprietary walled gardens, right?

Like, where inside you can name all the different pieces that you can name all the different pieces and they can all be tied together. But I think it's pretty important if this is going to be a really important way of building and starting and so on, it's going to be really important that we find ways centered around the open web.

To solve these problems equally, like ideally, probably as it always is, right? Like in forms that are probably a little more clunky than in one world garden, but that are also fundamentally really open and allows anyone to publish and participate in this composable, [00:46:00] like economy of tools that AIs can plug together and build it.

Guy Podjarny: Yeah and connect. So that's a great lead into indeed maybe using the last sort of 10 minutes or so of our conversation to talk about where this is headed. This has been great insights on how significantly, AI interweaves into everything that you do.

I think we would both agree that where we're headed, this is just getting started. This is just the beginning. So maybe you take out your crystal ball and say, as you think about, So when you look at your world, this world of the web, maybe specific to JAMstack the composable web in five years time, in 10 years time maybe even further, what would you say are the primary changes that you are anticipating?

Matt Biilmann: The reason that people start like thinking about composable architecture in opposition to buying some monolithic tool for here's a monolith for building, your corporate. com, or here's like a monolith for building your e commerce or here's like all of those, like the reason people start thinking about like [00:47:00] composable as a term and a category that they want to adopt is on the one hand to get flexibility and the experiences they can build, right?

Like you want to be able to apply custom development to build some custom experience. But you want to be able to do it with fast time to market and without building the parts of the experience that are not unique to you, right? So if you're like a high end luxury brand, you want to build a very tailored, like web experience for your e commerce that represents that.

But you don't necessarily want to build e commerce, right? Like you don't want to build like content storage for that. So you want to be able to take like reusable components, put them together there and then build really fast on top of them. And you want to know that those different components you pick, you can evolve them independently as the world evolves.

And there's different alternatives comes along where in the traditional monolith, you got stuck with maybe some areas it was best in class, but then a lot of areas that it sucked and couldn't do anything about that. So that's what motivates that shift. And I think the big market trends around [00:48:00] me, I will accelerate the need for that for those, like for going composable, right? And that's like the first reason for that. It's just a lot of things are hard to predict around AI, but what's really easy to predict is that it will accelerate everything, right? Like you'll get more content, you'll get more code, you'll get.

More assets.

Guy Podjarny: A natural reaction of reducing, as we've touched on before, the barrier of entry, the barrier of the efforts to create. So more will be created.

Matt Biilmann: More of everything, right? And that means faster evolving tool sets. And again, like that just puts more pressure on sitting on like some latch monolith.

It also puts more pressures on typically the rollout processes that comes when like your UI and your backend is tied together and everything might be like more risky to roll out and so on, right? So you need systems that can deliver very high throughput and very high flexibility and so on, right?

But then the other sort of more major thing that I'm thinking through is really what I would call like UI 2. 0, right? Like [00:49:00] where right now, all of that acceleration from AI, we're just using it to build the same user experiences that we built the last decades faster, right? So we're not really like doing anything massively new, but we're making the code and the asset and the designs and everything faster, right?

And that feels like just the first step, even tying back to that opening question on, on, on music, right? Like the same, right? We'll just like make the same pop songs but faster, right?

Guy Podjarny: Yeah. And that itself is a change because as we just said, it, it would create more content in places that couldn't be created before, just like it can create music in places more, maybe a customization and such faster pace of change.

But it's still not new creations. It's just a faster moving beast which once again is is another compounding. You could say DevOps has done that and cloud has done that and, Java has done that. All these things have JavaScript has done that they keep compounding on the pace of development and this case also content creation as well.

Matt Biilmann: Totally. So the pace [00:50:00] goes up, but I think if we look broader out into the next five to 10 years, then we'll see much more radical change, right? Like in the sense that gradually we'll start figuring out like, what kind of user experiences is it we can build when, especially when the core constraints that we've had on all human computer interfaces, since the first one that's really like that computers can.

Just to understand intent or something, right? Like you need very transactional and predictable interactions. You click a button, the computer does a predictable thing. You type a command, the computer does what the command says, right? Like that constraint is just seeping away, right? Like intuitively.

You can actually just, it can understand intent and go do a bunch of things, right? And the only way we've really exploited that yet is in like in chat bot interfaces, right? But that feels very primitive, right? And I think that's the part that's the most impossible to predict how it will unfold.

But I think it's easy to predict that it will unfold, right? That we'll start seeing people come up with these [00:51:00] patterns of user experiences that you could never have built without AI, that were just not possible, right? And we'll see that even changing, possibly the landscape of the devices and the hardware we use to interact with computers with as those possibilities change, right?

And that will be really fundamental. Like that, that I think will be as big a change to what you're supposed to be building it on the web as mobile was possibly even bigger. And similar to every other platform change. There will be like lots, like the actual stuff you will need to build, like it was impossible to predict when the iPhone came out, like what are the things like five years after that people will be building his apps, right?

But in every one of those platform changes, there's also some things that are very predictable, right? And one of them is just like this tension between closed platforms and open platforms, right? . Where obviously large players will have a strong interest in this re platforming being to proprietary platforms that they own and control and to end with walled gardens that they can [00:52:00] monetize and then a lot of us will have to try to fight for an open web alternative to that that can still deliver the kind of experiences that users will suddenly start to expect, right?

So that whole cycle that's happened with mobile and with social media and with all of those, I think it's just about to start really playing out again. I think that's going to be like one of the key areas where we as Netlify really thinking about like for developers, how can we keep making the web the right platform to, to innovate and build these new experiences on?

And how can we keep the type of qualities the web has around like. A creative chaos where there's no gatekeeper and you can just buy a domain and build whatever you want on that domain and put it in front of people, right? Like, how can we make those strengths really compelling is so we keep this like open shared infrastructure, the best platform to build from.

Guy Podjarny: Yeah, no, that's a really astute observation. And I agree that it's a risk that we have to have to deal with. And I guess we're seeing it on all [00:53:00] layers of the LLMs all the way from at the bottom, the foundation models themselves versus open weights versus a few that are open source properly, but then also further up indeed with, if you were to build an application, even these autonomous engineers and such, they might be very compelling, but what does it mean for the development methodology over here? For most of them, we don't know the answer because they're still behind a wall.

They're not publicly available, but the sentiment is that those are, hey, you're going to commit to this platform. It's going to be more of the Microsoft of old or the I don't know, a Borland type environment. It wouldn't necessarily be the open Java or the subsequently JavaScript and open web.

So it's interesting because I think there's a good chance that in a closed system, you can produce innovations faster and create things that are cohesive faster. And that will probably happen and that we'll have an advantage, but we just need to not accept that as the destination, but rather as a stepping stone.

Matt, this has been an excellent conversation, longer than most of our conversations, because there's just so much to say. Thanks [00:54:00] again for coming in and sharing these observations and exciting stuff on the Netlify front, especially, I really love how the investments in appealing to a broad developer base, I have drawn in this new coder, this new creature, this AI, but a lot of other guidance as well.

Matt Biilmann: So thanks for coming on to the show. Thanks so much for having me. This was a lot of fun, really great conversation.

Guy Podjarny: And thanks everybody for tuning in and I hope you join us for the next one.

Thanks for tuning in. Join us next time on the AI native dev brought to you by Tessl.

Podcast theme music by Transistor.fm. Learn how to start a podcast here.