Sept. 4, 2024

#383 Building the Operating System for AI: Rob Futrick on Anaconda’s Role in the AI Revolution

#383 Building the Operating System for AI: Rob Futrick on Anaconda’s Role in the AI Revolution

In this episode of The CTO Show with Mehmet, we are joined by Rob Futrick, the CTO of Anaconda, for an in-depth discussion on how Anaconda is positioning itself at the forefront of the AI revolution. Rob shares his impressive journey in the tech industry, from his first startup over 25 years ago to his time at Microsoft, where he led product teams for high-performance computing and AI software. He now leads engineering at Anaconda, a company with a strong legacy in Python and data science, and is excited about its future role in AI.

 

Rob provides insights into why Anaconda is now positioning itself as the “Operating System for AI,” emphasizing the company’s commitment to democratizing access to AI technologies, particularly through open-source frameworks. He explains how Anaconda’s focus on empowering developers and data scientists has evolved from simplifying Python environment management to supporting AI and machine learning solutions. Rob highlights the importance of community-driven innovation, particularly in the open-source space, and how Anaconda has been a key player in supporting and sustaining that community.

 

We also dive into the challenges of building AI solutions at scale, particularly for enterprises. Rob discusses how Anaconda helps enterprises integrate AI solutions, providing tools for deploying large language models (LLMs) both locally and in the cloud. He also shares his views on the future of AI, touching on the importance of AI agents, the continued role of Python, and emerging trends that excite him, such as the potential for smaller, more specialized AI models and the increasing demand for running models locally.

 

Throughout the conversation, Rob emphasizes Anaconda’s dedication to innovation, both in terms of its contributions to the open-source community and its ongoing development of tools to improve Python’s performance in AI. He also shares his thoughts on the balance between open-source and closed-source AI models and the importance of fostering a collaborative ecosystem that drives rapid progress.

 

More about Rob:

 

Rob Futrick is the Chief Technology Officer at Anaconda, where he leads the company’s technology strategy, vision, and development and delivery of Anaconda’s cutting-edge AI + data science platform technologies. With over 25 years of software engineering and product development experience, Rob has a proven track record of building and scaling engineering teams and complex software systems for some of the world’s leading companies.

 

Prior to joining Anaconda, Rob was co-founder and CTO of Cycle Computing, where he drove the company’s technology and product vision with a particular focus on enabling highly scalable cloud and hybrid HPC solutions. Cycle was acquired by Microsoft in 2017, where he owned product management for Azure’s HPC + AI Software platform and led teams responsible for building and delivering key services. 

 

Passionate about entrepreneurship, Rob is involved in several initiatives to mentor young tech professionals and encourage diversity in STEM fields, including Cornell University’s eLab program. Rob attended Cornell as a Computer Science major.

 

https://www.anaconda.com/about-us/leadership/rob-futrick

 

 

00:00 Welcome and Guest Introduction

01:09 Rob Futrick's Career Journey

02:22 Anaconda's Evolution and Community Commitment

06:15 The Role of Anaconda in AI and Open Source

10:06 Benefits of Open Source vs. Closed Source AI Models

14:51 Ensuring Quality and Security in Open Source Models

18:43 Local vs. Cloud Deployment of AI Models

25:39 Enterprise Adoption of AI Solutions

26:56 Deploying LLMs Locally with Anaconda

28:13 Measuring ROI in AI Strategies

28:55 The Importance of Clear Objectives and Measurement

31:06 Anaconda's Internal Use of LLMs

33:01 Python's Dominance in AI

39:07 Emerging Trends in AI and Machine Learning

48:49 Final Thoughts and Future Prospects

Transcript

[00:00:00]

 

Mehmet: Hello and welcome back to a new episode of the CTO show with Mehmet. Today, I'm very pleased joining me, CTO of Anaconda, Rob Futrick. Rob, thank you very much for being with me here on the show today. The way I love to do it, as I was explaining before we started, is I keep it to my guests [00:01:00] to just You know, tell us a little bit more about yourself, you know, and your background and then we can take it up from there

 

Rob: Sure thing.

 

Rob: Thank you so much for having me. I really appreciate it uh, it is it is an honor to be on your podcast, uh, and Uh, man a little bit about me. So, um, i've actually been in tech for for quite a long time Uh, my very first startup kicked off actually 25 years ago Uh, in the late nineties. Um, and so, uh, most of my career is focused on enterprise software and cloud infrastructure.

 

Rob: I spent quite a long time in the high performance computing space, actually. Um, and, uh, yeah, I previously had a startup called cycle computing. We were acquired by Microsoft back in 2017. I went on to run a bunch of their, uh, high performance computing and AI software product team in Azure before leaving to join Anaconda.

 

Rob: Um, so I've been in Anaconda for a bit now, but I've actually been involved with the company as most of us in the world have been, uh, for a lot longer, either by using their product and services, or actually I've even, I knew the founders before even, uh, they started Anaconda back then. [00:02:00] It was continuum analytics for those of us who've been in the space for, for long enough to remember that.

 

Rob: Um, but yes, now I'm here and I run engineering at Anaconda and, uh, I'm, I'm particularly excited by the evolution of the company and kind of the evolution of tech in general and how it's moved beyond just Python into kind of AI and. And kind of the wonderful world that all of this progress in LLMs has opened up for us.

 

Mehmet: And I was telling you, Rob, before we started also the recording them, I'm a user of Anaconda as well. And, you know, the, the nice thing that, uh, you know, the technology brings. So you mentioned, you know, about how, you know, initially people used to use Anaconda and how this is now changing with, you know, the adoption of AI.

 

Mehmet: So, And I know, like, you know, you had a, um, a fast and rapid growth recently. So share with me a little bit, you know, um, what makes people loves, you know, using, uh, Anaconda's technologies and, you know, what's the commitment [00:03:00] from, from Anaconda to the community when it comes to, to, you know, contributing, um, to, to its success.

 

Rob: Oh, that's that's an interesting multi part question. So, um, why do people love Anaconda? Uh, I think in general, I think people, people like tools that help them, that solve a problem for them, that empower them in some way, et cetera. And a lot of Anaconda's, um, uh, solutions actually were, were, you know, I guess the term is dog fooded in the sense that they were solving problems for themselves.

 

Rob: And so many years ago, when Anaconda started off, uh, you know, they were trying to use Python to solve these problems and they ran into issues with, uh, clean environments and collaborating across, uh, researchers and scientists and all kinds of people trying to do numerical computing and whatnot. And so they, uh, they, you know, they had a deep belief in Python as this tool and as this entire, um, Uh, community and ecosystem that really could could, uh, could do amazing things for its users, but it obviously had these these challenges.

 

Rob: And so, uh, they made it [00:04:00] easy. They made it easy to install python and really to install all the tools and packages and whatnot, um, python or other that people needed to get their jobs done. And again, they used it to solve the problems that they themselves had and then kind of made it more broadly applicable.

 

Rob: And so at its base, you know, what's really easy. What's really powerful about Python is the community. It is the millions and millions of people that are, are, are, you know, working, uh, to generate new libraries, new packages, to solve new problems and to share that work with other people. It's, it's the, the open source power that goes way beyond just Python.

 

Rob: And what Anaconda did was they just connected people to that open source ecosystem and we supported the ecosystem itself and all those community developers. And so as new people are trying to get into Python and they want to get a new project started. They make it, you know, conda and anaconda make it very easy.

 

Rob: Um, now, uh, we could go on to the evolution of developers as they go beyond like the hello world kind of apps and as they grow into, uh, more complicated environments, more complicated [00:05:00] systems that they're developing, how they work with others, how they publish that work, how they reproduce that work, and you kind of go up the stack of challenges there, and Anaconda kind of has, uh, capabilities that help people at each, at each of those levels, um, and a lot of that is available for free.

 

Rob: You know, it was donated back to the community like conda itself, uh, and a lot of the, the other, uh, features that Anaconda packages up now, how we supported the ecosystem. Uh, I mean, obviously the, the projects that we donate back to, we're deeply involved in many, uh, open source projects, whether it's things like beware, a number, um, panel and kind of on and on bouquet, uh, you name it.

 

Rob: Um, we are also active in many others. Uh, and we work with a lot of open source developers and even commercial companies, whether it's NVIDIA or, or, or meta or others on their Python or open source, um, um, contributions, but we also donate financially. Anaconda has actually donated millions of dollars, um, to various open source efforts.

 

Rob: We do a lot of hosting. Uh, and operations for open source projects, whether it's conda forge and others. [00:06:00] And so, um, it's kind of a deep ethos, uh, deep in the east of ethos of Anaconda. You know, we believe in open source and we believe in why. Supporting open source and supporting those communities is good for everyone, including Anaconda.

 

Mehmet: Fantastic. You know, and, uh, and I'm always fan, you know, I may be biased because I've used it, you know, and how easy it is, but now recently, Rob, you know, if, if someone goes today on your website, they will see that Anaconda is saying that we are the operating system for AI, right? So, uh, and of course, like AI is.

 

Mehmet: You know, top of mind. So why there is a special focus from, you know, the company side to democratize, you know, a I. And so because open source, actually, and you mentioned this, like just a few minutes ago about like how it helped, you know, the community in getting a Clean python environment in getting these nice libraries, you [00:07:00] know when we talk in the sense of data science and so on So now with ai, you know What is the main motivation for for anaconda and you know how you are hoping to leave an impact on on the tech community?

 

Rob: Yeah, that's a good question. Um I, I also personally kind of like the OS for AI framing because OS can stand for, uh, uh, operating system, but can also stand for open source. Um, so it's a fun little wordplay there, but, um, it's Anaconda and most people around in this space have been doing this stuff for a long time.

 

Rob: I think sometimes get a little, um, uh, I don't know, they have different reactions to the kind of recent focus on AI because, you know, they've been doing AI. It's this is not a new thing, right? It has been around for a long time. The recent evolution in L. L. M. S. And the fact that those have gotten very, very useful and can do amazing things in the last couple of years has definitely shifted the focus.

 

Rob: But this has been a long journey of applying the right tools to the right problems. And so for a lot of our users, we've been [00:08:00] helping them with A. I. And machine learning and whatnot for over a decade. Um, and so there's a there's a there's a deep Deep relationship. A lot of our customers already to solve these problems.

 

Rob: And so something that is just leaning into the evolution of those technologies and the evolution of kind of the technology across the world. That said, um, it's, you know, it's the same. general pattern of recognizing that there is this amazing technology and this amazing community, and that it will free people to solve problems in a way they couldn't solve them before, empower them, uh, and enable them to do great things.

 

Rob: But, you know, they can use somebody to help connect them to those technologies, to make them accessible, to make them secure, safe, repeatable, enable collaboration and reproducibility, and all the kind of, um, all the, all the kind of benefits that Anaconda brought to data science. Now we can just extend that to people leveraging these AI technologies as well, particularly open source.

 

Rob: And it's the same ethos, the idea that this large community that is pushing forward these [00:09:00] advancements. Um, and, and LLMs and deep learning and kind of all the other kind of, uh, AI technologies is going to, you know, push the state of the art. Mark, Mark Zuckerberg's recent, uh, llama announcement where he kind of extolled the benefits of open source.

 

Rob: Why is meta so into open source? Why do they think this is the right way forward? You know, that was kind of like, yes, of course, that's exactly how we believe, uh, this should be and why it's so important. And so we're really just trying to, in some ways, we're really just trying to repeat what we've done before.

 

Rob: For Python and data science. Uh in this kind of new realm But it's in many ways something we've already been doing for our users We're just being a bit more explicit about it and a bit more public about

 

Mehmet: it. Absolutely fantastic Like it's good. You mentioned about you know, llama and you know the announcement from from mark zuckerberg now For someone, you know, who might be listening to us or watching us today.

 

Mehmet: So they have, they have heard about different LLM, large language models. So if we [00:10:00] want, you know, in a very basic way, you know, and we can go into the details if needed. But I mean, I understand that open source makes everyone contribute. It makes people understand, you know, the logic maybe, and you know, what's going behind.

 

Mehmet: So if we want to compare the open source versus the closed source AI models, um, so what do you think are the key benefits for both? Like, I mean, developers and. You know, for for everyone who's planning to utilize AI, um, in their strategy and to your point, of course, when you say AI, AI has been for a long time, but I mean, the large language model and, you know, the capabilities that we can use out of them.

 

Rob: So when you ask about that, are you talking about the benefits of open source, uh, LLMs and AI in particular, or are you talking about the benefits of both closed source and open source, maybe in comparison to each other?

 

Mehmet: If we want to compare, like why, you know, you mentioned like Lama and, you know, like, and Mark Zuckerberg.

 

Mehmet: So if you want to explain [00:11:00] this in like maybe simple terms, why, you know, the open source, like offer more benefits in your opinion, um, than the closed source ones.

 

Rob: I think I do personally believe it's more benefits. Um, but I do think there are benefits to closed source too. And people can make that evaluation for themselves.

 

Rob: I think Mark did a great job of kind of, um, Laying out there. Some of the fundamental benefits of open source. And so, um, one of them is just fostering innovation. You know, if you have a company Um, that is developing technology. Uh, obviously you can have a company full of very, very brilliant people working very well together, et cetera.

 

Rob: Uh, but some of it's just a numbers game. If you have a large ecosystem and you have a large community of people that are all collaborating and all kind of pushing things forward, um, it's, it's. In some ways, this may sound a bit weird, but it's almost like capitalism. You have all these people that are, they're starting their various companies.

 

Rob: They're starting various services. They compete with each other. They push each other forward. Uh, whether it's commercially, they're trying to out, you know, business, the other person, or whether it's just, um, for recognition and for impact, whatever, [00:12:00] but I think that that community is what drives innovation.

 

Rob: And so you can have that, you know, that innovation and that community and whatnot within a company, or you can have this broader ecosystem. And, and so, uh, you know, I, I guess I personally wouldn't want to compete with millions and millions of people all working to push the state of the art forward, uh, I think that that is a losing proposition in many ways.

 

Rob: The other things about open source though, is. Because it's open, there's a level of understanding and clarity and trust there, uh, that, uh, I think can, can, You know, it can affect people in a bunch of different ways. Everything from security, I can actually look at things and I can audit them and I can understand what's going on, uh, to, uh, freedom.

 

Rob: So if I can take this stuff and I can do with it what I want, I'm not constrained. I'm not restricted. And, uh, you know, if the effort goes away, uh, then I could at least have the technology that I can then, I could fork it and I can maybe do my own. I could do my own thing with it. Um, and so that freedom gives people a lot of flexibility in how they adopt these technologies, how they apply them, how they drive them forward, how they integrate [00:13:00] them.

 

Rob: Um, many years ago at a startup, you know, we were a startup. And so, uh, we actually, it's funny. This is like 2004, 2005 timeframe. We, uh, we're using J boss as our model at the time. Uh, for how you could commercialize it's open source. JBoss has been acquired by Oracle. And so what we were doing was we were like, Hey, we can actually contribute enterprise features back to the open source projects and we can provide support.

 

Rob: We can provide a community support, uh, in addition to like product support, things like that, um, and push this forward. And we had, uh, you know, customers who are already using these open source technologies that were great. They finally had somebody to call answer, you know, answer the phone at midnight on a Friday when something went down, um, but they were worried about the extensions that we were.

 

Rob: Uh, providing on top of the open source. And so they actually made us put our code in escrow. The idea was we're a startup company. We're a risky bet. And if we go out of business, you know, they didn't want to have that technology bet. And so they made us put our code in escrow so that they had a, they had a way forward with the code should we go out of business.

 

Rob: And so with open source, like you don't necessarily need that you have the code itself. Like you can actually do that. Um, And so I [00:14:00] guess in a lot of ways, the, the, you know, for me, it comes down to the innovation that the open source community drives. I just, I just think competing with that level of intensity, that level of passion, that level of vision and the flexibility of it being open means that, that, that, that kind of discovery, that kind of graph traversal of innovation, um, is, is, is in my mind, it's almost inevitably going to be better.

 

Rob: Um, and I think we've seen that in the, you know, in the past couple of decades of open source, in particular, Anaconda's experience.

 

Mehmet: Absolutely. Now, one, one topic I want to, you know, also touch base on, uh, Rob. So when we talk about, um, you know, providing open source LLMs, so, and, you know, I'm interested to know like how at Anaconda, you know, you're, you're trying to, to tackle this.

 

Mehmet: Um, so when we have all these sources, I'm, I'm curious to know if this is the same way as, you know, when, When we do, for example, for a an [00:15:00] open source operating system like Linux, for example, so there's someone that goes and check this for the quality security of the open data and the models, right? So, so what are the, you know, the, the not in details, of course, but some of the measures that you take to You know, ensure the quality and the security of these models.

 

Mehmet: And again, like protect against maybe misuse of this, of these resources.

 

Rob: Oh, interesting. So just, when you say misuse, um, what do you mean? Do you mean, uh, uh, I'm, I'm, what I'm hearing you say is people who are then getting access to these open source models might use them for nefarious reasons or use them in a way unintended by the graders.

 

Mehmet: Ah,

 

Rob: so that's interesting. Let me get to the first part, which is what are we doing? to help people access these open source models. And then we can talk about the misuse. So to access them, we actually have a product, uh, in beta right now called AI Navigator. Um, it's an example of the kinds of stuff we're doing, but the, the idea here is, again, conceptually, you can just say, what did [00:16:00] Anaconda do for Python packages and the broader kind of, um, uh, development, environment management through conda and others.

 

Rob: Uh, you know, there's, there's hundreds of thousands or maybe even millions of, Of open source projects out there, and Anaconda tries to curate them to some extent to provide secure builds, uh, and to, um, evaluate them and whatnot, so that when people get them from Anaconda, as opposed to maybe directly from, uh, from the, the, the open source community itself, uh, there's some level of, of validation, verification, et cetera, that we have provided, and I'm not saying that that's better.

 

Rob: You have to understand which is good for you. I mean, like you may not have those requirements, et cetera, but for people that want to have a secure environment where they have some provenance or they have some, some, some certainty over the provenance of the code that they're installing, they have tools that, um, you know, central it has tools that they can, um, have some kind of governance.

 

Rob: Over the open source projects that are being used within an organization so they even know what's being used. They can control things like security violations or whatnot. Um, Anaconda does all of that [00:17:00] for, uh, the broader Python ecosystem. And we provide those tools to IT. So, now you're getting to data and, uh, and models.

 

Rob: It's the same idea. We want people to be able to access these things with some level of validation of certification. Thank you. Uh, et cetera. And we want to give Central IT the tools to make sure they understand who's using what, how they're using it, is it appropriate, um, so that they can safely and securely connect, you know, all of their developers, all of their knowledge workers, all of their application developers.

 

Rob: Um, side note, we'll come back to later. One of our goals is to. Drastically broaden the number of people that can actually access these technologies and use these technologies so that you don't just have to be a developer, that, that these tools are available to a broader number of people. Again, that'll drive innovation and make the world a better place for everyone.

 

Rob: Um, but coming back to the point there is, is you want to have, um, we want to, we want to connect people to these broad ecosystems in a way where they, it isn't just the wild west, but they have some control. So, um, AI navigator is an example of how we've done that. We actually curate the models there. We, we quantize them and make [00:18:00] them available on different devices.

 

Rob: That is another goal of ours is to also connect these people how they want to be connected so that could be on a mobile device. It could be on a laptop. It could be in a desktop. It could be cloud. It could be local. You kind of name it. We want to connect our users wherever they are, uh, to, you know, to, uh, to the broader Python ecosystem to the data in the models that they want in a way that also then enables those data models to then run.

 

Rob: In the locations that they want them to run. Um, And so, yeah, so I navigator is a tool that now allows people to do that. But that's just an example of the kinds of, um, and user tools that we're providing or will be providing to connect people to those models, but also the kinds of governance and kind of more corporate management tools that will be providing as well.

 

Mehmet: Fantastic. Just, you know, one curious question that came to my mind now, are you seeing more use cases for people to run these LLMs, uh, locally? Like even on, I'm seeing crazy things on, on, uh, on X mainly because it's where [00:19:00] the geeks, you know, are usually, and people are running. You know, like mind blowing experiments and even building great products just running that on a laptop, like a MacBook or whatever.

 

Mehmet: So are you seeing like more people going this rather than, you know, traditionally people going to the cloud or maybe, you know, you know what, just like out of curiosity, like what have you seen in the community, the adoption so far?

 

Rob: You know, I'd have to look at the data. I don't know if I'd say more. My, my, my, I'll stick maybe to belief or feeling at this point is that there is a ton of pent up demand there and there's a real need for these things to be available and accessible where people want to run them.

 

Rob: Um, not again that there isn't a reason to run them in the cloud or, or, um, and maybe those kinds of traditional ways it could be access to the right infrastructure you don't have. A fleet of the latest Nvidia hardware in order to run, you know, massive inferencing at scale or to train your own models.

 

Rob: And so there are, there are physical concerns, but I think, [00:20:00] yes, we are seeing a huge demand. Um, and actually, um, one thing that's interesting for us is that, you know, we have this, we have our own, uh, chat assistant. You know, kind of coding assistant, uh, the Anaconda assistant, um, that runs in our cloud notebooks, but we also have a way that you can install that assistant locally.

 

Rob: So people running Jupyter notebooks locally on their laptops or desktops, whatever can install our chat assistant. Now, obviously it does connect back to our cloud services, but they're still able to run that locally. They aren't forced to migrate their development environment,

 

Mehmet: uh, to the

 

Rob: cloud in order to access that.

 

Rob: And the growth in the local assistant has far surpassed the cloud assistant. And that was my thesis originally a long time ago was that there are a lot of people that do want to work locally. That's the, you know, you, you support them in their journey, which almost always starts off locally, not, not always, but, but a lot of times starts off, uh, locally and, um, and then maybe they, maybe they then scale to remote as their needs grow, as their sophistication grows, whatever.

 

Rob: Um, and that was borne out by the assistant that we, that we have available in both places. And I think also by the feedback we've gotten on Android Navigator so far is Making this stuff [00:21:00] accessible. Um, it's going to come down to what is the, what is the problem you're trying to solve? Uh, what is the conditions under which you're trying to solve it and who you're trying to solve it for?

 

Rob: And what I mean by that is you could be in a situation where you're your users themselves are running in situations where connecting to a cloud service necessarily isn't actually something feasible. Maybe it's a latency issue. Maybe it's the interaction pattern they're going for. Excuse me. The specific technologies that they are trying to leverage.

 

Rob: And so, um, unlocking that, uh, is obviously going to be very helpful for our users. Uh, and for those communities. Excuse me. Um, but additionally, uh, that's funny. That kind of cleared my cache. Um, uh, but yeah, as far as like, am I seeing more? I don't know if I have the data to back that up so much as I just have other situations in which we've supported local development, local access and seen.

 

Rob: Kind of a very excited adoption on a very excited reaction and my own kind of intuition around the kinds of problems that are users are trying to solve. We [00:22:00] also have users directly acting, asking us to help them solve some of these problems. Um, and it kind of always had a large presence in local. kind of development and application deployment.

 

Rob: And so it just kind of fits with our experience that there's going to be a pent up demand there.

 

Mehmet: One other question that also just came to my mind. Um, is this helping somehow some of, uh, you know, the startups, let's say who relies on maybe other big companies, APIs to have their kind of, you know, maybe at the beginning, kind of a.

 

Mehmet: Backup in case that APIs goes down for a reason or another, or maybe like, maybe if, you know, we, we, we know this happened a lot of times, uh, where the pricing change and you know, they, they hit the wall. So is this something that can help people building AI application?

 

Rob: Yes, absolutely. So, excuse me, um, there's, you [00:23:00] touched on a couple of other reasons why this is important.

 

Rob: So, um, You have to, how do I think about this? Um, okay, I'll put it this way. In my positions, a lot of times I, I am in, I'm involved in assessing technologies that we want to leverage or that we want to build. And I have to decide, is this something that's like a core competency of Anaconda's or, or any other place I've been, or is this something that we really want to leverage a tool from a partner?

 

Rob: And then how much of a risk are we taking in that, in that relationship? Um, and so. It may not necessarily be a core competency bus that we want to develop, like we don't necessarily need to develop our own foundational model, but if we are relying on that model to provide a key capability to our users, then that's, that's a risk I want to mitigate.

 

Rob: And I, and so, yes, I imagine any kind of, um, you know, executive, uh, at any firm, not necessarily a technology firm, um, needs to balance those risks. And so if you have that. Yeah. You know, this new, um, you know, whether it's a support system or, or some other kind of capability that you're rolling out for your users and yes, and you're reliant on a, on a third party API and they lose connectivity or [00:24:00] there's a contract dispute or a technology change.

 

Rob: You kind of name it. Um, yeah, that's, that's a risk that you are taking onto your business. You're taking onto your technology and you, you have to be prepared to handle that. Um, there is also the business side of that. Like you said, they raise their prices. You are dependent on them. I mean, I think. You and I are old enough to understand or remember the nineties when everybody in the world was worried about Microsoft in terms of windows and how you were kind of you were kind of holding to them.

 

Rob: I think modern version of that. You saw this again in Mark Zuckerberg's, uh, announcement was the reliance on Apple and Apple's control over its ecosystem and meta decided that they needed to push things to give them flexibility outside of that. And so, yes, if you are, if you are locked into a specific model or specific provider.

 

Rob: Model. Um, that is a risk that you are taking on and you'll have to decide kind of the right mitigation, the right resolution. Now, in addition to that, something that also comes up is any kind of compliance or regulatory concerns. Um, and that may change the factor on who, you know, on the model you're using, how you're [00:25:00] using it, how you're deploying it, how you're hosting it.

 

Rob: Um, you may have requirements. To, uh, uh, be able to, um, explain what's going on, how you're leveraging it. You may, uh, you may just not be willing to share the data. You may not, you may be unable to share the data that you need in order to, uh, either, uh, fine tune the model or to generate your own kind of derivative model, uh, or even to, to enhance your prompts.

 

Rob: And so there are plenty of cases where, um, there are concerns or requirements even outside of kind of our users. Control that they need to, uh, include in their evaluation of which technologies they're going to leverage.

 

Mehmet: Absolutely. Um, you mentioned, you know, the enterprise couple of times. So, so moving like from enterprise perspective, um, you know, how Anaconda helps, you know, enterprises for, you know, implementing AI solutions.

 

Mehmet: And of course, like, you know, enterprise, they, they will have like larger challenges, I would say. So how, how do you What, what do you see, you know, the, um, [00:26:00] Most common use cases for, for enterprises.

 

Rob: Oh man. Um, you know, one of the things that's interesting is things are changing so fast. And things are evolving so fast.

 

Rob: I mean, I'll be honest. I have trouble keeping up. I have a job Um, I could spend all day every day reading papers and paying attention to the latest releases and Um and trying to understand them. Um, and and i'll be honest like intellectually I get I get jealous of people who have more time Um, just out of my own curiosity.

 

Rob: And so some of it is just having expertise And being able to answer questions, being able to kind of simplify the situation and help people separate out. What should they pay attention to? What should they not, um, and to help them understand some, some of these challenges, how they affect them and maybe how other people are also responding to them.

 

Rob: So some of it's just having an extra position and having a lot of experience that we can then share and leverage from a technology perspective. Um, you know, we have a workbench, a product [00:27:00] we recently, uh, had the ability for people to deploy these LLMs locally. And integrate them through that workbench product.

 

Rob: And so we've actually worked with enterprise customers to deploy specific LLMs locally and to work with them to find the right models that are right for their scenarios or to help them understand how they can then do that themselves and how they can deploy different models. Um, we've also helped connect them, just the normal Anaconda stuff.

 

Rob: We've helped connect them to the broader, um, Python, you know, kind of ecosystem, whether it's the latest libraries, it could be PyTorch, it could be, uh, TensorFlow, it could be LangChain, it could be, you know, any one of the, of the kind of rapidly iterating, um, uh, uh, packages there that allow them to, uh, investigate or, or integrate these technologies, or even things like our partnerships with NVIDIA, where we recently updated the, um, uh, Anaconda distribution to include the entire CUDA development kit.

 

Rob: Uh, for people that are looking to leverage those libraries from NVIDIA in order to develop these technologies on their own. So, uh, it kind of spans the entire gamut. Everything from the very basics, giving you access, uh, all the way up [00:28:00] through our enterprise products that contain actual capabilities to deploy LLMs, uh, locally, uh, to just the expert advice and kind of guidance and consultation, um, that, uh, Anaconda can provide.

 

Mehmet: Fantastic. Uh, I gotta make a comment at the end about, you know, the fast pace, but for now, you know, and as someone who worked, uh, who worked, you know, in, in with enterprises is mainly. So the number one thing that, you know, usually they focus on is, okay, we want to see this return of investment ROI for what, you know, this AI strategy or strategies that they decided to, to have.

 

Mehmet: So, um, how have you seen them, you know, measuring this ROI and what are like some ways that they, you know, usually you advise for, for them to maximize this ROI.

 

Rob: Oh, interesting. Um, okay. So for the first part, so yes, I like to, I like to try and make [00:29:00] things concrete. Um, I'm not unique in that or anything like that, but just for me.

 

Rob: Um, and so when we talk about this, this gets back to the comment I made earlier, like I like to identify specific problems. And specific, uh, capabilities that people need so that, you know, you get what you measure. And so if you have some sense of, of what you're looking for, then we can talk about how we either accelerate it or increase it or we, you know, whatever, whatever kind of dimensions we need to, to affect.

 

Rob: Um, and so one of the things is that we'll talk about is you have the softer side of security stuff. It's, it's, it's an interesting challenge, right? Because if you haven't been hacked, if you haven't had like a supply chain problem or something like that, then sometimes it can be hard to quantify. Bye. The value there.

 

Rob: And so you just have to kind of look at industry and how other other companies have dealt with it and whatnot. So we talked to people about the kind of security capabilities we offer. But then more proactively, we can do things like we might help them accelerate their development. And so we can talk about what is the time to value.

 

Rob: Uh, for your software release cycle or your product release cycle, or maybe some new initiative. Again, you're trying to integrate these technologies and roll out some new capabilities. Well, what [00:30:00] is that it resulted in? Has that resulted in greater customer attention? Has it resulted in an increase in customer satisfaction and, um, uh, NPS?

 

Rob: Has it, um, are you actually seeing greater adoption? Are you seeing greater engagement with your products? Are your developers able to, um, um, you know, iterate more quickly and pull in your roadmap? Um, stay ahead of the kind of rapidly increasing competition. And so we'll identify those dimensions and then work with people to, uh, actually see measurable improvements in them.

 

Rob: And then they can try, they can translate that into the return on investment. Um, but I think that's the key is being clear about what are we trying to solve and how are we measuring progress towards it? You know, if you do, if you aren't, if you aren't kind of measuring, you don't know if you're actually making progress.

 

Rob: Uh, and I, it's, I, it sounds, it's, it's, Almost like too simple. Um, but It's kind of fundamental. Um, so yeah, so anyway, we we try to focus on people on accelerating the development on increasing their productivity on helping them mitigate their risks, whether it's security or others, or also scalability, the ability for them to they've gotten traction, they've [00:31:00] gotten some adoption and now they actually need to blow this out.

 

Rob: And, and how can Anaconda actually do that. Enable that for them as well. Um, we are also working ourselves. You know, we have, uh, evaluation driven development. So as we so this is a recent blog post of ours. We evaluate LLMs ourselves for how we include them in our technologies. So we're leveraging this stuff across our entire technology stack too.

 

Rob: So I don't remember the, uh, there was an old, Oh my gosh, this is probably awful. There was an old commercial I think in the 90s, maybe even the 80s, uh, in the United States, maybe wider where there was a, there was a hair club for men. And it was a, uh, it was a, it was a, uh, product to help, you know, grow hair or whatever.

 

Rob: Um, and at the end, the guy who was promoting this would say, I'm not, I'm not only the president of the company, I'm a client. And, um, and so I always made me smile at that one. And Anaconda, we're not just helping to drive this stuff for other people. We are leveraging this stuff ourselves. You know, we are deeply trying to integrate all this LLM and AI technologies into what we offer at every level of the stack.

 

Rob: And so, [00:32:00] uh, a lot of times when we talk about this, it's not just, we've helped company do this. We can actually talk about what we've done ourselves and the improvement we've seen ourselves. And so I can advise somebody on an ROI because I've had to do that ROI calculation. I have to understand where I'm going to invest engineering resources.

 

Rob: So sorry for the digression. But anyway, so, um, uh, you get what you measure. And so being clear with people, this is what we're actually trying to see. And this is how the, you know, this is how the return was for the investment that we've made. And then by having that experimental, that experimental culture, um, you can then kind of, uh, seek to find the greatest, you know, the greatest return on that path, we have an internal framework we use for evaluating LLMs.

 

Rob: And then for, uh, kind of automatically driving improvements in how we've integrated those technologies. Um, and so it's the same thing. We had to create systems to measure. The progress, or otherwise we wouldn't know if we're actually getting where we need to

 

Mehmet: be. Absolutely. Um, Rob, one question, maybe I should have asked you that at the beginning, but just it [00:33:00] came to my mind.

 

Mehmet: You know, why Python, you know, has dominated AI space? And are we seeing any other language that might come and take? I love Python, by the way, just for, you know, You know, it's the one of the easiest one. Even when I want to explain coding for someone. Um, but why it's so powerful and, you know, do you think that, you know, it's gonna stay to contribute this AI revolution that we are living currently?

 

Rob: Oh, that's a, again, we can talk for a while just about this. Um, so a couple of things. One, Anaconda actually, You know, we're so heavily associated with Python for obvious and good reasons. Um, but Conda itself and even Anaconda go beyond Python. Um, we already, you know, Conda supports other languages. That's actually one of the benefits of getting your environments is it can pull up, pull things other than just Python and its dependencies and whatnot.

 

Rob: And it's real kind of power comes when you're actually trying to pull in, um, non, non Python languages and whatnot, or [00:34:00] Python packages consisting of non Python. Um, and so, um, But yes, there's all kinds of interesting technologies come out. There's, you know, obviously there's those kind of hot programming languages, whether it's rust, um, or or others.

 

Rob: I go is hot. I'm sure, uh, stack overflow, I think, just did its recent survey, and they just announced, um, you know, their their latest results. And again, it's it's who responds to the to the survey, but you can see the popularity of different technologies. Now they've changed over time. Um, And then there's the Python, Python specific variants, whether it's things like Mojo or others, where people are trying to to address what they see as maybe the shortcomings in Python, um, at Anaconda, we're also trying to address what we see as the shortcomings in Python.

 

Rob: That's been the history of the company. And so we constantly look and say, look, if Anaconda was founded today. What are the problems that people have and, you know, that we would want to help them with to make sure that we are not over indexing on the past or on maybe prior framing. And so one of them is now Python performance matters because, [00:35:00] you know, you have these, this, this, this bottleneck in hardware.

 

Rob: Uh, and the difficulty, the calculations, even with that hardware, people need the results faster and whatnot, and they want to get the most out of these massive investments they've made. There's a reason why NVIDIA is one of the most valuable companies in the world. And so, uh, we are looking at things like how do we actually improve Python performance?

 

Rob: How do we keep it relevant? And so one of the things I wake up every day thinking about is how do I keep Python the lingua franca of AI? And what challenges need to be solved in order to do that? Um now I am a big believer that there are multiple technologies. I don't need there to be one technology In fact, that's actually bad.

 

Rob: I want different tools for different jobs. I want people to have that freedom and flexibility Um, but uh, that doesn't mean I shouldn't spend A ton of my effort, making sure I address the shortcomings and the tools that we have. Uh, and so we, uh, we are addressing Python performance at multiple levels, whether it's the, um, Python interpreter itself working to help make performance improvements there, whether it's optimizations in the Python packages, uh, for different architectures, different software environments, different [00:36:00] scenarios, um, to how we get those into people's hands.

 

Rob: In a way that is manageable when you have this kind of Cambrian explosion of hardware and accelerators, uh, and libraries and whatnot. So we're trying to help people manage all of that. On top of it, um, where could, you know, will Python stay kind of the default? Um, I think, I think it's exploded because Python was, you know, Anaconda.

 

Rob: I think rightfully, even though I wasn't part of Anaconda at this time, um, can say that they were a big part of making Python, uh, a huge force in data science, numerical computing and whatnot. And AI really came out. Of that. And so as a result, at least the current LLMs and everything else came out of that.

 

Rob: And so as a result, Python kind of rising tide lifts all boats kind of came along with it. Um, it's easy to use. It's accessible. You know, there's a reason why Microsoft is adding Python to Excel. Uh, and, uh, you know, that was announced last year and it's, it's, you know, there'll be continuous improvements.

 

Rob: It's because they view that as a way for their broad base of people who, again, aren't software developers. These are people that use Excel. They're very, very smart people. They write [00:37:00] incredibly complicated formulas and spreadsheets to do all kinds of wild things with Excel. Um, and they view that as a tool that was really going to allow them to kind of take what they do at the next level, to either give them capabilities, Um, you know, to, uh, you know, it's excels an amazing piece of software.

 

Rob: I mean, the world runs on it, um, and they view this as a way of, of, of just adding more, adding more features, more capabilities for their users and that broader user ecosystem. And so, um, uh, I think it's that. Ease of getting started, the ease of understandability, the ease of collaboration and whatnot that Python has, that it's, it's going to be hard to beat that in terms of something else coming along.

 

Rob: Um, and then that broad ecosystem, all those people contributing, all those packages, all innovation. Again, it's not just that you have to have a better programming language or a better tool. It's the whole ecosystem of users, that whole community that you then have to kind of bring along, um, or, or, or empower or support.

 

Rob: Uh, so I guess personally, I don't, See that going away. Um, but, you know, we're gonna do what we can to make sure that we keep python and that [00:38:00] python ecosystem moving forward so that, you know, no one ever feels like their concerns aren't being heard that their concerns aren't being addressed. And we'll do what we can, uh, to support that.

 

Mehmet: Absolutely. You know, and I think, you know, the from someone who's not So, Too much technically, uh, immersed in, in ma machine learning and ai. But, you know, I have like basic programming knowledge. So what I have seen is, you know, the amount of libraries and the amount of contribution from the open, you know, from the community, uh, because it's an open source to, to your point as we discussed at the beginning.

 

Mehmet: So made, made Python. It's not like, I would not say, uh, unreplaceable, but it's like kind of the, you know, defacto, uh, machine learning data science. You know, and AI to go because it's super easy to learn and you can teleport it, you know, the way you want to, to different. And of course, what Anaconda does, it helps in, in, in doing this also as well.

 

Mehmet: Now, the question that I [00:39:00] still have, and this is why I laughed after you said about, you know, following up on the, the changes, of course, we know that. You know, and we repeated this multiple times here on the show So so of course what what happened, you know let's say Late november 2022 Which with the release of chat gpt and so on so of course ai was there long long long way back But things started to change very very fast since then and you know There's no day without hearing about something new coming But if I want to think Beyond the LLMs and, you know, chat GPT and all this.

 

Mehmet: So if I want to ask you, Rob, today, some emerging trends in AI and machine learning that makes you super excited and I would not ask, you know, traditionally we used to ask someone, for example, Hey, tell me after in the next 5 10 [00:40:00] years now we cannot ask this anymore. So in the near future, I would say, so what are you seeing?

 

Mehmet: What, what makes you excited?

 

Rob: Oh, a lot, a lot. I think that's one of the reasons, um, I consider myself so lucky to kind of live when and where I do, um, because I think there's just so much exciting stuff going on, uh, and it doesn't stop. That can be exhausting, but it's still, it beats the alternative. Um, I am excited.

 

Rob: So I, I like to think about. I like to think about this stuff as, um, as an expert assistant that I have, you know, um, and so one of the reasons why I think small models, combinations of models, open source models are so attractive is that the idea that I'm going to have one, if I need, if I need an expert on my team, let's say I need help.

 

Rob: Uh, you know, we want to look into intellectual property stuff So I maybe I need some legal expertise, but I also want to get deep into maybe the life sciences I want to really help with drug discovery or genomics or something like that And then I also want to maybe do stuff for traditional manufacturing or or you know, um, you know, you kind of name it I wouldn't hire [00:41:00] one person who has six different phds in six different fields I would hire a lot of experts and i'd have them work together.

 

Rob: And so by combining models with specific Civic, you know, training specific, um, kind of data sets and whatnot together. I think, uh, you're going to see some pretty amazing capabilities, but I think where that's really going to get taken off is when you get, you know, kind of agents and I know I'm, I'm not unique in saying this, a lot of the stuff I feel like is, you know, is what you get with a, with a Google results or whatever, but I do believe agents are, uh, uh, going to be an enabling technology if we can get, um, Either the scenarios in which they're applied or how they're applied and how they're integrated to the point where things like hallucinations and whatnot, um, aren't a problem or at least are a manageable problem.

 

Rob: Then I have these intelligent assistants, these automated systems that I can then have them help me with tasks. It could be that those tasks are maybe they're grunt work. Um, it's complex calculations that I just don't want to do by hand or, um, in the same way that calculators and spreadsheets and other things kind of took those burdens off of people's minds, I'll now have an intelligent assistant that [00:42:00] maybe as a data scientist, I now have an assistant that can help me clean my data, uh, and help me organize my data and help me track my data and help me find new, new data sources, or maybe people that can, maybe systems that can automatically suggest performance improvements, um, or algorithmic changes, you know, things where I now have this team of experts that are helping me, uh, solve my problem.

 

Rob: Uh in a way that just wasn't possible before and now when I have my collaborators I have the other people i'm working with and we all have these agents Uh, you know that should be like a completely multiplicative force on what we're able to do I think one key thing that people forget some people forget.

 

Rob: Um is it's the old joke about uh that two people are camping and they hear a bear and One of the people starts putting on their their running shoes and the other one says You can't outrun a bear. What are you doing? And he says, I don't have to outrun a bear. I just, I just have to outrun you. Um, and as morbid as that joke is, uh, that's true when it comes to things like LLMs and whatnot, they don't have to be perfect.

 

Rob: If I'm, if I'm, you know, in the worst case, like, let's say I was, this is work I would give to a physical [00:43:00] person. They just have to be better. Then the existing system that they're augmenting. And so if the error rate of the existing system is, you know, 96%, well, great. As soon as I get the error rate with the LLMs to 97 percent or 96.

 

Rob: 1 or whatever you want, I have essentially a solution and you still have to have those checks and balances. You still have to, to architect with the idea that these things aren't perfect necessarily. Um, you don't, you can't get away from that, but you can't get away from that with people either. Uh, and so, uh, once I think the intelligent, uh, agents.

 

Rob: And the ability to combine these things and have them especially kind of self improve over time is going to be super, super exciting. Again, getting back to the evaluation driven development stuff, we've already talked about how we're doing that, how we have these automated systems that we, we use them to improve the system and we use the technologies to then evaluate the improvements and to suggest further improvements.

 

Rob: And we can get that kind of, uh, REPL loop going. On this technology and and and the acceleration and improvement is dramatic. And that's just kind of the beginning stages of your so where the [00:44:00] world's going to be in a year or two. Yeah, I wouldn't even make a prediction. Um, because Uh, all I, all I feel confident in predicting is that this kind of acceleration is going to drive improvements, although I do want to call back, we never got back to it, um, you said about misuse, so there is a bit of a problem, right?

 

Rob: This stuff is powerful. This stuff is, uh, it does kind of change the game. How do we then prevent people from doing bad things with it? And I think the answer is you can't. Um, honestly, uh, and I think, you know, with physical technologies, you can do that. Um, and maybe that is part of the answer, is you don't give people the ability to have the latest, uh, GPUs or the latest accelerators or the latest XYZs that, that allow them to do this.

 

Rob: But in general, I think the right answer there is, is really to empower the good people, uh, and to help people build systems that are resilient. Uh, and, um, and, uh, uh, properly architected and properly monitored and whatnot to deal with bad actors as opposed to any kind of idea that you can't. Bad actors is a human condition.

 

Rob: They're never [00:45:00] going away as long as people are here. Uh, and that's both malicious bad actors, people intentionally behaving badly, and people who believe they're doing good. Um, but maybe doing something that somebody else would disagree is a good thing, and I don't think you could ever get rid of that.

 

Rob: That's a people thing. The best thing we can do is to help help people build better systems and help provide the good people the tools they need to protect those systems and to kind of counteract those forces. But it's I think it's a It's a It's a way to frustration and disgruntlement and burnout. If you if you if you try to solve the unsolvable problem, which is, um, you know, preventing the bad used to begin with.

 

Rob: It's it's in my eyes. I need to arm the good people. with what they need to do good to protect themselves against bad.

 

Mehmet: Absolutely. And you know, one of the main things that I'm personally excited about is exactly the, uh, the technology that you just described, you know, having the multiple agents, uh, and, you know, getting them, The power to learn with time and enhance their capabilities and even communicate.[00:46:00]

 

Mehmet: Uh, maybe if I say this, uh, a couple of years back, people will think I'm talking science fiction, but we, we, we were like seeing a great, um, I would say progress, uh, in this field. And I'm looking forward to see, you know, what the near future, again, I will not do predictions like yourself, uh, Rob, because we, we, we, We never know.

 

Mehmet: So people are doing fantastic stuff nowadays. Um, One quick

 

Rob: thing. Sorry to interrupt you. Yeah. One quick thing. Um, I think a really interesting way to do predictions, um, or maybe not even do predictions really is you don't have to, um, you don't have to go for like fourth, fifth order effects. You don't have to really get, I don't even say creative.

 

Rob: You don't have to really kind of imagine this wild future. A lot of times just take a current trend and just extrapolate it out and assume that the trend is going to hold, you know, kind of like a Moore's law. Yeah. I'm pretty sure it was Nathan Myrvold. I believe there's the original CTO of Microsoft. Um, [00:47:00] he had like an interview in the late 90s and they were talking to him about, you know, you made all these amazing predictions and how did you see the future?

 

Rob: And his response was, I didn't, I didn't see the future. I just assumed the same technology trends for processors or memories and storage that had been in place for a while. We're going to continue. And I just said, okay, well, if they continue, what does the world look like in five years, what does it look like in 10 years?

 

Rob: And I made predictions saying, okay, in 10 years, we're going to have this much compute power, this much memory that costs with ease. So let's, let's, let's prepare for the world as it might be when those conditions are met. And so some of these things you don't have to necessarily. Imagine these these wild things that have been invented and everything else.

 

Rob: You just have to look at the rate of change and what impact that's already having and just assume that trend is going to continue. Model it out several years and then say, okay, great. Now this is the, these are the conditions under which we're operating. What constraints are now removed? Is this accessible to more people?

 

Rob: Is it something that the cost is lowered so I can integrate it into, um, supply chains or products where it was never cost effective before. Um, you know, the way that, that, uh, the mobile [00:48:00] phones. Going worldwide drove down the cost of certain ships, and now you can incorporate them into drones, and now you can incorporate them into all kinds of things, um, that, that weren't necessarily economically viable before, and so again, I don't, I don't think you, you have to get wild on the invention side, you just have to understand the trends, model them out, and then just have the confidence to say they're going to continue, and if they continue.

 

Rob: What's the world going to look like? And how do we then behave in a world like that? And, um, it's shocking how many things seem wild or, or science fiction or whatever, if you think about them that way, but then just think, just think of how the world's changed in the last 20 years. And suddenly it becomes a lot more, a lot more approachable, uh, and a lot more understandable.

 

Mehmet: A hundred percent. You know, I can't agree more with you, Rob, uh, on, on this. And this is why it's not like something that personally, I don't get surprised. I just get shocked. By you know, oh wow, they did it, you know, like I expected, you know I'm not shocked because I thought like it's not possible But yeah, like when it comes like faster [00:49:00] than I expected or I see it working It made me like say, oh wow, you know, like this is the wow effect as we say Um rob, I think we we covered a lot today.

 

Mehmet: So but you know, uh, if you want to leave us with you know maybe final thoughts or you want to Tell us like where people can, uh, interact with you. And, you know, of course, Anaconda website is very well known, but still I would ask you, you know, for like final words before we close.

 

Rob: Sure thing. Uh, yeah, thank you.

 

Rob: I, I, I found this. A great discussion, and obviously we have plenty of topics we could get talking for a lot longer on. Um, uh, you know, yeah, I'm, I'm active on all the normal socials and whatnot, and so I'm always happy to interact with people there. Um, I do try to, to go to different conferences and whatnot.

 

Rob: I'm always happy to talk about, uh, technology and my thoughts. I love hearing from other people because there's always a perspective or an insight or something that I don't have. Uh, and, uh, and that gets me, uh, very excited. As far as Anaconda goes, I think we have a lot of amazing stuff coming out. Um, later this [00:50:00] year, uh, even in the short term things that we've already announced or they're coming out shortly.

 

Rob: And so I would, I would say, pay attention everything from the, the kind of an LLM technologies to the, uh, Python performance issues to kind of the evolution of our data science offerings and whatnot. Um, we've had fantastic growth. Uh, you mentioned this at the very, very beginning and I didn't really respond, um, but we've had, you know, almost 100 percent growth in our business over the past year.

 

Rob: Uh, and adoption of our stuff obviously is even wider given. Given the, um, um, given the nature of our business. And so I think we have a lot of exciting things that are, uh, driving that growth. Uh, and, uh, yeah, I hate to tease you so much, but there's a bunch of good stuff coming that, you know, we have to have to wait until it's out to talk about it.

 

Rob: Um, and then otherwise, um, yeah, I look forward to, to hearing from people and from continuing to see just what comes out of. what comes out of the current technology evolution.

 

Mehmet: Absolutely. And again, thank you very much, Rob. I know like, uh, uh, how busy things can, uh, can get. And, you know, you took the time to be with me [00:51:00] here today.

 

Mehmet: So I really appreciate that. And I appreciate also your, you know, insights about, uh, all the things AI, whether it's like the Python language itself, whether it's, you know, the large language models, the open source and all the nice things that, you know, I'm, as I said, I'm a little bit biased because, uh, I'm one of the fans of the technology.

 

Mehmet: Uh, so again, thank you very much for the time. And this is how usually I end, uh, you know, my episode. This is for the Audience if you just discovered this podcast, thank you for passing by. I hope you enjoyed if you did So, please subscribe. We are available on all podcasting platforms and we are available on youtube also as well So give up give us a thumb up and share it with your friends and colleagues And if you are one of the people who keeps, you know, listening to us and watching the show Thank you very much for doing this and keep sending me your comments and suggestions.

 

Mehmet: I really enjoy reading them Thank you very much for tuning in. We'll meet again very soon. Bye. Bye

 

[00:52:00]