Incongruent
Podcast edited by Stephen King, award-winning academic, researcher and communications professional.
Chester | Dubai | The World.
Correspondence email: steve@kk-stud.io.
Incongruent
S5E15 Trust, Code, And The Future Of Truth: Billy Luedtke
Spill the tea - we want to hear from you!
What happens when a handful of companies can quietly steer what we see, buy, and believe? We sit down with Billy Ludke, founder of Intuition and a veteran of EY and ConsenSys, to map a path where trust is built on cryptographic proof, portable reputation, and your own data — not a platform’s black box. Billy argues that while crypto started by decentralising money, the bigger prize is decentralising information itself. If discovery flows through opaque feeds and proprietary AIs, power concentrates. The antidote is simple in concept and ambitious in practice: verifiable attestations about people, agents, and platforms that travel with you anywhere.
We dig into how cryptographic attribution shows who said what, while reputation adds the nuance that pure math cannot. One trusted voice beats ten thousand gamified reviews, so Intuition focuses on a neutral substrate for signed claims and lets multiple scoring models compete on top. That choice avoids a central arbiter of truth and keeps bias in check. From there we explore the rise of agent swarms — many specialised agents coordinating like a brain — and why open, portable reputations will decide how requests are routed and which tools act on your behalf.
Billy also shares how this vision lands on device with Samsung’s Gaia phones: a second brain for your preferences and trusted sources that you control, usable across any model without lock‑in. We talk healthcare records, bank reputations, and why your ChatGPT context should be yours to carry. The through‑line is clear: don’t let anyone control the truth. Treat it as a prism of perspectives anchored by verifiable facts and accountable actors. If that future excites you — or challenges your assumptions — tune in, share with a friend, and leave a review so more curious listeners can find us.
Welcome back everybody to Incongruent. I'm here with Arjun and he usually takes the lead and uh interrupts me before I can get into full floor. Arjun, hello.
Arjun Radeesh:Hello, hello.
Stephen King:So, Arjun, who would we be talking? It's good to hear your voice again. It's good to it's it's quite late in the evening over there in Dubai. Tell me, uh, who did we speak to today?
Arjun Radeesh:So today we spoke to Billy Ludke, who is the founder of Intuition. We've got with uh quite an interesting chat we have got with him, where he kind of gives us a sort of a sense that how AI can be trusted or cannot be trusted.
Stephen King:Well, in a sense, I mean he's talking about data and information and how blockchain can be used to verify information that we find on a web. Uh so here we're talking about not just uh factoids that you're requesting from JPTC or Get C, but also your own personal data so that you can keep control of your uh personal information and transfer from device to device uh and not be locked in to one provider that you've been using for so long. Um it's a really, really wonderful conversation. Um and I if everybody's listening online would like to like, leave a comment and follow and subscribe and maybe even support us with the new support. Give us a little bit of money. Uh, we would be ultimately grateful. And uh otherwise, if are you ready, Arsenal? Should we move ahead?
Arjun Radeesh:Of course we shall. All right, then here we go. Hello and welcome to a brand new episode of Incongruence. So today joining us is Billy Ludke. Uh, and he basically built Ernest and Young's first blockchain consulting practice. Still at key parts of the Ethereum ecosystem at consensus, and now leads the world's first token curated knowledge with Intuition. Most notably, Intuition has launched um natively on Gear Labs' uh Samsung Galaxy S25 Edge, marking an inflection point where trust, personalization, and privacy move from abstract cloud promises to technology actually in your pocket. Billy thrives on practical real-world ships, decentralized side entity, true data ownership, and the end of repetitive splintered onboarding. He knows what it means to instill trust in AI and digital agents. Billy, uh, welcome to the show. Thank you all so much for having me. I'm excited to kick this off. So, Billy, I'm I'm curious. Like, you've worked in EVI uh consensus and now intuition. Three organizations that each touch trust in different ways. What drew you personally to the problem of digital trust? And what was that aha moment that convinced you to a decentralized approach that could actually fix our fractured information ecosystem?
Billy Luedtke:Yeah, so happy to elaborate there. So, kind of what what really led me to crypto, one is I had I just my first job ever, just very luckily and randomly, was helping to build this uh decentralized seismic network at Caltech with the United States Geological Survey Office. So I got super early exposure to distributed systems, and then I became a financial services technology consultant at EY. Um, and so I just saw behind the curtain of the big banks, and and you know, it was just kind of unsalvageable. And and I like you look from the outside and you're like, oh, everyone has all their stuff together. You got the big pillars and the big buildings, and they they must know what they're doing. And then you look behind the curtain, it's like, oh, this dude's managing hundreds of billions of dollars in like an Excel spreadsheet that they can just arbitrarily edit. That's that's probably not so good. Or you got the the 1960s COBOL mainframe. So, you know, I just don't think that I think that the the decentralized infrastructure of crypto, I I I view it as kind of trust infrastructure where you no longer have to trust people. You no longer have to trust whatever behind the curtain. All you gotta trust is the math and the code. And and it's just like completely tamper-proof, you know, like what you see is what you get. No one can play by different rules, everyone has the same rule set. And so I just think that that's like a a very strong weapon in our arsenal of like societal construction. And then and then how it plays into like our our infer information ecosystem. Um, or sorry, should I pause there real quick? Oh no, go ahead. Go ahead. It looks like a hand is a hand has been raised.
Stephen King:No, no, it's just for Argent, so no problem. Sorry. But let as we stop, then let's continue. Just for a quick one. Uh, before you you you go heavy into that, perhaps you could just give us an idea of what a distributed architecture is, because this is where the word blockchain uh gets bashed around a lot uh and crypto. But what what does it what does it actually look like?
Billy Luedtke:That's a very good question. What it looks like is instead of one big tech company, whether that is a bank or a government or uh open AI, instead of them running servers and managing and controlling all of the data, like I think the the symbol, like the the acronym IT, IT, information technology, like everything we are doing in the digital world is just it's just bits. It's just information being moved around. And so your money is information now. Like the land registries are information now. Like everything is just in these servers. And those servers are controlled by big tech companies. And and the example I like to give is you know, let's say tomorrow you wake up and your bank, you can't log into your bank. Uh, and then or you log in and your balance is zero. Like, what kind of recourse do you have? Or you log you try to log into Google and your Gmail, it's like, oh sorry, you don't have access to your Gmail anymore, or your Twitter, or your Instagram, or your podcast platform. You don't own your digital identity. Like the the servers, the the the companies that run the servers that you are interacting with, they control your entire digital lives. And sometimes they provision access to you to be able to use it, but at any moment, they can just totally take that away. Like, like, and that is very scary. I just I think that's a very flawed model. And so, what crypto is, is an alternative where you don't have to trust anyone. It's it's it's all about kind of self-sovereignty, where you just have access to things and you can never lose it. There's no one that can ever take it away from you. So if you're thinking about Bitcoin, it's like you own your Bitcoin. No, no one can take that away. They can't prevent you from transacting. Like you own it. And you can apply, you can extrapolate that out to every single thing in tech where it's like your Twitter account. You can do decentralized social media, you can do decentralized email and text and whatever. And all it means is nobody owns it. Everyone owns it. It's not the big tech companies in these like extreme positions of power anymore. The power is distributed amongst the people. Um, and so like I just think that that's an objectively better design decision for humanity than like trusting these big institutions who then get so much power from the fact that they're in the middle, they're the intermediaries for every single thing that you do.
Stephen King:So in my head, I'm saying you've got this one guy with the Excel spreadsheet, but now with this distributed uh infrastructure, that means that the spreadsheet is duplicated in different locations around the planet, if I'm correct.
Billy Luedtke:Exactly. It's it's duplicated, and and everyone is required to play by the same set of rules to update it. So everyone is like, everyone has the copy of the spreadsheet, and then they also have like the queue of updates. And then as like an update is made, everyone's spreadsheet just like automatically updates, and you can't do something that's like not within the rules that everyone's agreed to. So so everyone who has a copy of the spreadsheet, we all say, here's the law. And if someone deviates from the law, there's they're not part of the network anymore. Like we just don't listen to their spreadsheet. We just we only listen to the the spreadsheets of the people who are abiding by these rules. And if you if you fork, like that's that's what's called like a fork in crypto. Like if you change the rules, then the network just doesn't listen to what you have to say anymore, unless everyone agrees that your new rules are better. So you so you have this kind of fork choice rule where it's like, oh, maybe there is a new rule that we want to introduce. And if the consensus is that this new law is better, then everyone follows that fork instead of the old fork.
Stephen King:Good. So then going back to where we were, we when we're trying to apply this to information because we have a lot of deep fakes, we have a lot of uh slop, let's call it. Uh some other people have called it worse. Uh many of our students, uh, many of my graduates were employed in their first jobs fact-checking on Facebook. So rather than having humans fact-checking, that's what intuition does.
Billy Luedtke:Yeah, I can I'll elaborate. So to kind of continue the thread of like the whole blockchain conversation. So what we what I just described has largely been applied, it's it's a generally applicable tech, but it's largely been applied so far to money and finance. And and the reason for that is because money is kind of like a system of extreme power where if you control the money and you control the finance, you can kind of control the world because you can control like what people purchase. And that's like very, you know, uh crypto people kind of want like separation of money and state because it's like, oh, I don't want you to have the ability to hyperinflate my currency arbitrarily at a whim. I don't want you to be able to just like prevent me from transacting. So there's a lot of power in money. Crypto's been applied to that thus far. But a system of even more extreme power that needs to be decentralized is information. So if you think about any transaction or any decision you make, there's like two variables at play. Like, let's say it's a financial transaction. One is how do I pay for this? What money do I use? What payment rails? The other variable is what do I buy? Like, like, or or or just extrapolating that, I was like, which podcast do I listen to? Which movie do I watch? Which book do I read? What is true, what is false, what is history? Um, the way that we get information online right now is through centralized intermediaries. Just like before we were transacting through centralized intermediaries, our information transactions are all also going through these centralized intermediaries right now. So you do a Google search, you're going through PageRank. You go through ChatGPT, you're you're subject to their algorithm. Twitter, their algorithm. And TikTok, as that's that's the ultimate algorithm.
Stephen King:And this creates these echo chambers, correct?
Billy Luedtke:Yeah, it creates these echo chambers, but more importantly, is these companies are in control and and you see what they want you to see, and you don't see what they don't want you to see, and they like extrapolate this out to AI world, like bringing it back to the AI conversation. No one's even doing research anymore. Everyone is just prompting the AI and taking what they get back as the canonical truth. So imagine someone controls that. Like they control everything, they control history, they control what we think, they control what we do. Like, like this cannot be the way it works. Like the way that we discover stuff, information, or just things generally online, it cannot be through the filter of the black box algorithm of Chat GPT or GROK or any of these things. Like, whoever controls that literally controls the entire world. So that's like what we're trying to solve.
Stephen King:That's uh fear here in the UK. Uh, the Department for Education uh one of the guidance it's given to schools is to make sure that the content that the students are receiving are British curriculum and not American curriculum or Chinese curriculum. That's very interesting. Uh Arjun, uh, what's our next question? Let's move on.
Arjun Radeesh:Um so, like, yeah, and my next question for you was that how do you basically balance the human element of reputation with the algorithmic side of cryptographic proof?
Billy Luedtke:Yeah. So so basically what you get from the cryptographic proof side is you get verifiable attribution of who said what. So you it's like, okay, I know for certain that Billy liked this restaurant or Billy posted this image or whatever. So that's the cryptographic proof side. And then you marry that with the reputation side of things. So it's like using a very simple example, let's say I want to book a dinner reservation or do a swap or buy a product. Let's do the dinner reservation example. What up, what the way the world is gonna look is you're gonna prompt the AI and you're gonna say, hey, book me a dinner reservation for tonight. And I want the AI to know my preferences, my attribute, my social graph, their preferences, their attributes. And it's like I want a very personalized response. I don't want this generic response from the AI that's consuming Yelp reviews, which nobody's doing, uh or trip advisor reviews, which nobody's doing. This is gamified to hell. I want to know what the people I trust are saying about things. So, in order for me to get the voices of the people I trust, first I need that cryptographic proof, and then I need reputation. So it's like, okay, I know you like sushi. Here's this guy you trust in the context of food and sushi. He likes this restaurant in your area. Now you can get a very good response back from the AI as opposed to going through like, again, the the Yelp slop or the trip advisor. Like, there's just like the slop is what's getting used, and all I want, I don't want the 10,000 anonymous reviews. Like one voice from one person I trust means more to me than all the slop online. So give me that. And so that's how how what where it all ties together.
Stephen King:So I've got here, uh, we're we're looking at you mentioned we're gonna ask ChatGPT, but then there's gotta be agents, and then there's gotta be agentic. Uh, and it's gonna be from a consumer, let's just keep it from a consumer point of view versus uh a business, but there's obviously different use cases in both areas. Uh, if we're gonna be, I'm using agents now a lot uh to help me make these decisions. Uh uh has this got anything to to do? Is that making your job more complex or more urgent?
Billy Luedtke:Or yeah, I think our job actually, like we actually fulfill so I'll I'll ex I'll expand on that use case to to agent world. So so basically the first step of the flow is I do the prompt and I say dinner reservation, and I want like a good personalized response. The next step of the flow is what's what's emerging right now is this coordinating agentic swarm. So so I think the way that we get AGI is not through one general agent, it's through lots of specialized agents who are able to coordinate, kind of like neurons in a brain. And and so that's kind of what's happening right now. And so each agent is good at its own specialized thing. And then what you need to do is you need to route requests to the right agent for the job. So, like, let's say we're doing the simple use case of booking a dinner reservation. I want my request to get routed to like the agent that's good at dinner reservations, right? Um, in order for that to happen, now you need agent reputation. So just like in the first step, you needed people reputation. Now you need agent reputation and context and metadata. Where does that live? Does that live in OpenAI servers? Does that live in Google servers? Does do these big tech companies control where the requests get routed in the agentic world? Like then they control everything again. So of course you need like a neutral substrate like intuition to kind of host now the reputation of the agents, uh just like you needed it to host the reputation of the people. And then the final step of this flow is the agent needs to perform an action. So in this case, it's booking a dinner reservation. Which platform does it use? Does it do a Google search and choose the top result and get phished and lose your credit card info or your crypto? Hopefully not. So now you need platform reputation. Like, where does the platform reputation live? Like Google? Again, like like you need you you need a neutral place to store this like information about stuff. Otherwise, just to do the simple flow of book the dinner reservation or which podcast do I listen to, or like which keyboard do I buy, like it gets owned and controlled by someone, and and that's not so good. It should be controlled by everyone.
Stephen King:True. And if I if my my knowledge of this is correct, in certain chatbots, if you don't ask it to do a deep research, it will give you access to the older, cheaper models which have been trained on lesser data. And you have to really uh click the right model to get to be on the up-to-day information. Um so I can see how that is most useful. Um we're going to be looking at uh putting trust in your pocket. That's what you've been talking about, I believe. Uh especially with the Samsung uh uh deal. Would you like to tell us a little bit about what you're doing with Samsung?
Billy Luedtke:Yeah, so it's kind of the it's the flow that I just mentioned. So it's like what I want is I want I just want a better AI experience. I don't want to have to write as long prompts, I don't want to have to prompt as much, like you just said. I don't want to have to like choose the right model for the job. I just want to do super simple things and then have like the best result on the other side. And and what you need for that, um I like right now, sure, there's like ChatGPT and OpenAI, they're starting to like personalize your your your context window, but like they're never gonna let you take that to anthropic, they're never gonna let you take that to mid-journey because that's their moat. And so what we think people need is like portable self-sovereign context that they can bring to any agent, they can bring it to any model. Um, and so on on the on the Gaia Samsung phone, like basically that's what we're providing is your intuition. So you can think of it like your second brain, where you just have like your little personal context that you can manage. You can see it, maybe have like folders, and it's like, okay, you know, here's my preference, here's here's what I like, here's what I don't like, here's my lists of stuff. So, you know, you come across a new book that you might want to read for later, just put it in there. Like it saves it for you, but it also saves it for the AI. So you just have your own little like container where you can store stuff, it's like owned by no one, and then you can use that to personalize your AI experience on device.
Stephen King:I hadn't realized it, but I am now locked in to Chat GPT, right? Mike, because I uh I it's a and that is uh oh, that is something for the um what do you have in the US? This lovely little team which break up companies. That's a monopoly, uh, potentially. That's that's monopolistic power. And if they do not allow me to have access to my own data to be able to transfer the data to a different account, that's the same as having a SIM card and not being able to move my number. That's that's the same argument. Uh and building on that, we have healthcare information where we've gone through ye decades and decades of trying to convince the healthcare providers to allow us to move our health information from provider to provider so we can get proper healthcare. So I'm I'm I'm assuming this is this is very similar.
Billy Luedtke:It's exactly the same thing. And and like we're we're applying this tech to the AI kind of field right now because that's kind of the meta, but you can also do it from like I wrote a white paper on this back in like 2015, 2016 for medical records. It's like like just like the example, it's like medical records, like every app that you sign into right now, you create a new identity for that context. And then you generate data and reputation that only lives within that context. So I can't take my five-star Uber reputation and rent an Airbnb. And I can't take my 20-year reputation with Bank of America and open a Wells Fargo account. I can't take my open AI context and bring it to Anthropic. Like, like this problem is very pervasive on the web. And so these self-sovereign identity principles that I've been working on for over a decade now, like this solves that. Like we have the tech to solve it, and now it's really just like applying it to the right use cases and trying to get adoption here.
Arjun Radeesh:Alright, so I am um, I'm gonna just step in and kind of ask this next question. Uh, what happens when AI starts issuing attestations? Can we trust the verifier if the verifier itself is an AI?
Billy Luedtke:Yeah, some I think in a lot of instances you'll probably trust it more, but this this is kind of where reputation comes into play, where like with intuition, anything can say anything about anything, and then you get like verifiable attribution of like who said what, and then you couple that with reputation, like we explained earlier. So it's like you can just look at like the reputation of the agent. And if the agent like has a really good track record of attesting to useful truths about things, or like let's say the agent is just like a thought leader on a topic, it's like, damn, I'm probably gonna trust the agent more than I'm gonna trust like Joe Schmo down the street. So like I think the agents are going to be smarter than us. I think we will we will definitely like it's not a prop. We want them attesting. And one of the first things that agents are attesting to right now on intuition is the reputation of each other. Because, like I mentioned earlier, there's this coordinating agentic swarm emerging, and so like when an agent interacts with another agent, they like leave an attestation and they say, Hey, yeah, this thing fulfilled its duties, and it and it was like super effective. And so, right now the agents are attesting to things about themselves, which is kind of cool. Also very scary.
Stephen King:How do you measure this reputation? How is it calculated? Yeah, sure. Yeah, so you you see because I I'm I'm in the public relations environment, uh journalism, uh, and in public relations, we we we we measure our reputation by the sentiment that's uh is seen in the newspapers and by the qualitative feedback that people are saying about us online. Uh, how would you how is an AI which is just in a numbers environment, uh how do they what do they measure?
Billy Luedtke:That's a really good question. So so those variables that you just mentioned, that's what we're we do. We collect the we collect the data. We like intuition doesn't give things a reputation score. Because if we do that, we become like again, we we're then we'd be in a position of power, and we'd be like, we'd have the ability to censor stuff, we'd have the ability to manipulate stuff. So all we do is we're like this neutral substrate, and then you have companies like Recall Network and others like building reputation scores on top of us. And I actually don't know like what they use in their calculations, but we're just kind of underpinning the and you could you could have a D5 credit score, you could have an agent reputation score, you could have a person reputation score. Like anyone is free to come up with like whatever reputation scoring mechanism they want. Like we can provide some examples, like we'll probably have some like default ones you could choose from, but like that's not what we want to focus on because that to me is like a slippery slope of of bias and control.
Arjun Radeesh:Um all right, so um as AI and blockchain coverage um converge, do you think that we are building tools that make truth stronger or just better at simulating truth?
Billy Luedtke:Yeah, that's a good one. Let me think about that a little bit. Yeah, I I think we are we are the whole point of intuition is to simulate truth stronger. So I hope that I hope that we pull it off. And just the to frame it, like it's called intuition because our whole goal is to kind of get truth out there, like get people's intuition out there for consumption. Because like right now, all we have is the slop. Like there's so much slop because no one is so there's there's two ways that data is getting contributed online right now. One is these gamified things where it's like, I've never written an Amazon review, I've never written a Yelp review, I've never done a LinkedIn endorsement. So it's like, who's who's creating this data? Who's curating this data? And the answer is it's more and more bots, and it's gamified. So like all that data is totally flawed because if you look at the data, over 99% of people aren't contributing data to these platforms. And then conversely, like that's the data that all the AI and every single person is leveraging is this known flawed data set. The other problem is I do do a bit of expression. Like I'm on Twitter, I'm on Instagram, but when I'm on those platforms, I'm playing the game of Twitter. Um that's not truth. It's not like exactly who I am. I'm like playing to the algorithm and I'm gonna structure things in that way. When I'm on Instagram, I'm playing the game of Instagram. So the picture that the AI has of us right now is entirely flawed. Like, because like the data ingestion is it's all just broken because everything's optimizing for the wrong things. It's optimizing for the clicks, it's optimizing for the attention. And so, like, I think we can do an infinite amount better than the status quo. And so the whole point of intuition is to like get people to express their holistic inner state a little bit more so that we actually get we can actually get truth out there.
Stephen King:In my head, I've got this picture of Bridgetton. Have you seen Bridgetton on Netflix?
Billy Luedtke:No, I haven't.
Stephen King:It's one of these period dramas where it's all about the young girls getting married and being brought out into society and having to portray themselves in the way that society expects in order that they can find their husbands. And the men have got to do the same. It's it's it's it's quite uh a contemporary take on on uh on a fantasy sort of environment of pre you know Victorian England. Uh, but I can see what you're saying. It's it's like for the past 10-15 years, we have been playing to Mark Zuckerberg's opinions of what we should be doing online, or uh or Elon Musk's now where we're operating under his or even on true social, we have to operate under those kinds of norms in order for our content to be seen. So I think that's that's a that's really that's really interesting.
Billy Luedtke:Um it's it's also a race to the bottom because then we're recursively getting trained by those algorithms. So it's like like we play to the algorithms, they optimize to play more to that, and then they train us. Like whether or not you want to accept it, we are being trained by them every day. It's like, you know, there is actually yeah, it's so it's influencing us more than we even know. So not only are they optimized for collecting slot, now they're kind of influencing us to like it's called like brain rot, I think, in like Gen Z dot term. It's like like everyone's got this brain rot because all we're consuming is this slop across the whole web.
Stephen King:And that just that just proves that the content I created 15 years ago is far, far, far over high quality than what Arjun's creating today. Perfect.
Arjun Radeesh:Thanks for that, Gen Z, big.
Stephen King:No worries, no problems. All right, we've got last couple of questions here. Uh, if you could redesign the internet's just layer from scratch, what's one principle you would never compromise on? Google in the past said we will not be evil. What would be the equivalent of a trust uh paradigm or a trust trust vision statement?
Billy Luedtke:Wow, that's a that's a really good one. Um I wish I saw this. I had more time to think of this. Let me, yeah. Uh man, I think the the principle that I would never compromise on is don't let anyone control the truth. It's it's like like there should be no one who can control it. It should be up to all of us to decide. And and it's like even something as objective as Wikipedia, it's not objective. They're like the people there's like a council of Wikipedia editors, and they've got admin privileges, and then you know, they've got a bias that leans one way or another. So if you go to certain pages, it's like people take that as like the canonical truth, but it's not. And so like that's the way that the web is presented to us right now. The whole the whole web presents us information as if it's like there's not two sides to the story, and that's the echo chamber thing. Like, you get trapped in your echo chamber, and then you're even more so, even on like topics of extreme conflict, you're presented one side as if it's the truth, and so I'm unwilling to compromise on the fact that like truth is a fractal prism of perspectives, it is not what one party tells us it is.
Stephen King:And Ajan, you have the last one.
Arjun Radeesh:And like, yeah, finally, what question about digital trust do you wish more people were asking?
Billy Luedtke:I wish more people were asking the questions that I'm kind of asking, which is like, okay, what are we trusting right now? Like, no one thinks about what they're trusting. Like, like they go about their everyday lives and they just like operate under all these trust assumptions. It's like, oh yeah, of course the bank is always gonna give me access to my money. Of course the government's not gonna hyperinflate my currency, of course we're gonna have free speech and not get imprisoned if you send a meme. Of course. Course like X, Y, and Z, and we're just operating under these trust assumptions. And then when you see them broken, it's like, wait, you guys are in control? Like, wait, this can't be how we're so no one's questioning trust right now. They're not there, no one's questioning where we're all placing our trust every single day. And that means that these people get to operate kind of behind the curtains and control it all. So that's it's just like as you go about life, just think about like the thing you're interacting with and think like, okay, in this scenario, which parties am I trusting? And then you say, Oh, imagine if you didn't have to trust any of them. That would be way better. Like, of course, like what if you can trust code and math instead of like all these flawed humans? So that's that's yeah, that's my answer to that question.
Arjun Radeesh:Thank you so much, uh Billy for such an insightful chat. Doing the interview, I felt like I've learned a bit more about say about the trust factor and like how AI still is like a work in progress thing. So, so thank you so much for joining us on another episode of Incongruent Podcast. And like, yeah, once again to all the listeners out there, thank you for listening. Please do like, share, and subscribe to our podcast. Our podcast are Google across all your favorite podcast platforms. Until then, this is the coup from Incongruent signing off.
Aayushi Ramakrishnan
Co-host
Arjun Radeesh
Co-host
Imnah Varghese
Co-host
Lovell Menezes
Co-host
Radhika Mathur
Co-host
Sukayna Kazmi
Co-host
Stephen King
EditorPodcasts we love
Check out these other fine podcasts recommended by us, not an algorithm.
The Agentic Insider
Iridius.ai Inc
Brew Bytes
Nadeem Ibrahim