sunnya97.com

MEV Panel with Sunny Aggarwal, Skip Protocol & Mekatek

The discussion centers on MEV (Miner Extractable Value) in Cosmos, exploring its implications for user experience, validator economics, and the development of competitive block space markets.

Summary

In this MEV panel discussion, we explore the complexities of miner extractable value (MEV) within the Cosmos ecosystem, focusing on its implications for decentralized exchanges like Osmosis. We dive into the debate around whether MEV is inherently harmful or if there can be "good" MEV that benefits users and liquidity providers. The conversation highlights the tension between user experience and the need for incentivizing liquidity providers, as well as the role of searchers in capturing value. We also compare the approaches of SCIP and MechaTech in addressing MEV, emphasizing SCIP's focus on building Cosmos-native solutions and the importance of governance in revenue distribution. Additionally, we discuss the potential impact of threshold encryption on transaction visibility and how it could streamline value capture, alongside the emerging concept of an interchange scheduler for block space futures markets. Overall, the dialogue reveals the nuanced landscape of MEV strategies and the ongoing quest for fair and competitive systems in the blockchain space.

Key Takeaways

  • MEV (Miner Extractable Value) can have both positive and negative implications, with 'good' MEV potentially benefiting users while 'bad' MEV can extract value from them.
  • There is a strong emphasis on user experience in the context of MEV mitigation, as practices like front running and sandwiching can harm traders and liquidity providers.
  • Both SCIP and Mechatech approach MEV from different angles: SCIP focuses on an in-chain solution that captures value transparently, while Mechatech utilizes a more free-market approach to maximize value without waiting for governance.
  • The future of block space markets, including the Atom 2.0 interchange scheduler, raises questions about how different chains will interact and whether there will be a unified or multiple auction systems for block space.
  • As encryption and threshold decryption technologies advance, they will reshape how MEV is captured and distributed, possibly leading to more equitable revenue sharing among validators and users.

Detailed Analysis

In the MEV panel discussion, the speakers delve into the complex world of Miner Extractable Value (MEV) and its implications for decentralized finance (DeFi), particularly within the Cosmos ecosystem. A central theme is the dichotomy between "good" and "bad" MEV, with the panelists exploring how different approaches to MEV can affect user experience on decentralized exchanges (DEXs). They emphasize the importance of designing systems that prioritize user experience while acknowledging the economic incentives at play for liquidity providers (LPs) and traders. This nuanced conversation sheds light on the ongoing struggle to balance profitability for validators and LPs with the need to protect users from predatory practices like front-running and sandwich attacks.

The discussion also reflects broader trends in the blockchain space, where MEV has gained significant attention as a double-edged sword. On one hand, it offers opportunities for innovation and economic sustainability within DeFi; on the other, it poses ethical dilemmas and challenges related to fairness and transparency. The speakers highlight how the structure of MEV extraction can either enhance or detract from the overall health of an ecosystem. As DeFi continues to evolve, the dialogue around MEV becomes increasingly critical, exposing the need for robust governance models and community consensus to navigate these challenges effectively.

One of the standout points made by the panelists is the necessity for a competitive market to optimize MEV extraction while ensuring that value is fairly distributed among participants. They argue that a well-structured, transparent approach—whether through on-chain or off-chain mechanisms—can mitigate the risk of centralization and create a more equitable environment for smaller validators. This perspective is significant as it suggests that the future of blockchain governance may hinge on our ability to design systems that not only capture value but also distribute it in a way that aligns with community values and priorities.

However, the conversation does expose some limitations in the current understanding and implementation of MEV strategies. While the panelists advocate for innovation and experimentation, there remains skepticism about the long-term viability of certain mechanisms, such as the proposed interchain scheduler in the Atom 2.0 white paper. The concern is that if not managed wisely, these approaches could inadvertently lead to poor value capture and increased extraction by off-chain searchers, ultimately undermining the ecosystem.

This video is particularly useful for developers, blockchain enthusiasts, and policymakers who are navigating the rapidly changing landscape of DeFi. It provides valuable insights into the complexities of MEV and the critical need for developers to consider user experience alongside economic incentives. By unpacking the multifaceted nature of MEV, the discussion encourages a deeper examination of how we can innovate responsibly while fostering a transparent and equitable ecosystem. The insights shared can guide stakeholders in making informed decisions, ultimately shaping the future of decentralized finance in a way that aligns with community values and long-term sustainability.

Transcript

Speakers: A, B, C
**A** (0:02): What's it called? Hello, everybody. Welcome to the MEV panel. You know, I'm excited to be moderating this. MEV is something, well, we think a lot about at Osmosis. Osmosis. Before it actually became a dex, we were actually working on like MEV mitigation strategies and we're like, well, we need a product here and that's how we'd be able to dex. So maybe we can just start with some introductions though from you guys. Barry, you want to go first? **B** (0:47): Sure. **C** (0:48): Hi guys. My name is Barry. I'm the co founder of scip, which is a Cosmos native MEV infrastructure team. We're working to basically build toolkits to help chains capture MEV design block space markets in the way they want them to be run and distribute that revenue to their stakeholders in the way they think it should be done. **B** (1:10): Hello, I'm Braps, I'm the co founder of mekatech and we work on interchange block space markets. **A** (1:16): Cool. Maybe we can start with like asking you guys for anyone who doesn't know what is mev? And it's also like, you know, I believe we've pretty well memed MEV as a word in Cosmos to be this very scary thing. And, you know, our team may have had a role in that. Why is that the, you know, should MEV be a scary term? Is there. Is there good mev or is all MEB bad? **B** (1:47): No, I think so. The scary word in MEV is extraction. So it somehow conveys the idea that there is value in an ecosystem, in a community which is being taken out and going elsewhere. And sometimes that could be sort of represented in some kind of rent being extracted by a party that has like a privileged position. It could also be represented by the lack of competition in a network, or it could be represented as additional infrastructure or operational costs, which is burdens to a validator or to a player to perform some otherwise essential role in a community. **C** (2:26): Yeah, to back up a little bit and try to do the what is mev? I think a lot of people have different definitions and most of them I really don't like. The one that I use, which is probably wrong as well, is there's like a black box basically that takes place between when a user signs a transaction and how that transaction ends up in a block with other transactions and how they're ordered. And MEV is sort of like the output of how we design that box and different designs. Whether you sort of just have transactions, first come, first serve, whether you have some kind of auction system Whether you do something else, something on chain, you can end up with different people making money off of that. And in some cases that money can be sort of kept in protocol or in a community. In other cases it can kind of leak out. And so for scip, we think about sort of good versus bad MEV in the context of extraction, and we really do see sort of a strong difference there. And I think that's something that makes us different than most other MEV infrastructure providers. I think front running and sandwiching harms user experience on dexes. And basically every Dex I've talked to is doing things to try to mitigate that. And so our view is user experience should sort of be the guiding principle that we follow. And we should try to enable kinds of MEV that don't harm user experience. And we should try to do things to disable kinds that do. **B** (3:59): Which user? **C** (4:01): Retail traders, the folks who bring liquidity, liquidity providers, folks who are actually using the dex, who are holding the token. **B** (4:09): But how do you, how do you square that circle is like if you sandwich a trade on a Dex, right? That constitutes a payment from a user to an LP. So the question is, in a community, are LPs represented? Are there incentives aligned and like. And how do you, how do you, how do you find that overlapping preference? **C** (4:27): It's a good question, actually. It constitutes, I think, part of a payment to an lp, but then there is a significant portion, the profit of that sandwich that often exits the chain and ends up getting paid to the sandwicher. And our view again is like MEV is something where I don't think there's a right answer. **A** (4:48): Right? **C** (4:48): I don't think there's a wrong answer. It's a question of values. And we have to make sort of value based decisions here about how we weigh LP incentives against traders, incentives against the incentives of market makers and sophisticated traders. **A** (5:05): What are some, like, so kind of like related to that? You know, I feel like, you know, if you. In Ethereum we've noticed something very interesting which is like the whole MEV game has actually brought so much like mind share to Ethereum, right? Like all I know so many builders who started as searchers and then got bored of that or got competed out of that, and then they're like, oh, well, I guess we know this stack very well, we're going to start building stuff on top of it instead now. **B** (5:36): Exactly. Like the whole MEV game can just be like a builder incentivization program to sort of bring people into the ecosystem and get them paid Completely independently. So no grants, no job, no employment, no permanent establishment risk, just like on chain activity that gets you into the ecosystem, creating value. And I think like to come back to what Barry is saying about like good and bad mev, like we are very much not in that game of making that declaration at MECH attack, right? We know from the research that as soon as we look into the effects of even what is considered the most dubious kind of mev, like sandwich attack, right? It does have aggregated benefits in terms of beneficial routing, so not just in terms of incentivization. So it is this sort of like weird space of overlapping of preferences and one of which is like who can participate in creating that liquidity and getting that value. So bringing developers to the ecosystem, is it a tax on users? Yes. Is it worth it? I'm going to go ahead and say yes. I think that the actual, I would say that the usual price that a user pays is say 2% slippage or whatever on osmosis, right? It's like it's barely worth talking about the 2% slippage. I know on Uniswap or whatever there's huge things, but the 2% or whatever to bring devs into the ecosystem is such like a clear win actually it might be the most effective allocation of capital. Like better than a grants program. That could be quite political, right? It's like a governance minimized on ramp for developers. That sounds amazing to me. **C** (7:11): I think viewing MEV as a way to get developers into the ecosystem is a pretty narrow way to look at it. I think it's one way developers can come into an ecosystem. But I think over the past few months and few years, Cosmos has not had a hard time getting smart developers into the ecosystem. And that's without creating any kind of social shelling point around, the need to sandwich people or front run people or those kinds of things. Like there are other interesting things to build and when you don't allow those kinds of strategies, you know, there are also other kinds of strategies that are more technically interesting. But putting that aside for a second, I would also say Skip is not really in the business of making this distinction. We're a product driven company. This has come from dozens of conversations with validators and dozens of conversations with chains, including Sunny, where people have said to us, okay, yeah, we like muv. We want to try to help our stakers make more money, we want to try to help token holders make more money. But our users are afraid of this kind of stuff and they're afraid of having their Money taken, even if it improves optimal routing. And I think trying to fight that trend is something we don't want to do right now, but ultimately something we would also be open to leaving up to governance at the end of the day. Like it's not really our choice and we don't think it should be either. **A** (8:36): Yeah, I feel in Cosmos, going back to the previous point, you know, instead of these searchers, I feel like there's a really good validator to developer pipeline. I feel like valid so many projects like, oh, they start by running validators and then they learn the stack and then they like start to like build stuff. So I don't know, maybe we already have a pipeline for that a little bit. **B** (8:55): Exactly. And I think the issue is that like if you're you know, a bottom 100 validator, so if you're like number 100, 120, whatever on osmosis, right. You're making probably about like 40 osmo a month or something like this. And whatever, whatever price that is that, you know, that's just not enough to even do things like participate in governance. You know, sometimes we think about validators and their, their economics purely in terms of like running servers, which is ridiculous. Like validators have an essential role in, in the political economy of Cosmos. And if they're making Forte osmo, how can we expect sort of people to put in the time necessary to participate in that system effectively? **A** (9:37): So Sean, you guys, mechatech started as a searcher, would that be fair to say? And then you've pivoted towards building infrastructure. **B** (9:45): Now mechatec has always worked on all sides of the market on the supply side, as in like selling block space. On the demand side, as in buying that block space and learning about that market and building empathy from that perspective. And then on the market, on the market intermediary. Right. That mediates these forces, creates competition and maximizes value for the ecosystem. So mechatec encircles all those concerns. **A** (10:07): Can you guys maybe compare and contrast? Like how does SCIP and mechatec differ in how they approach MEV in Cosmos in general and then maybe also on osmosis specifically? **B** (10:19): Well, I think like mechatec is like a very much builder oriented organization, so we just like went ahead and build it. So we have a market that's live running with validators. On osmosis Today you go to Meka Tech and you could read the documentation, you could integrate. We're probably going to have around 10 to 15 validators as soon as like this conference nonsense is Done. And people get back in front of laptops, I think you're going to see a lot more voting power go through our system. We also operate searchers that essentially generalize back running to actually incentivize real validators with real revenue. And then we take a cut of the profit which we use to just redelegate to the validators that help us. We kind of like make some money and then we lend that back out to the community so that all their profits are sort of incentive aligned with the overall growth in the ecosystem. So that's what macatech is doing. I won't speak for skep. **C** (11:08): Yeah, I think we've taken a very different approach with respect to osmosis as well as sort of the block space market that you guys have created in general. So your solution is very similar to out of protocol PBS in Ethereum. We actually believe that Cosmos needs a Cosmos native solution. There's a lot that you can do with Tendermint and the Cosmos SDK and ABCI that isn't captured by that solution. And so over the past few months we've been talking to Sunny, talking to validators in osmosis and have developed a module that will be launching on chain that will capture back running MEV revenue for the osmosis community in a way that is fully audited, fully visible, and can be reviewed by everyone and allows the community to decide, okay, where do we actually want this money to go? How much of it should go to Skip for providing the services and how much of it should go to LPs or the folks actually being back run or stakeholders and token holders. So I think this sort of does capture the main difference between us, which is we're trying to actually go out and figure out what do people need and then look at Cosmos and say, okay, what are the solutions we can build here that you couldn't build in Ethereum? **B** (12:24): Yeah, Just to summarize, I think the biggest difference between SCIP and MEG Attack is like we did not go through governance, we did not wait, we did not ask for permission, we just built it. And we're relying on the free market to maximize that value. So no governments, we don't know the price like our searchers that do back running participate in an open API that could get front run, that could get outbid. Actually we hope they do get outbid. If someone outbids our searcher, that means that more value is going to the validator. Right. So that's a free market sort of approach to this problem. Government's Minimized. **A** (12:54): So it would be fair to say, like the distinction is SCIP is taking a very on chain approach while mechatec is taking a more off chain approach. **B** (13:03): Well, I think. Oh yeah. So I think what Barry said made a lot of sense that I think we do need some like Cosmos native facilities. So like right now the actual implementation is like we just forked tendermint, so all everyone that the two founders of Megatech used to work on Tendermint, so we know the code base very well. So we just kind of, kind of made a minimal patch to enable pbs. Right. But it's kind of a pain to maintain. So what we're going to do is we're going to upstream those changes in a native builder module that will provide sort of an open API that anyone can compete with us or anyone else to choose the most profitable block from an open network, and that will be a native module. So one mechanic won't need to maintain it. I like that. And two, it'll provide sort of like competition all the way at the edge of the validator, you know, minimize that integration. **C** (13:51): That's a good point. So SCIP is also working on upstreaming Tendermint changes. We're working with you guys, with Marco, with the SDK team to try to figure out, okay, what is the best in protocol way to do this in the long term, in the short term for other chains. So outside of Osmosis, we're working with evmos, Juno, Terra, a couple others that are not yet announced that have more generalized chains where maybe in chain, in protocol arbitrage capture doesn't make the most sense. And there we also have a custom version of tendermint, ours is quite different. So it doesn't actually just import out of protocol PBS and force validators to sign blocks site unseen. What it does is it relies on completely native P2P gossiping mechanisms that are already in Tendermint, doesn't require validators to make any new security assumptions, doesn't require them to sort of change their key signing and is sort of easier to integrate as a first step, but as a downside, potentially, or maybe as an upside is less expressive. So we don't build a whole block, we just run an auction to help build a few transactions at the top of the block and then we let the validator build the rest of it. And that way even if folks don't trust us, we can't do anything related to censoring or anything malicious and we can ship bundles that are small enough to show people, hey, we're not sandwiching or front running your users and build that trust. **A** (15:15): So yeah, with Skip you guys are like making the explicit like requirement that you can't like people can't use it for like front running or like sandwiching. How do you do that? **C** (15:26): So it has to do with the kinds of bundles that we allow. So in theory you know, you could still probabilistically try to front run someone using the public MEM pool and we don't deal with that. But when you're submitting bundles to our infrastructure, we basically ensure that the person who's submitting the bundle has signed all of the transactions in that bundle after the first one they didn't sign. So you can only put your transactions after other folks transactions in our bundles. **A** (15:55): So Sean, you mentioned that you guys have about like 10 to 15 validators run on osmosis running this is it, is it, is the list of validators right now public? And is it important for like validators to be transparent about. **B** (16:10): I think they should absolutely be transparent and they should be governed by their, by their delegators. Right. So I think it's the system is completely opt in and the idea is that we are not going to be the only builder. So our philosophy is that the best thing that is going to aggregate the most amount of value is going to be a competitive market. So we cannot be the only builder. There must be other builders that operate with different thesis like in the same way that like Block Xroute has a builder in Ethereum, it has like an ethical builder that you know, sometimes wins blocks. And this of course ensures that no one builder can sort of censor the entire network. As long as there's choice we don't really have to worry about censorship at that level. And I think that validators will sort of the goal, right is not only to maximize the revenue for large validators but it's specifically to equalize the revenue for small validators. Right. So if you have a big validator who sort of vertically integrates, like if we move MeV to the consensus layer where it's like about doing it right, then big validators are going to have a disproportionate advantage. Small validators are not going to do this. And the thing about most Cosmos networks and including Osmos is that like big validators have a tendency to vote less. Like they vote more conservatively. They represent sort of a larger delegation of people and can be less, let's say engaged in taking the risk necessary to move the ecosystem forward. So we're Trying to fix this by making sure that every validator can sort of opt into some proportionate amounts of revenue without sort of taking this risk for censorship. And I think we've done this. **C** (17:49): I think Skip has completely the same point of view, that you need a competitive market for these kinds of things. I think on the flip, in the case of Osmosis, I think you end up with a more expressive kind of protocol value capture when you put it in chain. If generally, like, my personal point of view is not really speaking for Skip is like, if something can be in protocol, like if it's technically feasible to put it in protocol, it probably should be so that it's treated just the same as anything else. And it's not some sidecar thing that's attached and is some potential cartelization risk. Which is why I think upstreaming a builder module in partnership with you guys will be exciting. But I think it's also why we're experimenting with this on chain approach first with Osmosis, because Sunny and the team have been sort of so open to that experimentation and we'll see how it goes, right? We're going to see how governance actually wants this money to be distributed. I suspect it won't be that large validators end up with more of it. And actually, because it's in protocol, you could have a system where folks decide, actually the smaller validators who are getting less revenue regardless should take a larger portion of this revenue so that we can equalize some of that and that can be voted on in chain. **B** (19:03): But the thing about doing anything in protocol is not the question of, like, what's in protocol, it's a question is what is out of protocol. So if you have. What we're talking about is like designing a mechanism, whether it's like a skip mechanism for the block space or a mechatech mechanism for the block space. The question is not how much is extracted, the question is how much is aggregated. How much of the total value is actually captured by the mechanism. So if we say like, oh, let's put like, I don't know, a searching algorithm in every validator and it could only do two swaps or two Skips or two levels or whatever, because you just have a limited amount of space or like low specialization, that means that there's going to be money outside of protocol that anyone could capture, right? So if the, if the solution space is like valued at, like, how much of it is captured by the validators, it is somehow a superset of what is in the protocol and out of protocol that matters. **A** (19:54): So I guess related to this, how do both of your solutions change once threshold decryption is like live right? And you know, we have like most user transactions are encrypted in the mempool is does that, what happens to that value? Does that value like get given back to users or do you think how do your systems. **B** (20:16): I think either way, most of the value is going to in a competitive market, no matter what the mechanism is, the core of it is how competitive it is. That means that how much sovereignty, how much agency does the validator have to make a choice over what it proposes and how many things it have to choose from? Insofar as it is a competitive market, you're naturally going to see the margins of searchers get squeezed by that competition. And that's the kind of outcome that we're looking for, no matter what encryption or whatever is on the other side of that. **C** (20:45): Yeah. So I think for us it's a lot simpler once we have threshold encryption. Actually what Skip is doing now, which is back running every transaction in the chain on behalf of the osmosis protocol and giving that money back to osmosis users is going to be easier because we don't have to worry about searchers capturing back running opportunities from unconfirmed transactions in the mempool. We can actually just run a convex optimizer in chain at the start of every block and find the global clearing prices for assets on Osmosis, which I think is pretty cool, and then give all that money back to stakeholders. And in terms of competition, I don't actually think that this is in conflict with competition in any way because you can pair this system with an in protocol auction system. And then as much as it can be in protocol, you can also control where the revenue of that auction goes, which is another cool thing. So you can have in protocol competition as long as you can have it sealed bid. And if we have threshold encryption, I think we can really get there. So I'd push back on the notion that putting things in protocol is anti competitive. And I suggest more that it's more about having control and visibility into where the proceeds from the competition go. **B** (22:06): But here's exactly how it's anti competitive. If I'm a searcher and I run a gpu, I'm going to find more opportunities faster than you. Okay. If I do it off chain, I just need to run it on one node specialized for exactly that purpose. If we move that in protocol, it means that every validator essentially becomes Solana. **C** (22:23): You can still bid in protocol. You can just submit your transaction in protocol and try it. And if we just decide with osmosis and with the community that we want to put that above our convex optimizer, great, let's do that. **B** (22:38): But you background everything but who are you open to being front run is my question. **C** (22:45): So yes, if that's what the osmosis community wants. Again, we don't care so much about the details and more about the alignment. **A** (22:54): So I would say like the back run arbs are, you know, that's from my opinion how I see it is because this is all using information that is completely on the osmosis chain. I feel that is like something that the chain can do. But and I don't think that the, you know, it's not going to be the perfect arb. We don't want to spend too much compute cycles on that. Right. But if we can capture 50, 80% of it and then let off chain searchers capture the rest. Right. So I think these things are in conjunction. I think the real place where the auction really comes in is when you have things related to off chain information. Like if you're arbing prices against a centralized exchange. Right. There's no way the osmosis chain can know the right arbs to do. Right. And that's where that comes in. **B** (23:42): But I guess I don't understand why a searcher who could front run the system would leave anything left. So if you're like on the tail end, so you get the bottom of the block, something else gets the top of the block, what's the incentive for leaving anything at all? **C** (23:57): So I think maybe it's a question of like whether you view it as there is a fixed amount of opportunity or not. And I think sort of just like looking at how traditional finance has evolved over time, since that's my background, for better or worse, what you see is people exploiting information asymmetry advantages at very high frequencies and lower frequencies. And to Sunny's point, I think part of what searchers are going to be doing is looking at opportunities that fundamentally the protocol can't capture. **A** (24:30): Right. **C** (24:30): Sex dex arbitrage is something that we couldn't do in protocol or if we did it would be extremely dangerous and we wouldn't want to do that. Or things related to trying to do cross domain mev where perhaps there's an opportunity that arises on Kojira and osmosis and you need block space on both. Again, we couldn't do that. So I do think it's sort of like these ideas work in conjunction and I think it is very good that we have different approaches going about it so far. And I think in the end you're going to have some combination, right? **B** (25:04): I think they can be good. I'm just worried that like putting more burden on validators to do stuff that doesn't generate value creates negative utility for the ecosystem. **A** (25:14): I want to switch to another topic, which is the Atom 2.0 white paper included this idea of a interchange scheduler they called it. It's effectively a block space futures market. What are your guys thoughts on you know, do you think chains are actually going to use it? And two, how does this futures market approach differ to like more the just in time block space markets that both of you guys are building? **C** (25:44): So we're super excited about this. I mean we've been going around banging a drum saying MEV is the best way to have financial sustainability and token value appreciation. And now to see Ethan and the folks at the Hub get up on this stage and say, you know, we think MEV is the best way for the Hub to actually get Adam's value to go up. It's great for us. I think there's questions. I think so. I think we agree at a high level like this is a good economic direction for the Hub to go in. I have questions around how it will work in terms of how do you resolve multidimensional auctions for different block space. How do you express those kinds of preferences? Do you build the whole block or are we auctioning off parts of blocks and how does that all work? Also how much defi activity will there be on the hub versus not on the Hub? But we're excited to try to work to answer those questions with Zaki and the team actually building this and try to contribute there. And then on the high, low latency block space auctions, I think you need both. Right. The low latency things are useful sort of the block ahead when you know an opportunity exists. And fundamentally part of that is going to have to be on chain if you want or off chain if you want to have a generalizable solution. But that doesn't mean you can't also sell block space a long time in advance at some average price and have these systems interact. So I sort of see it as complementary. **B** (27:13): Yeah, I kind of agree with what Barry is saying. It's like as a, or I guess mega tech takes a very like value maximizing approach. Right. So we look at everything incredibly critically. So it's not just this thing that we're criticizing, but this thing in particular, right? So the way that you maximize value to a validator is by lowering the risk. Or like as a searcher you're only going to bid sort of your risk adjusted returns. That's the value of getting something executed. And there's a lot of things that sort of introduce risk, like time introduces risk, adding randomness introduces risk. Where I have to like spam a chain to get executed or whatever. All those things, in some ways they may seem like they prevent extraction, but they actually create more extraction because it's more wasted effort for everyone. Right? So this, this idea of selling block space in the future, it means that there's like, so what am I going to bid on that block space? I'm going to bid at the point of like minimal information, some like pretty much irrelevant price, you know, some bottom of the basement price. And then the question is like if someone, how long do I hold that future, Is it fungible? Can I sell it later? Because if not, if it's like locked in by the system. So if the ecosystem converges on this mechanism of which block space is sold once a day, it means the clearing price for that block space is going to be terrible. Right. And that means again that more searchers are going to go off chain, they're going to circumvent this mechanism and then what is the value that's going to be accrued or aggregated to the stakers at that point? So I think the mechanism has a little while to go before it sort of incentive compatible with where we see the ecosystem going. **C** (28:58): Yeah, that's sort of the area I think we'll play a part in in terms of like figuring out what's a fair clearing price and potentially competing there and then sort of aggregating demand for searchers where we'll like in real time sell block space back to them. Sort of cut up in a way that's off chain but still drives value on chain. So I, I think like you're completely right, it's incentive incompatible right now. And I sort of hope that what Skip and mechatec are about is like making it incentive compatible. **B** (29:28): I mean, I'll buy it for cheap. Yeah, exactly, I'll buy it for cheap. But then we're a centralized party. You know, we've been in the market for nine months now and it seems like counterintuitive, but what we want is more people and like we want the capital requirement for participating in the network to be as low as possible. That's what's going to Maximize value. And so even though it benefits us, you know, I don't think it benefits the kind of world we want to see. **A** (29:53): Yeah, I guess one last question would be like, will chains opt into using like something like the Atom 2.0 scheduler versus, you know, osmosis is building it into the osmosis protocol. Wouldn't all chains want to, like, build it into their own protocols? **B** (30:12): But what is this thing about all chains? I mean, until, like, until Flashbots, we really believed that like, no one would ever run custom software that every, every validator would like pull from GitHub and, you know, and just participate. Which I think had a stronger argument in Ethereum. Right, where. Where miners didn't really play the central role that validators do in Cosmos. Right. Cosmos validators are sort of like the bastion of voting. Like, they play an important role. So they have a lot of agency and also a lot of responsibility, you know, and so this idea that it's going to be one size fits all for every, like the protocol is going to act as a unified force, especially when we start introducing, you know, ideas like mesh security, you know, where there's multiple participants in multiple networks, like validators are going to do what validators are going to do to perform their role, which is vote on governance. **C** (31:04): I think what you're. The question is sort of like, does it make sense for Osmosis or another chain that has a lot of DEFI activity to sell its block space on a market that other chains are also selling their block space on? And. **A** (31:22): Basically, is there going to be one. Is there going to be one market for block space across multiple Cosmos chains at the same time? Or is it going to be many parallel auctions running at the same time? And if you want to have win, you have to like, bid correctly on both of the parallel auctions at the same time. **C** (31:42): So I think, like, probably. So there, there's two pieces to this one. Like, if you can probably like, the intersection of block space is more valuable than like unions of it. So if you get overlapping block space, you're probably willing for two chains, for example, you're probably willing to pay more than two times as much for that. And then there's also this question of like, well, if I'm a chain, then that's probably more revenue for me, and that's good. But what token is that auction settling in? And how do we decide, like, if you guys want your auctions priced in OSMO and Juno wants theirs priced in Juno, like, how do we make a good user experience for participating in this auction. And then like, if we did have every chain running their own auction, how could we make a good user experience for like, figuring out what to bid and participating in all of those auctions simultaneously? And I don't have the answer to that question yet. So I think you're going to see a lot of folks play with different models there. **A** (32:38): And also, just like, how revenue gets split, right? Like, block space on osmosis is probably way more valuable than a block space on a non defi chain. **B** (32:45): How dare you. How dare you. Bias. No, like mechatech will have serialized execution across zones next year. We'll be live sort of single block space on Juno and Epmos and blah blah blah by the end of the year. And then we'll figure out the combinatorial auction bit and do that next year. It's tricky, but it's not impossible. It'll happen sooner than you think. And I think it needs to prevent validator centralization. **A** (33:11): Cool. I think we are out of time, but I have so many questions left. But thank you guys for joining us. **B** (33:20): Thank you.