Hello and welcome to episode number 19 of the Awesome Algo podcast. We have a very special and anonymous guest today. His name is Bit and he is from a T13 collective, which was originally, I assume, designed as a collective of developers and enthusiasts building projects on the Algorand blockchain and generally organizing and performing a lot of different useful activities in the ecosystem.
And let's say we have a rather packed agenda today with a variety of interesting topics that we hopefully can cover in time and this will pretty much include things like discussions and thoughts from Bit in regards to the consensus incentives, things that are currently happening in terms of shift towards peer-to-peer and I guess potentially and eventually removing the reliance on the relay nodes, which is one of the, I guess, and used to be for a
long time, one of the main, I guess, arguments against centralization in general in Algorand blockchain and then some interesting projects that D13 is working himself and some interesting advanced use cases for box storage. And on the closing thoughts, we will try to speculate a bit, hopefully in a positive manner in this very, I'd say, bearish end of the year in regards to the future of Web3.
If you had any questions to D13, you had this opportunity on the askosamalgo.com website where you can for free ask a question on chain and I think we have a few that we will try to cover at the very end of the episode. And with that, once again, D13, thank you very much for coming to this podcast. I've been following your work.
I think you are doing amazing contributions to the ecosystem and you're always there whenever there is a major event or whenever there is some interesting ways of covering the latest features in the Algorand blockchain. And it's just, you're one of the engineers and generally people in the ecosystem who, I think, just inspires people to come and build on this particular network.
And with that, I think the stage is yours and would love to start with, as we usually do with this podcast, when we get guests who are from engineering backgrounds, if we could just cover a little bit on your academic and professional background. And I understand in regards to the anonymity aspects of it, you can skip certain parts of it, but we could maybe start with... We'll paint in broad strokes. So first of all, it's my genuine pleasure and honor to be here.
Your podcast is one of my favorites. I'm massively into podcasts anyway, but whenever there's a new awesome Algorand, it's at the top of the to listen to list. So my pleasure and what an intro. Honestly, I got goosebumps at some point during that. So thank you very much. Yeah, a little bit about Young Bit. I got into technology very, very young before it was fashionable even.
And I was a little bit of the stereotypical anti-social kid who was just fascinated by this box that ran Windows 95 and it had infinite depth and allowed you to delete files from System 32 and then build it up from scratch again. And yeah, I think my first foray into programming was building a webpage. I actually didn't have a hobby aside from my computer and so I borrowed a hobby from a friend. He was big into tractors for some reason. And so we made a website about tractors, just HTML, right?
Which is not a programming language, right? It's a markup, but okay. So that was my first experience. And then I think the first actual programming language was PHP and I kind of got into it from that. And academically, I also studied in the field. My mom wanted me to be a computer scientist, but I ended up as a web developer, you know, unless there's appointment. Right. So yeah, I'm a software developer in real life, mostly around the web stack.
And yeah, the premise of blockchain and web three in general, especially from the second generation of blockchains on where you can actually do stuff aside from send coins to one another. There's something really, really exciting about this. I'm not sure if the cypherpunk dream is completely dead yet. I think that some people are still fighting for it. Yeah, but that's one of the things that excites me about this space.
Maybe if we can expand a little bit on the excitement part about the blockchain and the web three in general, like I know there is, I would say the learning curve maybe to do the transition from engineering, I would say background where you're dealing with web development in general into web three is usually easier for some people considering the dominance of things like JavaScript and TypeScript in the general like broad scope of the tooling.
If you look on, for example, Ethereum or any other bigger ecosystems, which makes sense, I assume this also comes from the adoption aspects. People are trying to attract engineers and often looking at the maybe languages or the technologies that has the most people working on it. And there's a lot of people in web development in general. But what really just initially got you fascinated with distributed systems, fault tolerant systems and blockchain technology in general?
That would have been before Bitcoin was even a thing during my studies. Let's say that I'm mid to late 30s. So the challenges of distributed systems was something that always looked like a really interesting challenge with many elegant and some non-elegant solutions. So the entire Byzantine general fault tolerance situation, how to have a consensus that can carry on even in the presence of an almost majority of malicious actors. I mean, that's kind of fascinating, right?
Generally with blockchain specifically, I got in fairly early. So I was early in Bitcoin and it was just fascinating to think that a community can make something that's valuable, that's recognized as money without being given permission, so to speak. The financial system is still massively gated and permissioned. There are easy ways for you to get started with Stripe or PayPal or something, but then your access to that can be yanked in a second.
And then you're stuck speaking to customer support agents who are following and anti-spammer and scammer script. And here you are with your business trying to figure out why you can't process payments. So that's one of the things that drew me to the early Bitcoin days. I didn't really think it would be particularly successful, but still I was down to try it. I got a friend excited about this as well. And we built some custom rigs with GPUs to actually mine a little bit of Bitcoin.
The first priority as rational economic actors was to pay off the rig. Am I regretting that now? But I did successfully pay off the rig and then I had a few satoshis left over. But from a development point of view, I'm not sure what much there was to create back then aside from maybe a mining pool or a payment processor to just send or receive Bitcoin. There wasn't very much more to do and neither of these was something that kind of drew me to actually code with it.
I was just like a user and for a little while while GPU mining was viable and minor. And then I've kind of been on and off with the blockchain space since then, but at the tail end of the last bull run in around November or December of 2021, I was looking around for, you know, a great blockchain platform that does smart contracts as well.
Because once smart contracts come into the equation, then you could just code what is effectively an autonomous agent that will be hosted for you forever on the chain. And if you don't mess up the logic, it might just operate basically forever or as long as the blockchain is alive. So I looked at the most popular and some of the up and coming L1s at the tail end of 2021 and Algorand stood out by about a mile. Like there wasn't a question of going elsewhere.
This is of course coming from a hobbyist perspective, right? The calculus may have been different if I was, you know, kind of quit my day job and start a business and try to feed my family and my employees' families out of this. But my involvement in this space is intentionally not monetized. And this is basically my hobby and a creative outlet.
But honestly, looking, comparing Algorand to previous generation blockchains like Ethereum or even contemporary, let's say, competitors, it's just like the brief for creating Algorand was all right, reconsider absolutely everything about what the blockchain is and how it works down to the damn addresses and think if we could do this from scratch in the optimal way, like what would it look like?
The fact that our addresses are base 32 instead of base 16 and they also have a checksum at the end means that you can't accidentally like fub a single digit or letter and then send your money into a black hole. That is brilliant, right? The fact that we have standard assets and when you buy my token or a token, there's basically two fields to look at to see if it can be frozen or clawed back and that's it. You can trust that it will behave as every other ASA will.
On Ethereum, there are websites that analyze ERC20 codes to see if they are vulnerable in some known ways, but you can always make it vulnerable in a brand new way. There was semi-recently a token by an influencer that had a bug in the transfer from that he later claimed was intentional.
It's a lot more complicated and dangerous than it needs to be and having at least a base asset being standardized so that you can trust that at least the mechanics of how it's going to work is going to work exactly as it's advertised. That's pretty big. If I may ask a side question on this as well, when you were assessing different chains as you said in the tail end of the bull run in 2021, what were the main criteria by which you did this assessment?
Was it mostly developer experience where you were approaching this on a more balanced scale or you would also look at the consensus as well? Just curious how did you assess the individual L1 chains that were rising in that particular year to come to the conclusion with Zalgirain? It was a little bit of everything. There's the user experience that the chain enables. There's a developer experience which I guess there's two aspects there at least.
One is how easy it is to get started, but two is how powerful the primitives that are offered are and so that limits or enables how complicated and the things you can build. The consensus mechanism I think was a big selling point and not only from the aspect of elegance but also from the things that it enables. Having zero second finality and I think it was 3.7 block runtime or it might have been a little bit more.
Let's say four seconds runtime, so four seconds to finality from the time you submitted your transaction. That's basically unheard of still. Maybe Solana can compete with that, but later on as part of the agenda we'll see how Algorand is scheduled to evolve with regards to that as well. Both from a user experience and from a developer experience point of view, being able to forget completely about deterministic finality and knowing that if it's a block, it's canonical, it's full stop.
That's pretty great, right? Otherwise you usually have to wait for a few rounds or blocks and see if it's in the chain long enough so to speak and then you have an arbitrary length of blocks after which you consider things to be final. There's a lot to having almost instant finality as soon as a block is minted. The smart contract layer as well, you can do almost everything you might want.
I think part of the design has been for performance and throughput and so I think the AVM smart contract language, Teal or Ask One if you want to call it, I think it's intentionally restricted in a lot of ways. Maybe some of that could be lifted a little bit, but yeah, I think it was the overall package. And it, I mean, to be frank, went through quite a journey, right?
If you look at the evolution that AVM and the things it was capable of just three, four years ago and now, the things that you are able to do now, especially with the introduction of inner transactions and box storage, which is of course, I mean, there's always additional layers that potentially could be improved in this regard, but it is quite a significant chunk of things that the guys at Algorand Inc. managed to squash in the past, say, three years in comparison to the original one.
But if I recall correctly, yes, it's indeed something that had, it didn't have like sort of extremely ambitious long-term vision in regards to the things that let's put it now. So of course there's post quantum security, right?
But I think if I recall correctly, the chat I had in the episode two with Zev from Algorand Inc, they had a very particular set of requirements and it was performance, optimizing for performance and they sort of started with that and then evolved through taking sort of very careful considerations in design. So that is the whole premise and I guess the beauty of it, allowing it to have such a low finality.
But going further though, now obviously a lot of people got into Algorand development, maybe if we are talking about engineers who are still building decentralized apps, potentially as you said, right, the bull run, the tail end, maybe the beginning of the bull run in 2020, 2021 and if you look at the current state of the Algorand ecosystem, I think the big shift that potentially happened is that original proposal or like the original intent
from Silvio is to buy design, not to have incentives of sorts, right?
Not to have incentives for running the participation nodes, which essentially shifted after I guess many years of the seeing in practice that we live in a very materialistic world and despite the great maybe altruistic vision in that, as a step towards further decentralization but also tackling some of these problems in regards to, okay, how do we actually put some extra incentive for people to have the participation nodes, to have a reason to run a participation node, right?
Because it's no longer this simple software that you can just run on the Raspberry Pi and put somewhere in the corner of your room. The amount of things that are being added to AVM and just generally into the infrastructure tooling is increasing the complexity that increases the hardware requirements. So it's no longer having that advantage of a, I guess, a simpler premise that it takes less compute power to run it.
So it's sort of at this point, I feel like really comes in as a, as just the natural part of evolution, which also ties in with what Silvio is originally was planning to do, which I think every consensus for a good decentralized system should have some notion of adaptability, right? You can just have a set of rules that can sustain thousands of years because software is not like, software is something that degrades over time.
No matter how good it is, there's always going to be something that I think over a large span of time going to have some maintenance, going to need some maintenance. So that aspect of adaptability in the consensus is what enables, I guess, this transition as well. But I guess making a transition into some of the main topics of the agenda, I wanted to hear some of your thoughts on the consensus incentives in the ecosystem.
And maybe if you could just do a very short recap for some of the listeners who may not be aware of the latest state of the discussions in the ecosystem in this regard. But yeah, what do you think are the main advantages and disadvantages of having consensus incentives introduced at different levels? One is, I guess, team, let's have it at the protocol level versus the other one is, okay, let's have something that maybe resembles the existing consensus programs in the ecosystem a bit closer.
Yeah, for sure. First of all, I think that a big part of wisdom is looking at the things that you're trying, even especially the most ambitious ones, and then realizing that they are not working. As you said, the original vision for the consensus layer of Algorand was that the requirements from a hardware point of view were going to be low enough that people would be able to and glad to run participation nodes without any compensation. I think this largely failed as an experiment.
The stats speak for themselves and something also, I'm not just saying that from the sidelines, I guess that too, but I've been fairly prominent in helping people either one-to-one or via writing guides into how to set up participation nodes for maybe about a year and a half now.
But people are either not able to do so technically, which has started to be addressed by initiatives like Algorand, like the one-click nodes that the foundation does, and another third-party one-click node interface called the host one-click node, I think, by host. That's one aspect is how hard is it to set things up. For the one-click initiatives, you had to be either fairly well versed in setting up something usually on Linux.
Now, this is getting easier and it has been getting easier for some time now. The second part is even if you are able to, would you be willing to do so, especially considering that you need to park some algo in an account that's not basically getting any yield? The shift of governance from generally the governance incentives from the foundation, if you zoom out a little bit, are transitioning from super passive into a little bit more involved, a little bit more engaged.
In the early days, if you recall, there were passive rewards where rewards would accumulate in your account until a certain round. And then the next time you transacted, your balance could have a little bit of yield based on your balance. And then when that stopped, I think it did it.
I'm not sure if it overlapped with the first season of first period of governance or not, but then during Algorand governance, you basically were incentivized to keep your funds in an account outside of the DeFi economy. And so there was a clash of incentives there because people were building decks and lending and this and that on chain, but they had to compete with, if I recall, 8% APR yield for not getting involved in your protocol and so on.
And when one of them is custodial and has smart contract risk and all of that, and the other can be parked on a ledger and it's just appreciates in algo terms, like 8% a year, the foundation recognized that this setup was stifling the on chain economy and it did adapt quite well into incentivizing. First we got some liquid governance protocols and then these were heavily incentivized.
So I believe the APR for vanilla governance right now might be something like 4% and via DeFi the last period was something like 18, which I believe, I mean, I agree with that, but it does clash with consensus participation unless you can combine the two. While we had AlgoFi volts, you could actually combine the two if you were getting rewarded for participating via AlgoFi anyway, and you wanted to, you could also participate in consensus via your AlgoFi volt.
False Finance recently also enabled that. Their V2 model has a personal escrow account that's quite similar to what AlgoFi volts used to look like, whereas V1 was actually all pulled into a single account and so participation would not have been possible that way. And I think that at least in the short term, there are discussions about incentivizing consensus participation via some kind of governance boost, like the DeFi boost, there could also be a consensus participation boost.
And personally, I think that is a good idea to at least investigate in the short term. We also do have a data point to look at and see if something like this is even going to make a difference, which is False Finance, which ran their own consensus initiative as a surprise about a week or two after last period's governance commitment ended. They just announced, if you participate through your folks escrow, you get like your share of 50,000 Algo.
They didn't have a way to measure eligibility and then participate the rewards and then distribute the rewards. And so they said that it's going to be a flat reward of 50k Algo divided by however many. So they are also doing a similar program for this period. I think they are allocating something like 70k Algo right now. And what we can see from that, by the way, one of the things, a small little site that I've built is called consensus.algorithm.observer.
And if you go there, you're going to see a table that looks very much like the ledger database table on each Algo D node that includes the number of accounts that have declared themselves to be online as in participating in consensus. And I add a few fields there for looking up which ones are Algo 5 volts, rest in peace, or a False Finance escrow accounts.
And what we can see from that is that there are 131 accounts registered online that are False Finance escrow accounts out of 335 total online accounts in Algorand consensus. So taking away a few that have super low balance and might be inactive for a few years now, which they are, but it doesn't hurt if they don't have any balance. We are talking about something like 40% by absolute number of participating addresses being run as part of this program.
Now by absolute stake, the number is a lot lower. It's actually 22.2 million Algo online through False. So I think at least in the short term, getting more people involved in running nodes and in securing the network is something that needs to happen. The number of participating nodes during the tail end of last year dropped to under maybe 150 or so. Don't take my word for it, but the consensus numbers dropped significantly.
And then at some point recently, when there were some concerns about an Algorand whale that I believe turned out, I accept an analysis that showed that it was a Korean exchange that was accumulating a lot of Algo. So the analysis that I believe shows that this is one of their wallets. But around that time, the Algorand Foundation put a lot of their own stake online.
Currently, if you went to consensus.algorand.observer and you filtered by the labels and you added up all of the foundation labeled addresses, they would add up to 942 million Algo out of 1.4 billion online. So by stake, the foundation currently has 67% of the online stake.
And so the consensus incentives, and as we'll talk about later, the shift to peer to peer, will actually take us from our present state, which is, in my view at least, a protocol that is decentralizable, but not quite decentralized yet. Hopefully, consensus incentives, both in the short term and in the longer term, and also leading the relay nodes is going to help truly decentralize Algorand. So would you say it's a fair assessment to say that?
And I certainly agree with your point on this regard, but given that there is already protocols and there are mechanisms in the ecosystem that were built as part of, just generally, as you said, protocols trying to adjust to the governance models and things like that, there are already mechanisms in place that I think just pair nicely or could pair nicely with providing incentives without actually messing with anything at the protocol level.
So I guess continuing, at least experimenting in this regard, should be a good indicator on whether it does make sense at all to try to perform any changes on the level, which I assume is already what protocols like folks are doing with the introduction of the incentive set at the level of their protocol, also playing well with the vanilla governance.
And I'm not sure about ex-Golf completely though, whether it has mechanisms for consensus participation if you're an ex-Golf member as well though. They are completely separate concerns. So yes, you can participate and also be an ex-Golf. Yeah. So on that regard... Yeah, I agree with experimenting with the governance model. Something else that needs to be solved is how to reward consensus participation. Wolf's Finance had a question about how they should do it for this period.
And I came up with a way and suggested to them that I believe there should be two different criteria. One should be eligibility, whether someone should get rewarded at all, and the other about how the actual rewards will be distributed, which is a bit more banal. But the problem is when you're trying to figure out incentives in a consensus protocol that was not designed with incentives in mind, you need to figure out how you're going to punish or address those that are participating properly.
And without having any slashing of stake, which is the traditional way of punishing participation nodes that are not participating properly, then the only thing you have to take away is the rewards. And I think doing this in an epoch, so to speak, such as the three months of governance and looking at your average voting rate over that period is actually a really good way to filter out those who were participating well enough from those who weren't.
And so after you filtered by that, then the best way to distribute rewards is actually block production because that is linear to your stake, whereas votes have more of a logarithmic curve. And do you think this is something that is in an ideal world, a due diligence of every protocol or should this be eventually standardized as well? Maybe it's to be some sort of arc that follows generally? This is the first time that something like this will be tried on Algorand.
I think it should be tried before it's evolved and then eventually standardized. There's at least three different potential consensus incentive avenues. One is already in progress, which is a private initiative by Volksfinance. I do upload that massively and I'm a big fan that they are putting essentially their own algo from their own treasury into doing this. Another is, as we said, a theoretical consensus boost via Algorand Foundation governance.
And the third, which is already being researched, is a protocol level incentive. And the research that we have right now in the form of actual like a pull request on GitHub is something that will divert a percentage of the fees of each block to the block proposer. The problem with the protocol level thing is that you can't really do slashing.
And the benefit of doing it via Volks or via Algorand Foundation is that you can look at someone and say, you were meant to be online for three months, but for like one month out of that, you didn't vote once. So another thing to note about the Algorand consensus protocol is it is the voting part that makes it vulnerable to stalling. So during a single round of Algorand, there's going to be a block proposal and then two different votes.
So if let's say that 50% of the network was not proposing blocks, that could actually cause some delays, but it would not stall the network because that round would time out and then another proposer would be selected and eventually a block would be proposed. If 50% of the network was not voting though, no block would actually pass the committees and be finalized. And so the network would be stalled.
And so even from a conceptual point of view, filtering eligibility by proper voting is kind of the thing to do. However, there's a little bit of a problem there, which is that votes are not recorded on the blockchain. Each node records its own votes as it sees it and only up to the threshold that it needs to consider the block passed. And so a node in North America will have recorded different votes from a node in like South Africa, from a node in Japan and so on.
However, the way I thought to work around this, at least in a, let's say statistically significant way would be to gather the votes from relay nodes. Relay nodes are also regular nodes aside from being the network backbone of Algorand. And part of their job is to also propagate blocks to nodes that are syncing up to the current state.
And so if you sampled blocks from, let's say six nodes around the world, a few in the US, a few in Europe, a few in Asia, maybe one in Africa, you would actually end up with a pretty decent view of who was voting in each place at each time. And so that could get around the partial vote view on each node. As for the consensus-based incentives, one of the issues is that you can't do slashing. And so ideally you would like to only reward those who are participating properly.
If you do it with transaction fees on a per block basis, then someone can be offline for half a year, thus actively harming the network. Because if enough nodes do that, then the network's going to stall. So I think that's very much a long-term vision, and currently there just are not enough transaction fees to make this a viable approach. So we're talking about a long-term plan, assuming that the holy grail of adoption and a lot more traffic on the network happens.
As an idea of a fee generation over the course of 58 days, I reached out to the folks finance team to ask if they needed any help with figuring out who did their participation job well. And so I had some data already for a 58-day period between July 21st and September 17th, which is 58 days. And since I had that data already, I enriched it with transaction fees and posted on the pull request about the mining incentives for fees.
And it turns out that the total fees of the protocol during that period was 59,343. So just as a comparison, my own node is a pretty good value for money dedicated server that I pay $27 a month for. That's fairly reasonable. You could go cheaper, but not by much. And I was also delegated a whole bunch of algo from various different people. And so the total stake was somewhere in the range of 800,000 to 900,000 algo.
And assuming that a block proposer would get 100% of the fees, which is actually not the case, the fees that would correspond to me over that period would have been in the 20 to 24 algo range. And the node cost over the same range was $54. So this is not a right now thing. This is let's plan for after short-term incentives thing, at least the way I see it because of the way the numbers work out.
If I would have my say, then governance would continue rewarding more active participation in the network. And one of the most active participations you can have in the network is to run a node and secure it with your stake. In regards to, I guess, the aspects of the centralization, because I think you certainly did cover the topic of the consensus incentives in great detail. And yeah, I suppose we will have to see on the results of this, I guess, experimentation in the initial phase.
But once again, certainly agree with you that having something like that on the consensus protocol level is an extremely challenging task, which is also limited by some of the features like lack of slashing and the others. So it's on the other hand, though, great to see that there are companies like Defolk's Finance standing strong in the ecosystem and they're doing some very good, I suppose, initiatives in regards to... For many different reasons.
But speaking about the topic of the centralization, Algorand and Asium Foundation and Inc. basically continuing to evolve the network and the infrastructure and the whole shift towards peer-to-peer, which eventually should take care of completely removing the dependency on the relay nodes. And I think many people certainly do agree that this is definitely a step towards the right direction.
But what do you think are the potential initial challenges that could be faced with incentivizing in this case? And aside from incentivization, I assume this can also touch a little bit on the engineering challenges in general. Right. So let me start by saying that as far as I'm aware, Algorand Inc. itself does not run any relays and I don't think the foundation does either. It's universities.
Berkeley has one, MIT has one, a university in Italy that I don't recall has one, some partner companies also have relay nodes. And frankly, it's more of a conceptual thing. And given what we just spoke about with regards to Algorand Foundation's taking consensus currently, the relay centralization conversation is a little bit out of place.
I mean, if the Algorand Foundation wanted to stall the network, they would not reach out to a hundred different companies and tell them to shut down their nodes. They would accidentally stop participating in consensus. Okay, that would be enough. So conceptually though, it's something to work around. Going peer to peer will first of all make relay nodes optional. I believe the plan is to still have them there as a fast path to the network.
And it would also add some longevity to the network in case they can no longer be afforded because relays have really high requirements, especially network-wise. Both bandwidth and total throughput and bandwidth spend over each month.
I'm not sure if by incentives you meant the consensus incentives that we spoke about, but I guess the way those would factor in is if you have decent enough consensus incentives and eventually the relay nodes go away, then that basically solves the problem of who is going to run the network. It's going to be people out of their data centers or homes without needing relay nodes.
Yeah. Are you aware of any interesting, perhaps examples of other L1s at the moment who rely on peer to peer communication for running the networks in this case? Because I believe there's generally a trend though, right? Like whenever an L1 starts even looking at the governance models, it often does start as more or less centralized entity. And then as the community and the infrastructure is expanding, it starts spreading out and becoming more and more decentralized. But I wonder if there are...
Not only on the technical layer of consensus, but in the stake as well, right? In proof of stake, usually everything is minted upfront. And so decentralizing it and getting it into the hands of many different people is part of the challenge, especially since you're usually going to need to have some funding as well. And so some of the people who will fund you upfront will be getting a lot more of it.
But yeah, I assume I was just a bit curious on whether there are any interesting prominent examples of other L1s successfully implementing peer to peer in this case, because I would take as large sacrifice on finality maybe or the speed of transactions. I think Algorand was the exception with having an exclusive network layer of really fast relay nodes and that most L1s actually do have a peer to peer style functionality.
Before we move on from the protocol level stuff, I wanted to share another thing that has me really, really excited. Just an anecdote from just yesterday, someone messaged me on Telegram asking me if Algorand is dead. And I don't think it's quite dead yet. We're in a really tough bear market and things have certainly slowed down a lot. And one of the things that makes me bullish is how much work the Inc. is putting into keeping up the innovation.
And I believe that both the Inc. and the foundation are largely working on the right kind of things. One of the things that Algorand Inc. has cooking is dynamic round times. So that is going to drive round times in the average case scenario, I believe to about a second and in the best case scenario to under a second. So if we are looking forward to maybe six or nine months from now, we're looking at a network that has peer to peer paths.
It has fast relays as a shortcut or as an optional way to get to other consensus participants sooner and the round time is not capped to three seconds or 3.34, whatever we are now. And so looking at the two organizations involved that keep working on the right things, Algorand Foundation also doing a lot in developer experience and Algorand and AlgoKit and Tealscript and all of that is one of the things that keeps me bullish.
They don't seem to have thrown in the towel and I don't see a reason for, especially someone in my shoes to throw the towel in either. Yeah, as for box storage, I think box storage... If I may just chime in and add a little bit, I think just to maybe highlight some of the recent things and the things to come in regards to the improvements to the AlgoKit, which for some listeners who may not be aware of what AlgoKit is, it's a completely open source set of developer tooling.
If you come from different backgrounds, like say iOS development, you can think of AlgoKit as a fast lane for everything to do with Algorand. There's a lot of things that are being expanded such as you can run a single command these days. If you're familiar with Create React app, you can create a full stack application, fully customizable, almost production grade.
You have end-to-end tests, you have unit tests, you have workflows and everything integrated for you to basically start building an application. And showed out to, of course, a lot of people at the DevRel. I think it's easy to overlook the hard work that is happening despite the harsh conditions of the bear market. Yeah, like being an observer of the things happening internally. That's exactly right.
Yeah. It's basically a scaffolding tool to get you set up with some structures ready to go and a big bag of utilities that are really nice to have. Some of the stuff will already have been created in like home labs and company labs already, but having one place to have them all.
It has really cool things like being able to generate JavaScript or TypeScript clients for your smart contract already so that you don't have to spend the time trying to figure out what make application from suggested parameters field names are. And yeah, it's one of the game-changing utilities that the foundation has put out. And TealScript is also not to be overlooked, which is a TypeScript like language that compiles to Teal.
And then also PyTeal will evolve to be a lot more Python, native Python life. Exactly. So a lot of people are complaining about the foundation, but honestly, just the developer relations, the developer experience side of things that they are working on, I'd like to hear how they do better than that, right? Because making developers' lives easier is a key part to getting to adoption, to getting to people building on Algorand.
Exactly, and I can't wait to see what people's reactions are going to be closer to the beginning of the next year, once a lot of the things in the roadmap are out. And I can assure that there's going to be a lot more additions to Algorand in general, as well as the transpiler. Having a bit more native support to Python is a huge game changer. And just looking at the way smart contracts have been built for the past three years, it's potentially going to be the easiest way so far.
And TealScript is already a glimpse into that, if you are more on the web script side of things. And yeah, shout out to Joe Polny from DevRel for doing an amazing job on this tech. Yeah, indeed. Box storage then? Box storage. All right, box storage feels like old news by now. But when it came out, it basically lifted the size restrictions for Algorand smart contracts. One of my first dApps was either just before or just after box storage was released.
And I was not using box storage and I had to have a second contract just to abuse its storage functionality because I needed just over 64 things, which used to be the limit for the global storage. So with boxes, you can have, it's a key value store. The length of the key and the value combined can be up to 32 kilobytes. And using one raises the minimum balance requirements of your contract's address by a little bit. It was something that was needed to have.
And it's really exciting to be able to store almost arbitrary length using multiple boxes data in smart contracts. Without box storage, one of my recent projects I did with the EXA NFT marketplace would not really have been possible. They wanted to, they run a rewards program for using their platform as they are one of the most recent additions to the NFT marketplace space. I think they launched in November of 2022.
So they run a rewards program where using the platform would earn you loot boxes, which was a multi-ment NFT. And they wanted an on-chain shuffle of prices based on the NFTs that you held. And so you would exchange one for one of any number of prices. Any number turned out to be 6,000 or so. I will make sure to include the link to the post by the way, which is a great breakdown of what you did there and in regards to VRFs as well. Right. Yeah, thanks. It was fun writing it up.
I'm fairly proud of that effort. So yeah, the long and short of it is that we needed to store 4,000 asset IDs somewhere. That was entirely out of the question without having boxes. And so for simplicity sake, I ended up using a single box and I could also get away with storing only half of the asset ID, by which I mean asset IDs have a maximum 64-bit integer. But right now we are at asset ID 1.2 or 1.3 billion, something like that, which fits very nicely into a UINT32.
So with a few optimizations, I could have a single box and store all of the asset IDs. And then yeah, basically treated that as a list ordered by a rarity, since there were some mechanics where some rare loot boxes were guaranteed like a top 25 or top 6.25% of the price. And whenever a price was allocated, I would have to rewrite the box to be the same, but with a drawn price missing.
One of the limitations in the AVM is that the maximum byte string that you can have is 4 kilobytes long, which I discovered fairly late as I was testing with a full box of about like 28 kilobytes. And when a rare price was drawn at the beginning of the list, I would try to rewrite the box and it would overflow the 4 kilobytes size. So I had to basically do the same thing, but in 4 kilobytes chunks. So copy 4 kilobytes, all right, copy the next 4 kilobytes and so on.
And I think that my subtle complaining in the article on the site about VRF and the challenges of working with box thirds contributed to two new opcodes coming to the AVM, which is a box splice and box resize, which does what it sounds like. So the box splice would be to remove a chunk out of a box atomically with a single op and then box resize is to resize an existing box, which we also didn't have. So I'll take a very little tiny bit of credit for that.
But my major point here is that it's great to see Algorand Inc. who does not have the largest reputation of being loud and super involved in the space. They are, however, listening and addressing because for them it's fairly low hanging fruit to add that, right? If it's not junk, if it's not going to keep up, just take up space for no reason and a reasonable use case, then it's not going to cost us too much to add it. So let's add it and make your life easier.
I must say I discovered this 4 kilobyte limitation quite late as well. Some of the projects I've been working on, but it's certainly great to see the things like splice. I think that's going to be very useful in this case. In terms of maybe things that it also enables, if I may add, I think one other interesting aspect to it is just it adds a little bit more flexibility in terms of readability of the contracts in general, because you don't really need to touch up on the global state, right?
It's something that potentially you want to store in a global state and it's certainly not feasible to do so if you want to have an updateable contract because a change in a global state is not considered as an update. It's actually a breaking change in this case, right? You can't really... You can't really... Like the second you start dealing with the global state in this case, the ability becomes a bit tricky.
But if you can squeeze in a lot of information into the boxes, for example, that's what I believe the folks at NFD are doing with the way they handle the updateability of the contracts. Yeah, box storage is particularly useful in this case. But what do you... That said, it's not quite throughout global storage yet because the limitations are fairly significant. One of them is that they are private to each contract within on-chain. So I can't read your applications box.
With global storage, you can read another application's global storage quite easily. You need to have it declared as a foreign app that you will look at or touch. And then you can just read their global storage. With boxes, the only way to access another application's box is if they expose an ABI method to give you that box's data. And then the second thing is that it's not quite integrated in indexers. And so it's not part of the block delta as you would get it from an indexer.
If you want to track a box change during a transaction, I believe you can get that out of the transaction, out of the ledger delta APIs, but I have not actually confirmed that. So part of my medium term plans is actually to... I'm writing an explorer that's going to be on Algorand.observer. I'm not sure if that ever will be completed, especially the way I have it, but the ledger delta APIs is something that could be utilized in a good blockchain explorer as well.
And to my knowledge so far, it hasn't been. If I were to mention on the things that potentially the only, I guess, tricky part around dealing with boxes is MBR, calculating the minimum balance requirement. I feel like that aspect could potentially get a little bit easier, just maybe something that is baked in on the level of the core SDKs or something like that, that allows you to quickly, dynamically calculate the fee that you need to pay to interact with the contract.
It obviously can be covered by developer if you're providing an ABI interface to someone to consume. But for the developer himself, building tests and all of that, that's potentially when it just gets a little trickier to dynamically control it. Especially if you're dealing with dynamic bytes or if you're storing different mappings in there.
I hear that, but I think a more challenging scenario is when you're charging users for the box stores that they are using, because then you have another value to keep. You've paid up to that much. If you want to expand, you need to pay that much more. If you're just calculating it off-chain, the formula is, I think, 2,500 plus 400 times the number of bytes in microalgo. That's fine. But in having to keep track of that on-chain is where it might be a little bit more cumbersome.
You did mention that you're working on a set of very interesting projects that also touch on the box storage. Unless you want to jump into base 32 already, any chance you can also expand a little bit on the tool that you've mentioned that does the collaborative doc editing? Does this, if I get it correctly, utilize box storage? If yes, how do you achieve this particular capability? One thing that I started to build, it's 99% ready to ship, and I parked is called base 32 fund.
The idea there is to have a crowdfunding application where you can set a minimum and a maximum of the amount that you're raising, but also the title and the description of your fundraiser would be stored on-chain. And so it would be entirely self-contained, and you wouldn't have to go to IPFS to fetch a blob of text or markdown to show to the user. And then the reason that this is parked is, well, two challenges there.
One is figuring out how to censor spam and scams and things that are basically a danger to the project and the community. And that's a little bit of a hard problem to do. So I wouldn't be able to delete your fundraiser, but if you are raising money for terrorism or whatever, then it's probably not something that I want exposed on the front end. And so, I mean, that's actually the minor point. And the major point is that I want to build a set of tools around the base 32 name or brand.
And one of those is going to be called base 32 space. And the two aspects there are going to be one, I want to create a file system-like interface for smart contracts to store files on Algorand. Yes, I do know IPFS is a thing and it's usually a much cheaper thing. Algorand box storage is relatively expensive compared to competition.
However, if you have a good enough reason to store things on Algorand, or if you want to actually operate on the data from a smart contract, then in the first case, you have it completely self-sustained. So you don't need to also pay an IPFS pinning service to keep your thing online. And in the second, you can't really interact with IPFS data from the Algorand blockchain. And so being able to have a file in a smart contract would enable things like that file being a template.
And then maybe having some VRF output or maybe some algorithm that derives some content to the file from the user's address to randomize some colors in an SVG or something. And so I stopped base32 fund because it's going to be based eventually on base32 space. And I need to figure out what the best way for that is to look like. And then probably an arc would be in good measure there as well.
And the middle part between the file system thing and the fundraiser is actually something that base32 fund already implements, which is to store Markdown and then render it on the client. The first concern is that I'm not 100% confident that it would be secure from, let's say, code injection because I haven't actually written the Markdown renderer myself. I'll figure that out when we get to actually launching.
But something that might be interesting is to have the equivalent of Google Docs on chain. So if let's say base32 space file system is already launched, then we have a generic container to store files. By the way, the idea there is to also be able to have them compressed so you would pay a lot less. So you'd have a field to say that this is a gzip or deflate or LZ4 or whatever. And then it would be up to the client to apply the right decompression to get to the actual data.
And so for documents, the reasons that one might want to have it on chain is mostly to be able to interact with it either as a user or as a smart contract. So as a user, I might want to, as D13, I might want to publish my manifesto about Coop and then people might want to comment on it or upvote it or downvote it and introduce social elements into the document itself.
And then a second generation idea on that would be to have the equivalent of pull requests or merge requests where someone proposes a change on chain again via a Delta document that I can then look at and see that, okay, he's just fixing a typo. That's great. And I can incorporate that into my own.
And so yeah, I kind of built base32 fund as backwards because the ideal way would be to get a file system like thing together and then work on the document storage and especially the client side rendering properly to make sure that the no baddies can end up in HTML and then base32 fund, which actually utilizes the showing a document. I think I'm starting to get it in a much more context now and it's the more you think about it, the more exciting it starts to sound.
Are you familiar with Soled by Tim Berners-Lee? No, I don't think I've seen that. It's something that touches on link data space in general, but I think this is potentially a more practical way to solve the problem because the thing with Soled is that they basically are building this set of new protocols, expanding on the open web standards to have a way to deal with access control in a way where you represent the data as semantic data basically.
So you could say like, okay, this is a set of pictures of files that you as an app can access or can't access. But I feel like with what you're doing here with the file system, this can also potentially be expanded into something that also provides a more robust access control to files. If you're primarily aiming this to be like, I assume the hardest challenge you have is getting the file in there, getting it on chain.
The second you have it stored, accessing this particular information or loading this back is could potentially be a little bit more trivial, but this can also be expanded to have access control rights around it. You can imagine like maybe storing files and having some sort of social network thing that is reading information through smart contracts and then they can use your file system and know which particular boxes they can access or can't access.
But I'm just speculating, but once you started describing this in detail, this really reminded me of Soled a lot. Interesting. So anything that goes into a box has to be considered public domain, right? But then whether something would be fetchable from other smart contracts could be part of the access control. Another thing to be considered is if something is mutable at all, I think there's a use case for something that cannot be updated even by the owner.
And then accepting the social elements that we spoke about, about upvotes, downvotes, comments, pull requests and so on is probably going to be another aspect. But as for getting the data on chain, it's actually not that big of a deal. You have a limitation of two kilobytes in application arguments, but you just do it in two kilobyte chunks and it's fine. It's not particularly...
And what about the cases when you want to read a large chunk of data on chain considering the limitation, but I assume once again you just deal with chunks in this case. Right. Yes. That would be a little bit trickier. I assume four kilobytes is going to be the upper limit to that because of the byte string that you can store on the receiver side. I'm actually not sure if contract to contract, some methods have a limit on how large they are.
It could be two kilobytes or four, but I haven't looked into that much. And what do you think is the... What's going to be the best way for the community to provide feedback or test out the platform once you think you have some initial maybe beta version of the space, the base 32 spaces first? Right. I think I either need to do a prototype first of how it's going to work, which is actually in progress already.
And then the second part would be to get feedback informally and then publish an ARC because it would be really cool. It's not just, okay, you put a file on a contract. What's more exciting to me is something that fulfills the file interface, but is also a fundraiser, is also a love letter, is also a something else. So it has a standardized way of both clients and other smart contracts to modify or check out the state or whatever, but it is also like a fundraiser or something. So yeah, go ahead.
Would that be something you as a user deploy or would that be more of a factory contract of sorts you just interact with and that takes care of mincing the contract that will act as a storage? Probably a factory contract that spawns a subcontract. This could allow, so a problem with the user deploying contracts is how to actually index all of them.
And also if you need to interact with a smart contract that has a box storage or needs to opt into, that needs any sort of minimum balance, you have to do it in two steps normally, right? You have to first create it in order to see what the tap ID is, which is going to define what the escrow address is and then fund the escrow address and then you can create the box and then you can opt it into an ASA. However, base32fund allows you to create a fundraiser, which does use box storage in one go.
And the way I'm doing this is I have a like deployer pattern where the main contract on chain has already created templates of smart contracts that are just empty, but they are known and it has a box of let's say a hundred of them. And when you want to create a fundraiser, you pick one of them at random and then you transact as if it's going to be available in three or five seconds since you last read it or maybe 30 seconds since we're going to include signing.
And assuming that no one else sniped that up in the meantime, then you can basically just start using it immediately. The main contract is going to reassign it to you. So the permissionless, it can also be permissionless from the platform point of view after you actually create it. Yeah. And you don't have to wait for the contract to be created because it will be pre-created for you.
And also for this to be sustainable, the contract will also replace the application that was just taken with a new one on the spot. And so you won't run out after a hundred have been spawned. That's very interesting. What about just general end goal that you have with this project?
Would you aim to have this as a protocol or you'd want to have some somewhat of a sustainable business model for it, not to say that protocols don't have sustainable business models, but usually once you open source something, it becomes more of a ideological sort of goal to fulfill rather than something to monetize and generate revenue from.
So I haven't thought in very much depth about this, but my current intention is for, you would be able to create a file if you wanted in the same format, but it would probably not be available through base32 space domain. So part of this is going to be a front end interface that will serve either a page with the description of the file and be able to interact with it. But you'd also get like a raw URL to get just the file itself.
So if you just created something that conforms to the interface, but it was not done through the base32 space main contract, then it would not be indexed. That wouldn't even be a malicious thing, a profit driven thing.
It would just be the way that it works because the active application IDs will either be stored in a box on the main contract as well, just to be able to get them easily, or they'd be discovered by looking at an indexer and seeing who interacted with the main contract in order to figure out who's created like files or file buckets. I see. I see.
And I know to be a little bit more cautious of time, but usually find the questions around particular implementation challenges or discussions on individual platforms the most engaging. But anything in particular as a closing note, perhaps for the base32 project that you can mention in regards to the main challenges you face so far, I assume this would be the part about what you've mentioned in the beginning, right? Like building the space before dealing with the fund part.
The aspect is like before dealing with the funding part, you want to have a more general sort of banning mechanisms perhaps or ways to detect malicious actors? Yeah, moderation. It's a conceptual problem. It is a hard problem to solve in a permissionless or decentralized system. And the main problem there would be moderation. So this is not so much about the protocol level.
If you want to upload whatever you want and create it as a smart contract, I'm not going to give myself permission to delete it just because of this issue. So it can exist on chain, but it would need to not be on the front end. And so figuring out a way to make this a self-sustaining thing. And if I forget that we would need a team of moderators is probably one of the challenges to consider.
The thing with that is if you try to make a monetized token around reputation and you have people working for moderation to get that token, then not only do you have to fund the token itself and reward people for moderating, which is fine, but it would have to be a large enough market cap to not be subvertible by someone spending $10 to buy some.
So it might be something that requires a time lock or a vesting of your reputation tokens so that someone couldn't attack it immediately, but they would have to attack it like over months and some time. Generally there are no, I don't think there are very many good solutions here that do not involve KYC. And even then. And you just try to raise the bar higher to attacking the moderation system. I think that's the best you can do.
Or hire a bunch of moderators and give them ban power, over the front end at least. I'd be very curious to see the implementation details on that or whatever you have the perhaps or also if there's anything you might need in terms of testing, feel free to reach out, would be happy to contribute. I think conceptualizing boxes as a L1 primitive is one thing, but then conceptualizing this as a file system, you can potentially...
Because the way I see it based on your description is an abstraction on top of boxes that simplifies interactions with on-chain data in general, and I think that's a really powerful primitive if built right and also provides a way for layman users to interact through web interface.
I think there's a lot of opportunity there just from a perspective of building a protocol or if you want to achieve what is something in between when there you also have some features that potentially you would want to monetize. But yeah, thank you for describing it. I think this is certainly something that will benefit the builders in the community and just the ecosystem in general, I think. Yeah, let's please do the first by the way.
The chain UI thing you mentioned, while it's not the first instance of box storage, I think it's the first public project that has the concept of storing the website's code in a box. I think they're limiting themselves to a single box there, I'm not 100% sure, but there is a size limitation to how much you can store in a chain UI, so I assume that's derived from that chain from a maximum box length. Oh yes, definitely.
The reason why I didn't explicitly mention that, although I did mention them with regards to the web components and things like that is particularly that, right? Because it seems like despite the fact that you can pretty much store anything on the box storage, their approach was a lot more tailored and optimized for HTML files and artifacts that you may deal with when trying to display the web content. While in your case, I think it's a lot more generalized, right?
And you're also talking about compressing data in different ways to limit the amount of information you can save on space as well, because boxes are cheap until they aren't, right? Until you get to a megabyte, right? Until you get to a megabyte and they're no longer cheap.
So yeah, I think it's also quite powerful that you do approaching this in slightly more broad terms and I guess trying to aim for potentially having some sort of file system layer that is already powered by a quiet extensible primitive that's baked right into the L1. A box for one megabyte would be about 419 algo. So one concern that I don't need to concern myself with is video piracy on chain.
Yeah. Oh yeah, I mean, you might have to be an absolute madman to store videos on chain in the box stores especially. But yeah, as we move towards decentralization in this very weird, I would say, era where everything is dominated by centralized solutions, where even the term open source by itself, I think is mostly an industry term that came in after real open source died off. If we're talking about public domain software.
How do you think Algorand can generally help stigmatize the web 3 domain a little bit? Because I think adoption is one of the main ingredients in increasing the just general usefulness of blockchains because there is indeed a lot of real life applications where it does make sense even outside of the financial sector. But you need people building on this. And unfortunately, there's a lot of stigma around just cryptocurrency in general.
In my opinion, I think it's mostly coming down to reducing a little bit on the amount of web 3 specific terminology because it is computer science. At the end of the day, there's a lot of things. Indeed the paradigm with smart contracts is very different to what modern software engineers might think about. It was the deal with cloud development and things like that. But what's in your opinion, one of the strategies that can potentially attract more people to this particular domain?
So yeah, the blockchain reputation, the web 3 reputation right now is in the tatters. This kind of makes sense. It's ironic and unfortunate that a lot of that has to do with centralized solution that just happened to be selling cryptocurrencies. If they were selling potatoes or onions, then we wouldn't be calling potatoes and onions a scam. But the permissionless aspect of it and the anonymity or pseudonymity of it also lends itself to a lot of scams and maybe well-intentioned bubbles as well.
Personally, I've been interested in this space for more than a decade now. So from a developer's point of view, the other thing is how interesting programmable money is, especially now that you have smart contracts and it's not just sending it from one account to the other. It is quite fascinating to write what is essentially Python or TypeScript and it can manage split payments or royalties or whatever it is that you would want to do.
And also the aspect of having a neutral and trusted environment where this is going to run. So Teal is not ideal for readability and that's something ideal would be source-visible-on-chain and higher level, but you can't have everything. But the point is that it's an environment that will execute the code that you published in exactly that way. And this is not really something that you can replicate outside of blockchain.
You're usually trusting a single party to run that code for you to be the escrow account and that kind of power maybe there should be a way to have this kind of trustless execution. So figuring out what blockchain is good for and figuring out the pragmatic split between decentralization and lawfulness, let's say.
While I do subscribe to the cypherpunk ideals myself, like if we are realistic, then there needs to be some kind of, if not regulation, then at least some kind of law around scamming and stealing and all of that stuff. Also everyone's an anarchist until their house is broken into and then they're crying for the police, which happened to a lot of us in February, March, April and May of this year.
So yeah, figure out what works best, figure out the best balance of centralized and decentralized and the reputation will probably improve with the next cycle, right? Which is, I guess, some people might say that it's right around the corner. From my side though, I don't really at this point, don't really look into the trends in the cycles in that manner. It's more of a just general excitement about what's to come next year in terms of features that Algorand is going to be adding and we'll see.
On the other hand, there is also... Sorry, go ahead. I was going to say that it does impact the market size and the community size. So one aspect is, okay, my own personal algo stash is down or Ethereum stash or whatever, but the other aspect is there were a thousand people here, now there's 300. So that is a bit of a bummer, but my silver lining here is to see, as mentioned before, Inc. and the foundation working on the right things largely.
There really are super exciting things around the corner, maybe latest year or early next year with the dynamic Lambda and super short runtimes and with peer to peer. And even in the small stuff that they are still working on, like the boxer size and the box supply stuff.
Yeah. As it happens with a lot of the episodes so far, I'd really like to ask some of the closing questions, which in this case, I think there is just one potential question that we got from the community so far, which we can just briefly touch on before the final question for the episode. So someone asked yesterday on the ask.osu.malgo, in your opinion, what's the top one best and worst thing event that happened in this ecosystem this year?
Which I think the answers might be obvious for some people, but still worth, I guess, getting your opinion on. Right. The worst would probably have to be the my algo hack situation. By a mile, the community was already thinning out in the bear market, which is to be expected. And that kind of came out of nowhere. It was a professional hit, 100%. There's so many algo parts from the very first wave of the attack still on chain, think to the tune of more than 8 million, that still hasn't moved.
And yet they moved on to hit thousands of smaller accounts. This entire thing doesn't really make sense as a financially motivated attack. But okay, yeah. So the worst thing would have to be the my algo hack because aside from the financial losses that people incurred, it also really, really hit the community size and the confidence in Algorand, even though Algorand itself wasn't really hosting my algo.com.
As for best, I can't really single out a single spectacular event, but I've basically already addressed this as the reasons why I'm still bullish on Algorand, which is that I see the people involved working on the right things, at least according to me, and their hearts still seem to be in it. Same question back to you, best and worst.
Yeah, obviously I think the my algo hack was a huge blow in terms of, and I guess this might be rather the personal manner, but after getting to maybe like, might be news for some listeners, but like I am happy to like inform that my sort of escape from big enterprises has been a success so far and I do now have the ability to contribute and work full time on the Algorand ecosystem explicitly.
And this includes the contributions to AlgoKit, but I'd say personal opinions about learning a bit more intricate details about how things are being managed and organized and what people care about and worry about within Algorand Inc or foundation or the partners who are building the tooling. I think the intent is certainly in place. There's a lot of people who really truly care about the tech and doing its best to basically tackle the main challenges, which in my case is adoption.
One of the challenges that I personally am committed to right now and trying to pretty much do my best in terms of improving the developer experience and things around it. But I'd say one of the best things this year so far was just confirming a lot of the things I had my opinions on before actually being able to learn about and get to know some people in the community a lot better. Obviously it's not a financial advice.
I can't really promise anything, but one thing for sure is there's a lot of engineers who are just working their asses off every day. And it's just great to still be able to be within that environment and build something that I think is potentially one of the most scalable decentralized protocols out there. But we really need to polish out the dev experience around it.
And yeah, can't wait to see what's going to be the perception around some of the tooling once a lot of the exciting things on the AlgoKit site, for example, are going to be out. Sorry, I'm going on a tangent at this point a bit. But another thing is obviously bear markets are a bad thing, but it did showed a lot of real intentions behind a lot of people who may have had a look around being a good actor in the ecosystem, but it's the great filter in some sense as well.
It shows you who the fair weather friends are. It shows you who are the actual people in the community who are there for the community and care about the tech and expanding it and attracting more developers versus who were there for just the money perhaps or maybe getting some sort of weird internet fame. How would you call it? It's a whole other topic that's going to stretch this episode a little bit more.
But one thing I believe, so the second worst thing that can happen with blockchain is it goes mainstream big time as it is right now. Because yes, the stuff that we said about regulations and law and order and everything is kind of required, but also anonymity and privacy is like privacy is a basic human right. And right now there's very few blockchains that offer actual privacy.
And so if tomorrow morning we woke up and everyone had a blockchain account, that wouldn't necessarily be a good thing based on the state of blockchain right now with everything being super public, right? I don't know that you would want your entire purchasing history not only being available to corporations via other corporations and banks, but to everyone in every CD bar that you go and pay with some crypto, right?
So hopefully an evolution with something that blends privacy into blockchain before everyone has a blockchain wallet will happen as well. Oh yeah, I absolutely agree on that point. It's a double edged sword, right? It's important not to forget that the blockchain is still a technology and any technology can, depending on which actors are using it, could be used for many different intents.
And to be, I guess, a bit more pragmatic in this case, there are obviously absolutely no indicators that it's always going to be used for good, just as with any technology. So I know there has been a lot of talk in the community as well that Celui is also caring a lot about privacy in this regard. I assume this is very long-term though, but I would be curious to see what are the options that are there for enhancing or just introducing the notion of privacy into Algorand itself.
But on the bigger scale of things, yeah, I think it's one of the key ingredients for just survivability of a particular, whether it has ability to have privacy in general. The other aspect is, of course, if the tech is good, someone like BlackRock is going to come in and buy your chain and reverse engineer it and run it privately and have some, I don't know, central bank currencies that has smart contracts with clawbacks for every single citizen. So it's absolutely dystopian scenario.
And once again, same tech, but completely different approach to using it. So it's going to be interesting to see because I think it's still, despite this being over the span of many decades, just affecting the society in many ways. But we'll see what's the team Byzantine fault tolerance going to show in this regard. But also the proof of work stuff shows that you can have an anonymous people who are mining. You just know their hardware and not even that.
And so in theory, they could spend their Bitcoin or Monero to do whatever. And at least in Bitcoin's case, the concerns about terrorism and all the bad things as a percentage of total cryptocurrency transactions are very regularly shown to be super overblown. There was a very recent analysis by Chainalysis to address some concerns that Hamas was being funded to like a tune of billions, but in fact it was a shared wallet and the exposure, both inflows and outflows there was less than 1%. So yeah.
And that was what? Bitcoin? Ethereum, I believe. Ethereum. Interesting. Yeah. But I guess that's generally the sentiment I think. And generally just my mindset at the moment is more of a, you know, being focused on building things rather than trying to speculate on the potential outcomes because it's just too many variables in the world right now. Yeah, for sure. Too easy to get carried away.
Yeah. I'm genuinely excited to see what's going to come out over the next six to nine months and where the consensus incentive stuff leads. Likewise.
As a closing note, any particular advice, and I'm usually asking this for people who are curious to get into or venture into blockchain development, but we can potentially take a broader scope in this case, but any particular advice you might give to aspiring software engineers who are looking to venture, say even something more broader scope, like just the domain of systems design or distributed systems, because it doesn't always necessarily imply blockchain, right?
I think for blockchain, the advice would be different than distributed systems in general because of the downside of losing your own money and other people's money, even worse. So for blockchain, my advice would be to not start with a smart contract. They are brutal. If they hold any significant amount of money, it's going to be an implicit bug bounty to figure out the bug that you wrote. I have written a bug.
It was not exploited, thankfully, and the amounts were very small, but that doesn't really mean anything. So I would not start with a smart contract, at least to be deployed to mainnet. I would start with as boring as it might be as advice, getting familiar with the protocol in depth. And if you can't be asked to do that, then work in the web 2.5 space, which I would call anything that isn't actually the backend, isn't actually a smart contract or on chain.
Maybe it would be something like flow.algo.surf, which is a blockchain analytics tool that I've written for Algorand where you just display web 3 data. Maybe just sending NFTs around, maybe that's web 2.5, maybe that's web 3. But generally I'd say to start simple and especially in a blockchain that isn't too crowded from a developer point of view, there's always space to just show data from the chain that isn't done.
So recently we had a chain trail launch on Algorand within the last five to six months, I believe. That's just focusing on TPS and identifying who is using the chain at any given block or any given time segment. There's AlgoGator, which is mostly creating dashboards for you out of DeFi opportunities. And that's largely a read-only state of the blockchain.
There's a lot of stuff to figure out, but it is genuinely interesting to play with programmable money, but do it responsibly to yourself first of all, and then to your users. So another really, really good idea is to, if you are going to venture into smart contracts, is to read post mortems of previous hacks.
Figuring out how other professional developers failed in implementing something on this chain should be required reading before you do anything big with the money on that chain or just generally in chains. And on that side also, I'm not sure I can share much information yet, but there will be things that AlgoGate will simplify access to in regards to potentially at least catching the most obvious rookie mistakes you might make in smart contract development. But yeah, I absolutely agree.
It's usually a great source of learning when it comes to exploits in smart contracts. It's just a slightly different paradigm of software development you have to think about because yeah, indeed you're dealing with someone else's money. There's significantly less mutability than you expect from it. And yeah, it's just generally something you have to approach with a lot of care and indulgence before ever deploying something that other people can chat or interact with.
But with that, thank you for great advice. And very, very excited to interact with you. I'm still in a bit of a disbelief that I'm already at episode 19. This whole ordeal started just as a way to just chat with great people, engineers on topics that I personally find a lot of passion in. And yeah, it's amazing to have you on the episode. Thanks for coming. I'm hoping we can have one maybe doing a slightly deeper and I guess focused dive on the platform around the boxes with the Base32 fund.
It's best of luck with the challenge for the moderation around it. I think that solving that one is going to be quite interesting to see. But yeah, other than that. Thank you. Thank you very much for the invite. I enjoyed our chat immensely. As I said, your podcast is one of my favorites. Thanks very much for the opportunity to be here. Stay tuned for episode 20. That's going to be a special episode and we'll have to potentially make the very first episode we'll have multiple guests on it.
But that might come after quite some time. But once again, thanks for staying with us.