David Lancashire – Saito https://org.saito.tech The Big-Data Application Blockchain Tue, 04 Apr 2023 05:14:52 +0000 en-US hourly 1 https://wordpress.org/?v=5.7.14 https://org.saito.tech/wp-content/uploads/2019/06/Saito_Primary_Impossible_Shape_Gradient_RGB.png David Lancashire – Saito https://org.saito.tech 32 32 Saito Arcade: Development Update https://org.saito.tech/saito-arcade-development-update/ https://org.saito.tech/saito-arcade-development-update/#respond Tue, 19 Apr 2022 08:51:00 +0000 https://org.saito.tech/?p=3441 As aficionados of the Saito Arcade will have noticed, this week marks the graduation of our island-settling-game “Saitoa” out of beta and into rotation on the Saito Arcade, where we hope it will join other games including Poker, Red Imperium and Solitrio as community favorites.

Saitoa may not be the largest island out there, but it is built on a foundation of volcanic rock that makes it a safe refuge in times of trouble. As more travelers become aware of its resilience to 51 percent and other economic attacks, we expect an influx of settlers and upswing in commercial activity.

This week’s release brings the total number of games on the Saito Arcade to nine, with an additional seven in various stages of development behind the scenes. If you’re a gamer / developer / designer /artist and want to help expand the number of games that are available please get in touch. All of these games are available in our official Github repository, and will periodically show up on our staging and test servers as well.

When we released the new Game HUD some TS regulars were ambivalent: “why change it,” they asked? But then Old John, gaunt from a lifetime of DEFCON brinkmanship and rear-guarding in Africa, looked up and gave a slight nod, and we knew it was understood: every little improvement matters when you’re fight the Soviets. Or the Americans. Whichever, really.

The first thing many people notice about the new games is their improved look-and-feel — things like textured board and tiles. Beyond these more cosmetic improvements, our team and community have been hard at work revamping parts of the underlying game engine and its set of standard components. We have recently completely re-written the Game HUD, a component which lists cards and provides a way for players to select from a limited set of options. The draggable box that displays player information in games like Wordblocks and Blackjack has also been upgraded.

Wordblocks Pro-Tip: if you are playing against someone even remotely dyslexic we recommend taking odds. Our team has learned from hard experience to avoid playing “Richard” (real name withheld to protect privacy) unless we can even the odds with regular access to dictionaries and other basic Wordblocks tools.

Less visible improvements include better support for games with hexagonal tiles, simultaneous card selection, and web3 crypto integration (coming soon!). Those who look closely can also find some more experimental features: support for capturing in-game screenshots as well as the ability for players to save and load multiplayer games. This last feature is particularly useful in games like Red Imperium where it is coming in useful for debugging advanced technologies and late-game weapons.

As always, much of the work that is happening around the Arcade is community driven. If there is a particular game you would like to see developed or you have programming or artistic skills and are looking for a way to help out, please get in touch. If you let us know what you think you can tackle we can direct you to a project that needs help.

]]>
https://org.saito.tech/saito-arcade-development-update/feed/ 0
Saito Roadmap Update – March 29, 2022 https://org.saito.tech/saito-roadmap-update-march-29-2022/ https://org.saito.tech/saito-roadmap-update-march-29-2022/#respond Tue, 29 Mar 2022 04:00:00 +0000 https://org.saito.tech/?p=3373 Team

We are pleased to share news that three developers have joined our team over the last three months. This brings our development team to five staffers, not including Richard and David. It takes a few months for developers to get familiar enough with Saito to be highly productive, but the community should have already noticed an increase in the pace of activity.

Dev

The most significant technical update in the last quarter was our production network switching to the upgraded version of Saito Consensus that allows both Rust and Javascript clients to co-exist on the network. This was a major software release with significant improvements to core scalability.

Our team is back to refactoring and reiterating our Rust architecture as scheduled. Our goal with the Rust code at this point is improving its networking code and re-architecting its overall design to add support for WASM. The long-term goal with WASM is the ability to compile a optimized version of the Rust binary and embed it directly in user browsers, ultimately allowing us to replace much of our javascript stack and permit the development of applications in non-javascript programming languages.

Outreach

Our next major “public release” will be the unveiling of a new website that will replace our current splash page. While we don’t want to tease unnecessarily, we’re happy to share that the new site has a more stylish appearance that reflects the elegance of what we are accomplishing in code.

Importantly, the new site includes a number of custom-produced videos and animations that we believe will help newcomers more quickly understand why Saito matters and how it works. As of this afternoon the site is basically finished in terms of structure. Our animators are still working on some of the custom animations for the new introductory video. Our target for launching the new site is April 12.

Games and Applications

Once the new site is live we will be moving forward with the redesign of our interior application suites including the Saito Arcade. In the meantime, we will be releasing an exciting new game with a new level of polish and gameplay onto production servers later this week.

Within a month we expect to publicly launch our pilot web3 crypto-integration. We are likely to be “turning this on” on our testing and staging servers (note: we now have testing and staging servers) before shifting to production. Part of the work that is needed here is security reviews and handle UI/UX flows so that the process of joining Saito and funding the network is simple.

Saito Wiki

Our upcoming website will be accompanied by a new Saito Wiki that will combine educational materials, information on how people can contribute to the network and its various projects, information on community channels and materials, and supporting documentation for developers like an API Guide and application-building tutorials. The new Saito Wiki is expected to eventually replace our current backend site as the text-heavy partner to our new front-end webpage.

Our product lead Karl is the contact-point for suggestions around community and Arcade. He is still ironing out details about how exactly the edit-and-review process should work, as we will need to review suggested updates for security reasons, but the site is up and feedback is very welcome. Members of the community are already contributing to a community section and we are moving forward with community-development projects.

Moving Forward

All the above should bring us roughly to the April 2022 vesting, which opens our final quarter of formal vesting. Once that is over, depending on any laggard issuances, that will help bring the network into the period of token stability and zero-inflation needed to move forward with the general roadmap.

As announced in our November Roadmap, in the months following this vesting as the market settles towards a stable token supply, the community can expect continued work on the Rust suite, with most progress visible in the online application suite and gaming network.

We have discussed our plans for this in previous roadmaps, so will just point the interested back there.

Thanks for reading, see you all on the Arcade!

]]>
https://org.saito.tech/saito-roadmap-update-march-29-2022/feed/ 0
Intellectual Property Update – Feb 25, 2022 https://org.saito.tech/intellectual-property-update-feb-25-2022/ https://org.saito.tech/intellectual-property-update-feb-25-2022/#respond Fri, 25 Feb 2022 07:59:43 +0000 https://org.saito.tech/?p=3360 Today we are pleased to share news that in the last twenty-four hours our team has received official notice from both the European and Chinese patent offices concerning their intent to approve our extant applications for patent protection covering key elements of Saito Consensus in their respective jurisdictions.

These developments will extend protection for Saito Consensus outside of the United States in the global markets that matter for network adoption. Coverage starts with the use of cryptographic signatures to track fee circulation, and extends from there to critical techniques for building blockchains and running web3 applications.

Our team remains committed to building an open network. We are delighted at the recognition being shown to the importance of Saito Consensus, and the strong prior art these approvals will create to protect web3 developers building on Saito as well as other distributed PKI networks.

]]>
https://org.saito.tech/intellectual-property-update-feb-25-2022/feed/ 0
Technocratic Hubris: what Hayek would say about Bitcoin https://org.saito.tech/what-hayek-would-actually-say-about-bitcoin/ https://org.saito.tech/what-hayek-would-actually-say-about-bitcoin/#respond Wed, 29 Dec 2021 03:10:00 +0000 https://org.saito.tech/?p=3282 While we talk a lot about collective action problems and market failures at Saito, it isn’t necessary to understand public choice theory to see the problems blockchain is having scaling. Interestingly, it’s also possible to approach the issues by reading Chicago school economists like Friedrich Hayek and understanding how and why prices function in free markets.

On the off-chance you haven’t heard of Friedrich Hayek, he was one of the mid-20th century economists who witnessed the ascent of the Soviet economic model and fought a rear-guard action in defense of capitalism. Whereas his fellow Austrian emigre Joseph Schumpeter saw free markets as superior because of their ability to drive innovation (“creative destruction”), Hayek argued they were necessary to set prices at levels which are “not given to anyone in [their] totality” and thus cannot be optimized by central planners.

Hayek’s seminal essay The Use of Knowledge in Society outlines this most succinctly, asking how state planners can ever know how to allocate resources? What Hayek saw that many others missed is that prices do more than incentivize people to produce things: they value all goods and services in relation to each other at rates Hayek called their “marginal rates of substitution.” Letting these rates self-adjust in response to the floating preferences of consumers and producers is how free markets allocate resources efficiently, incentivizing more of X and less of Y if doing so provides greater social value.

The subtlety of Hayek’s emphasis on relative-prices is usually missed by crypto-libertarians, who see tokens incentivizing arbitrary forms of work and imagine they have a free market at work. Ask exactly how the market is pricing work and most will get confused: isn’t the free market providing hashpower to Bitcoin? Aren’t users paying fees? Yet the problem has never been creating incentive structures or charging fees: state planners do both of those very well.

To see the actual problem, try inverting Bitcoin. Imagine we paid nodes in the P2P network and asked volunteers to hash for free. Would we get an optimal amount of security? What about an optimal number of P2P nodes? If more hashing would encourage more P2P nodes and greater network usage, would volunteers be incentivized to spend more money on hashing?

The intellectual error most people make here is forgetting that when a consensus algorithm pays nothing for something, it is assigning a price (of zero) and manually defining the “marginal rates of substitution” between that and other necessary inputs. The fact that the consensus mechanism is a machine does not stop it from acting as a de facto central planner. Robots can fix prices too!

Were Hayek still alive, he would be the first to point out this abuse of free market economics. He would also never accept the what-me-worry reassurances of technocratic dilettantes that perhaps things will work out. As Hayek would point out, the fact that we have an external market controlling the flow of funds into an internal market means that our two layers are never independent, and can never be subject to a rationalizing outside force.

In this environment, the only way the free market can price inputs efficiently is by putting the cross-market allocation of resources within the firm, so that internal profit-maximization strategies minimize wasteful spending. Like this approach or not, the result will clearly look much less like a permissionless network and much more like Google or Amazon. Replacing volunteers with firms doesn’t suddenly make firms behave like volunteers.

This news is depressing for those who have eyes to see it, in part because so few people seem aware of it, and in part from the irony of watching Silicon Valley’s libertarian class repeat the same price-fixing errors their Marxist counterparts did a century ago. The machines may change, but the technocratic hubris remains the same.

So while we can get to Saito Consensus by reading public choice theory, even the Chicago School can teach us something most libertarian crypto-bros fail to see: the rise of dominant network-layer service providers like Infura and TAAL is far from a swan song of “centralization” before technical maturity delivers a golden age of radical decentralization — it is the direction the free market itself is steering POW and POS networks out of their need to induce provision of the infrastructure their consensus mechanisms cannot price and pay for, but on which they are dependent for continued survival.

]]>
https://org.saito.tech/what-hayek-would-actually-say-about-bitcoin/feed/ 0
Network Migration Update – December 3, 2021 https://org.saito.tech/network-migration-update-december-3-2021/ https://org.saito.tech/network-migration-update-december-3-2021/#respond Fri, 03 Dec 2021 09:57:01 +0000 https://org.saito.tech/?p=3265 The end of November marked our target for getting Rust Consensus running on our network. While technical folks can follow our dev channels and get a close view of what is happening, we wanted to share an update for those who are non-technical but still curious.

For the last several days, our Rust Consensus network has been running in parallel with our public-facing gaming network. We are testing the ability of clients on the network to maintain consensus with module-generated traffic and gameplay. At the moment all of the clients on the network are operating as full-nodes and consuming and publishing full blocks.

As of this morning, everything is behaving a lot like the public network, although we are obviously continuing to find bugs and release patches. We have seen some edge-case bugs around the distribution of golden tickets, the validation of merkle-roots, and the relaying and distribution of blocks that are transmitted wildly out-of-sequence. The introduction of binary block and transaction formats has led to some minor serialization issues as well. All of these issues are manageable and expected.

We expect to be running this network for another week or so before updating the public network that is supporting active gameplay. In order to do that, we need to see chain re-organizations, automatic transaction rebroadcasting and other routing activities working smoothly in a live environment.

In general, we are pleased to share news that our migration is underway and encourage those who want a deeper view of our software development processes to join our dev channels or track developments on Github. As long as there are not significant problems with our clients maintaining consensus we expect to switch our public-facing infrastructure to run on our new clients sometime next week.

]]>
https://org.saito.tech/network-migration-update-december-3-2021/feed/ 0
Saito Tech Update – November 2, 2021 https://org.saito.tech/saito-tech-update-november-2-2021/ https://org.saito.tech/saito-tech-update-november-2-2021/#respond Tue, 02 Nov 2021 06:14:16 +0000 https://org.saito.tech/?p=3176 Prior to our roadmap release we wanted to push out a quick update on what is happening behind the scenes here since it’s the beginning of November and we know people are expecting it. Our roadmap WILL be released soon. Reason for the delay is primarily the Web3 work described below, and also because we want to get decent polish on the document since we know it will be a GOTO for a lot of newcomers.

As those who’ve been watching our git commits will have noticed, for the last three weeks we’ve been doing a lot of work getting Milestone #2 for the Web3 Foundation completed and submitted. We originally had this work in wait-mode because of delays with the DOT parachain auctions. We got a ping asking for submission however so pushed ahead with getting it out of the way. The submission is now done and waiting on the Web3 Foundation.

As part of the updates involving this work, Arcade users should notice improvements to many of the open source games. The improvements include updates to basic game components (the user controls, card fans, card lists, popup overlays). We’re also pleased to have help on the gaming front from a new developer in Texas we are hoping will continue to work with us to improve our front-end look-and-feel across all of the other applications moving forward.

Development in Rust is continuing apace. For those curious about the state of development there, we currently have most of the consensus-touching parts of the codebase implemented with test coverage. The outstanding work remains in the networking code. The API that specifies how data is transmitted is mostly complete, although not entirely tested, and the outstanding work is largely in connecting this API up to the rest of the software so the client can send and receive blocks and transactions as we produce them or receive them from other peers. We also still need to get the client signing blocks and transactions.

Given that the Rust version of Saito involves consensus-breaking changes to how blocks are validated, we have started a rewrite of the Saito-Lite (in-browser) client to bring it into consistency. Rather than simply hack-up the existing codebase, we are engaged in a more comprehensive re-write that cleans up the codebase and bring it into stronger alignment with Rust. This is easier work but it is still a tremendous amount of work. So a lot is happening behind the scenes. We are hoping that what this gets us is not one but two reasonably easy-to-read implementations of Saito Consensus. How quickly we can “flick-the-switch” and move the current testnet over to our new software clients will depend on how the inter-client testing work goes.

On a final note, we are excited about the upcoming roadmap update and getting you all more details and a refined vision for the project. The view beyond this is what we call “web3” and we’re planning to talk about this during the Saito Townhall. If you want the heads-up in advance of the rest of the world please come, or fire off questions to @SaitoOfficial on Twitter and we’ll try to get them answered.

]]>
https://org.saito.tech/saito-tech-update-november-2-2021/feed/ 0
Saito Tech Update: September 7, 2021 https://org.saito.tech/saito-tech-update-september-7-2021/ https://org.saito.tech/saito-tech-update-september-7-2021/#respond Tue, 07 Sep 2021 01:18:21 +0000 https://org.saito.tech/?p=3098 Our tech team has been working hard. For those who don’t want a deep-dive into our development roadmap the takeaway is that we are expecting to release our Rust Client on-schedule.

For those seeking more detail on where we are in our roadmap, at present we are about 3-4 months into development of the Saito Rust Client. This is the scalable reference implementation for Saito Consensus that we are building in the Rust language. It does not replace our in-browser client, but it is intended to be the main network client for full-nodes. We need it up-and-running to figure out the maximum practical blocksize for the next few years, to finalize the burn-fee algorithm that determines the blocktime, and to understand how permissive or restrictive our congestion controls will need to be to discourage spam-attacks at small fee throughput.

As of our last tech update about three months ago, we were nearing completion of what we call “Saito Classic”, a throttled-back implementation containing the basic features needed for consensus. The implementation covered a single machine creating transactions that contained routing work and producing blocks. Our major challenge during this first stage of development was ensuring our architecture would not create problems for Rust. The reason for this is that while Rust offers very fast parallel-processing for non-mutable data, the language itself strictly forbids different threads from owning or accessing different kinds of mutable data at the same time. As our javascript code did not have this limitation, a major part of our focus was making sure the components in our architecture *could* communicate, and that the limitations the Rust language imposes on sharing data structures would not drive us into a position where we would need to re-architect the entire codebase six months to a year out. In our last update we covered details on this as well as shared some data showing the performance gains we are experiencing with Rust.

After our last update we needed another two weeks or so to finish implementing Saito Classic, after which we moved on to scaffolding the major components that were not included in that implementation but are needed for any Rust client to join the live network. These include Automatic Transaction Rebroadcasting (the blockchain pruning mechanism), our staking mechanism, and our Networking code. These features have been more time-intensive to add because they combine in complicated ways: once tokens are being managed by a staking mechanism it is necessary to rebroadcast them if ATR exists, for instance, while nodes winding/unwinding the chain must keep the staking tables in sync as well. In addition to working on these problems we’ve also taken some additional work on our plate, such as shifting to use of a binary-format for block and transactions. This will eliminate the need for our core network to handle JSON data-format and is a step up in efficiency, since parsing JSON (reading and writing) is much more complicated than simply reading N bytes into memory.

Where does this put us for the next 2-3 months? My guess is that we have another two weeks of work on staking and networking before we can be comfortable that those components can support the functionality they need. The networking code will bring us to the point Saito-Rust can run as a full-node and connect to other full-nodes (we will continue to use Saito-Lite as the connection-point for lite-clients). Our staking implementation will support two-block payouts to start, with the golden ticket being used to pay out the block immediately preceding the block in which the golden ticket is found, but we will not recursively iterate beyond that to payout any unsolved blocks before that point.

Once our work on staking and networking is complete, all of the major components that need implementation in Rust should have been implemented. At this point, we will have a lot of somewhat arbitrary tasks to complete (i.e. distribute tokens on network start) and will need to develop systematic tests that cover different kinds of blockchain reorganization attempts. Our priorities for the remaining two months will cover:

  • fleshing out our testing suite not only to help debug the software and identify problems, but also ensure that monetary policy is working as intended and the amount of tokens in circulation remain constant across the range of network activities.
  • polish and improve documentation so it is easier for us to onboard new developers and our reference implementation is cleaner and has consistent naming conventions and behavior. 
  • updating Saito-Lite (our in-browser client) so that it is compatible with Saito-Rust — the changes we have made to consensus have currently broken compatibility between the two codebases — this will likely be the most time-intensive part of the work that remains.
  • testing, testing, testing…

Beyond Saito-Rust, we have a few other tasks that are getting attention as time permits. These include polishing a new game for addition to the Saito Arcade, creating a general framework for easier crypto-integration so as to more easily support 3rd party cryptos people want to use, and finishing milestone #2 for our Web3 Foundation grant. We do not expect to spend time on public-facing materials like the website until Rust is complete.

Where does this put us in terms of our long-term roadmap? Once Saito-Rust joins the network we will make a substantive announcement about the timeline for development moving forward, including our target dates for supporting token persistence on the live network. The speed at which that will happen will largely depend on which kinds of work we need to prioritize once Rust is live on the network.

]]>
https://org.saito.tech/saito-tech-update-september-7-2021/feed/ 0
Wolves and Sheep https://org.saito.tech/wolves-and-sheep/ https://org.saito.tech/wolves-and-sheep/#respond Tue, 27 Jul 2021 06:50:16 +0000 https://org.saito.tech/?p=2916 It has become conventional wisdom that there’s no way to tell attackers (wolves) from honest nodes (sheep) in an open blockchain. The strength of this opinion is based partially on its promotion by talented developers like Vitalik Buterin and Zooko Wilcox-O’Hearn.

To understand why these claims are wrong, consider the difference that Bitcoin exploits:

While honest users build on the chain-tip, attackers build from a block at least one confirmation deeper into the chain. 

This difference exists because attackers are by definition attempting to re-write historical blocks. And the network exploits this to tax them: keeping the cost of producing blocks the same for wolves and sheep, but providing a higher return for sheep who produce at the tip of the longest-chain.

But don’t majoritarian attacks eliminate this tax and put wolves back on equal footing with sheep? Some developers may make this claim (“what I meant was…”) but note the shift in logic: we have moved from asserting there are no differences between wolves and sheep to asserting that no differences exist which can be used to tax wolves who produce a majority of blocks. And this second claim is also false….

The easiest way to demonstrate this is — again — with a simple counterexample. Can we think of other differences between wolves and sheep that can be leveraged to tax wolves? Remember that these differences do not need to be perceivable by participants in the network. All we require is that we can leverage them to systematically tilt income away from wolves and towards sheep. Eliminating majoritarian attacks requires this penalty to apply even when wolves produce the vast majority of blocks in our network.

Sound impossible? There is at least one difference with this property:

When an attacker puts a transaction (that will be profitable for them to collect) into a block, they must be one hop deeper into the routing network than at least one honest node or user.

This difference is unquestionable. If an attacker is making money producing blocks, those blocks must by definition contain fees that come from non-attackers. It follows that every wolf that is not burning their own money (paying a tax) to produce blocks has at least one sheep at an earlier point in their transaction-routing path.

As with the difference Bitcoin leverages, this difference can’t tell us which specific nodes are wolves (some sheep are deep-hop nodes too!), but it can be leveraged to suppress the profitability of wolves. If we can make participating in the network more costly for deep-hop nodes, and force them to compete with early-hop nodes to create blocks and earn income, we give sheep a systemic competitive advantage. Eliminating the 51 percent attack is possible if, provided with the same set of resources, any node at routing depth N+1 can produce only half the blocks and or earn half the profits as a node at position N.

We can accomplish this by putting deep-hop nodes at a competitive disadvantage in producing blocks, or paying them less for the blocks they do produce. And why not both? As the amount of routing work in any block grows our consensus-layer simply needs to increase the cost of producing a block, and decrease the payment for producing it. This solution is counterintuitive to POW and POS developers as it requires the ratio between two variables that are fixed in their networks (cost of block production, and income from producing blocks) to float. It also requires something missing in both POW and POS: cryptographic signatures that track who routes fees inwards towards block producers.

The Saito whitepaper describes a practical implementation of the above. In Saito the cost of block production is directly modulated by consensus, while a cryptographically-biased lottery makes collecting payments costly for wolves but profitable for sheep, even when wolves produce the vast majority of blocks. An attentive reading of the Saito design will show it eliminates other more subtle problems with POW and POS mechanisms as well, but there is no reason for non-economists to delve into Saito to understand why the claims what it does is impossible are ungrounded.

To conclude: security mechanisms in blockchain do not identify bad actors by intent. They differentiate by behavior. Those who insist it is impossible to tell the difference bet ween “wolves” and “sheep” are forgetting something they used to know: that blockchains work by taxing forms of behaviour which are always exhibited by wolves and not caring if this punishes some honest sheep.

Differences do exist and networks leverage them to punish bad actors. This is how cost-of-attack is generated, and it is worth remembering because if we want to solve majoritarian attacks we cannot just throw up our hands and declare them impossible. The solution is to tax the hell out of wolflike behaviour until the only living things left in our pasture are battle-hardened but honest sheep.

]]>
https://org.saito.tech/wolves-and-sheep/feed/ 0
Saito Tech Update: June 28, 2021 https://org.saito.tech/saito-tech-update-june-28-2021/ https://org.saito.tech/saito-tech-update-june-28-2021/#respond Sun, 27 Jun 2021 20:30:51 +0000 http://org.saito.tech/?p=2765 The tech team has been hard at work for the past two sprints. There is a lot to share, but if you don’t want to wade into a technical discussion the short version is that we are on-track and expect to release our Rust Client as scheduled.

Rather than just provide vague-reassurances that things are going well, in this post I’ll try to go a bit deeper to give more insight into how we are operating and what problems we are dealing with and how we are solving them. And with that in mind, the first thing to mention is that under Clay and Stephen’s leadership tech has gone through an organizational change that is professionalizing our dev environment. We have shifted towards test-driven development and a process of having peer-review on major commits to the codebase. This is pretty standard for larger teams: it has involved changes to how development happens that make it harder for any of us to break things or make arbitrary changes without flagging them with other members of the team first.

In terms of coding focus, Saito Rust has been our #1 priority and we’ve gotten a lot done in the past two sprints. Most of the components in the Saito system have now been implemented in Rust code: block production, storage/retrieval of blocks from disk, mempool management of transactions and blocks, tracking of longest-chain state for blocks in the blockchain, our algorithm for adding blocks to the blockchain, burn fee calculations, signature validation, utxoset management and more. Note that Rust has restrictive policies about memory management and “ownership” so these bits of code are not necessarily working in production. We tend to code and then integrate. The major components we have not tackled in any significant way are network and peer management, automatic transaction rebroadcasting and the staking mechanism.

In general, our development effort has tried to have open-ended development cycles followed by a discussion of what works and doesn’t, followed by focused implementation of the consensus approach into core software. Doing things this way has allowed us to find out: (1) what practical issues we run into implementing the logic in Rust, and (2) what suggestions contributors have on improving Saito. Being open to making changes to the way Saito works has slowed us down because it has meant discussions are about more than just implementing an algorithm that already exists in javascript, but it has also led to some pretty clear wins for which Clay and Stephen deserve almost all of the credit:

  • an upgrade to the default hashing algorithm from SHA256 to BLAKE3 that will significantly speed up the overall data-throughput of the network; this is really significant — hashing is the single-biggest bottleneck in the classic Saito implementation.
  • an optional “pre-validation” step in block-processing that avoids the need for many blocks to even have data inserted into critical blockchain indices until we are sure they are part of a competitive chain; this speeds up work by avoiding the need for it in all but critical cases.
  • a change to the way that slips are formatted that eliminates the need for lite-wallets to track the blockchain after they have made a transaction: users will be able to spend their change slips as soon as they have created them; there are follow-on benefits that make it easier to spend ATR transactions here as well.
  • various proposals that effectively shrink the size of Saito transactions considerably when they are saved to disk or sent over the network, including a roughly 40% savings in the size of transaction slips which will allow us to pack more transactions into the same space on the blockchain.

Looking back at the last two months, I think we’ve spent more time exploring various implementation ideas that will not be implemented than I would have expected, but believe the payoff has been worth it in the sense that we are implementing improvements we did not conceptualize as part of this effort. The overall result is that the codebase is probably a bit behind where I expected we would be now in terms of raw features, but it is also cleaner and simpler and development is happening faster than expected too. The hardest part has been sorting out a design process where people can communicate effectively about proposed changes.

In terms of actual development milestones, we’re expecting to finish the classic version of the Saito protocol (no-networking) within 1-2 more weeks. And then tackle peer connections, the ATR mechanism, and then the staking mechanism in turn. This should afford time for testing before live-net deployment.

So what are the numbers on raw scalability? A good place to start is probably this screenshot above, generated by an earlier non-Rust version of Saito I fired up to generate some competitive numbers. What you see here is a test script creating an arbitrary number of fairly mammoth transactions and throwing them into an earlier Saito implementation to check how long it takes for the engine to process them. This is a back-of-the-envelope way to sanity-check our performance by seeing how different kinds of blocks (a few big transactions? masses of small ones?) bring the engine to its knees. This specific screenshot shows a smaller number of data-heavy transactions taking about 2 seconds to process. This is what you should see:

There are three critical bottlenecks that affect our non-Rust implementation: (1) saving blocks to disk, (2) loading blocks from disk, and (3) validating blocks (and transactions). We can’t see the delays in blocks getting saved here because that is happening asynchronously and behind-the-scenes. But we can see that hashing transaction-affixed data and checking those transaction signatures is taking a huge amount of processing time. There are some less obvious lessons too — it is interesting to know that our Google Dense Hashmap is not even breaking a sweat in handling UTXO updates but that our wallet is already starting to struggle, suggesting that core routing nodes shouldn’t even do that work by default.

One of the dirty secrets in the blockchain space is that network speed is the most critical bottleneck for scale. This is one of the reasons the world actually needs Saito, since “routing work” pays for fast network connections and that is ultimately the only way to achieve scale. But the more time we can shave off other work the more time we can afford to waste on network latency, so our focus in terms of core blockchain performance has been on those three critical bottlenecks: (1) saving blocks to disk, (2) loading blocks from disk, and (3) validating blocks (and their transactions).

Without solving any of those issues, what the overall numbers from non-Rust implementations tell us is that any javascript client is going to hit a wall somewhere around 400 MB per minute, and probably sooner if they start getting spammed by malicious nodes. This is useful to know because terabyte-scale requires around 1.5GB per minute so Rust needs to deliver big gains. It is also useful to have raw numbers because while some people think javascript isn’t a performant language this isn’t true — what you see here is *optimized* code that does things like swap-in precompiled C binaries in performance-critical areas like handling the UTXO set.

So how does Rust look right now? In javascript we use JSON and store a lot of data in formats like “human-readable strings” that are slower to read and write from disk. In Rust we are eliminating those and focusing on pure byte-streams to speed-up performance. Most importantly, we are dividing the work of “validating” transactions across the number of CPU cores that a computer has \available. Rust is almost uniquely better suited to parallelization here for some complicated reasons.

And the Rust numbers are much better. In terms of baseline numbers, we are seeing evidence to suggest 2x speed improvements on writing-to-disk and loading-from-disk. The savings here don’t come from the speed of interacting with the hard drive but rather the conversion between the resulting byte-streams and the in-memory objects that can be processed by the software. Saving to disk is not a really critical bottleneck, but loading from disk (or the network) is and so performance gains here are really critical to being able to rapidly validate and forward blocks. Clay in particular has been doing a lot of heavy lifting in pushing us towards better and faster data serialization and deserialization. It’s hard to share hard numbers because serialization work is still underway. We think that 50% improvement in block loading speed is possible.

What about transaction validation? In testing this aspect of system performance, we’ve mostly been dealing with test blocks in the 10 MB to 1 GB range with anywhere from 1000 to 100,000 transactions in total (~300k slips). We don’t really know how much data users will associate with transactions so the differences here are less about approximating reality and more about finding out — given the need to process a metric ton of transactions — how long block validation takes at different transaction sizes. Doing this has helped us learn – for instance – that for most practical transaction sizes a lot more time is spent hashing than verifying the resulting transaction signature. And UTXO validation? That continues to be a cakewalk. Possibly because we are using best-of-breed tools instead of the sort of databases that POS networks drift towards we won’t need to optimize the speed of updating our UTXO hashmap for a long-time.

And the good news here is that Rust is delivering 100% on parallelization. While we can use various techniques to add “parallel processing” to transaction validation in other languages, many require creating multiple “instances” of blocks for separate CPU threads to process individually. Rust is superior for its avoidance of this issue, meaning that once our block is in memory we can throw as many CPUs at transaction validation as needed. For all practical purposes, this makes transaction validation a non-issue. The process still takes a considerable amount of time compared to everything else we are doing, but the performance limit here is the number of CPU threads available, and that is scalable to something like 64-cores TODAY on commercial hardware.

We’re excited to get to the point where benchmarking is becoming realistic and useful. We don’t really want to share numbers broadly on everything quite yet because we haven’t finalized serialization and ATR and routing work obviously have an effect on the performance, but if you are really curious you can always download and run the code and see for yourself. The most important thing for us is that we’ve got a solid foundation for our own team in submitted patches and proposing design changes, and that decisions are being made in terms of actual numbers. Does this make things faster? So we think things are pretty solid.

Our biggest goal for the next 2 weeks is to finish what we consider the classic implementation of Saito. This involves finalizing what versions of algorithms are included by default in the “basic” version and particularly how the mining components interact with block production. After that we will move on with more advanced features (automatic transaction rebroadcasting, staking).

There is lots more to do, but also lots of progress. As always, if you’re interested in checking out what is happening for yourself or joining us and tackling some development, please feel welcome to track or clone our Saito Rust repository on Github. We have a dev channel in Saito Discord where questions and discussions are more than welcome too. Hope to see you there.

]]>
https://org.saito.tech/saito-tech-update-june-28-2021/feed/ 0
Saito Roadmap Update: 24 May 2021 https://org.saito.tech/saito-roadmap-update-24-may-2021/ https://org.saito.tech/saito-roadmap-update-24-may-2021/#respond Tue, 25 May 2021 14:21:11 +0000 http://org.saito.tech/?p=2629 As the crypto markets experience turmoil, our team remains focused on building Saito into the world’s largest and most open public blockchain. Here is an update on what is happening.

TECHNOLOGY

By now our team has laid the foundation for a Rust implementation of our core Saito client. You can follow the development work on our Github repository. We have also pushed forward by completing Milestone #1 of our Web3 Foundation Grant.

For the next four weeks our priorities are continuing to build our Rust client, as well as doing developer outreach to the Polkadot community around the Web3 work we have delivered. As always, of course, we welcome contact from those interested in building applications on Saito or contributing to development: dev@saito.tech.

Our current priorities:

  • Rust speed-benchmarking: we expect to get Saito-Rust to the point in the next three weeks where we will be able to begin benchmarking foundational components like block creation, transaction validation and data-hashing speeds. We plan to release a short test-suite others can use to validate our figures. Once this is done our Rust client should be able to make blocks and process transactions, but lack networking support.
  • We will be spending time to revamp our homepage and user-facing website as well. This is as much comms and marketing work as it is tech, but we expect the technical implementation will take some time and attention.

FINANCE

  • As our community is aware, we have been working on expanding exchange listing support. We are in late stages of a significant listing and will provide details on timelines once we have details we can share.
  • Given recent market movements, we note that Saito is well-insulated against market volatility as most of our funding is secured in USD-denominated assets.
  • With vesting complete, we plan to move the majority of illiquid SAITO tokens on the public ERC20 contract into an explicitly multi-sig vault. Details on any transfer that affects publicly-visible balances will be communicated well-in-advance.

COMMUNITY

  • We are working on marketing activities related to activities discussed in the finance section of this roadmap. We expect this to continue to take up time and ask for understanding.
  • We are working to setup regularly-scheduled competitions and tournaments around some of the more popular games in our Saito Arcade. We’re not quite sure yet how this will interact with our Web3 support for other cryptocurrencies, but you can expect to see poker tournaments and other gaming leagues get off the ground in the next two weeks. We hope you will all join us for some fun!
  • Judging from online shares and retweets, our most popular piece of content last month was been this Ready Player One meme video. But we produced a lot of great stuff including multiple interviews, videos, and several blog posts. We will continue to put effort into producing materials and have plans for community activities to generate even more.

HIRING

  • We have brought on 1-2 developers over the past three weeks and are still looking for a front-end NodeJS developer who can work full-time on improving our website and all of our Saito Applications, including our games and web3 support layer. We are also continuing to hire for marketing.
  • As our marketing efforts expand we are looking for senior marketing staff who can design and execute great campaigns to take Saito to the crypto community and the world.

    View our open positions at saito.io/jobs

]]>
https://org.saito.tech/saito-roadmap-update-24-may-2021/feed/ 0