From the perspective of first principles, how SCP and AO affect the on-chain world
Why can SCP and AO (Actor Oriented) become the hexagonal warriors of infinite performance, data trustworthiness, and composability?
Author: Shijiu Jun, PermaDAO
- 1. From Bitcoin to Ethereum, how to find the optimal path to break through the limitations of throughput and scenarios?
- 2. Starting from first principles, what are the keys to breaking through the saturated market memes, and how to identify the most essential foundational needs of blockchain?
- 3. What magic do the disruptive innovation principles of SCP and AO (Actor Oriented) (separating storage and computation) possess that can allow Web3 to truly unleash itself?
- 4. Will the results of running deterministic programs on immutable data be unique and reliable?
- 5. Under such a narrative, why can SCP and AO (Actor Oriented) become infinite performance, trustworthy data, and composable hexagonal warriors?
The Shadows Behind the Prosperity of "Web3" After the U.S. Election
[Data Source: DefiLlama] As the spotlight fades, the TVL of Ethereum, the second-largest cryptocurrency in the digital currency market, has continued to show a sluggish trend since reaching its historical peak in 2021.
Even in the third quarter of 2024, Ethereum's decentralized finance (DeFi) revenue dropped to $261 million, the lowest level since the fourth quarter of 2020.
At first glance, there seems to be occasional spikes, but the overall trend indicates that DeFi's overall activity on the Ethereum network has slowed down.
Moreover, the market has seen the emergence of completely alternative trading scenarios with dedicated public chains, such as the recently popular hyperliquid, which is a trading chain based on an order book model, experiencing rapid overall data growth, jumping into the top 50 by market cap in just two weeks, with expected annual revenue projected to be only lower than Ethereum, Solana, and Tron among all public chains, reflecting the fatigue of traditional DeFi based on AMM architecture and Ethereum.
[Data Source: Compound Trading Volume]
[Data Source: Uniswap Trading Volume] DeFi was once the core highlight of the Ethereum ecosystem, but due to the reduction in transaction fees and user activity, its revenue has significantly declined.
In this regard, the author attempts to ponder the reasons behind the current predicament faced by Ethereum or the entire blockchain, and how to break through it?
Coincidentally, with the successful fifth test launch of SpaceX, SpaceX has become a rising star in commercial space travel. Looking back at SpaceX's development path, its success today is attributed to a key methodology—first principles (Tip: The concept of first principles was first proposed by the ancient Greek philosopher Aristotle over 2300 years ago. He described "first principles" as: "In the exploration of every system, there exist first principles, which are the most fundamental propositions or assumptions that cannot be omitted or violated").
So, let us also apply the method of first principles, peeling away the fog layer by layer, to explore the most essential "atoms" of the blockchain industry. From a fundamental perspective, we will re-examine the current challenges and opportunities faced by this industry.
Is Web3's "Cloud Service" a Step Backward or the Future?
When the concept of AO (Actor Oriented) was introduced, it attracted widespread attention. In the context of many EVM series public chains becoming homogenized, AO, as a disruptive architectural design, has shown unique appeal.
This is not just a theoretical idea; there are teams putting it into practice.
As mentioned above, the greatest value of blockchain lies in recording digital value. From this perspective, it is a publicly transparent global public ledger. Therefore, based on this essence, it can be considered that the first principle of blockchain is a form of "storage."
AO is realized based on the consensus paradigm of storage (SCP), as long as the storage is immutable, regardless of where the computation is performed, it can ensure that the results have consensus, giving birth to the AO global computer, achieving interconnection and collaboration of large-scale parallel computers.
Looking back at 2024, one of the most notable events in the Web3 space is the explosion of the inscription ecosystem, which can be seen as a practice of the early storage and computation separation model. For example, the etching technology used by the Runes protocol allows embedding a small amount of data in Bitcoin transactions. Although this data does not affect the primary function of the transaction, it serves as additional information, forming a clear, verifiable, and non-consumable output.
Despite initial skepticism from some technical observers regarding the security of Bitcoin inscriptions, fearing they could become potential entry points for network attacks.
However, for the past two years, it has completely stored data on-chain, and no blockchain forks have occurred to date. This stability further confirms that as long as the stored data is not tampered with, regardless of where the computation is performed, data consistency and security can be ensured.
Perhaps you will find that this is almost identical to traditional cloud services? For example:
In terms of computing resource management, in the AO architecture, "Actors" are independent computing entities, and each computing unit can run its own environment, which is reminiscent of microservices and Docker in traditional cloud servers. Similarly, storage in traditional cloud services can rely on S3 or NFS, while AO relies on Arweave.
However, simply reducing AO to "reheated leftovers" is not accurate. Although AO draws on certain design concepts from traditional cloud services, its core lies in combining decentralized storage with distributed computing. Arweave, as a decentralized storage network, fundamentally differs from traditional centralized storage. This decentralized characteristic endows Web3 data with higher security and censorship resistance.
More importantly, the combination of AO and Arweave is not a simple technical stack but creates a new paradigm. This paradigm combines the performance advantages of distributed computing with the trustworthiness of decentralized storage, providing a solid foundation for innovation and development in Web3 applications. Specifically, this combination is mainly reflected in the following two aspects:
Achieving a fully decentralized design in the storage system while ensuring performance through a distributed architecture.
This combination not only addresses some core challenges in the Web3 space (such as storage security and openness) but also provides a technical foundation for potentially infinite innovation and combinations in the future.
The following text will delve into the concepts and architectural design of AO and analyze how it addresses the challenges faced by existing public chains like Ethereum, ultimately bringing new development opportunities to Web3. Viewing the Current Web3 Predicament from the "Atomic" Perspective Since Ethereum burst onto the scene with smart contracts, it has become the undisputed king.
Some may ask, isn't there Bitcoin? However, it is worth noting that Bitcoin was created as a substitute for traditional currency, aiming to be a decentralized and digital cash system. Ethereum is not just a cryptocurrency; it is also a platform for creating and implementing smart contracts and decentralized applications (DApps).
In summary, Bitcoin is a digital substitute for traditional currency, with a high price, but that does not mean it has high value. Ethereum is more like an open-source platform, which has desirable value in terms of richness and better represents the open world of Web3 in contemporary ideology.
Since 2017, many projects have attempted to challenge Ethereum, but very few have persisted until the end. However, Ethereum's performance has always been criticized, leading to the growth of Layer 2 solutions. While Layer 2 appears to be thriving, it is actually struggling helplessly in a predicament. As competition intensifies, a series of issues have gradually emerged, becoming serious shackles on the development of Web3: Performance limitations and poor user experience
[Data Source: DeFiLlama] [Data Source: L2 BEAT] Recently, an increasing number of people believe that Ethereum's Layer 2 scaling plan has failed.
Initially, Layer 2 was an important continuation of Ethereum's subculture in the scaling plan, and there was a need for multiple people to support the development route of Layer 2, hoping to reduce gas fees and increase throughput to achieve growth in user numbers and transaction volumes. However, despite the reduction in gas fees, the anticipated growth in user numbers did not materialize.
In fact, is the failure of the scaling plan the fault of Layer 2? It is quite clear that Layer 2 is merely a scapegoat; while it bears some responsibility, the main responsibility lies with Ethereum itself, and further, it is an inevitable result of the underlying design issues faced by most chains in the current Web3.
From the "atomic" perspective, Layer 2 itself undertakes the function of computation, while the essential "storage" of the blockchain is borne by Ethereum. To achieve sufficient security, it must be Ethereum that stores and reaches consensus on the data.
However, Ethereum's design avoids potential infinite loops during execution, which could cause the entire Ethereum platform to halt. Therefore, any given smart contract execution is limited to a finite number of computational steps.
This leads to the design of Layer 2 expecting infinite performance, but in reality, the upper limit of the main chain imposes shackles on it.
The shortcoming effect determines that Layer 2 has a ceiling.
For detailed mechanisms, readers can expand their reading to understand: 《 From Traditional DeFi to AgentFi: Exploring the Future of Decentralized Finance 》. Limited Gameplay, Difficult to Form Effective Attraction Ethereum's pride lies in the prosperous ecosystem at the application layer, which hosts various DApps.
However, is the prosperity truly a scene of flourishing diversity?
The author believes that it is clearly not; behind Ethereum's prosperous application ecosystem is a serious financialization and a lack of maturity in non-financial applications.
Let us take a look at the application sectors that have developed relatively prosperously on Ethereum: First, concepts like NFT, DeFi, GameFi, and SocialFi, while exploratory in financial innovation, are currently not suitable for the general public. The rapid development of Web2 fundamentally stems from its functionalities being closely aligned with people's daily lives.
Compared to financial products and services, ordinary users are more concerned about features like messaging, social networking, video, and e-commerce.
Secondly, from a competitive perspective, credit loans are a very common and widespread product in traditional finance, but in the DeFi space, this type of product is still relatively scarce, mainly due to the current lack of an effective on-chain credit system.
Building a credit system requires allowing users to truly own their online personal profiles and social graphs, enabling them to cross different applications.
Only when these decentralized pieces of information can achieve zero-cost storage and transmission can a powerful personal information graph for Web3 be constructed, along with a set of Web3 applications based on a credit system.
Thus, we clarify a key issue again: The failure of Layer 2 to attract enough users is not its fau< the existence of Layer 2 has never been the core driving force. The real way to break through the shackles of Web3 is to innovate application scenarios to attract users.
However, the current situation is akin to a holiday highway, constrained by transaction performance limitations, making it difficult to implement even the most innovative ideas.
The essence of blockchain itself is "storage." When storage and computation are coupled, it becomes insufficiently "atomic." Under such a design that lacks essence, there will inevitably be a critical point of performance.
Some viewpoints define the essence of blockchain as a trading platform, a currency system, or emphasize transparency and anonymity. However, this perspective overlooks the fundamental characteristics of blockchain as a data structure and its broader application potential. Blockchain is not just for financial transactions; its technical architecture allows it to be applied across multiple industries, such as supply chain management, healthcare records, and even copyright management. Therefore, the essence of blockchain lies in its ability as a storage system, not only because it can securely store data but also because it ensures data integrity and transparency through a distributed consensus mechanism. Once a data block is added to the chain, it is almost impossible to change or delete.
Atomic Infrastructure: AO Makes Infinite Performance Possible
[Data Source: L2 TPS] The basic architecture of blockchain faces a clear bottleneck: the limitation of block space. Just like a fixed-size ledger, every transaction and piece of data must be recorded in a block. Ethereum and other blockchains are constrained by block size limits, causing transactions to compete for space. This raises a critical question: Can we break through this limitation? Must block space be limited? Is there a way to achieve true infinite scalability for the system?
Although Ethereum's Layer 2 route has succeeded in performance scaling, it can only be said to have succeeded halfway, as Layer 2 has improved throughput by several orders of magnitude. During peak transaction times, it may be able to hold up for individual projects, but for most Layer 2 solutions that inherit storage and consensus security from the main chain, this level of scalability is far from sufficient.
It is worth noting that the TPS of Layer 2 cannot be infinitely increased, mainly limited by the following factors: data availability, settlement speed, verification costs, network bandwidth, and contract complexity. While Rollup optimizes L1's storage and computation needs through compression and verification, it still needs to submit and verify data on L1, thus being constrained by L1's bandwidth and block time limits. At the same time, the computational overhead of generating zero-knowledge proofs, node performance bottlenecks, and the execution demands of complex contracts also limit the upper limit of Layer 2 scalability.
[Data Source: suiscan TPS] Currently, the real challenge for Web3 lies in insufficient throughput and applications, making it difficult to attract new users. Web3 may face the risk of losing influence.
In short, improving throughput is key to whether Web3 has a bright future. Achieving a network that can scale infinitely and has high throughput is the vision for Web3. For example, Sui adopts a deterministic parallel processing approach, pre-arranging transactions to avoid conflicts, thereby enhancing the system's predictability and scalability. This enables Sui to handle over 10,000 transactions per second (TPS). At the same time, Sui's architecture allows for increased network throughput by adding more validating nodes, theoretically achieving infinite scalability. It also employs the Narwhal and Tusk protocols to reduce latency, enabling the system to efficiently process transactions in parallel, thus overcoming the scalability bottlenecks of traditional Layer 2 solutions.
What we are discussing, AO, is also based on this idea. Although the focus is different, they are both building a scalable storage system.
Web3 needs a new infrastructure based on first principles, centered around storage. Just as Elon Musk rethought rocket launches and the electric vehicle industry, he fundamentally redesigned these complex technologies through first principles, disrupting the industry. AO's design is similar; it constructs a future-oriented Web3 storage foundation by decoupling computation and storage, pushing Web3 towards the vision of "decentralized cloud services."
Design Paradigm Based on Storage Consensus (SCP)
Before introducing AO, we must first discuss the relatively novel SCP design paradigm.
SCP may be unfamiliar to most people, but everyone is likely familiar with Bitcoin's inscriptions. Strictly speaking, the design idea of inscriptions is, in some respects, a design thought based on "storage" as the "atomic" unit, although it may have some deviations.
Interestingly, Vitalik has also expressed a desire to become the paper tape of Web3, and the SCP paradigm is precisely this kind of thought.
In Ethereum's model, computation is executed by full nodes, which then globally store and provide queries. This leads to a problem: while Ethereum is a "world-class" computer, it is a single-threaded program, where all steps can only be executed one at a time, which is clearly inefficient. It is also "fertile ground for MEV," as transaction signatures enter Ethereum's memory pool and are publicly disseminated, then sorted and mined by miners. Although this process may only take 12 seconds, in that brief time, the transaction content is exposed to countless "hunters," who can quickly intercept and simulate, even reverse-engineering possible trading strategies. For detailed information on MEV, readers can expand their reading: 《The MEV Landscape One Year After Ethereum's Merge》
In contrast, the idea of SCP is to separate computation from storage. You might find this a bit abstract; no worries, let's use a Web2 scenario as an example.
In Web2's chat and online shopping processes, there are often sudden spikes in traffic. However, a single computer struggles to support such a heavy load in terms of hardware resources. To address this, clever engineers proposed the concept of distribution, delegating computation to multiple computers, which then synchronize and store their respective computational states. This allows for elastic scaling to handle varying traffic over time.
Similarly, SCP can be seen as a design that distributes computation across various computing nodes. The difference is that SCP's storage relies not on MySQL or PostgreSQL databases but on the mainnet of the blockchain.
In short, SCP uses blockchain to store the results of states and other data, ensuring the trustworthiness of stored data and achieving a high-performance network layered with the underlying blockchain.
More specifically, the blockchain in SCP is used solely for data storage, while off-chain clients/servers are responsible for executing all computations and storing all generated states. This architectural design significantly enhances performance and scalability, but under the architecture of separating computation and storage, can we truly ensure the integrity and security of the data?
In simple terms, the blockchain is primarily used to store data, while the actual computational work is done by off-chain servers. This new system design has an important characteristic: it no longer employs the complex node consensus mechanisms of traditional blockchains but instead conducts all consensus processes off-chain.
What are the benefits of this? Because there is no need for complex consensus processes, each server can focus solely on processing its computational tasks. This allows the system to handle almost unlimited transactions, and the operating costs are lower.
While this design is somewhat similar to currently popular Rollup scaling solutions, its goals are larger: it is not just about solving blockchain scalability issues but also about providing a new path for the transition from Web2 to Web3. After discussing all this, what advantages does SCP have? SCP enhances system flexibility and composability by decoupling computation and storage. This design not only improves the performance limitations of traditional blockchains but also ensures the trustworthiness of data. Such innovation makes SCP an efficient and scalable infrastructure, empowering future decentralized ecosystems.
Composability: SCP places computation off-chain, which prevents pollution of the essence of blockchain, allowing it to maintain its "atomic" properties. At the same time, with computation off-chain, the blockchain only bears the functional attributes of storage, meaning any smart contract can be executed, and migrating applications based on SCP becomes extremely simple, which is very important.
Low Development Barriers: Off-chain computation allows developers to use any programming language for development, whether C++, Python, or Rust, without needing to specifically use EVM with Solidity. The only cost for programmers may be the API costs for interacting with the chain.
No Performance Limitations: Off-chain computation aligns computational power directly with traditional applications, where the performance ceiling depends on the machine performance of the computing servers. The traditional elastic scaling of computing resources is a very mature technology, and without considering the costs of computing machines, computational power is virtually unlimited.
Trustworthy Data: Since the basic function of "storage" is borne by the blockchain, this means all data is immutable and traceable. If any node doubts the state results, it can pull data for recalculation. Thus, the blockchain endows data with trustworthy characteristics.
Bitcoin proposed the PoW solution to address the "Byzantine Generals Problem," which was Satoshi Nakamoto's unconventional approach in that environment, leading to the success of Bitcoin.
Similarly, when facing the computation of smart contracts, we start from first principles. This may seem like a seemingly unreasonable solution, but when boldly decentralizing computational functions and returning the blockchain to its essence, we find that while storage consensus is satisfied, it also meets the characteristics of open-source data and supervisory trustworthiness, achieving performance that is completely as excellent as Web2. This is SCP.
The Combination of SCP and AO: Breaking Free from Shackles
After all this, we finally arrive at AO.
Firstly, AO's design adopts a model called the Actor Model, which was originally used in the Erlang programming language.
At the same time, AO's architecture and technology are based on the paradigm of SCP, separating the computation layer from the storage layer, making the storage layer permanently decentralized while maintaining the traditional model of the computation layer.
AO's computing resources are similar to traditional computing models, but it adds a permanent storage layer, making the computation process traceable and decentralized.
At this point, you may wonder, which main chain is used for the storage layer?
Clearly, the main chain used for the storage layer cannot be Bitcoin or Ethereum, and the reasons have been discussed above. I believe readers can easily understand this. The final data storage and verifiability issues of AO's computations are handled by Arweave.
So why choose Arweave among so many decentralized storage tracks?
The choice of Arweave as the storage layer is primarily based on the following considerations: Arweave is a decentralized network focused on permanent data storage, positioned similarly to a "global hard drive that never loses data," differing from Bitcoin's "global ledger" and Ethereum's "global computer." Arweave is like a global hard drive that will never lose data.
For more technical details about Arweave, please refer to: 《Understanding Arweave: The Key Infrastructure of Web3》 Next, let us focus on discussing the principles and technologies of AO, and see how AO achieves infinite computation? [Data Source: How AO Messenger Works | Manual]
The core of AO is to build an infinitely scalable and environment-independent computation layer. Each node in AO collaborates based on protocols and communication mechanisms, allowing each node to provide optimal service and avoid competitive consumption. First, let’s understand the basic architecture of AO, which consists of processes and messages as the two basic units and scheduling units (SU), computing units (CU), and messenger units (MU):
Process: The computational unit of nodes in the network, used for corresponding data computation and message processing. For example, each contract can be a process.
Message: Processes interact through messages, with each message being data that conforms to the ANS-104 standard. The entire AO must adhere to this standard.
Scheduling Unit (SU): Responsible for numbering the messages of processes so that they can be sorted and is also responsible for uploading messages to Arweave.
Computing Unit (CU): The state node in the AO process, responsible for executing computational tasks and returning the results and signatures to the SU, ensuring the correctness and verifiability of the computation results.
Messenger Unit (MU): The routing entity within the node, responsible for delivering user messages to the SU and verifying the integrity of signed data.
It is worth noting that AO does not have shared states, only holographic states. The consensus in AO is generated through games, as the state produced by each computation is uploaded to Arweave, ensuring data verifiability. When users question a certain piece of data, they can request one or more nodes to compute the data on Arweave. If the settlement results do not match, the corresponding dishonest nodes will be penalized. Innovation of AO Architecture: Storage and Holographic State The innovation of the AO architecture lies in its data storage and verification mechanism, replacing redundant computation and limited block space in traditional blockchains with decentralized storage (Arweave) and holographic states.
Holographic State: In the AO architecture, the "holographic state" generated by each computation is uploaded to the decentralized storage network (Arweave). This "holographic state" is not just a simple record of transaction data; it contains the complete state and related data of each computation. This means that every computation and result will be permanently recorded and can be verified at any time. The holographic state serves as a "data snapshot," providing a distributed and decentralized data storage solution for the entire network.
Storage Verification: In this model, data verification no longer relies on each node recalculating all transactions but confirms the validity of transactions by storing and comparing data uploaded to Arweave. When the computation results produced by a certain node do not match the data stored on Arweave, users or other nodes can initiate a verification request. At this point, the network will recalculate the data and check the storage records in Arweave. If the computation results are inconsistent, the node will be penalized, ensuring the integrity of the network.
Breaking Through Block Space Limitations: Traditional blockchain block space is limited by storage, with each block containing only a finite number of transactions. In the AO architecture, data is no longer directly stored in blocks but uploaded to a decentralized storage network (like Arweave). This means that the storage and verification of the blockchain network no longer depend on the size of block space but are shared and expanded through decentralized storage. Therefore, the capacity of the blockchain system is no longer directly limited by block size.
The limitations of block space in blockchain are not insurmountable. The AO architecture changes the traditional data storage and verification methods of blockchain by relying on decentralized storage and holographic states, thus providing the possibility for infinite scalability.
Must Consensus Depend on Redundant Computation?
Not necessarily. Consensus mechanisms do not have to rely on redundant computation; they can be achieved in various ways. Solutions that depend on storage rather than redundant computation are also feasible in certain scenarios, especially when the integrity and consistency of data can be ensured through storage verification.
In the architecture of AO, storage becomes an alternative to redundant computation. By uploading computation results to a decentralized storage network (here, Arweave), the system can ensure data immutability, and through the holographic upload of states, any node can verify computation results at any time, ensuring data consistency and correctness. This approach relies on the reliability of data storage rather than the results of each node's repeated computations. Next, let’s look at a table to compare AO and ETH:
It is not difficult to see that the core characteristics of AO can be summarized in two points:
Large-scale parallel computation: Supports countless processes running in parallel, significantly enhancing computational capacity.
Minimized trust dependency: No need to trust any single node; all computation results can be infinitely reproduced and traced. How AO Breaks the Dilemma: The Predicament of Public Chains Led by Ethereum? Regarding the two major dilemmas faced by Ethereum, performance shackles and insufficient applications, the author believes that this is precisely where AO excels, for the following reasons:
AO is designed based on the SCP paradigm, where computation and storage are separated, so in terms of performance, it is no longer comparable to Ethereum's single-process computation. AO can flexibly scale more computational resources according to demand, and the holographic state of message logs in Arweave allows AO to ensure consensus by reproducing computation results, which is equally secure as Ethereum and Bitcoin.
The message-passing parallel computing architecture allows AO's processes to avoid competing for "locks." In Web2 development, it is well-known that a high-performance service will try to avoid lock contention, as this is a significant cost for efficient services. Similarly, AO's processes avoid lock contention through messages, which enables its scalability to reach any size.
AO's modular architecture is reflected in the separation of CU, SU, and MU, allowing AO to adopt any virtual machine, sorter, etc. This provides extremely convenient and low-cost migration and development for DApps across different chains. Combined with Arweave's efficient storage capabilities, DApps developed on it can achieve richer gameplay; for example, character graphs can be easily implemented on AO.
The support of the modular architecture allows Web3 to adapt to the policy requirements of different countries and regions. Although the core concept of Web3 is decentralization and deregulation, it is inevitable that different policies in various countries have a profound impact on the development and promotion of Web3. Flexible modular combinations can adapt to different regional policies, thus ensuring the robustness and sustainable development of Web3 applications to some extent.
Conclusion
The separation of computation and storage is a great idea and a systematic design based on first principles.
As a narrative direction similar to "decentralized cloud services," it not only provides good landing scenarios but also offers broader imaginative space for combining AI.
In fact, only by truly understanding the foundational needs of Web3 can we break free from the dilemmas and shackles brought about by path dependence.
The combination of SCP and AO provides a new idea: it inherits all the characteristics of SCP, no longer deploying smart contracts on-chain but storing immutable and traceable data on-chain, achieving verifiable data trustworthiness for everyone.
Of course, there is currently no absolutely perfect path, and AO is still in its nascent development stage. How to avoid the over-financialization of Web3 and create enough application scenarios to bring richer possibilities for the future remains a test for AO on its path to success. Whether AO can deliver a satisfactory answer still needs to be tested by the market and time.
The combination of SCP and AO, as a potentially powerful development paradigm, although its concepts have not yet been widely recognized in the market, AO is expected to play an important role in the Web3 space in the future, even driving further development of Web3.
Disclaimer: The content of this article solely reflects the author's opinion and does not represent the platform in any capacity. This article is not intended to serve as a reference for making investment decisions.
You may also like
Krypton DAO (KRD) price soars 513% over the week: will it hold the gains?
Shiba Inu Grapples With Heavy Bearish Forces: Will Support Levels Hold?
Time to Buy Bitcoin? $1.02B BTC Outflow Raises Hopes of Price Rebound
Crypto Market Continues to Bleed – $460 Were Liquidated in the Past 24 Hours