Foresight Ventures: Crypto-Native Indexing Protocol and Keeper Protocol
0. Middleware Security Issues
Existing Indexing protocols and Keeper networks are not completely trustless, but trusted, or seemingly trustless. Developers and users need to trust that these products are not evil in a “Trust, Not Verify” way.
They are last generation infrastructures, and there may not have been a good solution at the time, so Fisherman mechanisms or DAO governance (Social Consensus…) were used to ensure trustworthy data and secure operation of the protocol.
At the moment, the zk solutions are in the final stage of performance optimization, so the mechanism of the previous generations can no longer be used to tackle the issues of the current one. With zk, all the innovations of Web3 middleware can be achieved, ensuring that security, decentralization, and performance are met at the same time, just as Optimistic Rollup is likely to give Layer2 dominance to zk Rollup in the future.
1. Web3 Indexing
a) Web3 Indexing
We first need to understand why the Web3-specific indexing protocol is needed:
- Web3 is an address model where smart contract data exists in transactional form and needs to be indexed to make the data structure easier to use; Web2’s data structure is handled by the developers themselves.
- Much of Web3’s data is transaction-related; a large portion of Web2’s indexed data is web page or image data indexed by search engines.
- Web3 needs a common indexing protocol to be composable; Web2 developers build their own indexing services based on their own centralized applications.
For these points, if you’re developing a DApp and you’re forced to do the indexing yourself, it takes a lot of front-end code to extract specific data from a contract, and here’s an example of the countless different functions that would be required to build a service:
So in our previous DApp architecture diagram, a common Indexing protocol is required as an intermediate layer for the front-end to use and access the data of smart contracts, so that the data of smart contracts can be easily used by the front-end.
b) GraphQL Indexing
We need an Indexing protocol as an intermediate layer, so how do we choose this protocol (The Graph ’19 talked about why Web3 uses GraphQL, but it didn’t feel very clear)? We have four potential choices:
First, exclude SOAP. Because its adoption rate is very low, and the learning curve is very steep. Some even say “REST is king, and SOAP is trash”.
Second, exclude RPC. RPC is a common specification for client-to-blockchain, or Web2 service-to-service calls, with operations (verbs) as the core, and the interface is a bit more cumbersome to update, suitable for client-to-blockchain network communication. But for our smart contract development scenario, it is too heavy, not the most suitable, and performance is not good due to the number of requests and the need to rely on the running application.
Next, REST is ruled out. The REST style is considered a resource (noun) centric specification. But in a Web3 application, any update to a resource needs to be triggered by user or other authorization, and all our requests in the indexing protocol are GET requests, so there is no need for REST.
Finally, GraphQL was chosen:
- The GraphQL protocol itself requires less effort to build than other standards, requires less changes, and is easier to build a common protocol.
- The interaction format of GraphQL gives more freedom to the front-end to define the result, which is in line with the back-end-less idea of DApp architecture.
- GraphQL is very suitable for blockchain smart contract scenarios that are completely open and immutable, and have a lot of tree-structured data, and therefore better performance.
- GraphQL is already a mature standard in blockchains with The Graph for indexing individual smart contracts, and there are already GraphQL interfaces for entire chains (ethql, Clear), with a high degree of maturity and a well-developed developer ecosystem.
Beyond that, I don’t think we need to spend a lot of time developing new GraphQL and Query protocols for storage networks (although it makes sense to aggregate these indexes):
- Most of the storage networks come with available indexing protocols, such as Arweave’s GraphQL service, and developing new protocols would be rebuilding the wheel.
- The amount of data on the storage network is very small compared to contract data or Web2 data, and the value it carries is also relatively small.
- Web2 already has more mature protocols and solutions for indexing this data, and developing new protocols is still like reinventing the wheel.
When we talk about indexing protocols, the default is to get the blockchain smart contract data directly from the front-end, because as we explained in the previous article, it makes sense to eliminate the back-end server for the Web3 Crypto-native trusted DApp.
Adding a backend for smart contract chains would add architectural complexity and expose more untrustworthiness (there are projects like zk-sql that focus on this problem, but can’t fully solve it; there is also Sqlidity, interesting on-chain SQLite solution), but of course for DApps based on storage protocols, SQLized statements are necessary for development familiarity and flow.
The structure to focus on for the indexing protocol should be a GraphQL structure that the front-end can use directly.
c) Existing Web3 GraphQL Indexing Protocol Issues
The leaders of the existing Indexing Protocols are necessarily the decentralized The Graph and Pocket Network and the centralized Alchemy. Both centralized and decentralized have their own problems:
— The problem of centralized Indexing Protocols:
- Inability to resist censorship
- Inability to guarantee high availability of services
— Problems with existing decentralized Indexing Protocols:
- Trust model and security remain poor (the cost of attacking Subgraph is very low, as with Chainlink 2.0, which relies on “more trusted” Fisherman to report)
- Performance does not meet demand
For security, the Fisherman mechanism in Optimistic Rollup is different from Indexing Protocol in that Optimistic Rollup’s data is up-chain and can be easily verified by a larger group of people through execution, whereas Indexing’s process is down-chain, if not If the indexer is not a Subgraph indexer, it is difficult to challenge the wrong data. This makes the trust model even less robust.
The combination of these shortcomings has led to a huge gap in the market for large DeFi applications that rarely use these indexing protocols because of performance and security.
d) ZK Solving Problems
ZK is actually a very good solution, and any problem with Optimistic mechanisms can be solved by switching to ZK, such as Rollup, which is the most prominent area.
The ZK-ized indexing protocol combines all the advantages of both centralized and decentralized protocols, including high availability and censorship resistance (multiple nodes guarantee uptime), excellent performance (centralized high-performance nodes can be chosen because of ZK), and security (ZK’s mathematical cryptography ensures good security).
For an indexing protocol, ZK’s solution:
- EVM compatibility is not required.
- Focus on overall performance, need to guarantee the rate of verifiable query.
The Graph itself is aware of the lack of security in its own mechanism and is working on Shellproof.
But I think The Graph is still slow in research and development, and I don’t know if Shellproofs can support all subgraphs. Also, The Graph has spent so much work on the existing mechanism that it would be more difficult to replace it than to build a new one.
A truly zk-enabled application of The Graph can build a new application and development paradigm:
- Any DeFi application can trust data from this indexing protocol, greatly simplifying the development process.
- Multi-chain applications can trust data from multiple chains and protocols at the same time, resulting in a huge improvement in user experience (+ Standardized Subgraph).
In this way, we can understand that the zk-ized The Graph is actually a decentralized RPC, which is far more ambitious than The Graph’s narrative, and truly achieves the decentralization that Infura is aiming for.
2. Web3 Keeper Network
In our previous article on Crypto-Native application architecture, we mentioned the Keeper.
It is essentially, an off-chain timer that triggers a function of a smart contract at a certain time, similar to:
- CronJob in Linux
- Web API (not JS) setTimeout and setInterval
It is used for the following purposes:
- On-chain prophecy machine price updates (previously mentioned Uniswap V2 TWAP)
- Trading, voting, clearing bots
- automated mining and selling
However, similar to The Graph we mentioned before, its security mechanism is very outdated, not even on-chain governance like The Graph, but off-chain manual reporting of illegal nodes through DAO and Social Consensus. For example, in the diagram below, Gelato’s architecture is clear in its overall functionality, but each component does not show any security guarantee.
Take two typical Keeper Networks as an example, their security mechanisms are:
- Gelato: The current Keeper service execution nodes are not permissionless, but only whitelisted nodes can participate. Gelato expects to secure the network in the future after decentralization through Stake and Slash mechanism and DAO. However, the slow decision to punish an illegal node for a week through DAO is not acceptable for a service that needs to run at a high frequency. I think it is completely unacceptable for a service that needs to operate at high frequencies.
- Keep3r Network: Similar to Gelato, the Watcher monitors and reports illegal behavior to the DAO, requiring communication and lengthy steps.
Like the Indexing protocol we just mentioned, Keeper can be completely zk-ified to solve the security problem, and even Gelato’s off-chain resolver is a subgraph defined in GraphQL, but has no security guarantees with The Graph. These two problems can be solved together.
This way a zk-ified Keeper with a trusted off-chain resolver can unlock a myriad of new application scenarios:
- Transaction bots with complex strategies
- Cross-Cluster/Cross-Chain/Cross-Block/Cross-DEX arbitrage/market-making bots
- Programmable Liquidity (Adjust interval, JIT, reinvestment, Rebalance)
3. Crypto-Native ZK Infra
As the underlying infra of zkEVM and generic zkVM mature, we can already try to build infra based on them and use them to build infra that developers can use directly, including the zkified middleware we are envisioning.
ZK, as a typical solution, is an important innovation driver like AMM. ZK and AMM unlock more automated and trusted applications than Optimistic and Order Book mechanisms, respectively, making security fully transparent and publicly verifiable on the chain, as well as unlocking the additional tracks of proof outsourcing and Swap aggregators, respectively, unlocking countless new applications.
In addition to scaling/cross-chain light nodes/privacy/machine learning, ZK is a cryptographic solution perfectly suited to the blockchain scenario (network-wide verification, extremely automated, and even more secure than network consensus), and has great potential in the middleware tracks of Indexing protocols and Keeper networks. We will continue to monitor the application of ZK in more areas.
About Foresight Ventures
Foresight Ventures is dedicated to backing the disruptive innovation of blockchain for the next few decades. We manage multiple funds: a VC fund, an actively-managed secondary fund, a multi-strategy FOF, and a private market secondary fund, with AUM exceeding $400 million. Foresight Ventures adheres to the belief of “Unique, Independent, Aggressive, Long-Term mindset” and provides extensive support for portfolio companies within a growing ecosystem. Our team is composed of veterans from top financial and technology companies like Sequoia Capital, CICC, Google, Bitmain and many others.