ETHTaipei 2024 Interview with Vitalik: AI, DeSocial, Quantum Resistance
Colin Wu . 2024-03-21 . Data

Author: WuBlockchain

During the recent ETHTaipei 2024 conference, Vitalik was interviewed and asked by several media outlets, The process involves life and eating, Dencun upgrade and Ethereum roadmap (upgrade expectation, centralized problem, modular re-pledge, etc.), security and privacy (anti-quantum capability, ZK privacy and security), decentralized social, AI, industry suggestions and other topics. WuBlochain compiled as follows:

Q: As a nomad, what’s the quick dish you’ve been making yourself a lot lately?

A: Dark chocolate will be one of the main components.

Q: When you travel and live around, what experimental community event inspired you the most?

A: Yeah, I guess it depends on what you mean by experimental community events, right? Zuzalu last year was probably the biggest thing in that category. The one in Vietnam, for example, was the first where the entire venue was outdoors. It was reminiscent of the West Lake in Hangzhou, in terms of ambiance. It was also the most COVID-safe conference I’ve attended, which was cool.

Q: What are you most excited to work on these days? Doesn’t have to be Ethereum/crypto related.

A: There’s a lowest continuing this year and it’s like Ethereum connected, but the community obviously. Various independent groups are forming their own villages this year, and I’m following along, offering help where I can. Excited to visit some of them soon. Lately, I’ve been experimenting with recent AI tools, running models locally, and using them for various tasks. It’s crucial to deepen understanding in this space and explore new possibilities. Learning is key, and I believe there’s value in exploring a diverse range of topics.

Q: How does the Dencun upgrade contribute to the Ethereum ecosystem?

A: The purpose of the Dencun upgrade is to significantly enhance scalability and reduce transaction fees, particularly for L2 and rollups. It achieves this by creating an independent section of the database within each block, known as blobs. This data is inaccessible to the EVM, which is crucial because it means clients verifying an Ethereum block don’t need immediate access to this data. However, the Ethereum blockchain guarantees its availability. This feature is especially beneficial for rollups and any layer 2 project reliant on secure data availability, ensuring nodes can sync if current nodes disappear, or enabling challenges against incorrect answers.

Over the past week, we’ve already observed a substantial decrease in layer 2 fees, in some cases by up to 50%. However, it’s important to note that fees may increase again as more users adopt blobs. Despite this, it represents a significant scalability improvement, and we anticipate the number of blobs supported by the Ethereum chain to continue increasing in the coming years.

Q: Is Dencun’s upgraded performance better than you expected?

A: I think it depends on what you mean by “better.” From a technological perspective, the upgrade went flawlessly. The network participation rate only dropped from 99% to 95%, which is better than any other fork we’ve had.

Interestingly, the level of usage is currently quite low. The target is about 3 blobs per block, but the average amount used is only one blob per block. This means blobs are currently incredibly cheap. If you want to publish a blob, you basically only have to pay Ethereum transaction fees. High Ethereum transaction fees might be one reason for this, but if the price of blobs approaches zero, they could be used for various purposes, like backing up an encrypted copy of your hard drive. There’s an infinite number of uses for a guaranteed database. Eventually, I believe usage will increase, but for now, it’s beneficial for present-day rollups that it’s very cheap. I’m looking forward to seeing usage go up over the next few months.

Q: What do you envision as the single most transformative impact will have on society in the next five years.

A: Yeah, I believe the next five years will be pivotal for Ethereum because many applications that were once theoretical or small-scale are now ready for real-world use. The impact of blockchain technology has already begun to permeate the broader world, often unnoticed. For example, Reddit’s forthcoming IPO includes opportunities for active community members to participate alongside institutional investors, thanks to ideas from the crypto space.

In terms of actual use, stablecoins have been one of the biggest impacts, facilitating savings, trading, and transactions. Over the next five years, Ethereum’s user experience and fee structures are set to improve, with developments like layer twos and projects like Base paving the way. As Ethereum becomes more user-friendly, it’s poised to lead in making stablecoins accessible and decentralized.

Beyond finance, non-financial applications are gaining traction, particularly in social media alternatives like Forecaster and Lens. Decentralization offers unique benefits, such as the ability for anyone to develop new clients or access and write to the same content without starting from scratch. This vibrant ecosystem is expected to expand further.

Ethereum-based identity solutions are also growing rapidly, with advancements in proof of personhood protocols and social graph-based systems. These solutions aim to address the challenge of verifying human users in online platforms, mitigating the risk of centralized solutions that could exclude certain groups. Ethereum has the potential to lead in developing decentralized alternatives to this problem.

Overall, these developments highlight the importance of Ethereum’s evolution and its potential to continue growing in various sectors.

Q: How you see the current challenges in proof of stake and how single slot formality and other upgrades could address this issue?

A: Yeah, I think the major challenges in proof of staking right now mainly revolve around various centralization risks. One significant concern is on the MEV (Miner Extractable Value) side, as well as risks related to the function of staking and being a validator in and of itself. On the MEV side, we’re witnessing growing centralization and censorship risks, with relays emerging as another form of centralized actor. Techniques like execution tickets and inclusion lists aim to mitigate these risks, striving to maintain decentralization and fairness in block creation, while assigning specific centralized functions to builders.

Another challenge lies in staking itself. According to recent polls, the primary reasons for not staking are having less than 32 ETH and the perceived difficulty of running a node. However, there’s a technical roadmap in place to address these concerns. Vertical trees, for instance, have made significant progress, aiming to reduce the storage requirements for running a node from multiple terabytes to a more manageable level, potentially allowing nodes to operate in RAM. Additionally, with technologies like ZK-SNARKs, the computational requirements for running a node will decrease further, making node operation more accessible to a broader range of users.

The issue of having less than 32 ETH for staking is more complex. Originally, 32 ETH was chosen as a compromise between requiring a significant amount of ETH to stake and avoiding too many validators, which could lead to processing difficulties. New approaches to proof of stake involve relaxing the requirement for every staker to participate in every round of consensus, enabling benefits such as single-slot finality and the ability to stake with less than 32 ETH. Rainbow staking and BLS signatures are among the proposals being explored in this active research area.

Q: We have seen many projects proposing ‘’modularize blockchain’’ solutions, and we’re also seeing ideas that the Ethereum L1 could be responsible for shared sequencing.How do you see the broader question: what are the functions that an L1 should handle centrally, and what should be left to individual L2s?

A: Modularization implies a future where individual chains handle fewer functions, with different components managed by various parts and shared sequencing becoming prominent, as advocated by Justin Drake, one of our researchers.

Currently, L1 Ethereum is responsible for shared security and settlement, ensuring that every L2 can interact without depending on centralized actors or validator sets. Ethereum provides data availability for rollups but not for voliums, and sequencing of transactions is determined on a rollup-by-rollup basis.

Regarding shared sequencing, opinions vary widely. While some advocate for its benefits, others argue it’s overrated. They contend that shared liquidity benefits are limited beyond a certain scale, and cross-L2 MEV isn’t as significant as perceived since it can be decomposed into MEV between different L2s.

Expanding Ethereum’s capacity to support more data directly is crucial. While the ideal scenario involves everything being a rollup, practical limitations necessitate exploring optimistic data constructions for off-chain data, with high-security functions remaining on-chain through rollups.

Account abstraction poses another challenge. Determining where account state resides and managing updates efficiently across multiple locations requires innovative solutions. One approach is a minimal keystore rollup, where state resides in a neutral rollup atop Ethereum, accessible to other L2s.

These ideas are still in early stages and subject to ongoing discussion in the Ethereum community. Pragmatically, functions critical enough to warrant L1 handling should remain there, while allowing flexibility for different L2s to handle diverse functions is also valuable.

Q: if you have a system that gives you some ZK level of privacy in theory, then like as a user, how do you know that you actually get that level of privacy in practice?

A: And I see this as essentially a continuation of a problem Ethereum already faces: ensuring the security of assets held in smart contracts. If you’re entrusting your assets to a smart contract instead of an individual, how can you be certain there’s no backdoor allowing unauthorized access?

Current solutions include the ability to read contracts on Etherscan, where people can publish source code. While this is useful for sophisticated users, it’s impractical for regular users who can’t review complex code themselves. Wallets have begun addressing this by providing more warnings, such as alerts when interacting with less-known applications.

One improvement I envision is versioned DApp user interfaces hosted on IPFS, with each update requiring a blockchain transaction for authorization. This eliminates the risk of unauthorized updates via hacking into a server. Wallets could then display information about the site’s recent updates and approval status.

We also need better mechanisms for aggregating the opinions of high-quality researchers and auditors. Wallets will likely play a crucial role in assisting users by aggregating this information. Similar tools used for Ethereum should be applied to ZK technologies, including publishing source code and verifying its authenticity.

For instance, tools like Etherscan could publish ZK source code and verify it on-chain. This approach ensures transparency and security for both on-chain and off-chain ZK technologies, such as Zoopass.

These efforts extend the same principles used for Ethereum contracts to the realm of ZK technologies, ensuring a secure and transparent ecosystem for users.

Q: How do you propose Ethereum can address the pressing concern of quantum-enabled threats, and what implications does this have for the broader cryptocurrency ecosystem?

Will there be a standard of criteria or symptoms the foundation has already predicted or estimated for the community to be alerted it’s a quantum attack?

A: One important realization, which I believe many people still overlook, is that from a technological standpoint, we have quantum-resistant algorithms for every vulnerable aspect to quantum computers. Quantum computers break existing elliptic curve signatures, but we have various quantum-resistant alternatives based on hash functions, lattices, and isogenies. For example, lattice-based and isogeny-based solutions address quantum attacks on elliptic curve-based encryption and stealth addresses. Moreover, advancements like Starks, which recently saw a breakthrough in reducing their size, offer quantum resistance due to their reliance on hash functions.

Fully homomorphic encryption, fortunately, is inherently quantum-resistant, as quantum computers do not affect lattice-based cryptography. While theoretically solved, there remain logistical challenges in transitioning from theory to practice. However, emergency recovery mechanisms exist to safeguard most funds, although efforts are underway to ensure complete quantum resistance for both users and protocols.

To achieve this, several steps are essential. Firstly, implementing account abstraction would allow users to choose quantum-resistant signature algorithms. Additionally, the Ethereum consensus layer needs quantum resistance, although engineering challenges arise due to the efficiency of current signature schemes like BLS. This underscores the importance of exploring alternatives like 8192-bit signatures per slot to accommodate less efficient quantum-resistant algorithms while researchers work on optimizing and benchmarking post-quantum alternatives.

Q: In your opinion, what are the key benefits of integrating AI with cryptocurrency, and how might this reshape the industry?

In a February tweet, you discussed AI’s role in debugging code, which is crucial for Ethereum. Could you provide further thoughts on how AI could be used in crypto and Ethereum?

A: Yeah, I think many people have been curious about the intersection of AI and crypto for the past decade. It’s logical to inquire because at a thematic level, it seems fitting. AI and crypto are both significant technological trends of our era. There’s a notion that AI tends toward centralization while crypto leans toward decentralization, suggesting a complementary relationship between the two.

The key question has always been whether we can move from this thematic convergence to actual applications that effectively utilize both technologies. In a recent post from about two months ago, I attempted to analyze this question and pinpoint some concrete applications that make sense.

One example I discussed is AI participating in prediction markets or other markets built on Ethereum. This could enable markets to operate on a micro scale and involve AIs as participants. Another application involves integrating AI into wallets to assist users in navigating the online and on-chain environments.

Realistically, the first two applications appear to be the most immediate and straightforward, while the last two are more speculative. I don’t want to imply that AI crypto applications will be the next big narrative driving the industry forward. However, I do believe these intersections warrant exploration.

Another potential area is AI’s role in debugging code. One of the major challenges in the space is dealing with bugs. AI could potentially make it easier to use formal verification tools to prove that larger sets of code meet certain criteria. This could lead to the development of bug-free EVMs, enhancing the security of the space. AI might play a valuable role in achieving this.

Q: What is your opinions on eth restaking fever? (Like Eigenlayer, which claim to provide ‘’trust of Ethereum’’ to actively validated services)

A: Restaking is an intriguing concept because it potentially offers a valuable method to unlock assets being used in staking and make them accessible to other applications. However, it also presents risks if not implemented correctly. One concern is the possibility of systemic risks affecting the Ethereum validator set. There’s a demand for restaking, and if this demand isn’t met in a decentralized manner, centralized actors might take over, which is undesirable.

Different projects are exploring various approaches to address these challenges. For now, I’m observing developments in this space and eagerly anticipating what emerges.

Q: You are more active on Farcaster than Twitter?

Do you think DeSocial applications like Farcaster can really compete with Web2 social apps or be able to subvert the social pattern?

A: One fascinating aspect of the social space is its dual nature of network effects and anti-network effects. Platforms like Twitter attract a large user base, but they also harbor undesirable elements. I’ve noticed that Forecaster, with its growing user base, maintains a higher quality of engagement compared to platforms like Twitter. Decentralized social platforms can address issues of centralized moderation and privilege, promoting trust without relying on centralized entities to dictate content quality.

Forecaster is evolving and encountering challenges like spam, but it remains pragmatic and user-friendly, even for non-crypto users. I anticipate the development of alternate Forecaster clients, offering unique features and community-driven enhancements. This trend toward decentralization in social media holds promise for addressing long-standing issues of moderation and content curation.

Maintaining decentralization in the face of institutional adoption and power concentration requires concerted efforts. We must address public goods funding, promote community-wide standardization, and encourage diverse participation in the Ethereum ecosystem. Implementing real-world applications of crypto’s theoretical benefits and ensuring open standards for decentralized infrastructure are essential steps in preserving Ethereum’s ethos amidst regulatory challenges.

Q: Do you have any specific advice to how non-English writers can participate in the Public Goods ecosystem?


English writers can contribute significantly by creating translations and localized versions of content related to Ethereum and other protocols. Active participation in community discussions, voting, and contributing to the aggregation of information are valuable ways to engage with the global community. Showing up, reaching out, and actively participating can help foster collaboration and knowledge-sharing among diverse participants. This advice applies to anyone looking to get involved, as active engagement is often welcomed and encouraged by community members.

Q: Why are you so interested in longevity? What will you do if you are immortal?

A: I Just like living, you know, like life is fine. What will I do if I’m immortal? You know, probably the same kind of stuff that I do today, just like longer. Yeah, yeah, yeah, exactly.

Q: What has surprised you the most in terms of how crypto has evolved and played out? Like if you were to tell yourself 5 years ago about how things are now, what would surprise you the most? How do you think the crypto world/industry will look like another 5 years from now?

A: Yeah, in 2020, NFTs really surprised me, as well as the resurgence of meme coins. On the positive side, the rapid advancements in ZK technology have been particularly surprising, as such progress in software development is rare. Additionally, the increasing speed of layer 1 protocol changes, such as the merge and EIPs, has been impressive.

Q: Could you share one piece of advice you would like to give to Taiwanese developers who are new to blockchain?

A: I believe the most crucial aspect is finding motivation to actively engage and remain involved in the community. This could involve attending in-person events, undertaking a specific project, or even starting as a writer, as I did, which encourages continuous learning and involvement. The key is to find a way to become an integral part of the community rather than drifting away after a short period of time.

Follow us