Connect with us

Uncategorized

Web3 Has a Memory Problem — And We Finally Have a Fix

Published

on

Web3 has a memory problem. Not in the “we forgot something” sense, but in the core architectural sense. It doesn’t have a real memory layer.

Blockchains today don’t look completely alien compared to traditional computers, but a core foundational aspect of legacy computing is still missing: A memory layer built for decentralization that will support the next iteration of the internet.

Muriel Médard is a speaker at Consensus 2025 May 14-16. Register to get your ticket here.

After World War II, John von Neumann laid out the architecture for modern computers. Every computer needs input and output, a CPU for control and arithmetic, and memory to store the latest version data, along with a “bus” to retrieve and update that data in the memory. Commonly known as RAM, this architecture has been the foundation of computing for decades.

At its core, Web3 is a decentralized computer — a “world computer.” At the higher layers, it’s fairly recognizable: operating systems (EVM, SVM) running on thousands of decentralized nodes, powering decentralized applications and protocols.

But, when you dig deeper, something’s missing. The memory layer essential for storing, accessing and updating short-term and long term data, doesn’t look like the memory bus or memory unit von Neumann envisioned.

Instead, it’s a mashup of different best-effort approaches to achieve this purpose, and the results are overall messy, inefficient and hard to navigate.

Here’s the problem: if we’re going to build a world computer that’s fundamentally different from the von Neumann model, there better be a really good reason to do so. As of right now, Web3’s memory layer isn’t just different, it’s convoluted and inefficient. Transactions are slow. Storage is sluggish and costly. Scaling for mass adoption with this current approach is nigh impossible. And, that’s not what decentralization was supposed to be about.

But there is another way.

A lot of people in this space are trying their best to work around this limitation and we’re at a point now where the current workaround solutions just cannot keep up. This is where using algebraic coding, which makes use of equations to represent data for efficiency, resilience and flexibility, comes in.

The core problem is this: how do we implement decentralized code for Web3?

A new memory infrastructure

This is why I took the leap from academia where I held the role of MIT NEC Chair and Professor of Software Science and Engineering to dedicate myself and a team of experts in advancing high-performance memory for Web3.

I saw something bigger: the potential to redefine how we think about computing in a decentralized world.

My team at Optimum is creating decentralized memory that works like a dedicated computer. Our approach is powered by Random Linear Network Coding (RLNC), a technology developed in my MIT lab over nearly two decades. It’s a proven data coding method that maximizes throughput and resilience in high-reliability networks from industrial systems to the internet. 

Data coding is the process of converting information from one format to another for efficient storage, transmission or processing. Data coding has been around for decades and there are many iterations of it in use in networks today. RLNC is the modern approach to data coding built specifically for decentralized computing. This scheme transforms data into packets for transmission across a network of nodes, ensuring high speed and efficiency.

With multiple engineering awards from top global institutions, more than 80 patents, and numerous real-world deployments, RLNC is no longer just a theory. RLNC has garnered significant recognition, including the 2009 IEEE Communications Society and Information Theory Society Joint Paper Award for the work «A Random Linear Network Coding Approach to Multicast.» RLNC’s impact was acknowledged with the IEEE Koji Kobayashi Computers and Communications Award in 2022.

RLNC is now ready for decentralized systems, enabling faster data propagation, efficient storage, and real-time access, making it a key solution for Web3’s scalability and efficiency challenges.

Why this matters

Let’s take a step back. Why does all of this matter? Because we need memory for the world computer that’s not just decentralized but also efficient, scalable and reliable.

Currently, blockchains rely on best-effort, ad hoc solutions that achieve partially what memory in high-performance computing does. What they lack is a unified memory layer that encompasses both the memory bus for data propagation and the RAM for data storage and access.

The bus part of the computer should not become the bottleneck, as it does now. Let me explain.

“Gossip” is the common method for data propagation in blockchain networks. It is a peer-to-peer communication protocol in which nodes exchange information with random peers to spread data across the network. In its current implementation, it struggles at scale.

Imagine you need 10 pieces of information from neighbors who repeat what they’ve heard. As you speak to them, at first you get new information. But as you approach nine out of 10, the chance of hearing something new from a neighbor drops, making the final piece of information the hardest to get. Chances are 90% that the next thing you hear is something you already know.

This is how blockchain gossip works today — efficient early on, but redundant and slow when trying to complete the information sharing. You would have to be extremely lucky to get something new every time.

With RLNC, we get around the core scalability issue in current gossip. RLNC works as though you managed to get extremely lucky, so every time you hear info, it just happens to be info that is new to you. That means much greater throughput and much lower latency. This RLNC-powered gossip is our first product, which validators can implement through a simple API call to optimize data propagation for their nodes.

Let us now examine the memory part. It helps to think of memory as dynamic storage, like RAM in a computer or, for that matter, our closet. Decentralized RAM should mimic a closet; it should be structured, reliable, and consistent. A piece of data is either there or not, no half-bits, no missing sleeves. That’s atomicity. Items stay in the order they were placed — you might see an older version, but never a wrong one. That’s consistency. And, unless moved, everything stays put; data doesn’t disappear. That’s durability.

Instead of the closet, what do we have? Mempools are not something we keep around in computers, so why do we do that in Web3? The main reason is that there is not a proper memory layer. If we think of data management in blockchains as managing clothes in our closet, a mempool is like having a pile of laundry on the floor, where you are not sure what is in there and you need to rummage.

Current delays in transaction processing can be extremely high for any single chain. Citing Ethereum as an example, it takes two epochs or 12.8 minutes to finalize any single transaction. Without decentralized RAM, Web3 relies on mempools, where transactions sit until they’re processed, resulting in delays, congestion and unpredictability.

Full nodes store everything, bloating the system and making retrieval complex and costly. In computers, the RAM keeps what is currently needed, while less-used data moves to cold storage, maybe in the cloud or on disk. Full nodes are like a closet with all the clothes you ever wore (from everything you’ve ever worn as a baby until now).

This is not something we do on our computers, but they exist in Web3 because storage and read/write access aren’t optimized. With RLNC, we create decentralized RAM (deRAM) for timely, updateable state in a way that is economical, resilient and scalable.

DeRAM and data propagation powered by RLNC can solve Web3’s biggest bottlenecks by making memory faster, more efficient, and more scalable. It optimizes data propagation, reduces storage bloat, and enables real-time access without compromising decentralization. It’s long been a key missing piece in the world computer, but not for long.

Continue Reading
Click to comment

Leave a Reply

Ваш адрес email не будет опубликован. Обязательные поля помечены *

Uncategorized

Illinois to Drop Staking Lawsuit Against Coinbase

Published

on

By

Illinois will soon drop its staking lawsuit against Coinbase, joining three other U.S. states that have recently backed down from litigation against the exchange.

A spokesperson for Illinois Secretary of State Alexi Giannoulias told CoinDesk on Thursday that the office “intends to drop the Coinbase lawsuit.” The spokesperson did not reply when asked when the case may be dropped.

Illinois was one of 10 U.S. states that brought charges against Coinbase in 2023 for allegedly violating state securities laws through its staking program. The U.S. Securities and Exchange Commission (SEC) also charged Coinbase with violating federal securities laws for its staking product, but dropped that suit in February. Since the SEC’s retreat, state securities regulators in Kentucky, Vermont and South Carolina have also abandoned their own cases against the exchange.

The remaining states with staking-related suits against Coinbase include Alabama, California, Maryland, New Jersey, Washington and Wisconsin. Spokespeople for California, Maryland, and Wisconsin declined to comment on pending litigation.

A representative for the New Jersey Bureau of Securities told CoinDesk the “Coinbase matter remains open,” and Bill Beatty, securities administrator for the Washington Department of Financial Institutions said the state’s “case with Coinbase remains ongoing at this time.”

The Alabama Securities Commission did not return CoinDesk’s request for comment.

Continue Reading

Uncategorized

Dogecoin Volatility Surge: From Stability to Dramatic Decline

Published

on

By

Recent Price Action Shows Signs of Recovery

In the last 100 minutes of trading, DOGE has demonstrated a notable recovery pattern, climbing from a local bottom of $0.156 to stabilize around $0.158.

The price action shows an apparent V-shaped recovery with significant volume spikes (16-21 million) during the bottoming process around 14:50-14:52, indicating strong buyer interest at support levels.

The $0.158-$0.159 zone has emerged as immediate potential resistance, with multiple tests showing decreasing selling pressure. This recovery aligns with the 38.2% Fibonacci retracement level from the recent decline, suggesting potential continuation toward the 50% retracement at $0.160 if current momentum persists.

Dogecoin Technical Indicators

Price Range: DOGE traded between $0.179–$0.156, representing a 12.7% swing.

Volatility: 48-hour annualized volatility reached 86.3%, significantly above market norms.

Support/Resistance: Breakdown of $0.165 support level with new critical support zone at $0.158–$0.160.

Fibonacci Levels: Potential stabilization at the 61.8% retracement level ($0.162).

Volume Analysis: High-volume selling pressure followed by significant volume spikes (16–21 million) during recovery.

Recovery Pattern: V-shaped recovery from $0.156 to $0.158 with decreasing selling pressure at resistance.

Retracement Levels: Current price action aligns with 38.2% Fibonacci retracement with the potential move toward a 50% level at $0.160.

Disclaimer: This article was generated with AI tools and reviewed by our editorial team to ensure accuracy and adherence to our standards. For more information, see CoinDesk’s full AI Policy. This article may include information from external sources, which are listed below when applicable.

External References:

Times Tabloid, “Dogecoin (DOGE) Next Significant Rally? 7 Critical Levels to Watch,” accessed Apr. 3, 2025

Bitzo, “Market Weakness Strikes: Are DOGE, SHIB Set to Recover in April?” accessed Apr. 3, 2025

Times Tabloid, “Dogecoin (DOGE) at a Critical Turning Point as Key Levels Dictate Its Next Move,” accessed Apr. 3, 2025

Coinpedia, “Will Dogecoin (DOGE) Crash or Skyrocket?,” accessed Apr. 3, 2025

Finbold, “Anxiety Grips Dogecoin Holders as Major Sentiment Flips Into Bear Territory,” accessed Apr. 3, 2025

Continue Reading

Uncategorized

Luxor’s Aaron Forster on Bitcoin Mining’s Growing Sophistication

Published

on

By

Luxor Technology wants to make bitcoin mining easier. That’s why the firm has rolled out a panoply of products (mining pools, hashrate derivatives, data analytics, ASIC brokerage) to help bitcoin miners, large and small, develop their operations.

Aaron Forster, the company’s director of business development, joined in October 2021, and has seen the team grow from roughly 15 to 85 people in the span of three and a half years.

Forster worked a decade in the Canadian energy sector before coming to bitcoin mining, which is one of the reasons why he’ll be speaking about the future of mining in Canada and the U.S. at the BTC & Mining Summit at Consensus this year, May 14-15.

In the lead-up to the event, Forster shared with CoinDesk his thoughts on bitcoin miners turning to artificial intelligence, the growing sophistication of the mining industry, and how Luxor’s products enable miners to hedge various forms of risk.

This interview has been condensed and edited for clarity.

Mining pools allow miners to combine their computational resources to have higher chances of receiving bitcoin block rewards. Can you explain to us how Luxor’s mining pools work?

Aaron Forster: Mining pools are basically aggregators that reduce the variance of solo mining. When you look at solo mining, it’s very lottery-esque, meaning that you could be plugging your machines in and you might hit block rewards tomorrow — or you might hit it 100 years from now. But you’re still paying for energy during that time. At a small scale, it’s not a big deal, as you scale that up and create a business around it.

The most common kind of mining pool is PPLNS, which means Pay-Per-Last-N-Shares. Basically, that means the miner does not get paid unless that mining pool hits the block. That’s also due to luck variance, so it’s no different from that solo miner’s situation. However, that creates revenue volatility for those large industrial miners.

So we’re seeing the emergence of what we call Full-Pay-Per-Share, or FPPS, and that’s Luxor is operating for our bitcoin pool. With FPPS, regardless of whether we find a block or not, we’re still paying our miners their revenue based on the number of shares they’ve submitted to the pool. That gives revenue certainty to miners, assuming hashprice stays the same. We’ve effectively become an insurance provider.

The problem is that you need a very deep and strong balance sheet to support that model, because while we’ve reduced the variance for miners, that risk is now put on us. So we need to plan for that. But it can be calculated over a long enough period of time. We have different partners in that regard, so that we don’t bear the full risk from our balance sheet.

Tell me about your ASIC brokerage business.

We’ve become one of the leading hardware suppliers on the secondary market. Primarily within North America, but we’ve shipped to 35+ countries. We deal with everybody from public companies to private companies, institutions to retail.

We’re primarily a broker, meaning we match buyer and seller, mostly on the secondary market. Sometimes we do interact with ASIC manufacturers, and in certain cases we do take principal positions, meaning we use money from our balance sheet to purchase ASICs and then resell them on the secondary market. But the majority of our volume comes from matching buyers and sellers.

Luxor also launched the first hashrate futures contracts.

We’re trying to push the Bitcoin mining space forward. We’re a hashrate marketplace, depending on how you look at our mining pools, and we wanted to take a big leap and take hashrate to the TradFi world.

We wanted to create a tool that allows investors to take a position on hashprice without effectively owning mining equipment. Hashprice is, you know, the hourly or daily revenue that miners get, and that fluctuates a lot. For some people it’s about hedging, for others it’s speculation. We’re creating a tool for miners to sell their hashrate forward and use it as a basic collateral or a way to finance growth.

We said, ‘Let’s allow miners to basically sell forward hashrate, receive bitcoin upfront, and then they can take that and do whatever they need to do with it, whether it’s purchase ASICs or expand their mining operations.’ It’s basically the collateralization of hashrate. So they’re obligated to send us X amount of hashrate per month for the length of the contract. Before that, they’ll receive a certain amount of bitcoin upfront.

There’s a market imbalance between buyers and sellers. We have a lot of buyers, meaning people and institutions wanting to earn yield on their bitcoin. What you’re lending your bitcoin at is effectively your interest rate. However, you could also look at it like you’re purchasing that hashrate at a discount. That’s important for institutions or folks that don’t want physical exposure to bitcoin mining, but want exposure to hash price or hashrate. They can do that synthetically through purchasing bitcoin and putting it into our market, effectively lending that out, earning a yield, and purchasing that hashrate at a discount.

What do you find most exciting about bitcoin mining at the moment?

The acceptance and natural progression of our industry into other markets. We can’t ignore the AI HPC transition. Instead of building these mega mines that are just massive buildings with power-dense bitcoin mining operations, you’re starting to see large miners turning into power infrastructure providers for artificial intelligence.

Using bitcoin mining as a stepping stone to a larger, more capital intensive industry like AI is exciting to me, because it kind of gives us a bit more acceptance, because we’re coming at it from a completely different angle. I think the biggest example is the Core Scientific-CoreWeave deal structure, how they’ve kind of merged those two businesses together. They’re complimentary to each other. And that’s really exciting.

When you look at our own product roadmap, we have no choice but to follow a similar roadmap to bitcoin miners. A lot of the products that we built for the mining industry are analogous to what is needed at a different level for AI. Mind you, it’s a lot simpler in our industry than in AI. We’re our first step into the HPC space, and it’s still very early days there.

Continue Reading

Trending

Copyright © 2017 Zox News Theme. Theme by MVP Themes, powered by WordPress.