On Bitcoin, the Mandela Effect, and Simulation Theory

I don't expect you to believe this. But I do expect you to notice the timeline: Bitcoin mining and the Mandela Effect didn't just emerge around the same time—they scaled together. Perfectly. The correlation demands an explanation, even if that explanation sounds insane.

On Bitcoin, the Mandela Effect, and Simulation Theory

Ok folks, this one is admittedly out there. I'm asking you to take off your skeptic hat for a few minutes as I make the case. What does the Mandela effect, Bitcoin and simulation theory have to do with each other? Read on to find out...

Let's start with the Mandela effect. We all know what it is...people vividly remember something that never happened or happended differently than recorded history suggests.

My personal favorite is "The Berenstain Bears". I've must of read that book out loud thirty times with the kids. I can't comprehend how I would have mistaken "Berenstein" for "Berenstain" when the name is repeated in the book countless times.

Image source: Wikipedia

And there are countless other examples.

People say that the Mandela effect is an example of reality glitching or proof of time manipulation. There's a well known consipiracy theory where people believe that the operation of the CERN Large Hadron Collider (LHC) is linked to the origin of the phenomena.

CMS Detector for LHC (Image source: Wikipedia)

But I'd like to present an alternative theory as the origin story:

What if I told you that the moment humanity began mining Bitcoin at scale was the same moment that we started to misremember reality?

I'm going to walk you through what I believe is an interesting correlation and one that might suggest that we've found a way to stress-test our reality.


The Mandela effect was coined by paranormal researcher Fiona Broome in 2009, after she discovered the she wasn't alone with her false memory of Nelson Mandela dying in prison during the 1980s.

The phenomenon went viral when related memes caused a surge of interest in 2016. You can see this from this Google Trends chart.

The Wikipedia page for the Mandela effect redirects to "False Memory" as the mainstream scientific explanation for pheonmenon. And it's true...memory is fallable. However, if you're like me, there might be specific examples tied to memories with personal significance. Those are way too difficult to dismiss as simply misremembering.

Now let's talk about Bitcoin...

The Bitcoin network launched on January 3, 2009—the exact same year the Mandela Effect entered the public zeitgeist. In the beginning, mining remained a hobbyist activity. A few techies running code on their personal computers, generating a trivial amount of computational work.

But things changed quickly...

In 2010, Bitcoin was featured on Slashdot, a website popular among tech enthusiasts. Within days, the number of miners exploded. Mining difficulty quadrupled on July 16, 2010. By October 2010, GPU mining code was released to the public, allowing video cards (a much better fit than CPUs) to join the mining operation. The computational load increased exponentially.

Then in 2017, there was a surge of interest that was driven by several factors, including increased media coverage, the entry of institutional investors, and the launch of Bitcoin futures markets, which legitimized the cryptocurrency in the eyes of many.

Check out the google trends chart for Bitcoin. Look familiar?

This is where things get interesting. The two phenomena didn't just emerge at the same time - they scaled together. Meaning, as Bitcoin mining difficulty ramped up, so did reports of collective false memories. The correlation is interesting enough to warrant at least some consideration.

Here's the core hypothesis: Could Bitcoin mining be creating observable stress on the fabric of our simulated reality as evidienced by the rise of the Mandela effect?

In my previous article on simulation theory, I argued that if we exist within a simulation, we wouldn't be able to "break out" of it in any real sense. Our entire existence including our ability to perceive and think—would just be constructs within the simulation's framework.

I suggested that perhaps instead the best that we could do is collectively send a "signal" to our simulation overlords that we were on to them. The signal being something anomalous - a detectable pattern that would disrupt the normal simulation operational parameters.

But what if we're already creating detectable anomalies simply by conducting massive, distributed computational work?

Could Bitcoin mining be creating observable stress on the fabric of our simulated reality?

Think about what Bitcoin mining actually does. It's essentially brute-force guessing. Miners perform trillions of SHA-256 hash calculations per second, searching for a specific algorithmic result (like solving a puzzle). It's purposely computationally intensive and it serves no "productive" purpose from a physics standpoint. Bitcoin mining doesn't advance scientific understanding, such as modeling weather systems or simulating biological processes. It's just computational overhead.

From a simulation's perspective, this could be considiered digital make-work. Massive amounts of processing power being spent on operations that don't contribute to the "plot" of the simulation (if there is one).

Consider this: Currently, at the time of this writing, Bitcoin miners collectively produce hashes at a rate of around 1,020-1,440 exahashes per second. That's over 1,000 quintillion calculations per second. The network consumes approximately 170-175 terawatt-hours of electricity annually—about 0.8% of global power consumption. An estimated 5-6 million mining machines operate across more than 6,000 geographical locations in 139 countries, running 24/7.

That's an extraordinary amount of computational work happening continuously within the simulation.*

Bitcoin mining facility (Image source: Wikipedia)

Now the counter argument of course is that if reality were a simulation, the computational power needed to "render" the observable universe would be incomprehensibly vast—orders of magnitude greater than all human-made computing, including Bitcoin mining.

Bitcoin workloads would be infinitesimal compared to the compute demands required for simulating all atomic and quantum interactions in a universe-sized model.

But consider this: it's not about overwhelming the simulation's total capacity. It's about introducing an unexpected type of load the simulation wasn't designed to handle.

Here's an analogy: your computer can run a complex video game without a problem (an example of a large computational workload), but then might freeze up when you open 50 browser tabs simultaneously (relatively tiny additional load). The game engine is optimized for rendering graphics and physics. Random browser tabs opened that are competing for memory? That's outside of the optimization parameters.

Bitcoin mining might be infinitesimal compared to simulating reality, but wouldn't it also be completely pointless from the simulation's perspective? Would the simulator creators have anticipated millions of machines performing meaningless cryptographic puzzles that don't advance any purpose? It's not expected load. And when systems encounter unexpected loads...they optimize.

Think about how video games handle resource constraints. When your GPU is stressed, games reduce texture quality, simplify physics calculations, use lower-polygon models in the distance, or skip rendering frames entirely. The experience degrades, but the game keeps running.

Glitched

Has Bitcoin mining triggered optimization routines that had unintended side effects that we can detect? What if memory consistency is one of the areas where our simulation optimizes under load?

Consider the pattern of Mandela Effects:

  • Brand names and spellings (Berenstain vs Berenstein, Chick-fil-A)
  • Minor visual details (Pikachu's tail, the Fruit of the Loom cornucopia)
  • Pop culture specifics (Monopoly man's monocle, movie quotes)

These are exactly the kinds of details a stressed simulation might not render perfectly if it's using procedural generation or template-based memory creation routines instead of storing every detail explicitly for every simulated consciousness.

It is a disturbing thought that the simulation might not be storing unique perfect memories for each conscious being just to save resources. Memories might being generated from templates when they're "recalled", similar to how video games render scenery when it's needed.

This works fine under normal conditions. (The templates are detailed enough that we don't notice.) But when the system is under heavy computational load (Bitcoin) the memory generation might use simplified or downscaled templates. Templates with errors.

And if millions of simulated consciousnesses are all drawing from the same faulty template, they'll all "remember" the same incorrect details.


That point represents one of the most puzzling aspects of the Mandela Effect. Meaning how specific the false memories are. Huge numbers of people independently report the exact same incorrect details. That's weird.

Millions of people (like me) remember the Berenstein Bears, not Berenstain. Millions remember "Luke, I am your father," not the actual line. Millions remember the Monopoly man having a monocle.

We'd expect some variation if these were simply individual memory errors. But instead, we see consistency in the errors themselves.

This pattern fits particularly well with the hypothesis that these are rendering glitches in a stressed simulation. When multiple players in an online game experience lag because the servers are overloaded, they often see the same graphical glitches, or the same characters frozen in the same poses, or the same texture pop-ins, the same physics violations, etc. The glitch is consistent because it originates from a shared computational bottleneck.

Could the Mandela Effect represent the same phenomenon? Millions of simulated consciousnesses all accessing faulty memory templates because the simulation is under computational stress?

Here's another angle: What if the Mandela Effect doesn't just represent glitched memory generation under stress? What if it represents something more like an active cleanup process?

Imagine the simulation as a program running on limited hardware. When any program runs low on resources, it implements garbage collection. A routine that clears out old, unnecessary data to free up memory.

What if the simulation is actively rewriting collective memories to reduce its data or memory footprint?

I call this the "compression" theory.

Every event that happens in the simulation generates data. Every observation, every memory, every recorded fact. If the simulation needs to free up processing power due to those Bitcoin hash calculations, it might decide to "compress" certain historical data. It might do this by simplifying records, merging similar memories, or discarding details that seem unimportant.

But imagine if the process isn't perfect. Some simulated consciousnesses might retain fragments of the "pre-compression" memories. They might remember the old details. Specifically, the details from before the optimization pass rewrote that particular aspect of history.

This would explain why Mandela Effects feel like memories of a "different timeline." They might actually be memories from before the simulation rewrote that particular detail to save processing power.

The compression algorithm changes "Berenstein" to "Berenstain" across most records and most memories, but it misses some copies. Those people retain the original spelling and can't understand why the current timeline shows something different.

Fragmented Memories

Can this hypothesis be tested?

This website is all about testing weird theories and actually trying things out. We should be able to observe specific patterns if there's a genuine correlation between Bitcoin mining and reality glitches.

I believe there are three dimensions that can be examined for patterns: temporal (or time based) clustering, geographic clustering or computational correlation.

Here are the observations to look for:

Temporal Clustering: Do Mandela Effect reports spike following major increases in mining difficulty? When Bitcoin undergoes halving events (reducing mining rewards and typically intensifying mining activity), do we see corresponding increases in reports of memory inconsistencies?

Geographic Correlation: Do regions with heavy mining operations report more reality anomalies? If computational stress is localized, we might expect higher rates of Mandela Effect experiences near major mining facilities.

Computational Correlation: Do other massive computational events show similar patterns? When CERN fires up the Large Hadron Collider for major experiments, do anomaly reports increase? (Maybe that initial conspiracy theory I mentioned might have some merit after all.) What about during intensive AI training runs on massive server farms? This could be another interesting angle given the rise of GenAI (see the footnote at the end of this article.)

These aspects would be the underpinnings of a good study:

  • Bitcoin mining difficulty over time
  • Mandela Effect reports (via forums, social media mentions, search trends)
  • Other major computational events
  • Geographic distribution of both mining and reports

Imagine if the correlation holds. If reality glitches consistently follow computational stress, we would have compelling evidence that we've found a way to observe the simulation's resource limitations.

Yes, admittedly this theory is out there. But at the same time, the simulation might be showing us its seams. The real question is: what do we do with that knowledge?


*Note: Some readers might be wondering why I didn't include the computational demands associated with GenAI as part of this discussion. The reality is that at the time of this writing (October 2025), Bitcoin still requires vastly more raw computational operations globally than generative AI. The analysis is complex as the technologies perform fundamentally different types of computation that aren't directly comparable. Bitcoin mining performs SHA-256 hashing operations using specialized integer arithmetic, while AI performs floating-point matrix operations (FLOPS). These are completely different computational tasks that use different parts of computer hardware. In terms of raw operation count, Bitcoin's ~1.9 × 10²⁴ specialized hash operations per second far exceeds AI's ~5 × 10²¹ floating-point operations per second—roughly 380x more operations. But in terms of computational versatility, complexity, and general-purpose utility, AI's FLOPS are far more powerful. One AI FLOP accomplishes more useful work than hundreds of Bitcoin hash operations because it's part of solving complex, multifaceted problems rather than checking a single cryptographic lottery ticket billions of times.

Ultimately, I highlighted Bitcoin because it's origin and scaling parallels the Mandela effect. As GenAI continues to explode in usage, the increased computational demand would support the hypothesis if there was a corresponding increase in the incidence of Mandela effects reported.


If you're interested in exploring simulation theory further, check out my previous article on randonauting as a method of communication with our simulators. And if you notice any reality glitches—especially temporal correlations, reach out! I'd love to hear about them.

Lastly, as you can see from the types of topics we cover, we do not write for a search algorithm and we do NOT display ads of any kind. That's because with AI, the web business model that we've all known and loved in the past is broken.

We believe that being subscription based is a more equitable model - so If you like our content, please consider becoming an Insider to keep this site going! Also, if you enjoyed this article please consider leaving a tip or sharing on social media! Thank you again for your support and readership.

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Enigmatic Ideas.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.