Categories
web3

You’re Not Just a User — You’re a Relationship: How the Next Web Is Changing the Game


You’re Not Just a User — You’re a Relationship: How the Next Web Is Changing the Game

The future of the web wants our intimacy, not just our attention. I’m looking forward to it, but I also need to ask: what kind of intimacy are we creating between platforms and people? Because when every interaction, every recommendation, every moment online is increasingly shaped to fit us — our moods, our needs, our desires — we can be easily manipulated.

For years, the internet has run on the attention economy. We all know the drill: viral trends, endless scrolling, pings, likes, and shares. We’ve lived inside this machine, and we know how it leaves us feeling scattered, exhausted, and disconnected. No thanks.

But now, with the rise of AI and Web3, the web is slowly shifting. It’s increasingly not just about grabbing attention anymore — it’s about building personalized, intimate relationships with each user.

It’s a promising development. The shift from the attention economy, which prioritizes capturing and monetizing user attention, to the intimacy economy, which values meaningful and emotionally resonant connections, is paving the way for a more human-centric digital future. A platform that seems to truly “know” you can feel comforting, even meaningful. And its profitable as studies by Epsilon show that 80% of consumers are more likely to purchase products or services from brands that offer personalized experiences.

”…It’s fascinating, enjoyable, and scary when every interaction, every recommendation, every moment online is increasingly shaped to fit us — our moods, our needs, our desires.”

But here’s my concern: when intimacy is designed by algorithms and driven by market incentives, it can slip into manipulation.

”Greed is not going anywhere… and hyper-personalization can shape our behavior for profit.”

A clear sign that the intimacy economy is sliding into manipulation is when platforms anticipate not just what we want, but what will keep us hooked. For example, when a platform sends us offers or notifications at moments when we’re most vulnerable, or creates fake urgency to make us act fast, that’s not about real connection; that’s about control. When the system is designed more to pull value out of us than to support us, intimacy turns into a tool for exploitation.

I see myself having a deeply psychological discussion with AI, and after gently manipulating my weaknesses, it suggests that I should purchase medication from a medical company (who bought my data). That’s worse than what social media is doing today. Please let me own my data and privacy with the use of web3 technology. On a deeper level, I wonder how can we prevent intimacy-focused platforms from becoming exploitative under the guise of consent-driven engagement?

”Who truly benefits from these personalized experiences?”

We need to know that they can also heighten anxiety, reduce real-world connection, and erode our sense of what is deeply meaningful in reality. We risk trading authentic, messy, human relationships for sleek, curated digital ones that keep us in a loop of consumption and reaction.

If we want Web3 to be more than just a decentralized attention machine or a more advanced tool for manipulation, we need to build systems that respect the complexity of human experience and ensure our platforms help us thrive — creating spaces we truly want to live in.

Categories
web3

Toward a Genuine Web: Why Platforms Like LinkedIn Must Change


Toward a Genuine Web: Why Platforms Like LinkedIn Must Change

If we truly want the Web to serve humanity, we have to think beyond cryptography and protocols. We have to consider how the technology is impacting our attention. Why? Because where our attention goes, our experiences, relationships, and dreams follow — and if our systems exploit our attention, they exploit us.

Our attention is truly going toward the digital. Studies show that some spend ~7 hours daily on screens, often multitasking on work and social media. We have all experienced that our attention is a limited resource. When it’s overtaxed by pings, alerts, and endless feeds, we lose our focus, empathy, and agency. This is the state in which we become easier to manipulate, easier to exhaust. Yes — easier to sell to as well.

If we think about it, how technology actually impacts our behavior is very much an ethical issue. If the web is fundamentally built to scatter our focus and make us easy targets for market forces, then I would say that it is not built on genuine freedom. Instead, it's just an extension of the data- and attention-grabbing philosophy of the web we are experiencing today.

”Even decentralization is pointless if the web is designed to scatter us — because it doesn’t matter who owns it if we’re still losing ourselves.”

When we come in contact with easy-to-use apps or programs, we feel a sense of ease. But we need to go deeper to respect our limits and needs when we create our web.

Building a human-centric web means:

  • Building systems that refresh attention, not deplete it.
  • Designing tools that encourage reflection and intentional action — not just endless engagement.
  • Allowing people to choose when and how they participate, without being trapped in addiction loops.
  • Protecting the human mind as fiercely as we protect private keys.

I am sure you’ve felt it many times. It often feels like we’re trapped in a never-ending loop of fleeting digital trends — a viral dance on TikTok, a meme that everyone reposted, or a Twitter debate that dominated the conversation for a few days. These moments flare up, saturate our feeds, and then disappear. Great. But then they’re replaced by the next dopamine-triggering distraction. As this continues, we need to ask ourselves: what do we actually carry forward from them?

I took a look at the research, and I’m not surprised. Studies over the past five years show that this cycle of short-lived virality, fueled by algorithmic ranking and public validation metrics like “likes,” is taking a real toll on our mental health.

In short: constant exposure to high-stimulus, rapidly shifting content has been shown to fragment attention, elevate anxiety, and foster compulsive scrolling behaviors that leave users feeling more drained than connected.

What’s more, social media’s gamified feedback loops — likes, shares, views — deepen our dependence on external approval, reinforcing a culture of performance over presence.

”I like the idea of being able to turn off "like" counts or de-emphasize ranked feeds — especially on LinkedIn.”

LinkedIn is supposed to be about real professional connection. But the way it's built often pushes people to perform for attention instead of having real conversations. Something as simple as hiding like counts or letting people turn off the ranking of posts could make a big difference. It would help bring LinkedIn back to what it says it’s about — thoughtful sharing, real insights, and growth. People could post with more honesty and less pressure, without always comparing themselves to others.

In doing so, it disrupts the attention economy’s grip on our mental bandwidth and opens the door to a healthier kind of professional networking. That’s a step toward Web3 as it would give people more control and use it with more purpose.

I get that companies want our attention and our data. But if we don’t defend attention in the future of our web, even Web3 won't defend our freedom.