My vision of the modern gaming journalism website

My vision of the modern gaming journalism website
AI generated mock, just so that I have some kind of title image. Other than that, the rest of this article is human written!!!

Let’s be honest: traditional gaming journalism is on life support. Whether it’s IGN, GameSpot, or any of the other legacy giants, user trust is non-existent.

The industry has devolved into a cycle of AI-generated slop, "expert" takes from people who clearly are unqualified to even talk about the game, and here in the west, we especially see a lot of political posturing. But the real issue is the conflict of interest where these for-profit sites are need to keep healthy ties with publishers. If they don't give a triple-A title a glowing 9/10, they risk losing early access, blacklisting, and the "review copies" that drive their SEO traffic. In a world where 60% of consumers trust peer recommendations from Reddit, Youtube, Steam Reviews, etc over "expert" reviews, the legacy model is on its final stretch.

In the modern era, players trust other players more than anyone else, and we need a way to aggregate posts and articles written by these real experts.

The Hardcore Content Gap

There is a specific Venn diagram of people who drives such communities:

  1. Good Storytellers/Writers
  2. Hardcore Gamers (the ones doing frame-data analysis and DPS calculations)
  3. Community-Minded Contributors (those willing to share their findings for free)

Currently, these people are scattered across Discord graveyards, expiring pastebins, and bloated subreddits. There are equivalent communities in Korea, such as Inven or DCinside, but the data stays siloed. These contributors don't care about writing for profit, they care about gaining street credibility and prominence in that specific game's community. They want their 50-page Google Doc on raid mechanics, 1 hour dungeon guides, best tips for beginners, etc, to be the most-viewed bible for that game.

My vision is a platform that captures those users and that passion and utilize AI to turn it into a global, searchable, and verified intelligence network.

Verified Credibility System

The biggest problem with sites like Metacritic or Reddit is review bombing and bad-faith actors. The proposed solution is using an optional Gamer Identity link.

Users would be able to link their Steam, PlayStation, or Xbox accounts as a way to not just brag about their own achievements, but to also gain natural credibility in the platform.

  • If you’re writing a guide on a the hardest new raid but your Steam API shows you haven't even unlocked the first boss achievement, your post won't gain traction.
  • This builds an inherent "reputation score." Users can see your hours played and your trophies. It’s a natural deterrent for bad actors who haven't touched the game but want to skew the discourse.
  • Someone with thousands of hours clocked across legacy Monster Hunter titles may be able to share a very unique view and analysis of the next new Monster Hunter game that someone coming from a souls-like genre might not be able to provide.

Breaking the Language Barrier with AI

Gaming is global, but knowledge is often locked behind language barriers. Some of the best theory-crafting in the world happens in Korean or Chinese forums, but English-speaking players never see it.

By using AI-driven translation fed with public multi-lingual context to build localized glossaries, the system will understand game-specific context, ensuring that a highly accurate version of a post originally written in a different language can be read and accessed globally.

Tournament Ranking System

Numerical scores are meaningless. A 7.5/10 might mean two completely different things for two different people. Someone might put graphics or audio as a hard criteria in their reviews, while some might be scoring purely based on the gameplay. There is no objective way to give a specific number to score a rank a game, which we all meme about on every IGN's "The game sucks, so we score it 9/10" posts.

Instead, my ideal website will use a Head-to-Head Tournament System. Users will be prompted with a simple choice: "Which of these two FPS games would you recommend to a friend?" By forcing relative comparisons between similar titles, we generate a Crowdsourced ELO Rating for games.

This creates a dynamic, relative ranking that tells you the truth: "Game A is currently outperforming Game B in player satisfaction," rather than a static, arbitrary number pulled out of thin air.

Discoverability

Communities like Discord or Reddit is where information goes to die. Because general discussion posts are mixed with high-value reviews, guides, and content, these posts eventually gets buried by memes and salt within 24 hours.

This site explicitly would separate general discussion from other informational content, similar to how GameFAQs laid out their forums.

Not only would there be AI-based automatic tagging/bucketizing of posts, but my vision also includes Automated Megathreads. AI will automatically monitor high-performing data posts to generate a megathread-like header that aggregate the best-verified guides into one clean header for every game category.

This can also be unique for each game - for instance, back when I played a lot of First Descendant, there used to be a "Build Guide" megathread, where there'd be a section for every character, and the highest rated build guides for each character aggregated. (I also had multiple guides that were included in that list)

Rather than some kind of hardcoded structure for each game, the megathread can organically adapt to the type of posts being made, and we can probably use a combination of hand-crafted per-game prompts and several general templates to generate these headers.

Moderation

The biggest point of failure for modern communities is the moderators who ends up playing god. Whether it’s Reddit or Discord, we’ve all seen it: a small group of people exert control over the community narrative, banning people who are not aligned with their own biases, etc. They are ticking timebombs for community health.

I have yet to have idealized the proper solve for this, but I'm currently hypothesizing a multi-tier, anonymized system that treats moderation as a platform-wide utility, not a localized power position.

Tier 1: AI Filter

Before a human ever sees a report, an AI layer performs the heavy lifting. The AI layer will look for moderatable intent:

  • Duplicate Detection
  • Contextual Irrelevance (filtering out anything not related to that specific game/community)
  • Slop Filtering (identifying the patterns of AI-generated articles like meaningless fluff, and repetitive headers that provide meaningless data)
  • Spam or offensive posts

Tier 2: Blind, Cross-Platform Crowdsourced Moderation

For content that requires human intervention, such as harassment, inappropriate imagery, or nuanced rule-breaking, I'm thinking of some kind of decentralized moderation system.

Instead of having "Mods for the Dark Souls Forum," we have a platform-level High-Trust Contributors who gain access to a "Platform Health" tab.

Blind Moderation Mechanism - A moderator is shown an anonymized comment or post snippet from a random game they might not even play. They are asked a simple, objective question: "Does this contain harmful or inappropriate content according to the Global Guidelines?"

Because they don't know which subforum the post came from or who wrote it, they cannot use their power to gatekeep a specific community. Surely the post or context of the post might be able to be used to infer which community it originates from, but because no single moderator will be given the capability to moderate multiple posts of a single community, their influence would be close to zero, and the focus can be kept on purely enforcing the rules.

Tier 3: Anti-Gaming System

To continue to evaluate if the moderators are properly qualified, the system uses subtle disguised tests to keep moderators in check:

  • Occasionally, the system will serve a "Test Case" - a post that has already been verified as "clean" or "harmful" by the AI and other human staff.
  • If a moderator consistently answers these disguised tests questions incorrectly, they are flagged as a bad-faith actor and their moderation privileges are revoked.
  • These tests would be disguised and mixed in with other posts that have been user reported, and would essentially be indistinguishable. This ensures that moderators to continue to operate for the sake of purely enforcing rules, not to influence a specific community.

By making moderation a blind, platform-wide responsibility, we ensure that no single person can own the culture of a game. The premise is that the community belongs to the players, not specific people.

There are still some gaps in my current idea:

  1. Granting moderation capabilities to "high-contribution users" mean there also needs to be a system that can algorithmically determine what "high-contribution" means in the first place. We also need to make sure that this score can't be gamed.
  2. Posts and comments that are eligible for moderation would require other users to first report the post, which is technically also crowdsourced. There would need to be a system that is able to detect users that are illegitimately reporting other posts, and revoking reporting permissions as well.
    The idea is kind of like the Youtube DMCA situation where content creators use DMCA to take down videos that they simply don't like, even if there was no DMCA violation. The platform should have mechanisms that prevent such things from happening.

Incentive System

The last system that the website would need to is to provide some kind of incentive system to continue to fuel gamers to publish high-quality content. Something like Reddit Karma or Stack Overflow badges. It's not a high priority requirement so I didn't put too much thought into how this would work, but I definitely see the value in implementing it.

What this platform does to differentiate

Firstly, the platform will be completely non-profit. Something like wikipedia. Some monetization will be required for the server and infrastructure costs, but that's about it. No full-time employees on a payroll, no political interests, no ties with publishers, etc. It’s about building a platform that rewards the love of the game.

Secondly, it is fully decentralized but still moderated. Spam posts, low-effort posts, and slop posts will be organically filtered by AI and also the community, making sure that users who come to the site for high-quality information will be able to gain high-quality information without having to filter through unrelated posts.

In this platform, there would be no profit-driven bias of traditional media, and we'd establish a crowd-sourced space where the most accurate, data-driven information rises to the top.