The Xbox One has been under fire lately for consistently releasing games at a lower resolution than the PS4. For this week’s round of Nerd Wars! we want know how much this really matters to you, the player.
Cross-platform titles such as Assassin’s Creed IV, Call of Duty: Ghosts, Trials Fusion, and Tomb Raider were all released at a lower native resolution on Xbox One. Microsoft’s new console has the ability to upscale games to 1080p, but this doesn’t quite match the quality of the native resolution.
How much does resolution actually affect the quality of game, however? With the last generation the PS3 almost never displayed a game in 1080p and it didn’t offer an option to upscale the graphics, but many would argue that the games on that console looked better than those on the Xbox 360. In many cases this was judged on a game-by-game basis, because ultimately it comes down to the developer understanding the potential of the hardware. Games like Red Dead Redemption are a perfect example. The PS3 was technically the superior machine in terms of graphics processing, but the version of Red Dead Redemption that came to Sony’s console was a 530p game being upscaled to 720p. The end result was a rather ugly game with an absurd amount of screen-tearing. This was eventually updated, but this was due to the developers coding of that version of the game.
How do your judge a games graphical quality? Is it based off the resolution alone? Would you ever skip out on a game completely if it didn’t display at a certain resolution?
Miles Dompier is the mad commander of TEAM XBRO. He is a Seattle native who recently moved to the sweltering heat of Los Angeles to pursue his dream of becoming a composer/voice actor. When he’s not up writing until his eyes bleed, he likes to play a Prince level of instruments and listen to terrible death metal. Follow him on his personal Facebook page or the official What’s Your Tag? Twitter page - @whatsyourtag