BioWare’s latest title in Dragon Age: Inquisition came out late last year to what can best be described as a large success – not just in sales and not just in reviews, but in awards. While there wasn’t quite the same obvious go-to for Game of the Year pick like in 2013 (when The Last of Us swept up pretty much every award it could get) Inquisition has still gone on to garner no small amount of accolades. Not only did it receive GotY at the Video Game Awards in December, but one need only glance at the list on its Wikipedia page to see the truly impressive array of kudos it has received.
And it’s not surprising or undeserved; despite some controversies with their previous two titles released (Dragon Age II and Mass Effect 3) BioWare is still one of the top-tier developers in the industry, viewed with no small amount of prestige. So it’s good to see Inquisition picking up awards from a developer that reliably delivers some of the best RPG gaming in the industry.
Of course, Inquisition brought with it some changes, not all of which are really a surprise either. BioWare was looking to improve upon what many fans viewed as a disappointment in Dragon Age II. And beyond that, the developer said more than once that they were looking to games like Skyrim for inspiration. In turn, that’s also very understandable: Bethesda’s Elder Scrolls games become more prolific with each iteration, and years after Skyrim‘s own debut, no doubt there are gamers that are still lost in what is essentially a perpetual fantasy world. One in which there’s practically no amount of side questing, geography, or simple task to run out of. Indeed, one could very likely keep playing such a title forever.
In this sense, what BioWare has adopted is the very simple concept of open-world gaming – which has become the idea of the moment. Because BioWare isn’t the only one following this idea; they’re not the only franchise that is suddenly adding “sandbox” to its repertoire of descriptors for a franchise previously somewhat more restricted by relative linearity.
No doubt titles like those from Bethesda – our Skyrims and our Fallout 3s – are having a huge impact. They go on to great critical acclaim and regularly appear on best-of gaming lists, both for the generation and all-time. Minecraft has added even more to this conceit: the idea that gaming doesn’t need to be limited, no matter the genre. And plenty of other genres have been borrowing heavily from RPGs – it only makes sense that RPGs would adopt from others in turn, especially since the notion of having a large sandbox world to explore in a fantasy world is doubtless something that holds great appeal for numerous RPG fans.
But at the same time, is this really the best direction for gaming? Particularly when it comes to quality, there’s almost the risk of things becoming too open-ended, too sand-box, too unlimited. Not only can it arguably become too much of a good thing, but it also runs the risk of diminishing the overall product. Why is the quality of a game now measured in its duration, rather than in the story and gameplay content?
There is the argument to be made about number of hours to be had in a single playthrough of a game, and no doubt that plays a role in the industry taking this direction. After all, if we have to pay $60 (or more) to buy a title, we want to feel that we got our money’s worth. Oftentimes the easiest way to measure that is in terms of quantity; I feel better about spending $60 for a game if I got 400 hours of playtime out of it. Many developers hear this kind of (very understandable) thinking and so offer up exactly that in turn.
But if I spend $60 on a game and only beat it in ten hours, will I feel I wasted my money? Even if it was the best game I ever played? That’s likely debatable, and will probably vary from fan to fan. Sometimes it’s easier for us to measure the quantity of something rather than the quality – particularly when looking at that game’s price tag.
But the loss of quality is exactly what’s happening, and that’s why the shift toward sandbox gaming as a universal idea is somewhat disconcerting. And in no title is that better represented than in Dragon Age: Inquisition.
The first Dragon Age title, Dragon Age: Origins, was structured around five in-depth story campaigns; and beyond each being tacitly connected to the game’s main narrative, they also make up the meat of the game. In completing each, a gamer becomes immersed in everything from the political machinations of Dwarven politics, to exploring the entirety of Ferelden’s Circle Tower of Mages. And through this, the gamer is given an intimate sense of what Dwarven culture is like: their history, their social structure, and their religion, because we are active participants in all of it.
There’s absolutely no open-world sense to the title – sure, there’s freedom in which to complete major and minor story quests. But the game never goes for the sandbox feel. And it is better for it.
By comparison, roughly the same period of time (and likely more) can be spent in Inquisition exploring the numerous open-world areas made available through the game, from the Storm Coast, to the Emerald Graves, to the Hinterlands. But there’s a comparatively hollow feel to the approach: one is given the visual feel of these differing areas, but little else. There’s next to nothing in the way of storytelling, and most side quests and tasks feel largely the same.
The problem there being that in Inquisition‘s open-world structure, there’s a tendency to tell the gamer about the world’s history and culture; whereas in Origins, the non-open world structure showed it to us.
Additionally, open-world gaming is not conducive to good storytelling. While the meat of Inquisition is spent in these open-world environments, whatever central narrative thread exists in the game is lost, as it becomes one of the few and far between elements that, as a result, resonate less for the entirety of the game.
The Last of Us is one of the most critically acclaimed games ever made, and has very little in the way of customization or exploration. Would it be a better game if it had a sandbox feel to it? If we could pause in between each Ellie-Joel cutscene to wander off and explore the desolate, zombie-infested landscape and complete sidequests? Likely not. Indeed, it would probably make the game weaker and less resonant, as that sandboxing would take away from the potency of the main story quest, which is the very thing most gamers loved about the experience.
None of this is to say that open-world gaming is bad – but rather, that it doesn’t belong in every game. That so many are starting to adopt it is somewhat troubling. Inquisition is a weaker game for it; and while the notion works very logically with the primary appeal of Bethesda titles, it takes away key elements of the traditional narrative-driven experiences that BioWare most often strives for. Naughty Dog has similarly described the upcoming Uncharted 4: A Thief’s End as having sandbox elements – at this point, one can only hope they don’t let it overtake what is traditionally a linear game to such a degree that it robs Uncharted of its most powerful elements, like the storytelling and action set-pieces.
One of the most discouraging aspects about the placement of sandbox gaming in Inquisition is its general usage: to a degree, it is required. If one wants to progress with major story quests, the only means by which to level up to a needed degree is to participate in the open-world experience – which is to say, they’ve made it mandatory. Whether we like it or not it’s become so commonplace that even those of us who regularly defer away from the conceit are forced to participate. In the long run, I just hope there are developers still willing to stick with the quality over the quantity, even if it’s less often than it used to be.