Saturday!
Okay, there was recently a *HUGE* kerfuffle online about the new Spider-Man video game. The kerfuffle had to do with the age-old issue of the graphics quality that was shown at E3 versus the graphics quality that is in the game that got shipped.
The name of this controversy is “Puddlegate“.
Here’s the short version of the part that’s easiest to explain: Spider-Man had a demo last year. In the demo, there was an awesome scene of a fight between our titular character and a bunch of hoodlums in an area covered by asphalt upon which there were multiple large puddles that showed reflections of the actions above them. Everybody who saw this video said, paraphrased, “Wow! Keen!”
Then the game came out. Wouldn’t you know it? The same area had smaller puddles. This meant that the puddles that weren’t there were not reflecting the actions above. Which, to those who care about puddles and their reflections, meant that the game had promised a certain level of graphical achievement and, when the rubber met the road, the promise had written a check that the game itself could not cash.
Now. There are a bunch of people who said something to the effect of “Have you seen the difference between the Whopper in the Burger King commercial and the actual Whopper you get when you go to Burger King?” and, effectively, asking if consumers were born yesterday and honestly didn’t expect puffery to show up in promises made by the company. Unfortunately, there were a number of gaming journalists among this number.
There were also a bunch of people who said something to the effect of “Please note: Gaming journalists once again taking the side of gaming companies over the side of the people who saw the gameplay footage and decided to actually purchase the game. This is like a newspaper restaurant reviewer mocking readers for being upset that Real Life Whoppers don’t look like in the commercials.”
And then we’re off to the races. To what extent ought consumers downright expect products to look nothing like what was promised? To what extent are we able to say that the difference between the two pictures constitutes “looking nothing like what was promised”?
I know that, for my part, I talked about how I had pre-ordered the game with a co-worker who then told me “you done (messed) up!” because he was very upset about game companies over-promising and under-delivering. As someone who cannot tell the difference between 1080p and 4k, I found myself looking at the two pictures and saying “they’re different?” and then, only after he pointed to this and that in the picture saying “oh, yeah, I guess they’re different.” He rattled off a number of similar controversies where the company did a demo of any given game with a certain level of graphics… and the company then shipped off a game that looked a lot clunkier and as if it were running on a far less quality system.
(And, honestly, there have been a handful of scandals involving such high-profile games as Witcher 3 and Watch Dogs where they promised one thing and then delivered another knowing that they had promised one thing in the first place.)
And as someone whose first *REAL* video game was Zork, I’m not someone who looks to graphics as the reason to buy (or not buy) a game. But I completely understand being (ticked) off that companies say “THIS IS WHAT IT LOOKS LIKE!” in the commercial and, when you go in to buy it, give you the “Samoyed in the middle of getting a bath” version of the game. “They’re both Samoyeds!”, you’re told… except, in real life, Samoyeds dry off.
In any case, I pre-ordered the new Spider-Man game. It reminds me, from the maybe 30 seconds of footage I’d allowed myself to see, of the Batman Arkham games I love so much. So *THAT* is what I’ll be playing.
So… what are you playing?
(Picture is HG Wells playing a war game from Illustrated London News (25 January 1913[/efn_note]
Love the inclusion of mousetrap. In high school we had a science teacher who spent several weeks each year doing “mousetrap” and letting teams of students come up with the craziest Rube Goldberg machines possible in the lab. Highlight of the school year and a very creative way to teach a lot of different principles at once in a really fun way.Report
Big fan of the Arkham games myself! So is my 7 year old… (I know, I know.)Report
That’s a pretty big difference between the pictures. But then, I’m in the gameplay > graphics camp… so if the game sucks, I don’t care how cool the graphics are … and vice versa. (Within reason, that is… not joining the pixel nostalgia trend)
This weekend looks like rain will prevent getting the woods ready for hunting season… so will indulge my Path of Exile “Delve” season itch. Playing a Guardian Summoner Build (the bandwagon is full, so don’t try to get on)… but I prefer to think of it as the Oathbreaker Fulfillment Build… “Summoner” has such a tyrannical feel to it; rather, I’m liberating the poor spirits from their torment by allowing them to kill stuff so I get phat loot explosions. Win/Win.Report
The argument, as I understand it, is this:
“It may be an awesome game with really good graphics… but if you got promised an awesome game with awesome graphics, the game companies are *LYING* to you and if you buy the game anyway, you’re *REWARDING* them for *LYING* to you. DON’T REWARD PEOPLE FOR LYING TO YOU!”
Which is a great argument!
But it looks like an awesome game with really good graphics…Report
This is console only? I’m not really familiar with the expectation management from E3 to release works for PS4 games. Don’t pre-order games based on E3 hype?
I mean, back in my day when you bought an SSI game, you just hoped it installed without having to adjust motherboard circuit switches. And we liked it that way. And we liked it when our floppy disk #1 got corrupted so we could never play the game again, ever. {though we really did like being able to flip-up the disk-drive door just in time to prevent it writing our death in CastleWolfenstein… the one with stick figures shooting dashes in your choice of 8 directions… so we didn’t have to restart from the top}.
I’m sorry, what was that about the puddles?Report
I believe that it is currently console-only.
(I also remember my friend getting a hard disk and him having to explain to me “internal drive” rather than one of the 3.5″ floppies.)Report
On the “pixel nostalgia trend”…my boys (both teens) play a handful of “pixel” games and they love them all and the older one is constantly trying to get me to play them. I look at the graphics and I think, “But…but…I lived through this era in real time. Why would I want to go back?”Report
Damn it, I spent enough on this video card to bankrupt a small third world nation, why would I play a pixel game? I want explosions to be awe inspiring and viscera to be disturbing!Report
I don’t understand the Samoyed getting a bath metaphor, but I’d really like to.Report
Yeah, I just had a conversation with Maribou about how that was really clunky.
Here’s what I was going for:
Here is a picture of a Samoyed like in the commercials.
Here is a picture of a Samoyed taking a bath (specifically, the second one).
You get promised the former. You get delivered the latter.Report
As an aside, this conversation happened because someone didn’t notice the clunkiness when proofing it after midnight last night.
I may have finally found the downside to the marital bond. It made perfect sense to me!Report
You know how Samoyeds always look like they’re smiling at you? In the bath they look like “Don’t act innocent. I know where you’ve been.”Report
I always figure that it’s a case of trade-offs. What you see at E3 is the best they can do, but something got sacrificed[1] to make the E3 demo possible. Or something wasn’t ready yet that caused the E3 demo to no longer be possible[2] at release. Everyone who bitches about these kinds of things is probably someone who has no clue how actual software development happens (or does, and doesn’t care, because ‘Clicks!’).
[1] An E3 Demo is probably running on a top end card, and likely optimized for that chipset. At release, you can keep that kind of graphics, but only if you restrict the game to running on that class of chipset (e.g. only folks with a high end Nvidia card can play Spider-Man, low end Nvidia and Radeon or Intel owners can suck it).
[2] For instance, fixing that washout in the background killed the frame rate until the amount of reflective surfaces was reduced to a manageable level.Report
An E3 Demo is probably running on a top end card, and likely optimized for that chipset.
That Spider-Man game is just for PS4 (Sadly. It looked a lot like the Arkham series and I think I’d like it, but I’m not buying a PS4.), so it’s designed for a specific playform, it’s not like PC game where they realize ‘Oh, we need to reduce this, a lot of machines can’t handle it’.
Which means they either faked the footage, or reduced the reflective area for some other reason. Which…there could be plenty of reasons. The entire area is actually textured differently, heck the demo doesn’t seem to have a skyline at all visible, maybe they increased all that and the reflections made it now lag.
I love how game companies feel they have to deny doing that sort of thing.Report
That was what I meant by footnote 2. The fact that the background did not look like it was overexposed (and probably had trees and leaves blowing in the breeze, or something else that had to be rendered) in the release is a pretty good reason to cut down on foreground reflections, especially if those reflections also showed the background moving.
And I agree, I don’t get why game companies don’t just admit that they made a trade-off between E3 and release.Report
What I’m playing currently is…getting ahead of everything I need to do so I can dedicate next weekend to the new Tomb Raider. 😉Report
This very reason is why I generally don’t buy games when they first come out. You get a lot of info from guys critiquing games newly published. Is it worth it? Does it have “loot boxes”, etc.
And I’m also a guy that doesn’t keep up with the latest and greatest. I originally bought witcher 3 and couldn’t run it on my pc. then I got 2 and could BARELY run it…so it was time to upgrade.
But the “bait and switch”, in reality or because of overriding business concerns, is an issue. I’d lean towards one or the other depending upon the prior history of the developer and whether or not they’d made outstanding games, all else being equal, before passing judgement.Report
Controversies like this speak to how much graphical quality has become the selling point of AAA games, with all other characteristics being considered secondary. I think it also explains why AAA publishers are always experimenting with new ways to monetise their games, on top of the retail prices – find ways to stuff ever-more polygons onto the screen is getting increasingly expensive.
It’s one of the things that has alienated me from AAA gaming. Graphics aren’t the main thing I want out of a game, and the kinds of gameplay I like (generally slower, more deliberative styles of play) just don’t attract AAA interest any more.
One thing I do like though are the new features being shown off for Stellaris 2.2. Paradox just did a developer stream walking through some of the changes being made to economic management in the game, and it all looks really good:
https://www.youtube.com/watch?v=MyJ9nYbvev0Report
Paradox sale on Steam this weekend!Report
Not just Stellaris!
Cities Skylines for those of you who miss Sim City!
Shadowrun for those of you who are *STILL* waiting for Cyberpunk 2077!
Tyranny for those of you who wish that Tides of Numenera was better!
Crusader Kings II for those of you who think that you could have handled the fall of the Holy Roman Empire better!
BattleTech for those of you who remember playing BattleTech on mom’s kitchen table!Report
Controversies like this speak to how much graphical quality has become the selling point of AAA games, with all other characteristics being considered secondary. I think it also explains why AAA publishers are always experimenting with new ways to monetise their games, on top of the retail prices – find ways to stuff ever-more polygons onto the screen is getting increasingly expensive.
Yeah, I don’t know what the hell is going on there. I was recently at an Atlanta ‘gaming bar’ recently for a party, which was interesting because I saw a lot of console games I hadn’t even seen before. (Including streaming footage of the new Spider-Man, which made me wish it was coming out of PC.)
For example, there was a football game (Checking Google, I assume it’s the newest Madden.) that honestly was so photorealistic that I had to look several times to make sure someone wasn’t just watching football. Including things like cameras following players walking around between plays, and weird pauses, basically the same exact thing you see during actual televised football games. I almost expected commerical breaks at a few points.
Why? I mean, seriously…why? I understand making the players as realistic as possible, fine, you’re paying good money for likeness right and team names and stuff. Maybe a zoom-in or two after a touchdown or something. But there was a hell of a lot of camera work and making it look like TV, and…why? Honestly, the changing camera angles and whatnot seems like it would make it much harder to follow what was actually happening and where people on your team were than the old overhead view that I remember from playing a football game (Probably Tecmo?) on my NES. Granted, I wasn’t actually playing, and the guy who was playing seemed to be doing fine, but…I don’t know.
It’s one of the things that has alienated me from AAA gaming. Graphics aren’t the main thing I want out of a game, and the kinds of gameplay I like (generally slower, more deliberative styles of play) just don’t attract AAA interest any more.
That’s basically how I think of myself as a player, too. But there are AAA action games you can play slow-ish and deliberatively. The stealth ones.
I don’t mind big fights, but what I do mind is constantly having to be on edge. I want to wander around, oh, there’s an enemy, I can take him down silently, go back to looking around, oh, there’s a big group of enemies, get a good position, take a few of them out silently, they eventually figure it out and I have to fight three or four at once, that’s fine, keep looking around…and I just walked into a boss battle room, but that’s fine too, they gave me cues and some prep time, and during the battle I can just keep running around and take them out one by one, etc, and it’s basically an endurance test.
That’s basically Tomb Raider. At least how I play it. I assume some people just run around guns blazing the entire time, but I don’t think that’s really how you’re supposed to do it.
So if you like that sort of slow gameplay, you might want to consider tracking down the 2013 Tomb Raider game. It’s $20 on Steam.Report
You’d love Assassin’s Creed Origins–and it’s pretty.Report
Okay. I have played enough of the game to say that the people upset about the graphics downgrade are not particularly correct. The graphics are gorge, as the kids say.Report
To what extent ought consumers downright expect products to look nothing like what was promised?
If I have not mentioned (and I think I have), but the Black Sabbath videos on YT w/ RJ Dio showeth the Elf to be quite drunken, stumbling, forgetting yon lyrics, and meandering with the tune.
It also shows the man could sing his ass off.
But this is not the Dio on the cover of Heaven and Hell.
Or is it?
A true classic here:
Last Crack (the best thing to ever come out of Madison)Report
As to what I am playing, I just started Far Cry 3, Blood Dragon, which is an 80’s Cyber Action Fever Dream come to life, complete with Michael Biehn voicing the hero, Sgt. Rex ‘Power’ Colt.
So if you loved cheesy 80’s cyber action movies and have a hankering to hear Kyle Reese deliver a constant stream of one liners that are just so bad they’re awesome… Well, it’s a $15 game, so why the hell not.Report