![]() |
||||
![]() |
||||
![]() |
||||
| Frequently Asked Forum Questions | ||||
| Search Older Posts on This Forum: Posts on Current Forum | Archived Posts | ||||
I'm still of the mind that they could easily have had 60 FPS as they do now, but kept split-screen for at least Campaign while having either the resolution drop or the FPS brought down to 30. Other games do it, and I KNOW the majority of people that want split-screen would have been perfectly fine with this option. I get the impression that they didn't want to backpedal from 60 FPS after Master Chief Collection, and that part of the reason we found out it was missing so close to launch was that they were still trying to figure out getting it to work. I think they should have just did the resolution drop if they were so adamant about it. Halo 2 Anniversary doesn't run at 1080p.
While I do think 60 FPS is objectively better for pretty much any game that isn't static turn based, I'd like to think the majority of people are alright with a 30 FPS lock, at least on consoles. I don't recall Destiny having nearly as big of a stink about it's framerate as some of the other shitstorms from the past 2 years, but maybe that was because of how upfront they were about it. It seems to me that the console FPS discussion more recently is tied closer to the console wars aspect, since PS4 does run that majority of the games that have issues better, coupled with the massive amount of remasters that have been piss easy to get running at 1080/60 exposing more console only players to these higher standards. I've been a console focused gamer my whole life, and I accepted that we're always going to have lower performance than PC, as it really became the norm during the 6th gen (PS2/GameCube/Xbox1). I figure most gamers know that if they know virtually anything at PC gaming.