Frequently Asked Forum Questions | ||||
Search Older Posts on This Forum: Posts on Current Forum | Archived Posts | ||||
Microsoft doesn't say much about their rendering tech, although it's worth noting that Bungie only chose to use the dual-buffer format because larger formats have various deficiencies on the 360's ancient tech (lack of gamma-correction for alpha blending, lack of support for alpha-blending, etc). Using a single large buffer would probably be a better match for the XB1 hardware, giving superior dynamic range and requiring fewer ROP operations (since you're interacting with 1 buffer instead of 2).
(This is one of those "numbers are hard to compare directly because new technology is more efficient" things.)
Thus changing to a single-buffer format wouldn't necessarily be a bad thing in itself. A switch to a 16-bit-per-channel fixed-point format could probably get slightly better results for cheaper than the original approach.
I'm pretty sure they changed the buffer format, for one main reason: the behavior of color banding seems different than in the original game. Part of that might just be the new gamma curve that aggressively compresses the range above and below. But that should compress the range and if anything reduce noticeable banding, I would think.
Some areas in the MCC version look fine, but some of the skies have really choppy banding with large bands of staggered sizes, most noticeable in Floodgate. {wild speculation} It makes me think that they switched to a single-buffer floating-point format, and the reduced precision in the high parts of the range reacted a bit awkwardly with Halo 3's visual makeup. {/wild speculation}