Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm not 100% sure but I think the final output blur was there to just map the output buffer resolution to the hardware output. If you ran the N64 in 640x480 resolution (with AA enabled) it looked gorgeous. Unfortunately the RDP wasn't fast enough to update more than around 1/3 of the screen before you saw tearing on that mode - so it was really only practical on mostly static screens.

The standard output mode was 320x240 but developers realized you could reduce the buffer sizes and play with the screen borders to try to render less pixels per frame. Dropping resolution was a quick way to get frame rate up, and when your target TV was a NTSC CRT it didn't seem so bad.

The Antialiasing (which the game shark can disable) was what stopped the nasty pixel crawl and jaggies that Playstation games of that era suffered from. It was cutting-edge for the time - it used the 'extra' bit in the 9bit RAMBUS ram (that would have been for ECC in serious applications) to store coverage bits and blend edge pixels while maintaining crispness on interior edges.



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: