In June of 2010, Apple announced the iPhone 4, and along with it the new gold standard for display resolution. Apple uses the moniker "Retina" to classify these displays, claiming pixel density so high that the human eye is unable to discern individual pixels at a typical viewing distance.
As is usually the case with Apple product announcements that debut advanced new technologies, ultra-high resolution displays have now become the norm for smartphones, tablets, and even some of the more advanced notebook computers on the market today. In fact, having recently upgraded to the 13" Macbook Pro with Retina Display, it occured to me that my TV is the only screen remaining in my life that doesn't tote a display density of at least 250+ pixels per inch. In other words, everything but my living room has received the "Retina" treatment.
With that said, our televisions are not far behind. In his Thursday story for Businessweek, Justin Bachman took a look at "4K" resolution televisions - the next step forward in the ever-advancing world of consumer displays - and explores just how far display manufacturers will take the resolution arms race:
Gorgeous image or not, the question arises: At what point do the capabilities of the technology outpace those of our eyes? Farhad Manjoo, writing in the Wall Street Journal, declared that we’re almost there: “Nobody’s eyes are good enough to appreciate resolution above 4K.”
If around 300 pixels per inch is a Retina resolution, as Apple has branded it to be, how much farther can pixel density increase before resolution is simply a moot point behind color, contrast, and image depth?
Regardless of what is determined to be the reasonble limit for display resolution density, Hans Baumgartner of digital compression pioneer Rovi, speaks the truth:
“Better is better, you know?”
I couldn't agree more. ◉