Fluvio Labenti

( flowing stream )

Thursday, January 30, 2020

When to upgrade your PC, a golden rule?

After basically thinking aloud writing some comments on this Youtube video from BPS Customs, I thought I could elaborate further here on my own blog.

If you have a cell phone, a gaming PC, or if you are using or have used any computer for that matter, very likely you have experienced the harsh realities of tech obsolescence. Science and technology improve on a daily basis, often quite drastically. Give it just enough time, and you are left with a gadget that is only a few years old, yet newer gadgets are way more powerful, or support newer standards and protocols and connectors that yours does not, even though yours was probably not exactly "cheap" when you got it not that long ago. Sounds familiar?

There's nothing we can do about this, except to upgrade to some newer, more powerful gadget at some point in time, whenever we decide to do so. But when is it a good time to do so? Not always an easy decision.

Isn't your gadget/computer still powerful enough? Do you really need already that newer one? Is the newer one really that much different/better/faster/more capable? Can your current "old" rig not serve you well enough for some more time, before you drop all that cash for the newer stuff, which will for sure go through the same aging process quite inevitably anyway?

Tough call. Tough call.

Everyone can approach such upgrade decisions their own way. Each has his/her own interests, priorities, and most importantly, pockets. If you have money to burn, simply get the latest/best equipment you want whenever it becomes available or whenever you want, and done with it. No choice paralysis whatsoever :) But plausibly many people do not have deep enough pockets to adopt such an approach. Some others may want to use their resources in an efficient, sustainable manner. In any case, when to upgrade? Is there a golden rule?

Here is a rule I have sort of internalized to guide my own decision making, trying to optimize the utilization of my money, the usability of my existing rig, and yet staying with a current and well performing system.

Technology obsolescence aside, my golden rule is the following:

Consider upgrading only when you can get 2x the performance for the same price you paid last time.

Notice, such rule is not at all the same as suggesting to upgrade when you get similar performance for half the price. Those are two completely different things, and here's a concrete example why: right now an AMD Radeon RX 5700 XT graphics card (currently a best value in the mid to upper-range GPU category) offers about the same performance the "old" Nvidia GTX 1080 Ti did three years ago, at somewhat near half the price (considering european prices.) But there is simply no existing option right now that would offer you 2x the performance of a 1080 Ti for its original price. GPUs have not evolved that quickly. So whenever you get the same performance for half the price, you don't necessarily get double that performance for the same price.

Updates / corrections early Feb. 2020:

With respect to CPUs, the AMD Ryzen 3700X currently does provide about twice (in fact 2.1x) the multi-threaded performance of the now 4+ years old Intel i7-6700K, for about the same price, or even slightly less. Such an upgrade would perfectly exemplify the application of this golden rule --if you mostly cared about multi-threaded performance, that is.

Notice that the rule can be applied even if planning to jump up to a much higher performance class of equipment.

The AMD Ryzen 9 3900X CPU offers about 3x (more exactly 2.9x, according to PassMark's CPU Mark) the multi-threaded performance of the i7-6700K, but at a higher price, more exactly, 1.5x the price in the US market. So as of early February 2020, their relative performance/cost ratio is really 1.9x, close to but not 2x quite yet. (Cost of the 3900X would need to get to about or below $450 to match that 2x ratio.) Still a clearly beefy upgrade.

For the AMD Ryzen 9 3950X, the relative performance/cost ratio gets worse at 1.34x - 1.5x because of its much higher cost (2.2x) over the older Intel, while its multi-threaded performance is only slightly higher than the 3900X's (3.2x  vs.  2.9x). The higher a performance jump you aim at, the louder the law of diminishing returns will scream at you. That ~12% extra performance offered by the 3950X over the 3900X costs however ~60% more.

If your system had some different old parts, or if you are eyeing different new parts, the situation might be different. Also if you have some urgency to upgrade (e.g. you need support for some new standard, you want some new feature, or you want a better gaming experience just because,) then you could make that 2x smaller and to your taste, let's say 1.75x, 1.5x, or even lower? Up to you and your needs. This golden rule at least gives you a good framework to keep in check how much you would be spending, vs. how much or how little extra performance you would be paying for yet again.

Technology improves quickly enough, so I think it's not worth it to use inflation-adjusted costs when applying this golden rule. But as they say, your mileage may vary, so don't quote me on that ;)

Tuesday, September 24, 2019

TV Brand ranking update: two+ years later

Finally here the post promised two submissions ago, and also an update (2+ years later) to the TV rankings from last post.

This update considers the following use cases from RTings.com:

1. 4K Gaming
2. PC Monitor
3. Sports
4. HDR Gaming
5. TV Shows
6. Movies

The Best Outdoor TVs category was not included in my ranking calculation because a full listing with a specific single score for that category seems to be missing.

The scoring works the same way as in last post, and that means: for each of those six Rtings.com usage categories, I simply find the first position where a TV brand appears on the corresponding scoring table. There are six categories, so that means there will be exactly six such numbers for each brand. Those six numbers get averaged per brand, and that will be the brand score. Simplistic interpretation: the lower the score, the better the brand.

As of today I get the following results, clustering the brands in tiers somewhat arbitrarily and manually by proximity:

Tier 1:
#1: LG with 2.17 (S.Korea)
#2: Samsung with 5.00 (S.Korea)
#3: Sony with 6.00 (Japan) 

Tier 2 (updated: this tier has spread out scores, but no need for three tiers really)
#4: Vizio
with 14.50 (USA)
#5: Hisense with 25.33 (China)
#6: TCL with 39.17 (China) 

(Keep in mind that for example Panasonic cannot be included in this ranking because RTings.com does not review TVs from Panasonic. Clearly it also does not review all TVs out there on the market.) 

We can comment quite a few things about this update to the rankings. Two years ago, or rather almost three, LG (South Korea) was #1, and Sony (Japan) was #2. Right now South Korea is hogging the top two places all for itself with LG and Samsung. The latter, who has been pushing QLEDs as allegedly the better technology against OLED for the last few years, not only crawled up from #4 to #2 surpassing Vizio and Sony, it is now allegedly also preparing an OLED offering: a sort of hybrid between OLED and Samsung's own "quantum dots". That ought to shake up things in that Tier 1, specially between the two South Korean giants.

As of today, the general consensus still is that OLED TVs offer the best image quality. And regardless of the OLED TV brand, all of them currently have a South Korean, LG-manufactured OLED panel. The technology offering the next best image quality after OLED is QLED, as from Samsung at #2 above. South Korea all over the iron throne.

Japan on the other hand, well... it's complicated.

Even if [1]: Japan was the first one to announce plans for 4K as well as 8K broadcast TV;
Even if [2] first Pioneer (Kuro plasma TV) from Japan, and then Panasonic (eventually got all plasma patents from Pioneer,) also from Japan, brought Plasma TV to the highest consumer image quality levels ever seen before OLED;
Even if [3]: Panasonic has won some international TV face-off competitions with its OLED TVs against Sony and LG;
and finally,
Even if [4]: Sony is now showing off an almost 6 million dollar modular super huge display... 

In spite of all that, again, Sony as well as Panasonic, any other Japanese brands (Toshiba and Sharp come to mind,) as well as all other brands in the world, currently still depend on LG panels for their OLED options. Quite a position of power over the industry for South Korea. Of course, the display panel is not everything on a TV. The image processing circuitry feeding images to that panel is crucial, and Sony as well as Panasonic seem to have some major tech strongholds there behind the panels, in particular with respect to motion control, color/shade gradations, and choice of tone mapping curves for HDR.

Vizio (USA) is offering great budget/best value options, catching up as far as image quality goes. And Hisense and TCL (both from China) appear now on the rankings, also seemingly catching up with respect to image quality for very competitive budget options.

While the rumour mentioned above about Samsung considering an OLED offering is circulating, now there is also some growing hype about the upcoming MICROLED technology, which should match OLED's perfect blacks, while offering much higher brightness levels, yet no permanent burn-in risks whatsoever. So basically, it will combine the best features of the two current best technologies (OLED and QLEDs,) while completely overcoming their respective shortcomings. Sounds like a holy grail, but we'll have to see how it delivers, and most importantly, how much it will cost.

I find the modular approach to building very large displays a particularly interesting development. The super huge screen from Sony is shown at the beginning of this post. Both Sony and Samsung have recently showed off prototypes based on that approach, and it really sounds very promising. It would allow consumers to flexibly and progressively build up and "grow" their desired screen size whenever they are ready to do so, and to whatever larger size they want. Let's say you start with a modest 55" screen, but then over time, and assuming enough money and space, you make that grid become a 100", 200", or even a larger display covering an entire wall, not by replacing your TV, but by adding more "screen tiles" to your existing TV. Important to realize that apparently no one ever complains about getting too large a screen, it's rather the very opposite. So screen size plays quite a major role in the consumer market. This modular approach might turn out to be a clear win-win for manufacturers as well as consumers.

Hopefully some sort of calibration ought to take care of proper brightness and color uniformity across all those tiles in those grids at all brightness levels, even if the tiles might come from different production batches finalized years from one another. Those uniformity issues might be one of the main possible problems for this modular/incremental approach, together with the difficulty of achieving perfect separation invisibility (or "seamlessness") between adjacent tiles. In any case, when such a grid/tiled based display becomes a desirable option even for colorist and film makers when they need professional reference monitors for their work, only then we'll know that these displays are among the very best that technology can offer for ultimate image quality.

Until then... 

Well, let's at least wait for Microleds, even if not modular, and let's also wait for that new hybrid OLED-QLED offer from Samsung.

Before all that, this
updated ranking based on RTings.com scores might give an approximate idea of how the biggest players are standing right now against one another with respect to TV technology.

Friday, May 05, 2017

My TV brand ranking

A diversion from the post I promised last time, although significantly related. This post is the result of an exchange of comments in this YouTube video.

The idea was to elaborate on the current supremacy of OLED over LCD/LED tvs, and right now that basically means LG over all other brands.

But Sony and others also have now OLED offerings. They all use LG OLED panels, mind you, so credit where credit is due. But still, we might want to compare the scores of two OLED TVs from different brands even if they all use LG panels, because obviously, besides the panels, not all other things are equal.

Yesterday I submitted a comment with a special ranking I made for myself about TV brands. My ranking calculation works the following way:

Rtings.com has six usage categories for TVs. Here they are with links to the corresponding pages that include the full scoring tables for several TV models and brands:
1. HDR Gaming
2. Movies
3. PC Monitor
4. Sports
5. TV Shows
6. Video Games

For each of those six Rtings.com usage categories, I simply find the first position where a TV brand appears on the corresponding scoring table. There are six categories, so that means there will be exactly six such numbers for each brand. I average those six numbers, and that's the brand score. Simple.

Keep in mind that for a given brand, their best scoring TV model in a given usage category may not be the same best scoring TV they have on another category. And a given brand X may have a TV in the first position, but then all other top TVs for that category from position #2 till #50 might be from brand Y. But we will not worry about any that, because we are trying to rank the brands themselves, not specific TV models. And we are ranking the brands just by averaging the top positions they achieve in these categories, and nothing else.

Yesterday I submitted a comment to that YouTube video, showing my ranking of the top four TV brands (LG, Sony, Vizio, and Samsung) calculated using this scheme. The scores per brand were the following:

LG: 1.0
Sony: 4.5
Vizio: 11.5
Samsung: 13.0

But that was yesterday. Incidentally, today Rtings.com published their review of the newly released Sony OLED A1E. That's why I'm posting this on my blog. I had to recalculate my brand rankings, and I figured this whole thing was a bit too long for a YouTube comment :P

Well here is the update. In each usage category, the top standings per brand right now are the following:

HDR Gaming:
 First LG at #01: EG9600 (Score 8.9, 2015)
 First Sony at #02: A1E (Score 8.9, 2017)
 First Vizio at #10: P Series 2016 (Score 8.4, 2016)
 First Samsung at #12: Q7F (Score 8.4, 2017)

 First LG at #01: LG C6 (Score 9.4, 2016)
 First Sony at #02: A1E (Score 9.0, 2017)
 First Vizio at #08: P Series 2016 (Score 8.7, 2016)
 First Samsung at #13: JS9000 (Score 8.0, 2015)

PC Monitor:
 First Sony at #01: A1E (Score 8.7, 2017)
 First LG at #02: C7 (Score 8.7, 2017)
 First Vizio at #07: P Series 2016 (Score 7.9, 2016)
 First Samsung at #17: Q7F (Score 7.3, 2017)

 First Sony at #01: A1E (Score 8.4, 2017)
 First LG at #02: C7 (Score 8.4, 2017)
 First Samsung at #14: Q7F (Score 7.8, 2017)
 First Vizio at #18: P Series 2016 (Score 7.6, 2016)

TV Shows:
 First LG at #01: C6 (Score 8.4, 2016)
 First Sony at #04: A1E (Score 8.3, 2017)
 First Samsung at #15: Q7F (Score 7.7, 2017)
 First Vizio at #23: P Series 2016 (Score 7.3, 2016)

Gaming TVs:
 First LG at #01: C7 (Score 9.0, 2017)
 First Vizio at #02: P Series 2016 (Score 8.9, 2016)
 First Sony at #03: X850E (Score 8.9, 2017)
 First Samsung at #09: MU8000 (Score 8.5, 2017)

Averaging all those top positions achieved per brand, the updated scores as of today ends up being the following, TA DAAAAAA!!!:

LG: 1.3 = (1 + 1 + 2 + 2 + 1 + 1)/6
Sony: 2.2 = (2 + 2 + 1 + 1 + 4 + 3)/6
Vizio: 11.3 = (10  + 8  + 7 + 18 + 23 + 2)/6
Samsung: 13.3 = (12 + 13 + 17 + 14 + 15 + 9)/6

So even if Sony is using LG panels in its TVs, it does not seem to beat LG's own TV offerings for now, according to this ranking. LG and Sony seem to be somewhat close though, so in a similar league, with LG on top. They could be regarded as a Tier 1. Vizio and Samsung, however, are rather far below them in their scores, so they could represent a Tier 2. Within that second tier, they are close, but Vizio seems slightly above Samsung.

The current TV Brand rankings therefore:

Tier 1:
#1: LG

#2: Sony

Tier 2:
#3: Vizio

#4: Samsung

And that's it for now. Still planning to write what I promised at the end of my previous post.

Tuesday, November 01, 2016

Isn't 8K too much?

It requires a combination of at least three genetic flukes for humans to achieve extreme levels of visual acuity:
1) Perfect ocular shape and optics to begin with
2) Higher than normal cone density on the retina
3) Outstanding transparency inside the eye

Only a very small percentage of people get all the flukes combined, so it's very rare. Yet we should keep in mind that right now we are quite a few billion people on the planet...

Remember that some birds like falcons have "ordinary" visual acuity in the order of 20/2, or about 10x sharper than the "normal" 20/20 vision of humans. But even keeping the pride of our species with respect to eyesight sharpness in check, it is actually not so rare to find people with better than 20/20 vision. Let's see briefly how "not rare" that is.

Roughly, about 35% of the adult population has at least "normal" or 20/20 vision without glasses. But close to 10% of the US population has 20/15 (better-than-normal) vision.

In fact, 1% of the population achieves 20/10 vision. That's twice as good as "normal vision." The human record seems to be even slightly better: around 20/8. That means, being able to read at 20 meters what most (those with 20/20 vision) can only read at 8 m or less.

On the other hand, approximately 64% of adults wear glasses, at least in some developed countries. Yet we can imagine that the eyesight of glass wearers, with their glasses on, falls roughly on a normal distribution bell peaking around 20/20? So even if just 1/3 of them (us) can see slightly better than 20/20 with glasses on, that would represent about 20% of the total adult population. Let's assume that is a bit too optimistic, so to be conservative, let's make that just a 10%.

Adding that to the 10% who already achieve at least 20/15 vision without glasses, we can estimate that roughly 20% (about one in every five people,) exhibit a visual acuity that is clearly better than the "normal" 20/20. Notice, that's regardless of whether they wear glasses or not.

This study mentions average FVT visual acuities of 1.82, so closer to 2x the "normal" 20/20. Not sure what population samples they used there though.

So as a sort of disclaimer, in spite of my long previous post explainin when and why 4K might not offer any visible improvement over Full HD or even plain HD, we should not forget the fact that there are cases in which the benefits of a higher resolution can indeed be seen and be pertinent and enjoyable. The obvious examples: you simply sit closer than the ideal viewing distance for your visual acuity, and/or you do have in fact better than normal visual acuity, which as we just saw, is not so rare after all.

But keep in mind, that is not really a case in favor of 4K or 8K or even higher resolutions.

And now to honor this post's title: Japan's public broadcaster NHK has recently announced TV broadcasting at 8K.

Well, nice try, Japan. But isn't that too much?

Let's be clear: an absolute given resolution is never "better" or "wrong" or too much or too little in and of itself. Again, it might be too much, or satisfactory, or too little, depending on a combination of factors, namely pixel size, viewing distance, and visual acuity (check said previous post for all the details if needed.)

As if it wasn't already obvious from my posts on resolution, I don't think 8K for broadcast TV might be such a great idea, even for Japan (they pioneered the use of higher resolutions for broadcast TV before anybody else in the world quite many years ago,) and even for those lucky few with the three flukes combined and outstanding 20/8 vision. A very high resolution can be adequate in some use cases, but likely, it can also not be so, and a big waste. And there's quite a lot more to high resolutions that just being potentially useless or unnecessary and wasteful in some common cases.

igher resolutions are very costly in terms of compression and bandwidth requirements (which in turn can deteriorate image quality very very quickly, and also most horribly and catastrophically, when not properly taken care of.) For a given bandwidth, the higher you go in resolution, the more you must sacrifice frame-rates, which will deteriorate the fluidity of motion. This has major implications for anything with fast moving images, like sports broadcasts, or in particular, video games. But even most importantly, resolution is only secondary after contrast and color, for ultimate picture quality.

The current trend in cell phone manufacturers, offering flagship models with cameras that have smaller pixel counts than older models (even some older non-flagship ones,) yet offer higher image quality, should give a clear hint already: people are starting to care about better pixels instead of more pixels, and are not falling so easily for the earlier and simpler "more pixels = better" marketing bull.

Well, but Japan, or at least NHK, seems to think otherwise. (Japan's Sony, on the other end of the bluff spectrum, recently offered its flagship Playstation 4 Pro console with a rather weak and disappointing claim in the 4K gaming arena.)

In any case, let me leave it at that for now as far as this post goes. I'll be talking more about Japan soon in an upcoming post, not only about this specific 8K move, but also about the history of TV and the current standing of Sony and Panasonic (Japan) vs. Samsung and LG (South Korea.)

There was a slogan that got advocated when the CD standard was finalized: "Perfect audio forever." With respect to picture quality, we could say the ultimate aim has been analogous for a long time: Perfect Image Quality Forever. Manufacturers and technologies have gone through up and downs, but they have been moving overall in that same direction. The fact is, display manufacturers and technologies have been able to provide outstanding, never-before-seen picture quality in consumer level displays this very year, in 2016. Pretty much anything from 2015 and before has been clearly left in the dust and very soon obsolete. There are very good reasons for excitement about display technologies and picture quality precisely right now and from now on, and that is really great news. (But 8K broadcast TV is not one of those great news, imho.)

As a sneak peak, I'd like to quote DisplayMate's assessment of a 2016 flagship OLED TV (For the record, I have absolutely no relationship with any of the companies mentioned in these posts.)

"In terms of picture quality the LG OLED TV is Visually Indistinguishable from Perfect. Even in terms of the exacting and precise Lab Measurements it is close to ideal, and it breaks many TV Display Performance Records. (...) far better than the best Plasma TVs in every display performance category, and even better than the $50,000 Sony Professional CRT Reference Studio Monitors that up until recently were the golden standard for picture quality. In fact, based on our detailed lab tests and measurements the LG OLED TV has the highest Absolute Color Accuracy, the highest Absolute Luminance Accuracy, and the highest Contrast Ratio with perfect Black Levels of any TV that we have ever tested, so it even qualifies as a Reference Studio Monitor."

Did you notice the bold text there? These pros talk about picture quality, and mention things like color, luminance, contrast, and black levels... But they don't even mention *resolution* there. Hmm... Wink wink ;) 

"Perfect", or let's say at least technically flawless displays are already available and might be bound to become pretty much a commodity rather soon, on cellphones as well as on computer monitors and large panels/TVs. Content makers and distributors have to bring up the image quality of the content they offer accordingly, no doubt about that. But to get there, moving up to wider contrast and wider color space standards is much more important than bringing up the resolution at the likely expense of frame-rates.

In any case, more on all of this in the next post.

Saturday, October 01, 2016

When does a screen have "too much" resolution?

(Note: This entry is a translation from the original post in Spanish, written back in Oct/2013. Re-posting it in English because I'll be writing some things shortly also in English, about the current trends in display technologies and the consumer TV market, 
OLED vs. LCD, HDR, wider Color Gamuts, and so on. That post will likely refer to things covered here, so wanted to have this text already in English.)

4K resolution, also called "4K Ultra HDTV" or "Quad HD" is the resolution offered by the latest-generation TVs. This resolution is equivalent to approximately 4x the resolution of Full HD or 1080 (see the image above.) 4K is 3840 x 2160 (or even 4096 x 2160 pixels), a little more than eight million pixels in total. Quite a few pixels! But is it really useful to have such a high resolution on our televisions? That depends on several things, and that is the topic of this post. The idea is to inform you so that you won't spend a fortune on something that possibly you will not be able to enjoy or take advantage of, in spite of what salesmen or even other consumers would want you to believe.

Consider first a cell phone, such as an iPhone 5. Its screen size being only 4 "(diagonal), and its resolution 1136 x 640 pixels. Note that this resolution is relatively" low "in the sense that it's not even the minimum HD (which is 1280 x 720.) But this "low" resolution in such a small screen results in a very high pixel density: 326 ppi (pixels per inch,) that's about 12.83 pixels per millimeter. In other words, the pixel size on this iPhone, assuming they are square, is only 0.078 mm per side (less than 8% of 1 mm the side of each pixel.)

As a marketing strategy, Apple gave a rather picturesque name to this pixel density of the iPhone: they called it retina display. The reason was that, in principle, our eyes, or our vision in general, cannot distinguish those pixels if we place the iPhone at a distance of at least one foot (30 cm) from our eyes. And without resolving the pixels, the image would then appear completely smooth rather than pixelated. What Apple is telling us here may actually be true or false, and that depends on our visual acuity.

In a previous post we saw that a person with "normal" or 20/20 visual acuity can distinguish between two parallel lines separated by one minute of arc, or arcminute. An arcminute is just 1/60th of a degree (and a degree is just 1/360th of a full circle.) An arcminute is thus a fairly narrow angle. How narrow? If we plot an angle of one arcminute starting from our eyes, the separation of the sides of that angle at a distance of six meters would be just 1.75 mm. (Remember, this is calculated using the circumference formula: [2 * pi * R / 360] / 60 = 1.75, where R is the radius of the circle, which in this case would be 6000 mm = 6m.)

About 30 cm away, the separation of the sides of an angle of one arcminute would be just 0.087 mm. Less than 9% of a millimeter. Ah! But there you go! Above we saw that the side of each pixel of the iPhone 5 has a length less than 8% of a millimeter, so in this case, pixels are a little smaller than what a "normal" visual acuity can resolve at a distance of 30 cm. That's the key! That's why in principle we can't resolve those pixels at that distance. Apple then did tell us the truth about the retina display, at least when a visual acuity no better than "normal" is assumed.

If you bring the iPhone close enough to our eyes, then you would distinguish the pixels, even if you have normal vision. (A 30-year-old can focus even at 15 cm, and a child can focus even at less than 7 cm.) And if we have a visual acuity higher than normal, then we would be able to resolve the pixels of the iPhone even at 30 cm.

We see that resolving or not the pixels of a screen with a particular resolution will depend on several things. Those things are precisely the terms highlighted above in bold, namely:

1) Pixel size (which is derived from the screen size and its resolution)
2) Distance between our eyes and the screen
3) Our visual acuity

The final effect on our eyes will depend on these three factors. We can assume that our visual acuity is already the best we can muster (using glasses if we need to, for example,) so overall we cannot improve factor #3. Then we can only modify factors #1 and #2. Modifying #1 means a different screen size, or a different resolution, or both. Modifying #2 means changing the distance between the screen and our eyes.

Clearly, if we can distinguish the pixels on a given screen, then either the resolution is too low for that distance, or we are too close to the screen given its resolution. The fact is that if we start moving our eyes away from the screen, at some point we will reach a distance at which we can no longer resolve the pixels. Only then, given that screen and distance, and our visual acuity, we could say that that resolution is "satisfactory".

But then again, when do we have too much resolution?
(Remember, this is the key question concerning this post.)

We will have too much resolution "A" when, for the same screen size, there is at least one lower resolution "B" that will also **not** let us resolve its pixels at the same viewing distance.

That is because, if resolution A is greater than B, but both resolutions at distance X on screens with the same size do not allow us to resolve their respective pixels, then at that distance the images of A and B are completely indistinguishable (in terms of resolution) to our eyes, no matter how finer resolution A is with respect to B. For that viewing distance, for that screen size, and for our visual acuity, resolution A would therefore be excessive and technically useless over and above B.

Let's elaborate a bit more.

Imagine we put many iPhone 5 screens together to build a single large 60" screen. That would require a lot of iPhones, in fact 15 x 15 == 225 iPhones. And do the math: the resolution you would get with that amount of screens (at 1136 x 640 per little screen) would be a wooping total of 17025 x 9600 pixels! That is more than 18 times higher than 4K. But ask yourself: would that be perhaps somewhat excessive and unnecessary? Well, given normal vision, we already saw that we cannot resolve pixels on any of those iPhones when our eyes are just 30 cm away. How much further from resolving them wouldn't we be, when this 60" screen has pixels of the exact same size as those on the iPhones, and we are to see them now from let's say three meters, so 10x times farther away?

In fact, a "normal" vision already **can not** resolve the pixels on a 60" screen with the "so much lower" 1920 x 1080 resolution (Full HD) from three meters away. Just getting closer to less than 2.38 m (7.8 feet) would allow you to begin resolving those pixels (this can be calculated similarly to what was already explained above.) So at distances beyond 2.38 m, no "normal" vision will reap any "benefits" from this Super Ultra Ridiculous resolution 18+ times higher than 4K on a 60 "screen, compared to a modest screen of the same size, with a simple 1080p resolution. Our eyes at that viewing distance simply cannot see the difference between these two resolutions.

That is a hyper-exaggerated example, but I hope the idea comes across. A resolution can be absolutely excessive and completely useless to our eyes compared to some other much lower resolution, depending on our visual acuity, and the viewing distance.

Now back to 4K.

A 60 "screen with a 4K resolution has quite small pixels. In fact, four of its pixels can fit inside one pixel from a 60" screen with 1080p resolution. At what distance can we resolve those 4K-60" pixels? Actually only at less than 1.19 meters (or ~3.9 feet; again, normal vision.) So you sit at 1.19 meters or farther away from that screen and you won't see any pixelated images; perfect! However, don't go sit beyond 2.38 m (7.8 feet) away from that screen, because then you will have paid that higher 4K resolution for nothing. As we saw above, beyond 2.38 meters you already wouldn't be able to resolve the much larger pixels on a 4 times lower 1080p resolution screen of the same 60" size. So if you are considering sitting beyond 2.38 meters away from a 60-inch TV, then it makes little sense to have it be 4K over 1080p, because a 1080p screen will look just as well at that distance (you won't even be able to resolve the pixels on the 1080p screen from that distance.)

What is more, if you sit beyond 3.57 m away (11.7 feet,) then it doesn't even make much sense to have a 60" 1080p TV, because at that distance you can no longer resolve the pixels in the 720p resolution (HD rather than Full HD) for that screen size. So all other things being equal, at 3.57 meters or more, a 60" 720p screen will look just as good (without pixelation) as a 1080p, and as a 4K the same size. Again, all this is assuming normal vision.

Of course, we would need to calculate things for each screen size, resolution, and every viewing distance possible to see if the combination works and makes sense or can be recommended for our particular needs. But I don't need to do that, because others have done it already (click to visit and enlarge):

 Screen Size vs. Viewing Distance vs. Resolution

One way to use this graph: first choose the viewing distance you are considering. For example, if it's three meters (~10 feet), then locate the value 10 feet on the vertical axis to the left, and draw a horizontal line across the entire graph at that height. Then check screen sizes on the horizontal axis below, and draw a vertical line from your screen size of interest, and see where it intersects that horizontal line you drew right before. Let's say, if you are considering 60" at 10 feet, the intersection between the two lines would fall near the tip of the red triangular area. Depending on the color where the intersection falls (blue, green red, or purple), a given combination will make sense or not according to the descriptions associated with that color (text blobs on the right, both for triangular areas and for the boundary lines between them.)

In our example, the intersection is on the red area, and the description for the red area tells us that the benefits of 1080p would be noticeable. That means, from 60" viewed at 10 feet we are ok with 1080p. But it also tells us, it would not be a combination that would let us benefit from 4K; we would need a larger screen, or a shorter viewing distance, or both, to move the intersection towards the purple area in order to do so.

This graph allows us then to respond fairly easily to the question on the title of this post: when does a screen have too much resolution? Answer: when the intersection between the viewing distance and the screen size falls outside (most likely above) the color associated with that screen's resolution. Note, for example, that only when the intersection falls below the red line, only then we would observe benefits from a 4K resolution.

I'm sure you will be surprised to realize how closely you have to sit from the screens (despite their large sizes) in order to truly reap the benefits offered by each resolution over the previous lower one. For example, beyond three or more meters away (10 feet or more,) a 50" 1080p screen hardly makes sense against a 720p, at least not for a "normal" vision. A 4K 60" screen would start to make sense only if you plan to see it sitting within midway between 5 and 10 feet, so about 7.5 feet, which is exactly the same distance we mentioned before: less than 2.38 m. But that would only be the distance at which normal vision "begins to notice the benefits" of 4K compared to 1080p. To really enjoy those benefits you would need to sit even closer to this large 60". Such close distances to large screens may be impractical, inconvenient, or simply uncomfortable for you. Or it may be the case that for your viewing distance of interest, the recommended ideal combination of screen size and resolution end up beyond budget.

By the way, we have only discussed here viewing distance with respect to resolving pixels. But there is something else which is very important to take into account when choosing the ideal viewing distance for a particular screen, especially when it comes to watching movies, and that is the horizontal angle that the screen covers on our field of view. SMPTE's recommendation is about 30 degrees, and any value between 28 ° and 40 ° complies with the THX recommendation (and certification.) In general, for home theaters, it is advisable to use at least 20°. Many consider this more important than resolving or not resolving pixels, because sufficient visual field coverage increases the effect of "immersion" into the film, whether it's pixelated or not.

Here's an example that could be used as some sort of reference. For a 50" 1080p (Full HD) display, any viewing distance between 1.98 and 2.97 m (basically two to three meters, or 6.5 - 9.7 feet) matches what would be the three key criteria for optimal viewing of movies:

1) We are far enough to not resolve the pixels on the screen (with normal vision)

2) We are in the range of distances where we are effectively taking advantage of our screen's higher resolution (in this example 1080p) over the immediately lower resolution (720p)
3) We are at a distance that allows the screen width to horizontally cover between 20º and 40º of our visual field 

For example, with a 4K - 60" screen, we would comply with #1 beyond 1.19 m, but we would need to sit no closer than 1.82 m to comply with #3. And as we saw earlier, we should not sit beyond 2.38 m to comply with #2. So it's important to realize how narrow the ideal range of optimal viewing distances gets for higher resolution screens. For a 60" 4K screen, it's between 1.82 and 2.38 m (between 6 - 7.8 feet). If we sit outside that range, we violate one or more of those three criteria above, and it would be best to change some of the variables at play: either the screen size, the resolution, or the simplest, our viewing distance.

For a giant 100" 4K screen, the ideal range complying with all three criteria would be between only three and four meters (about 10 - 13 feet.) In fact, a little narrower: between 3.04 and 3.96 m. But depending on your visual acuity, at 3.96 m (~13 feet) you are already risking not seeing any benefit from 4K over 1080p from a 100" screen. Better sit at the lower end of that range, just a little over three meters (10-12 feet). So yes, believe it or not, if your vision is normal, ideally you would sit just slightly beyond three meters away (~10 feet) from a giant 100" 4K screen.

In conclusion, there are cases in which resolution can be too high; effectively, unnecessarily, and uselessly too high, and it makes little sense to pay more for something that doesn't offer perceptible improvements over something cheaper. If you are looking for a TV or monitor, don't let all the fuzz from salesmen and even from other consumers fool you, about the alleged "remarkable and incredible benefits" any higher resolution is supposed to offer above lower resolutions. Take into account how far your eyes will be from that screen at your particular room and setup (that's possibly the most important thing,) and try to use the chart above (or the above formulas) and the three criteria mentioned here, to determine the combinations of resolution, screen size, and viewing distance that are truly convenient or even optimal for your particular needs and budget.

Additional Information:

1080p Does Matter - Here's When (Screen Size vs. Viewing Distance vs. Resolution)
Resolving the iPhone resolution
Optimum HDTV viewing distance (Wikipedia)
4K resolution (Wikipedia)

PS. The 4K resolution I mentioned as 4096 x 2160 is the one from Digital Cinema Initiatives (DCI.) 4K UHD (or Quad HD in the first image) is exactly equivalent to 4 times 1920 x 1080 (Full HD), or 3840 x 2160. In any case, they are very similar.

PS2. Refined some calculations on the post, and here taking the opportunity to explain an additional formula that might be useful. If you do not have the density of pixels per inch or per centimeter from the screen manual, you can get the pixel size by dividing the screen height by the number of vertical pixels on the screen. For HD that number is 720, for Full HD, 1080, and for 4K, 2160. To get the height of a screen, simply measure it, or from the diagonal use Pythagoras, knowing that the ratio of the screen is 16:9, that means the base of the screen is always equal to 1.777x the height. With this data and the diagonal D, we can easily calculate H the screen height: H = sqrt (D * D / 4.16). To convert that height from inches to millimeters, multiply by 25.4. For example, a 60" screen has a height of 747.2 mm, and pixels for resolutions of 720, 1080 and 4K on that screen would be 1.038 mm, 0.692 mm and 0.346 mm per side respectively.

Monday, March 28, 2016

My take on "Batman vs. Superman - Dawn of Justice"

Actually saw it twice, first in German, then in English.

Important disclaimer: no spoilers here, so no worries.

The fact that I saw it twice already gives away that my impression is rather positive instead of negative. Actually enjoyed it quite a bit, and could even watch it again. The movie never felt boring to me, even though it does have a lot of talking with little action for rather long stretches of time. But the intrigue always keeps it going, so it never felt as if losing momentum. I'd give it a 7/10, and feel I could even give it an 8, will explain later why.

Not true that this is a completely humorless movie. First of all, there's nothing wrong with absolute lack of humor in movies. "Batman Begins" felt way less humorous, and I think it was an excellent film. "Batman vs. Superman" has however at least three humorous bits that I noticed: one comment that Bruce Wayne gives Clark Kent about Superman, one thing Diane Lane says to Batman at one point, and something Superman and Batman say to each other about Wonder Woman. Those are clear humorous bits. It is simply wrong for some critics to keep repeating that the movie is dead humorless. It's no comedy, that is ok, but that is also no problem.

There are explicit associations between the god-like superhero figures and actual real-world religious gods, names included. I'm sure this is uncomfortable for quite a few. In fact, I suspect this might be part of why some critics seem so negatively taken by this movie. In my opinion this religious angle is slightly lame in and of itself within the movie, but it's not the biggest negative deal.

The movie does have what I would call huge plot weaknesses, but won't elaborate because I would enter spoiler terrain. Will only say that whenever a rather major plot twist depends on some rather silly coincidence, it smells like simply bad writing. And what is worse, there are a number of weaknesses, not just one. But nothing else about that. Let me try to list quickly the other things I did not like so much about it as a whole, then what I really *really* liked.

Ben Affleck I found not so lame as Batman, but I find Christian Bale easily way better. For some reason, every time they show a close-up of Affleck's "angry" face, it seems to me he is about to explode in laughter or in one of his smug smirks. Also, in quite a few shots he just has what looks like his dumbest possible face, which does not correlate well with the Batman character, really.

Some critics claim the movie is a display of incoherence. Well, I don't think so, even though... The trick of showing character's dreams or short delusional trances as part of the movie itself is used a couple of times. Without telling in advance, only shortly thereafter the movie lets you realize it was only that: a dream or delusional trance. Well, these dreams/trances *are* rather or very incoherent some times. I only have an issue with one of those, well rather two of them, not because of being incoherent but because of adding plot holes.

The movie borrows (I think) quite a few elements from several other movies, which might contribute to the incoherence claim. For example, Bruce Wayne now seems to imitate 007 at least as far as driving an Aston Martin. The movie looks dark and has a mostly cool (blueish) color temperature overall, but in one stretch the entire aesthetics seemed to have been stolen from, believe it or not, Mad Max Fury Road, everything warm and sun-scorched, in dessert-like landscapes, and with similar full-face clothing against the sand. Also, some evil-boss features seemed to have been stolen from Fifth Element. Some parts of the movie make you think too much of King Kong, and at some point, Batman himself could make you think he is playing Spiderman. I could keep going but will leave it at that.

Another point I would call negative: some special effects. As impressive as some of them are some times, some other times they were slightly disappointing.

And that's all for the negatives. Now the positive.

Sound and music and acting for me ok. Acting-wise, particularly liked the ladies: Diane Lane, Gal Gadot, Amy Adams, and also Holly Hunter.

Some critics bash the fact that this movie replays once again how Bruce Wayne's parents got killed when he was just a child. Gee, I have to disagree strongly. Not only this is possibly the briefest of all such replays in all of Batman movies (at least feels very brief,) I think it's the most devastating, and at the same time, photographically speaking, the most impressive.

I'm no fan at all of 3D, but this movie uses 3D quite well, particularly near the beginning and near the end. At least twice it did make me feel as if I was on a floating vehicle, or as if the entire theater was moving. This is something that I don't feel easily at all with 3D movies, so I have to give credit to this one for this. Just as with 3D, I liked the photography a lot, also particularly near the beginning and near the end.

Forget about Superman vs. Batman trivia. I won't say anything about how predictable or unpredictable their face-off turns out, the fact is, that is not really so relevant in this movie. What I found most relevant was clearly Wonder Woman. Not only was her character painstakingly introduced throughout the movie, the way she enters the real action was just amazing, really one of the best I've seen in any hero movie ever. Her badassness is something to behold, and in fact, I leave this movie twice feeling there was too little of her on the screen. Which I fancy was precisely one of the goals the producers intended to achieve (given the upcoming Wonder Woman movie.) In my opinion, they succeeded magnificently in that respect.

The fact that the actress playing Wonder Woman, Gal Gadot, has a beauty that seems almost unreal, is truly besides the point. (To me she looks like an impossibly enhanced mixture of Famke Jansen, Taylor Swift, and Natalie Portman, three unreal beauties on their own.) I won't enter the details, but will just say: the action predictably increases on the second half of the movie. How Wonder Woman enters these action parts simply got my jaw dropped. This was badassness as they have hardly shown in any recent hero movie. Unbelievable. Almost all of the fights and punches of Batman and Superman are completely forgettable, because actually we've seen them all already, in fact arguably better ones in their previous movie incarnations, so nothing radically new there. But Wonder Woman here is something else. Trust me: if there's anything worth watching in this movie, it's actually not the duel between the guys, it's really her, this badass character and how this actress plays her so excellently.

This aspect of Wonder Woman is why I said earlier I could give it an 8/10. I will also say that I liked how the movie ended, all the way to the very last few milliseconds (if you see it you'll understand why I say so.) In particular, photography and 3D near the end for me gets also rather memorable as in the beginning. But boy the plot weaknesses hang heavy for me, so for now I leave it with a 7/10.

In conclusion: easily recommended. You can even think about the Superman vs. Batman duel as pure context. You simply have to watch this new Wonder Woman in action. It takes the movie quite long to get there, but I find it is well worth it.

Sunday, January 24, 2016

Excellent colors on current IPS LCD monitors!

(Kicking off my English language posts with this one.)

My friends know I'm a home theater and audio enthusiast. You might also say an Audio/Video-quality freak :p If you have an LCD screen, I encourage you to calibrate it as best as you can using Lagom LCD monitor test images.

I've been using that same link for many years to calibrate my computer screens. That is not a pro (colorimeter based) calibration, but it can massively improve the image quality you are getting from your LCD display.

LCD screens used to suffer greatly from poor contrast ratios, poor color accuracy, and terrible viewing angles. Modern "fast" LCD panels (those with 2 ms or less response times, desirable for gaming applications) still suffer from this, because they are mostly still based on this so called TN (Twisted Nematic) technology. But there are newer LCD technologies at play, VA and IPS/PLS among them, which keeping aside response time, offer much better image quality in terms of contrast ratio, color accuracy, and viewing angles (as in image above.)

Recently I got a new relatively cheap Samsung PLS monitor. (PLS is Samsung's jargon for its own enhanced IPS implementation.) In spite of its relatively low price, it is clearly waaaaay ahead of a Samsung I purchased only about 4-5 years ago while still in Venezuela (pre-Dakazo times and back then not yet planning to emigrate.) After calibration, I watched carefully several testing scenes of my own choosing from Hero, Lord of the Rings, Matrix, Tangled, and Frozen. (Have not yet brought my dvds or blurays, but I have basically almost-free unlimited access to the huge video library in Bonn's Stadtbibliothek.) Skin tones, color saturation, and color accuracy overall remind me of my older Panasonic plasma which I used to calibrate with an AVIA calibration DVD, and was simply astounding. Only the blacks are not as good, but colors overall on the IPS panel are amazing, gorgeous! IPS/PLS panels are really very very good color accuracy-wise, particularly once calibrated.

For the record, IPS panels from LG and other brands I think are just as good and similarly priced. Chose the Samsung mostly out of familiarity and reliability experience with the brand, and curious about comparison with my older monitor. Also viewing angles to me seemed a bit better on the Samsung (more on this further down.)

OLEDs are not LCDs, they are based on a different technology, and they have the clear edge (even over plasma) with respect to blacks and contrast, arguably the most important factors in ultimate image quality --even more important than perfect color or ultra-high resolution. OLED is the most promising image display technology right now, but still too expensive, and they seem to be not as good color-accuracy-wise: the blue diodes seem to last less than the other diodes, which creates color issues over time. OLED panels also suffer from image burn, which is why there are still no OLED-based computer monitors in general, neither entry-level nor premium/pro-oriented.

My selection was narrowed down to sizes between 22" and 24", find that size-range optimal for typical viewing distances of 50-70 cm. Also full HD (1920x1080) resolution, 16:9 aspect ratio, and non-gaming monitors; didn't care about less than 1 or 2 ms response times, cared mostly about image quality, color and contrast-wise, and comfort/non-flickering (non PWM) feature. Did not want to purchase online because of fragility of monitors, so only monitors I could easily see and adjust in person on brick-stores. Had to have HDMI input(s), and at least audio output, but did not care about speakers on the monitor. My final selection was down to LG (22 or 24)MP57VQ-P, and Samsung S(22 or 24)E390H. After side-by-side comparison on stores, all playing the same video signal from HDMI inputs, without even touching them it seemed to me the Samsung 22" had slightly better viewing angles than the LGs, and surprisingly, even slightly better than the Samsung 24". Keep in mind that viewing angles is something you can't adjust playing with settings; it's something fixed given the panel's technology and construction. Settings-wise and color/contrast-wise they all seemed rather comparable overall. The LGs are VESA mount compatible, the Samsungs aren't, but even so I felt slightly inclined for the image quality on the Samsungs, in particular the 22" over the 24".

PS. Update 25.03.2016: Recently discovered a very good Computer Monitor rankings list compiled by Chip.de. Not surprisingly, at this time pretty much all their top monitors are IPS panels.