Fluvio Labenti

( flowing stream )

Saturday, May 02, 2020

Some love for your books, please

It really drives me a little crazy every time I see a book covered with self-adhesive plastic. Here's an example borrowed long ago (before the lockdown) from the local library in Bonn. Click any of the photos for a full-size view:

Self-adhesive plastic, like Con-Tact or other brands, will stick to your original book cover permanently. Nothing against Con-Tact et al, just against adhesive plastic used like this on books.

The risky undertaking of removing self-adhesive plastic from a book (e.g. with heat guns or what not,) almost certainly will cause some damage to the exterior of your original book cover. Not to mention cover coloration changes caused by the glue, even if perfect removal was possible.

If you have ever covered any of your books with self-adhesive plastic, I am here to tell you that you should not do that ever again. Ever.

Instead, use non-adhesive plastic and the zero-damage technique hereby described, which:

  • Does not alter the book at all, not allowing any glue or sticky thing to touch any surface of your precious book whatsoever (<= this is single-handedly the most important characteristic.)
  • Because of the above, it is easily removable and hence replaceable, without causing any damage to your book.
  • Protects the book perfectly.

Needless to say, I love books. Have done and enjoyed some DIY book bindings, book cover reconstructions, and book restorations myself actually. But besides that, I have been protecting most of my books this way since I can remember - for sure since undergrad university times, likely since some time in highschool, possibly even primary school.

First of all, the needed materials. This could not be easier:

  • The book you want to protect and cover
  • Non-adhesive plastic of your choice, here using a very nice frosted/textured one
  • Some Cell-o-tape
  • Scissors 
  • Optional: some tupperware-like container (bare with me.)

And here are the steps:

1) Pre-cut all the pieces of cell-o-tape to be used. We will need exactly 12 pieces, each one about 3-4 fingers long. Have them ready to be pulled and used when appropriate. Normally I stick them all onto the edge of a tupperware before starting:

2) Cut a rectangle of the plastic to use as cover, so that when folding it around your book, you would get at least 3-4 fingers of plastic beyond all the borders of the book. Something like the following:

3) Starting with the back, fold the plastic onto the back cover of the book:

4) Apply pieces of cell-o-tape on both ends of the folded plastic, so that it sticks not to the book, but to the plastic itself. This is the whole essence of what we want to do: never sticking anything to the book directly. As you'll see, absolutely all the cell-o-tape pieces are to land completely on the plastic, never on any part of the book:

5) Do the same folding for the front part, and stick the cell-o-tapes there as well, just as we did for the back:

With that, we are four little cell-o-tape pieces down; eight more to go.

6) Now fold little triangles inwards on each corner of the folds, and then apply tapes on each, again landing the tapes completely on the plastic. You can leave some couple of mm between the triangle and the border of the book, as shown here:

After all four corners are done, we are now eight little tape pieces down; only four more to go. The work so far will look like this:

7) Now near the spine of the book we need to make cuts in the plastic, for the top and bottom folds that will be needed onto the book covers. Two cuts for the front and two for the back, so four cuts in total. This should preferably be done slightly away from the spine edge to make the next step #8 easier:

8) Now starting let's say from the top rear, fold the plastic piece onto the inside of the back cover of the book. A ruler or something similar hard and flat will help a lot in doing this properly. Notice how I sort of pull and pin the plastic down onto the plastic itself which is already folded onto the book cover from the side and from an earlier step:

9) And here comes the magic trick again: apply another little piece of cell-o-tape, this time diagonally, so as to bind that plastic fold coming from the top with the plastic fold coming from the side of the back cover. Yet again: the little sticky tapes are always placed so that they land completely on the plastic, never letting any part of them land on any part of book. They simply hold some of the plastic against some other part of the plastic:

10) Repeat the previous step for the remaining corners, and after that, we are done with the 12 little pieces of cell-o-tape. The work now looks as follows:

11) Now we must finish those little wings of plastic that remain hanging out from the ends of the spine of the book. There are two ways to proceed depending on whether the book is a soft-, or a hard-cover one. Here are both scenarios:

11-A) For soft-cover books: simply trim those little wings of plastic near the spine of the book, and our work is done:

Ta daaaaa!

11-B) For hard-cover books: Some hardcover books have no space between the spine and the spine cover (they might be glued together.) Or even if they are separated, the spine might be too narrow. In those cases, simply proceed as if it was a soft-cover book: just cut the plastic as in step 11-A above.

If there is space between the spine of the book and its cover, and the spine is wide enough, then cut the sides of those little plastic wings, so that you end up with single flaps that can be folded into that space. It is convenient to trim the plastic so that the remaining pieces end up slightly trapezoidal, so wider near the spine's edge, and narrower at the end of the plastic, as shown here for the top flap:

Folding the final little plastic flap into the space between book spine and spine cover:

Push it all the way in, and then, with your fingers, press the plastic softly onto the edge of the spine cover, so that it stays bent and remains put. The same must be done for the bottom flap of course. The end result for a hard-cover book should look like this:

Here a couple of additional photos of this hardcover book showcasing this non-adhesive, zero-damage plastic cover technique:

And here a few more books already covered and protected this way. The yellow one on top does not have the frosted plastic as cover but a completely clear one, so it's hard to see on the photo:

This kind of protective cover is not only super durable, but also perfectly replaceable. If after many years and/or lots of usage that plastic cover looks worn or bad, simply remove it carefully by cutting out the tapes on the corners, unfolding the plastic all the way, removing it, and then going through the process described here once more, covering the book again with a brand new plastic and the same technique: no sticky tiny bit of cell-o-tape ever touching any surface of the book or book cover anywhere. No alteration, no damage whatsoever. Just protection.

Why some libraries keep covering books with self-adhesive plastic really beats me. In any case, I hope this little guide will help you better protect your beloved books.

PS. Step 12) Clean your tupperware! ;)

Saturday, February 22, 2020

The ultimate solution to GPU sag

If you are looking for the ultimate solution to GPU sag, you came to the right place.

Here are four other working solutions to GPU sag as suggested by famous tech YouTuber's Paul´s Hardware, and Jay's Two Cents. Basically:

#1 (Paul's): PCIe power cables routed upwards and pulling a bit.
#2 (Paul's): buy and install a GPU support bracket
#3 (Paul's): put a toy or some other supporting object right under the saggy corner of the GPU.
#4 (Jay's): install a little m3 screw through the back of the case, right above the GPU tab that is diagonally opposite to the saggy corner of the GPU.

With respect to suggestion #4, I tried it on my own PC and it caused my GPU to overheat massively (+20 ºC). My explanation was that the torsion imparted by the little screw may have warped the GPU board enough to compromise the contact between the GPU die and the GPU cooler. Fortunately, when removing the little m3 screw, the GPU was back to normal thermal behavior, but I would not recommend option #4. Wrote a comment about that in Jay´s video.

In any case, neither one of those four solutions is satisfactory when you not just want to avoid GPU sag, but when you really need to immobilize that GPU. For example, when you might want to ship the PC internationally, with a (relatively wobbly) GPU installed. Which is exactly what I needed to do.

Recently I built a mini ITX system for my sister using the following components:
Case: Cougar QBX
PSU: Cooler Master MWE White 450 W
Motherboard: Gigabyte B450 I Aorus Pro Wifi
CPU: AMD Ryzen 3600
CPU Cooler: Noctua NH-L9x65
RAM: Corsair Vengeance LPX 2x8GB 3000 MHz
SSD: WD Blue SSD M.2 Sata 500 GB + Crucial 1 TB Sata
GPU: Asus Strix GTX 1070

The system ought to be shipped from one country in Europe to another, and my sister does not like to have absolutely anything to do with electrical stuff. So she would not welcome the idea of receiving the PC, plus something separately (the GPU), and having to open the PC and install said separate thing somewhere inside there somehow. No siree, nope, not a chance. She would gladly rather wait for my next visit, for me to do that installation myself.

So either I shipped them separately, and I installed in my next visit, or I find a way to really immobilize that sag-prone, wobbly corner of the GPU, and ship the full PC ready for her to power up. I wanted to do the latter, and that´s what I did.

Here´s the process in photos.

First, let´s see the sag-prone corner of the GTX 1070, the GPU in this build. (Ignore the little red electrical tape on the power connector; installed that just to dimm down the blinding white LED of the card.) Notice the top little corner of the backplate of the card, highlighted in the red circle. That is a structural spacer between the actual GPU board, and the plastic ROG backplate on top. One appropriate way to immobilize this specific GPU would be to somehow clamp this point of this corner safely:

The same corner seen from below shows that there is a screw head holding the spacer. Notice also that there are tiny delicate electronics very close by:

The screw head is taller than the tiny components, but just in case I decided to protect all of that with at least five layers of electrical tape, actually not just there but also on the back plate side, so both sides of what would be eventually clamped:

And the following photo shows my little GPU stabilizer solution, with the materials I used. The aluminum flat bar is 2 cm wide, and 2 mm thick. The assembled little screw rod shows the nuts and washers I used. The little black piece at the bottom represents what would be the bottom of the PC case, while the two white cloth pieces represent the aluminum pads that would clamp the now electrical-tape protected, sag-prone corner of the GPU on the previous photos, here represented by a little USB stick drive placed between the white pads:

Here is the rod already installed at the bottom of the Cougar QBX case. Please be aware that drilling into a PC case better be done either without any components in it, or very carefully isolating/protecting all remaining components in the case, so that no metal dust/debris falls on them. Such debris can easily cause short circuits and damage your components, so watch out!

Here some photos showing the construction of the clamping pads. These required some Dremel and metal files. The final size and shape is up to anyone's taste. The important thing I wanted was to have some placement flexibility. That´s why I made that inner slot instead of just a hole for the screw rod. I also decided to apply several layers of electrical tape on the pads' tips, for additional cushioning and isolation between the clamp pads and the GPU corner:

And finally, here is the GPU stabilizer fully installed, holding steady the formerly hanging corner of GPU.

This little stabilizer actually does quite some more than just completely removing GPU sag: now there is no GPU movement whatsoever.

Some additional notes/tips about the final installation process:

First I placed the PC lying horizontally on its back side, so that the whole weight of the GPU was resting on the PCIe slot, and therefore the GPU had no sag whatsoever. That is the exact position of the GPU to preserve and immobilize with the clamp.

All the bottom pad related elements in the stabilizer should be screwed in and loose further down the rod. That means: the bottom clamping pad, and the nuts and washers under it. By the way, as you can see in the photos, I used a total of three washers under the bottom pad: a large one right under the pad, then a smaller split lock washer, then a normal washer between that one and the bottom nut.

The top clamping pad can be completely fixed first and independently of the bottom one. Notice that it prevents the GPU from moving further upwards. That pad can be tightened somewhat strongly, since it´s just a stop, and it can be tightened while the pad is just in contact with the electrical-tape-covered corner of the backplate of the GPU, not pushing it downwards at all, just keeping it exactly where it already is, while preventing it from going anywhere further up. The internal nut and a washer under the top clamping pad not only allows this independent fixing of the top pad, it also helps replicate near the rod the thickness between the clamping tips touching the board.

After the top clamping pad is properly placed and fixed, then the bottom pad should be brought in contact with the underside of the board´s corner. Then the nuts and washers under the bottom pad can be slowly and gently tightened up, till the whole arrangement becomes rigid enough, clamping and completely immobilizing the GPU.

So there you have it: the ultimate solution to GPU sag. Not just a GPU sag killer: it immobilizes the GPU for good. The system can now be safely and worry-free shipped, even with the GPU installed.

Another similar solution could have been not to use clamping pads but a simple horizontal stick somehow attached to the vertical rod, and then immobilize the cables of the GPU power connector attaching them to that stick, maybe with tie-wraps. But I eventually favored the clamping approach since I thought it would provide a lot more precision, stability, and rigidity.

Note of caution: your mileage may vary depending on your specific GPU, how heavy it is, and how its sag-prone corner is actually constructed.

PS. Replaced the main grill using a DIY one with larger holes for better GPU breathability. Made of zinc plated steel, it was quite too shiny. Painted it with a thermal black spray paint, which after dried down required to be placed in the oven at 180 ºC for 1 hour for hardening:

Thursday, January 30, 2020

When to upgrade your PC, a golden rule?

After basically thinking aloud writing some comments on this Youtube video from BPS Customs, I thought I could elaborate further here on my own blog.

If you have a cell phone, a gaming PC, or if you are using or have used any computer for that matter, very likely you have experienced the harsh realities of tech obsolescence. Science and technology improve on a daily basis, often quite drastically. Give it just enough time, and you are left with a gadget that is only a few years old, yet newer gadgets are way more powerful, or support newer standards and protocols and connectors that yours does not, even though yours was probably not exactly "cheap" when you got it not that long ago. Sounds familiar?

There's nothing we can do about this, except to upgrade to some newer, more powerful gadget at some point in time, whenever we decide to do so. But when is it a good time to do so? Not always an easy decision.

Isn't your gadget/computer still powerful enough? Do you really need already that newer one? Is the newer one really that much different/better/faster/more capable? Can your current "old" rig not serve you well enough for some more time, before you drop all that cash for the newer stuff, which will for sure go through the same aging process quite inevitably anyway?

Tough call. Tough call.

Everyone can approach such upgrade decisions their own way. Each has his/her own interests, priorities, and most importantly, pockets. If you have money to burn, simply get the latest/best equipment you want whenever it becomes available or whenever you want, and done with it. No choice paralysis whatsoever :) But plausibly many people do not have deep enough pockets to adopt such an approach. Some others may want to use their resources in an efficient, sustainable manner. In any case, when to upgrade? Is there a golden rule?

Here is a rule I have sort of internalized to guide my own decision making, trying to optimize the utilization of my money, the usability of my existing rig, and yet staying with a current and well performing system.

Technology obsolescence aside, my golden rule is the following:

Consider upgrading only when you can get 2x the performance for the same price you paid last time.

Notice, such rule is not at all the same as suggesting to upgrade when you get similar performance for half the price. Those are two completely different things, and here's a concrete example why: right now an AMD Radeon RX 5700 XT graphics card (currently a best value in the mid to upper-range GPU category) offers about the same performance the "old" Nvidia GTX 1080 Ti did three years ago, at somewhat near half the price (considering european prices.) But there is simply no existing option right now that would offer you 2x the performance of a 1080 Ti for its original price. GPUs have not evolved that quickly. So whenever you get the same performance for half the price, you don't necessarily get double that performance for the same price.

Updates / corrections early Feb. 2020:

With respect to CPUs, the AMD Ryzen 3700X currently does provide about twice (in fact 2.1x) the multi-threaded performance of the now 4+ years old Intel i7-6700K, for about the same price, or even slightly less. Such an upgrade would perfectly exemplify the application of this golden rule --if you mostly cared about multi-threaded performance, that is.

Notice that the rule can be applied even if planning to jump up to a much higher performance class of equipment.

The AMD Ryzen 9 3900X CPU offers about 3x (more exactly 2.9x, according to PassMark's CPU Mark) the multi-threaded performance of the i7-6700K, but at a higher price, more exactly, 1.5x the price in the US market. So as of early February 2020, their relative performance/cost ratio is really 1.9x, close to but not 2x quite yet. (Cost of the 3900X would need to get to about or below $450 to match that 2x ratio.) Still a clearly beefy upgrade.

For the AMD Ryzen 9 3950X, the relative performance/cost ratio gets worse at 1.34x - 1.5x because of its much higher cost (2.2x) over the older Intel, while its multi-threaded performance is only slightly higher than the 3900X's (3.2x  vs.  2.9x). The higher a performance jump you aim at, the louder the law of diminishing returns will scream at you. That ~12% extra performance offered by the 3950X over the 3900X costs however ~60% more.

If your system had some different old parts, or if you are eyeing different new parts, the situation might be different. Also if you have some urgency to upgrade (e.g. you need support for some new standard, you want some new feature, or you want a better gaming experience just because,) then you could make that 2x smaller and to your taste, let's say 1.75x, 1.5x, or even lower? Up to you and your needs. This golden rule at least gives you a good framework to keep in check how much you would be spending, vs. how much or how little extra performance you would be paying for yet again.

Technology improves quickly enough, so I think it's not worth it to use inflation-adjusted costs when applying this golden rule. But as they say, your mileage may vary, so don't quote me on that ;)

Tuesday, September 24, 2019

TV Brand ranking update: two+ years later

Finally here the post promised two submissions ago, and also an update (2+ years later) to the TV rankings from last post.

This update considers the following use cases from RTings.com:

1. 4K Gaming
2. PC Monitor
3. Sports
4. HDR Gaming
5. TV Shows
6. Movies

The Best Outdoor TVs category was not included in my ranking calculation because a full listing with a specific single score for that category seems to be missing.

The scoring works the same way as in last post, and that means: for each of those six Rtings.com usage categories, I simply find the first position where a TV brand appears on the corresponding scoring table. There are six categories, so that means there will be exactly six such numbers for each brand. Those six numbers get averaged per brand, and that will be the brand score. Simplistic interpretation: the lower the score, the better the brand.

As of today I get the following results, clustering the brands in tiers somewhat arbitrarily and manually by proximity:

Tier 1:
#1: LG with 2.17 (S.Korea)
#2: Samsung with 5.00 (S.Korea)
#3: Sony with 6.00 (Japan) 

Tier 2 (updated: this tier has spread out scores, but no need for three tiers really)
#4: Vizio
with 14.50 (USA)
#5: Hisense with 25.33 (China)
#6: TCL with 39.17 (China) 

(Keep in mind that for example Panasonic cannot be included in this ranking because RTings.com does not review TVs from Panasonic. Clearly it also does not review all TVs out there on the market.) 

We can comment quite a few things about this update to the rankings. Two years ago, or rather almost three, LG (South Korea) was #1, and Sony (Japan) was #2. Right now South Korea is hogging the top two places all for itself with LG and Samsung. The latter, who has been pushing QLEDs as allegedly the better technology against OLED for the last few years, not only crawled up from #4 to #2 surpassing Vizio and Sony, it is now allegedly also preparing an OLED offering: a sort of hybrid between OLED and Samsung's own "quantum dots". That ought to shake up things in that Tier 1, specially between the two South Korean giants.

As of today, the general consensus still is that OLED TVs offer the best image quality. And regardless of the OLED TV brand, all of them currently have a South Korean, LG-manufactured OLED panel. The technology offering the next best image quality after OLED is QLED, as from Samsung at #2 above. South Korea all over the iron throne.

Japan on the other hand, well... it's complicated.

Even if [1]: Japan was the first one to announce plans for 4K as well as 8K broadcast TV;
Even if [2] first Pioneer (Kuro plasma TV) from Japan, and then Panasonic (eventually got all plasma patents from Pioneer,) also from Japan, brought Plasma TV to the highest consumer image quality levels ever seen before OLED;
Even if [3]: Panasonic has won some international TV face-off competitions with its OLED TVs against Sony and LG;
and finally,
Even if [4]: Sony is now showing off an almost 6 million dollar modular super huge display... 

In spite of all that, again, Sony as well as Panasonic, any other Japanese brands (Toshiba and Sharp come to mind,) as well as all other brands in the world, currently still depend on LG panels for their OLED options. Quite a position of power over the industry for South Korea. Of course, the display panel is not everything on a TV. The image processing circuitry feeding images to that panel is crucial, and Sony as well as Panasonic seem to have some major tech strongholds there behind the panels, in particular with respect to motion control, color/shade gradations, and choice of tone mapping curves for HDR.

Vizio (USA) is offering great budget/best value options, catching up as far as image quality goes. And Hisense and TCL (both from China) appear now on the rankings, also seemingly catching up with respect to image quality for very competitive budget options.

While the rumour mentioned above about Samsung considering an OLED offering is circulating, now there is also some growing hype about the upcoming MICROLED technology, which should match OLED's perfect blacks, while offering much higher brightness levels, yet no permanent burn-in risks whatsoever. So basically, it will combine the best features of the two current best technologies (OLED and QLEDs,) while completely overcoming their respective shortcomings. Sounds like a holy grail, but we'll have to see how it delivers, and most importantly, how much it will cost.

I find the modular approach to building very large displays a particularly interesting development. The super huge screen from Sony is shown at the beginning of this post. Both Sony and Samsung have recently showed off prototypes based on that approach, and it really sounds very promising. It would allow consumers to flexibly and progressively build up and "grow" their desired screen size whenever they are ready to do so, and to whatever larger size they want. Let's say you start with a modest 55" screen, but then over time, and assuming enough money and space, you make that grid become a 100", 200", or even a larger display covering an entire wall, not by replacing your TV, but by adding more "screen tiles" to your existing TV. Important to realize that apparently no one ever complains about getting too large a screen, it's rather the very opposite. So screen size plays quite a major role in the consumer market. This modular approach might turn out to be a clear win-win for manufacturers as well as consumers.

Hopefully some sort of calibration ought to take care of proper brightness and color uniformity across all those tiles in those grids at all brightness levels, even if the tiles might come from different production batches finalized years from one another. Those uniformity issues might be one of the main possible problems for this modular/incremental approach, together with the difficulty of achieving perfect separation invisibility (or "seamlessness") between adjacent tiles. In any case, when such a grid/tiled based display becomes a desirable option even for colorist and film makers when they need professional reference monitors for their work, only then we'll know that these displays are among the very best that technology can offer for ultimate image quality.

Until then... 

Well, let's at least wait for Microleds, even if not modular, and let's also wait for that new hybrid OLED-QLED offer from Samsung.

Before all that, this
updated ranking based on RTings.com scores might give an approximate idea of how the biggest players are standing right now against one another with respect to TV technology.

Friday, May 05, 2017

My TV brand ranking

A diversion from the post I promised last time, although significantly related. This post is the result of an exchange of comments in this YouTube video.

The idea was to elaborate on the current supremacy of OLED over LCD/LED tvs, and right now that basically means LG over all other brands.

But Sony and others also have now OLED offerings. They all use LG OLED panels, mind you, so credit where credit is due. But still, we might want to compare the scores of two OLED TVs from different brands even if they all use LG panels, because obviously, besides the panels, not all other things are equal.

Yesterday I submitted a comment with a special ranking I made for myself about TV brands. My ranking calculation works the following way:

Rtings.com has six usage categories for TVs. Here they are with links to the corresponding pages that include the full scoring tables for several TV models and brands:
1. HDR Gaming
2. Movies
3. PC Monitor
4. Sports
5. TV Shows
6. Video Games

For each of those six Rtings.com usage categories, I simply find the first position where a TV brand appears on the corresponding scoring table. There are six categories, so that means there will be exactly six such numbers for each brand. I average those six numbers, and that's the brand score. Simple.

Keep in mind that for a given brand, their best scoring TV model in a given usage category may not be the same best scoring TV they have on another category. And a given brand X may have a TV in the first position, but then all other top TVs for that category from position #2 till #50 might be from brand Y. But we will not worry about any that, because we are trying to rank the brands themselves, not specific TV models. And we are ranking the brands just by averaging the top positions they achieve in these categories, and nothing else.

Yesterday I submitted a comment to that YouTube video, showing my ranking of the top four TV brands (LG, Sony, Vizio, and Samsung) calculated using this scheme. The scores per brand were the following:

LG: 1.0
Sony: 4.5
Vizio: 11.5
Samsung: 13.0

But that was yesterday. Incidentally, today Rtings.com published their review of the newly released Sony OLED A1E. That's why I'm posting this on my blog. I had to recalculate my brand rankings, and I figured this whole thing was a bit too long for a YouTube comment :P

Well here is the update. In each usage category, the top standings per brand right now are the following:

HDR Gaming:
 First LG at #01: EG9600 (Score 8.9, 2015)
 First Sony at #02: A1E (Score 8.9, 2017)
 First Vizio at #10: P Series 2016 (Score 8.4, 2016)
 First Samsung at #12: Q7F (Score 8.4, 2017)

 First LG at #01: LG C6 (Score 9.4, 2016)
 First Sony at #02: A1E (Score 9.0, 2017)
 First Vizio at #08: P Series 2016 (Score 8.7, 2016)
 First Samsung at #13: JS9000 (Score 8.0, 2015)

PC Monitor:
 First Sony at #01: A1E (Score 8.7, 2017)
 First LG at #02: C7 (Score 8.7, 2017)
 First Vizio at #07: P Series 2016 (Score 7.9, 2016)
 First Samsung at #17: Q7F (Score 7.3, 2017)

 First Sony at #01: A1E (Score 8.4, 2017)
 First LG at #02: C7 (Score 8.4, 2017)
 First Samsung at #14: Q7F (Score 7.8, 2017)
 First Vizio at #18: P Series 2016 (Score 7.6, 2016)

TV Shows:
 First LG at #01: C6 (Score 8.4, 2016)
 First Sony at #04: A1E (Score 8.3, 2017)
 First Samsung at #15: Q7F (Score 7.7, 2017)
 First Vizio at #23: P Series 2016 (Score 7.3, 2016)

Gaming TVs:
 First LG at #01: C7 (Score 9.0, 2017)
 First Vizio at #02: P Series 2016 (Score 8.9, 2016)
 First Sony at #03: X850E (Score 8.9, 2017)
 First Samsung at #09: MU8000 (Score 8.5, 2017)

Averaging all those top positions achieved per brand, the updated scores as of today ends up being the following, TA DAAAAAA!!!:

LG: 1.3 = (1 + 1 + 2 + 2 + 1 + 1)/6
Sony: 2.2 = (2 + 2 + 1 + 1 + 4 + 3)/6
Vizio: 11.3 = (10  + 8  + 7 + 18 + 23 + 2)/6
Samsung: 13.3 = (12 + 13 + 17 + 14 + 15 + 9)/6

So even if Sony is using LG panels in its TVs, it does not seem to beat LG's own TV offerings for now, according to this ranking. LG and Sony seem to be somewhat close though, so in a similar league, with LG on top. They could be regarded as a Tier 1. Vizio and Samsung, however, are rather far below them in their scores, so they could represent a Tier 2. Within that second tier, they are close, but Vizio seems slightly above Samsung.

The current TV Brand rankings therefore:

Tier 1:
#1: LG

#2: Sony

Tier 2:
#3: Vizio

#4: Samsung

And that's it for now. Still planning to write what I promised at the end of my previous post.

Tuesday, November 01, 2016

Isn't 8K too much?

It requires a combination of at least three genetic flukes for humans to achieve extreme levels of visual acuity:
1) Perfect ocular shape and optics to begin with
2) Higher than normal cone density on the retina
3) Outstanding transparency inside the eye

Only a very small percentage of people get all the flukes combined, so it's very rare. Yet we should keep in mind that right now we are quite a few billion people on the planet...

Remember that some birds like falcons have "ordinary" visual acuity in the order of 20/2, or about 10x sharper than the "normal" 20/20 vision of humans. But even keeping the pride of our species with respect to eyesight sharpness in check, it is actually not so rare to find people with better than 20/20 vision. Let's see briefly how "not rare" that is.

Roughly, about 35% of the adult population has at least "normal" or 20/20 vision without glasses. But close to 10% of the US population has 20/15 (better-than-normal) vision. (Later addition: a nice distribution of visual acuity is shown in Fig. 4 of this other paper.)

In fact, 1% of the population achieves 20/10 vision. That's twice as good as "normal vision." The human record seems to be even slightly better: around 20/8. That means, being able to read at 20 meters what most (those with 20/20 vision) can only read at 8 m or less.

On the other hand, approximately 64% of adults wear glasses, at least in some developed countries. Yet we can imagine that the eyesight of glass wearers, with their glasses on, falls roughly on a normal distribution bell peaking around 20/20? So even if just 1/3 of them (us) can see slightly better than 20/20 with glasses on, that would represent about 20% of the total adult population. Let's assume that is a bit too optimistic, so to be conservative, let's make that just a 10%.

Adding that to the 10% who already achieve at least 20/15 vision without glasses, we can estimate that roughly 20% (about one in every five people,) exhibit a visual acuity that is clearly better than the "normal" 20/20. Notice, that's regardless of whether they wear glasses or not.

This study mentions average FVT visual acuities of 1.82, so closer to 2x the "normal" 20/20. Not sure what population samples they used there though.

So as a sort of disclaimer, in spite of my long previous post explainin when and why 4K might not offer any visible improvement over Full HD or even plain HD, we should not forget the fact that there are cases in which the benefits of a higher resolution can indeed be seen and be pertinent and enjoyable. The obvious examples: you simply sit closer than the ideal viewing distance for your visual acuity, and/or you do have in fact better than normal visual acuity, which as we just saw, is not so rare after all.

But keep in mind, that is not really a case in favor of 4K or 8K or even higher resolutions.

And now to honor this post's title: Japan's public broadcaster NHK has recently announced TV broadcasting at 8K.

Well, nice try, Japan. But isn't that too much?

Let's be clear: an absolute given resolution is never "better" or "wrong" or too much or too little in and of itself. Again, it might be too much, or satisfactory, or too little, depending on a combination of factors, namely pixel size, viewing distance, and visual acuity (check said previous post for all the details if needed.)

As if it wasn't already obvious from my posts on resolution, I don't think 8K for broadcast TV might be such a great idea, even for Japan (they pioneered the use of higher resolutions for broadcast TV before anybody else in the world quite many years ago,) and even for those lucky few with the three flukes combined and outstanding 20/8 vision. A very high resolution can be adequate in some use cases, but likely, it can also not be so, and a big waste. And there's quite a lot more to high resolutions that just being potentially useless or unnecessary and wasteful in some common cases.

igher resolutions are very costly in terms of compression and bandwidth requirements (which in turn can deteriorate image quality very very quickly, and also most horribly and catastrophically, when not properly taken care of.) For a given bandwidth, the higher you go in resolution, the more you must sacrifice frame-rates, which will deteriorate the fluidity of motion. This has major implications for anything with fast moving images, like sports broadcasts, or in particular, video games. But even most importantly, resolution is only secondary after contrast and color, for ultimate picture quality.

The current trend in cell phone manufacturers, offering flagship models with cameras that have smaller pixel counts than older models (even some older non-flagship ones,) yet offer higher image quality, should give a clear hint already: people are starting to care about better pixels instead of more pixels, and are not falling so easily for the earlier and simpler "more pixels = better" marketing bull.

Well, but Japan, or at least NHK, seems to think otherwise. (Japan's Sony, on the other end of the bluff spectrum, recently offered its flagship Playstation 4 Pro console with a rather weak and disappointing claim in the 4K gaming arena.)

In any case, let me leave it at that for now as far as this post goes. I'll be talking more about Japan soon in an upcoming post, not only about this specific 8K move, but also about the history of TV and the current standing of Sony and Panasonic (Japan) vs. Samsung and LG (South Korea.)

There was a slogan that got advocated when the CD standard was finalized: "Perfect audio forever." With respect to picture quality, we could say the ultimate aim has been analogous for a long time: Perfect Image Quality Forever. Manufacturers and technologies have gone through up and downs, but they have been moving overall in that same direction. The fact is, display manufacturers and technologies have been able to provide outstanding, never-before-seen picture quality in consumer level displays this very year, in 2016. Pretty much anything from 2015 and before has been clearly left in the dust and very soon obsolete. There are very good reasons for excitement about display technologies and picture quality precisely right now and from now on, and that is really great news. (But 8K broadcast TV is not one of those great news, imho.)

As a sneak peak, I'd like to quote DisplayMate's assessment of a 2016 flagship OLED TV (For the record, I have absolutely no relationship with any of the companies mentioned in these posts.)

"In terms of picture quality the LG OLED TV is Visually Indistinguishable from Perfect. Even in terms of the exacting and precise Lab Measurements it is close to ideal, and it breaks many TV Display Performance Records. (...) far better than the best Plasma TVs in every display performance category, and even better than the $50,000 Sony Professional CRT Reference Studio Monitors that up until recently were the golden standard for picture quality. In fact, based on our detailed lab tests and measurements the LG OLED TV has the highest Absolute Color Accuracy, the highest Absolute Luminance Accuracy, and the highest Contrast Ratio with perfect Black Levels of any TV that we have ever tested, so it even qualifies as a Reference Studio Monitor."

Did you notice the bold text there? These pros talk about picture quality, and mention things like color, luminance, contrast, and black levels... But they don't even mention *resolution* there. Hmm... Wink wink ;) 

"Perfect", or let's say at least technically flawless displays are already available and might be bound to become pretty much a commodity rather soon, on cellphones as well as on computer monitors and large panels/TVs. Content makers and distributors have to bring up the image quality of the content they offer accordingly, no doubt about that. But to get there, moving up to wider contrast and wider color space standards is much more important than bringing up the resolution at the likely expense of frame-rates.

In any case, more on all of this in the next post.

Saturday, October 01, 2016

When does a screen have "too much" resolution?

(Note: This entry is a translation from the original post in Spanish, written back in Oct/2013. Re-posting it in English because I'll be writing some things shortly also in English, about the current trends in display technologies and the consumer TV market, 
OLED vs. LCD, HDR, wider Color Gamuts, and so on. That post will likely refer to things covered here, so wanted to have this text already in English.)

4K resolution, also called "4K Ultra HDTV" or "Quad HD" is the resolution offered by the latest-generation TVs. This resolution is equivalent to approximately 4x the resolution of Full HD or 1080 (see the image above.) 4K is 3840 x 2160 (or even 4096 x 2160 pixels), a little more than eight million pixels in total. Quite a few pixels! But is it really useful to have such a high resolution on our televisions? That depends on several things, and that is the topic of this post. The idea is to inform you so that you won't spend a fortune on something that possibly you will not be able to enjoy or take advantage of, in spite of what salesmen or even other consumers would want you to believe.

Consider first a cell phone, such as an iPhone 5. Its screen size being only 4 "(diagonal), and its resolution 1136 x 640 pixels. Note that this resolution is relatively" low "in the sense that it's not even the minimum HD (which is 1280 x 720.) But this "low" resolution in such a small screen results in a very high pixel density: 326 ppi (pixels per inch,) that's about 12.83 pixels per millimeter. In other words, the pixel size on this iPhone, assuming they are square, is only 0.078 mm per side (less than 8% of 1 mm the side of each pixel.)

As a marketing strategy, Apple gave a rather picturesque name to this pixel density of the iPhone: they called it retina display. The reason was that, in principle, our eyes, or our vision in general, cannot distinguish those pixels if we place the iPhone at a distance of at least one foot (30 cm) from our eyes. And without resolving the pixels, the image would then appear completely smooth rather than pixelated. What Apple is telling us here may actually be true or false, and that depends on our visual acuity.

In a previous post we saw that a person with "normal" or 20/20 visual acuity can distinguish between two parallel lines separated by one minute of arc, or arcminute. An arcminute is just 1/60th of a degree (and a degree is just 1/360th of a full circle.) An arcminute is thus a fairly narrow angle. How narrow? If we plot an angle of one arcminute starting from our eyes, the separation of the sides of that angle at a distance of six meters would be just 1.75 mm. (Remember, this is calculated using the circumference formula: [2 * pi * R / 360] / 60 = 1.75, where R is the radius of the circle, which in this case would be 6000 mm = 6m.)

About 30 cm away, the separation of the sides of an angle of one arcminute would be just 0.087 mm. Less than 9% of a millimeter. Ah! But there you go! Above we saw that the side of each pixel of the iPhone 5 has a length less than 8% of a millimeter, so in this case, pixels are a little smaller than what a "normal" visual acuity can resolve at a distance of 30 cm. That's the key! That's why in principle we can't resolve those pixels at that distance. Apple then did tell us the truth about the retina display, at least when a visual acuity no better than "normal" is assumed.

If you bring the iPhone close enough to our eyes, then you would distinguish the pixels, even if you have normal vision. (A 30-year-old can focus even at 15 cm, and a child can focus even at less than 7 cm.) And if we have a visual acuity higher than normal, then we would be able to resolve the pixels of the iPhone even at 30 cm.

We see that resolving or not the pixels of a screen with a particular resolution will depend on several things. Those things are precisely the terms highlighted above in bold, namely:

1) Pixel size (which is derived from the screen size and its resolution)
2) Distance between our eyes and the screen
3) Our visual acuity

The final effect on our eyes will depend on these three factors. We can assume that our visual acuity is already the best we can muster (using glasses if we need to, for example,) so overall we cannot improve factor #3. Then we can only modify factors #1 and #2. Modifying #1 means a different screen size, or a different resolution, or both. Modifying #2 means changing the distance between the screen and our eyes.

Clearly, if we can distinguish the pixels on a given screen, then either the resolution is too low for that distance, or we are too close to the screen given its resolution. The fact is that if we start moving our eyes away from the screen, at some point we will reach a distance at which we can no longer resolve the pixels. Only then, given that screen and distance, and our visual acuity, we could say that that resolution is "satisfactory".

But then again, when do we have too much resolution?
(Remember, this is the key question concerning this post.)

We will have too much resolution "A" when, for the same screen size, there is at least one lower resolution "B" that will also **not** let us resolve its pixels at the same viewing distance.

That is because, if resolution A is greater than B, but both resolutions at distance X on screens with the same size do not allow us to resolve their respective pixels, then at that distance the images of A and B are completely indistinguishable (in terms of resolution) to our eyes, no matter how finer resolution A is with respect to B. For that viewing distance, for that screen size, and for our visual acuity, resolution A would therefore be excessive and technically useless over and above B.

Let's elaborate a bit more.

Imagine we put many iPhone 5 screens together to build a single large 60" screen. That would require a lot of iPhones, in fact 15 x 15 == 225 iPhones. And do the math: the resolution you would get with that amount of screens (at 1136 x 640 per little screen) would be a wooping total of 17025 x 9600 pixels! That is more than 18 times higher than 4K. But ask yourself: would that be perhaps somewhat excessive and unnecessary? Well, given normal vision, we already saw that we cannot resolve pixels on any of those iPhones when our eyes are just 30 cm away. How much further from resolving them wouldn't we be, when this 60" screen has pixels of the exact same size as those on the iPhones, and we are to see them now from let's say three meters, so 10x times farther away?

In fact, a "normal" vision already **can not** resolve the pixels on a 60" screen with the "so much lower" 1920 x 1080 resolution (Full HD) from three meters away. Just getting closer to less than 2.38 m (7.8 feet) would allow you to begin resolving those pixels (this can be calculated similarly to what was already explained above.) So at distances beyond 2.38 m, no "normal" vision will reap any "benefits" from this Super Ultra Ridiculous resolution 18+ times higher than 4K on a 60 "screen, compared to a modest screen of the same size, with a simple 1080p resolution. Our eyes at that viewing distance simply cannot see the difference between these two resolutions.

That is a hyper-exaggerated example, but I hope the idea comes across. A resolution can be absolutely excessive and completely useless to our eyes compared to some other much lower resolution, depending on our visual acuity, and the viewing distance.

Now back to 4K.

A 60 "screen with a 4K resolution has quite small pixels. In fact, four of its pixels can fit inside one pixel from a 60" screen with 1080p resolution. At what distance can we resolve those 4K-60" pixels? Actually only at less than 1.19 meters (or ~3.9 feet; again, normal vision.) So you sit at 1.19 meters or farther away from that screen and you won't see any pixelated images; perfect! However, don't go sit beyond 2.38 m (7.8 feet) away from that screen, because then you will have paid that higher 4K resolution for nothing. As we saw above, beyond 2.38 meters you already wouldn't be able to resolve the much larger pixels on a 4 times lower 1080p resolution screen of the same 60" size. So if you are considering sitting beyond 2.38 meters away from a 60-inch TV, then it makes little sense to have it be 4K over 1080p, because a 1080p screen will look just as well at that distance (you won't even be able to resolve the pixels on the 1080p screen from that distance.)

What is more, if you sit beyond 3.57 m away (11.7 feet,) then it doesn't even make much sense to have a 60" 1080p TV, because at that distance you can no longer resolve the pixels in the 720p resolution (HD rather than Full HD) for that screen size. So all other things being equal, at 3.57 meters or more, a 60" 720p screen will look just as good (without pixelation) as a 1080p, and as a 4K the same size. Again, all this is assuming normal vision.

Of course, we would need to calculate things for each screen size, resolution, and every viewing distance possible to see if the combination works and makes sense or can be recommended for our particular needs. But I don't need to do that, because others have done it already (click to visit and enlarge):

 Screen Size vs. Viewing Distance vs. Resolution

One way to use this graph: first choose the viewing distance you are considering. For example, if it's three meters (~10 feet), then locate the value 10 feet on the vertical axis to the left, and draw a horizontal line across the entire graph at that height. Then check screen sizes on the horizontal axis below, and draw a vertical line from your screen size of interest, and see where it intersects that horizontal line you drew right before. Let's say, if you are considering 60" at 10 feet, the intersection between the two lines would fall near the tip of the red triangular area. Depending on the color where the intersection falls (blue, green red, or purple), a given combination will make sense or not according to the descriptions associated with that color (text blobs on the right, both for triangular areas and for the boundary lines between them.)

In our example, the intersection is on the red area, and the description for the red area tells us that the benefits of 1080p would be noticeable. That means, from 60" viewed at 10 feet we are ok with 1080p. But it also tells us, it would not be a combination that would let us benefit from 4K; we would need a larger screen, or a shorter viewing distance, or both, to move the intersection towards the purple area in order to do so.

This graph allows us then to respond fairly easily to the question on the title of this post: when does a screen have too much resolution? Answer: when the intersection between the viewing distance and the screen size falls outside (most likely above) the color associated with that screen's resolution. Note, for example, that only when the intersection falls below the red line, only then we would observe benefits from a 4K resolution.

I'm sure you will be surprised to realize how closely you have to sit from the screens (despite their large sizes) in order to truly reap the benefits offered by each resolution over the previous lower one. For example, beyond three or more meters away (10 feet or more,) a 50" 1080p screen hardly makes sense against a 720p, at least not for a "normal" vision. A 4K 60" screen would start to make sense only if you plan to see it sitting within midway between 5 and 10 feet, so about 7.5 feet, which is exactly the same distance we mentioned before: less than 2.38 m. But that would only be the distance at which normal vision "begins to notice the benefits" of 4K compared to 1080p. To really enjoy those benefits you would need to sit even closer to this large 60". Such close distances to large screens may be impractical, inconvenient, or simply uncomfortable for you. Or it may be the case that for your viewing distance of interest, the recommended ideal combination of screen size and resolution end up beyond budget.

By the way, we have only discussed here viewing distance with respect to resolving pixels. But there is something else which is very important to take into account when choosing the ideal viewing distance for a particular screen, especially when it comes to watching movies, and that is the horizontal angle that the screen covers on our field of view. SMPTE's recommendation is about 30 degrees, and any value between 28 ° and 40 ° complies with the THX recommendation (and certification.) In general, for home theaters, it is advisable to use at least 20°. Many consider this more important than resolving or not resolving pixels, because sufficient visual field coverage increases the effect of "immersion" into the film, whether it's pixelated or not.

Here's an example that could be used as some sort of reference. For a 50" 1080p (Full HD) display, any viewing distance between 1.98 and 2.97 m (basically two to three meters, or 6.5 - 9.7 feet) matches what would be the three key criteria for optimal viewing of movies:

1) We are far enough to not resolve the pixels on the screen (with normal vision)

2) We are in the range of distances where we are effectively taking advantage of our screen's higher resolution (in this example 1080p) over the immediately lower resolution (720p)
3) We are at a distance that allows the screen width to horizontally cover between 20º and 40º of our visual field 

For example, with a 4K - 60" screen, we would comply with #1 beyond 1.19 m, but we would need to sit no closer than 1.82 m to comply with #3. And as we saw earlier, we should not sit beyond 2.38 m to comply with #2. So it's important to realize how narrow the ideal range of optimal viewing distances gets for higher resolution screens. For a 60" 4K screen, it's between 1.82 and 2.38 m (between 6 - 7.8 feet). If we sit outside that range, we violate one or more of those three criteria above, and it would be best to change some of the variables at play: either the screen size, the resolution, or the simplest, our viewing distance.

For a giant 100" 4K screen, the ideal range complying with all three criteria would be between only three and four meters (about 10 - 13 feet.) In fact, a little narrower: between 3.04 and 3.96 m. But depending on your visual acuity, at 3.96 m (~13 feet) you are already risking not seeing any benefit from 4K over 1080p from a 100" screen. Better sit at the lower end of that range, just a little over three meters (10-12 feet). So yes, believe it or not, if your vision is normal, ideally you would sit just slightly beyond three meters away (~10 feet) from a giant 100" 4K screen.

In conclusion, there are cases in which resolution can be too high; effectively, unnecessarily, and uselessly too high, and it makes little sense to pay more for something that doesn't offer perceptible improvements over something cheaper. If you are looking for a TV or monitor, don't let all the fuzz from salesmen and even from other consumers fool you, about the alleged "remarkable and incredible benefits" any higher resolution is supposed to offer above lower resolutions. Take into account how far your eyes will be from that screen at your particular room and setup (that's possibly the most important thing,) and try to use the chart above (or the above formulas) and the three criteria mentioned here, to determine the combinations of resolution, screen size, and viewing distance that are truly convenient or even optimal for your particular needs and budget.

Additional Information:

1080p Does Matter - Here's When (Screen Size vs. Viewing Distance vs. Resolution)
Resolving the iPhone resolution
Optimum HDTV viewing distance (Wikipedia)
4K resolution (Wikipedia)

PS. The 4K resolution I mentioned as 4096 x 2160 is the one from Digital Cinema Initiatives (DCI.) 4K UHD (or Quad HD in the first image) is exactly equivalent to 4 times 1920 x 1080 (Full HD), or 3840 x 2160. In any case, they are very similar.

PS2. Refined some calculations on the post, and here taking the opportunity to explain an additional formula that might be useful. If you do not have the density of pixels per inch or per centimeter from the screen manual, you can get the pixel size by dividing the screen height by the number of vertical pixels on the screen. For HD that number is 720, for Full HD, 1080, and for 4K, 2160. To get the height of a screen, simply measure it, or from the diagonal use Pythagoras, knowing that the ratio of the screen is 16:9, that means the base of the screen is always equal to 1.777x the height. With this data and the diagonal D, we can easily calculate H the screen height: H = sqrt (D * D / 4.16). To convert that height from inches to millimeters, multiply by 25.4. For example, a 60" screen has a height of 747.2 mm, and pixels for resolutions of 720, 1080 and 4K on that screen would be 1.038 mm, 0.692 mm and 0.346 mm per side respectively.