Whenever I go to an electronic store and see the displays of the new UHD 4K televisions, I just can’t wait to get one. Especially the new curved televisions, such as the Samsung JS9000 or the premium JS9500. The 4K demo shown in the store is dazzling and the slightly curved design gives a pleasing sense of depth over the flat screen alternatives.
I just have the urge to have one of those 65” UHD televisions set up in the comfort in my house where I can revel in all its glory with my video games and Blu Rays.
But, if I buy it now, I couldn’t actually utilize all that the television has to offer. I’d pay upwards of $5500 for something that I could not really use until later next year. The newest generation of consoles only go up to 1080p and the 4K (2160p) Blu Ray library will start to accumulate on store shelves near the end of 2015 at the earliest.
So, already having a high quality 1080p television with a lot of technical perks that is only a few years old, why spend more than $5000 for something that ultimately currently offers what I already have?
That’s the problem with early adoption. You have a fantastic piece of hardware, but have none of the software to utilize it.
However, relatively speaking, the current update to 4K isn’t such a huge change than back in the really awkward days of the early 2000s.
In that era, not only was there a higher resolution being offered, but a big change in the aspect ratio – from 4:3 (1.33:1) to 16:9 (1.78:1) – when television broadcasts were still largely 4:3 and “full screen” 4:3 DVDs were being sold (but not without the scorn of many directors) alongside the full picture letterbox versions of films.
So, when people back in 2001-2003 or so bought these brand new, expensive high definition (720p and later up to the inferior 1080i) televisions, they didn’t have a lot of content to utilize the new television sets nor did they know how to use them properly. I worked at a large electronics store in 2004, and many customers would be willing to spend a lot of money but not take the time to learn what certain terms meant nor how to actually use the television properly.
They would buy DVD players, but still use the composite video cable instead of getting the component (or, at least, S-Video). They’d set up their DVD player and cable television service to the TV set and have 4:3 video stretched out to fill the 16:9 screen. It resulted in a low quality, distorted picture that they would claim is high definition because they don’t know any better.
In fact, with standard 4:3 television broadcasts, people still have their televisions set to stretch and distort the picture because they don’t like the pillar boxes on the sides of the screen. I cringe whenever I see this. I’m being a little hyperbolic, but whenever I see someone with their television with those settings, I think that it should be confiscated due to consumer ignorance. This also includes people that shoot vertical video with their phones, or worse, shoot video at all with their tablets.
The early 2000s was a time when it was probably a lot better to have the best of the previous technology rather than jumping to get the early versions of the new tech. For gaming, I got a 36” Sony Trinitron CRT 4:3. The vast majority of the sixth generation consoles’ games were 4:3 and the television was big enough to display the letterbox DVDs without the unused areas on the top and bottom of the screen being intrusive. Also, I could get very crisp video quality for the time for my video games and DVDs using component and S-Video cables. I like experimenting, and there was definitely a lot more clarity with component cables than using composite, which resulted in a softer, washed-out picture.
It wasn’t until 2007 that I felt it was time to splurge and buy a top of the line Full HD (1080p) 52” Sharp LCD. Back then, Sharp was one of, if not the, best in LCD displays. I had an Xbox 360 and bought a Playstation 3 after the television. All of the software was on the shelves to utilize the hardware. I was amazed at the brilliance and clarity of the picture. It was new to me and I felt that it was worth waiting for.
That eight-year-old television is still good, but is dated in some of its features. When I moved a few years back, I bought another television that has everything that I need to experience the best of what is offered.
If I buy that Samsung UHD television this year, even though I would really like to, I would inevitably be disappointed down the line. I would have hardware that can’t be utilized for the next couple of years. By the time 4K content is common place, I would be regretting the decision to buy a high-end television in 2015 when there are better, and probably slightly cheaper, options in 2017.
With the monetary aspect, the same principle of early adoption could be argued for game consoles. When a new console comes out, waiting a year or a year and a half will open you to a wider variety of games at cheaper prices as well as additional money saved with hardware price drops and bundles.
It can be hard, but resist the urge to get the newest stuff as soon as you can, unless you have a lot of money to throw around.