HDR is a real feature and has nothing to do with 4k they are separate things. As for 4k you need to check the size/distance charts for 20/20 vision to see if it makes a difference. The people buying less than 70" 4k TVs and/or sitting further than 10 feet - you get the point.
I would jump on 4k when 4k projectors become somewhat affordable. It makes a huge difference when you consider that sort of size/distance. Since there is no appreciable video/film content yet, it would mostly shine for gaming.
As far as HDR goes I saw an OLED with HDR and it matters there a lot. Given LCD tech which can't really produce blacks I suppose HDR on LCD is another marketing term. It's nice the TV can accept the input of HDR content, but really given LCD tech does it matter? Yes I know about FALD and all the other kludges they try to get LCD to perform. On an OLED it will matter.
But all that aside you know most people bought really crappy 4k TVs, they didn't look at FALD or HDR or Triluminos or actual visual features. They are the market for this "4k" PS4. Anyone in the market for a new TV is getting a 4k because it's been pushed down to all the price points. Those other features were the domain of top end TVs regardless of resolution. It just so happens the introduction of HDR coincides with 4k panels becoming the standard.
4K chart:
http://referencehometheater.com/2013/commentary/4k-calculator/
Long story short it is the unfinished standards like Rec. 2020 along with a new HDMI standard for it plus OLED and possibly quantum dot technology that will revolutionize TV as we know it in the next year or two. Still I see quantum dot as another kludge for LCD because it kinda, sort of half way solves one of LCD's problems. OLED is the future. Right now what's being put out is 4K as motivator to upgrade but pretty much useless in and of itself. I would wait for the standards to settle down over the next year or two for real improvements in TVs. We are on the cusp of a true revolution in displays. 4k will come along for the ride but only because that aspect of panel PPI is inexpensive to manufacture.
EDIT: Just did the research on HDR as it applies to LCDs, the standard was kludged for LCD to be all about hitting 1000nits. This converts to about 291 ft-lamberts. Basically it means your LCD must be a light cannon. That kind of luminance in a dark room will hurt your eyes. So really it's meant for rooms with strong ambient light. But that standard is only for LCD again because they can't produce blacks. For OLED the standard is defined at the low end. It still needs to produce 504 nits, but the extra dynamic range is at the bottom with 0.0005 nits black level. Obviously one would need a pitch black room to get the benefits of this. In any case HDR in photography is more about capturing shadow detail while being able to stop down the aperture enough to get detail in the highlights. It isn't about making the highlights brighter. Having looked into the tech, as far as LCDs go it is shaping up to be the 2016 buzzword marketing term again. It's funny how we get a new one every year now. They must really want to sell TVs.
Still, I wouldn't write off HDR, it's a more noticeable improvement than 4k in most if not all viewing situations. But for LCD as usual all benefits depend on a bright room - in which the low end of 0.5 nits is moot. It at least shows the display is capable of high contrast in those high ambient light situations. The major benefits again come in the second "true" HDR standard which only OLEDs can attain and can be realized in practice in a dark room.