The phrasing is just weird if so. 4K is a just a resolution, all serious movie cameras can capture image natively at that size and it's becoming the norm for post producing films and broadcasting them in theaters. For all intents and purposes, "4K" is cinema as of now, the same way 35mm was.
Now, that some TVs have poor upscaling and some weird shit and settings going on and denaturing a movie image is spot-on.
What was the cause of the "soap opera" effect from The Hobbit's particular framerate? Was that shooting at 48fps or something? That's unrelated to 4K, yes?
48fps instead of the classic 24fps, indeed. By definition more images means less motion blur, which may strikes you as "unnatural" if you grew up with classic chemical film.
Digital cameras on the whole produce a sharper, more even image but that partially can't be helped because classic film texture is a result of photochemical grain and stuff, which is naturally noisier and often way more specific in how it capture colors. Big trend currently is to mount old ass Soviet or interwar 35mm optics on cameras to "break" the digital image.
EDIT: On top of that, digital grading allows for pretty insane image modifications nowadays (regardless of it you shot film or digital), you can target any specific color or luminance spectrum to work it independantly, when back then all you could do was adding to the whole frame more red, green and/or blue and change the global luminosity. Lots of previously unthinkable image textures started popping in the early 00's.