It's not really the 4K resolution that gives you such an improvement over 1080p but it is the HDR that is being offered with the 4K. That may be Dolby vision, HDR 10, HDR 10+. Without any of those 4K isn't as impressive.
4K is capable of huge improvement over 1080p, but not all sources are up to it. On some titles, film grain is already visible at 1080p. When scanning 35mm film, you can get from 2-4K, depending on film speed (graininess) and lens quality. Of course, higher resolutions do not fix lens flaws, focus errors and motion blur. Streaming services are not necessarily providing optimal quality: Netflix attempts to reduce bandwidth makes some 4K streams very poor quality, more like a lower resolution upscaled.
HDR is a significant improvement in dynamic range and color depth. It attempts to recreate true-to-life lighting levels and contrast range, but is limited by the device displaying it more than the standard encoding it. Many monitors and sets are limited to 300-600 nits maximum brightness, but this is still 3-6 times the dynamic range of the 100 nit maximum defined for a BT.709 display used for SDR video. Some HDR titles have not been transferred well, with the scanned film being stretched too far (unnatural/exaggerated dynamic range rather than just capturing the range that was naturally there in the film negative). The potential lies more in the realm of digital photography. It seems to be used as a bit of a gimmick at times, with lots of scenes of bright glaring lights in dim rooms and backlit scenes facing into the sun (this does not transfer well back to SDR displays and often the SDR version looks too dark).
The color gamut for HDR is much wider (BT.2020), theoretically covering 76% of all possible colors compared to BT.709 of SDR video which is only about 35%. BT.2020 is about the maximum color range that can be implemented in real life using only 3 primary colors. In practice, there are almost no displays that can do BT.2020 and/or 1000 nits or more (an ideal HDR target), due to technical problems creating a pure enough green primary, etc, so we usually see 300-600 nits in DCI color space (about 45% of all possible colors and 87% of all naturally occurring reflective colors). As film is wider gamut than a BT.709 display, this is much more faithful to source (DCI gives good coverage of the colors that can be reproduced with movie film). In practice, transfers of scanned film usually fit within 400 nits, so this is not as much of a problem as it could be. It is also possible to have SDR video that is BT.2020 or DCI color gamut, BTW.
In theory, you could have idealised HDR such that a dark scene depicting a candle will be as bright as a real candle and an outdoor scene of a sunny day would make you want to put on sunglasses. For watching movies, that amount of range is probably not practical (too difficult for the eyes to cope with adjusting to scene changes).