Motion blur doesn't work on modern TVs (i.e. OLED). They will display a static, blurry frame for 1/24 of a second, then snap to the next one practically instantly with a clearly visible jump. What I get in the end is a juddering mess that is also blurry, so I can literally see nothing.
Motion blur doesn't really work in that way on any device, but what you're missing is that the content's captured frame rate has a relatively important relation to the display frame rate: You can't usually change one & not the other without making everything look worse, at least to most people, and this is what a lot of modern TVs do.
Naive interpolation of inter-frame content is awful, is what a lot of TVs do to implement their smoothing feature, and is why everyone complains about it.
The reason a lot of people hated The Hobbit may be partly because of this problem: It was shot at a 270 degree shutter to try to get a bit more motion blur back into each frame, which to a lot of people felt strange.
There ought not be any juddering since that is an artifact of a varying frame rate rather than a static frame rate, nor should you be "literally seeing nothing". You might be more visually sensitive than the average person, who seems to not care about these things. It reminds me of my spouse, who is really good at detecting motion at night where I see nothing, almost like a super power.
Motion blur doesn't work on modern TVs (i.e. OLED). They will display a static, blurry frame for 1/24 of a second, then snap to the next one practically instantly with a clearly visible jump. What I get in the end is a juddering mess that is also blurry, so I can literally see nothing.