@Dashrender said in 10" HDMI 1080P displays:
Honestly I have no idea in the TV space.
When you hook up a 1366 x 768 monitor to a PC, the video card knows the resolution of the monitor and attempts to us only that.
I don't know if TVs work the same way or not. I want to make sure that my new TV will be displaying at it's best resolution possible, even if a lower res one is attached to the second output.
TVs and monitors store their possibile resolutions in a small IC and your PC will query this (keyword is EDID).
Anyway, most modern TVs and monitors support some form of upscaling, thus they are calculating the rest of the picture. If your screen doesn't support that, you can:
You could even DIY, but that requires some knowledge. What is the exact use case / purpose? Digital (HDMI) or analog source?