A display that supports 1920×1080 natively will of course support both 1080i and 1080p. There are the same number of pixels in both formats, it’s just the source signal that determines the i or the p. The issue is not with receiver (displays), but one of the source (transmission) signal.
Remember that legacy projection-type displays and even CRT monitors that were part of the early HD systems (usually the ones that contained both analog and digital HDTV tuners, aka “HDTV ready”) were able to decode 720p and 1080i.
720p and 1080i are broadcast standards for broadcasting HDTV (aka “source” signals). Some networks only broadcast in 720p; the issue was and is, that before true 1920×1080 displays (plasma and LCD) came out, the old tuners were essentially, SD TVs that down-res’d either the 720p signal or the 1080i signal, down to the TV display’s native resolution (480p in most cases).
1080i is a legacy tech, as it was the most compatible with CRT displays which are inherently interlaced. 1920 displays can and have always been able to support either interlaced or progressive scan. The limited resolution was this “middle” tech that displayed down-res’d 1080i to 1366.