To answer the thread title directly: It is based on a number of business factors. Let me quote an article:
https://news.ihsmarkit.com/prviewer/rel ... markit-say
Factors other than direct material costs, such as production yield, utilization rate, depreciation expenses and substrate size, do actually matter [comparing OLED to LED], IHS Markit said.
Let's say you own a modern factory for panels in the size of your desired large 240p LCD. All other things being equal, it would be a better and safer investment to manufacture panels that will be used in high-demand products like monitors or televisions. Your capacity will end up being partially idle because the production run will be small, so you will have to balance that by charging similar to a normal large volume run - which will end up producing many times more panels which will be desired by many more manufacturers and users. You would also need to spend money creating an entirely new panel design for realizing 240p on current production lines - and this assumes that design would actually be compatible with production processes geared towards very small transistors and other structures. You probably can't look around in the garbage for old mid-'90s LCD production tech, and you wouldn't likely get very good results anyway. Furthermore, the actual signal processing hardware and power circuitry of modern designs are also for high resolution. On the plus side, you could probably incorporate a per-pixel LED backlight a bit more easily because the industry is getting close to moving beyond 240p density in the backlight array alone. But even this would be a custom design if you wanted to get precisely 240p worth of independent lighting zones.
Maybe somebody could DIY a way to create large, low-resolution panels at a very competitive price, but digital signage costs probably are a sign this is easier said than done.
This is a fun thought exercise, of course. This is reminiscent of the old "pixel size vs. quality" debate from digital camera forums, waged because some people misunderstand image quality. A large camera sensor pixel captures much light and can give better results than a small pixel...but this is not the real-world comparison because an image is built up of more than one pixel. Within some limits a whole brigade of smaller pixels allows the capture of more refined data, and this can in turn be used alongside computing to improve image quality. In a similar way, your question, as posed here, falls into the same trap. Not only is there no 240p pixel, or a standard phosphor decay rate, or even a standard aspect ratio to play games in, but throwing away high resolution might even have unforeseen consequences on things like overdrive processing (which in modern displays is geared towards high-DPI monitors and may not work so well at super-low resolutions). I would love for somebody to prove me wrong. In particular, there is still apparently some market for GameBoy / Camcorder-size 240-ish capable screens for backup cameras, although even that is likely soon to go the way of the dodo because of increased resolution demands. But, again, that's not necessarily a bad thing if you've ever paid close attention to what some of the classic handhelds are actually doing to aspect ratio and cropping.
Osirus wrote:Something like this would probably take upwards of 8-figures to bring to the market, and you'd likely sell hundreds of them.
That may be a conservative estimate because you would be competing directly with modern panel demand. There may even be a bit of retooling desired, or at least a learning curve to get the right yields on an exotic low-resolution design.