Playing with the PS2 (Lets Get Crazy)
-
The_Guffman
- Posts: 3
- Joined: Sun May 03, 2020 7:34 am
Playing with the PS2 (Lets Get Crazy)
Hello everyone. I've seen a lot of posts here on using the PS2 and 480i in particular. I'm not an expert by any means on any of this stuff, but thanks to you guys I've been able to learn the gist of what 480i is, why it was used, and how to work with it. I'm using an OSSC myself and am very happy with my results from both 240p and 480p content, but I'm not completely sold yet on bob deinterlacing. To clarify, I'm using the PS2 on a modern 4K display, though in the future I would also like a PVM in my setup. That all being said, what are some things, if any, I could add before or after the OSSC in my video chain to, potentially, change or tweak the PS2's output? I know GSM exists, but its compatibility seems pretty hit or miss with some pretty prominent games.
To start with, are there better deinterlacing methods to use than line doubling (bob deinterlacing)? I understand that bob is fast, and the point is to minimize input lag, but could we maybe use a device solely dedicated to ONLY deinterlacing, maybe one that can pack quite a punch, and throw that in there? I've been looking at FPGA deinterlacers from Xilinx and think I might be onto something, and I'm enough of a fool to accept some frames of input lag for something like this, even just as an experiment. Maybe I'll throw some settings around in PCSX2 to see what I can come up with...
I've seen another thread on here from a few years back out downscaling 480i to 240p with something like an Extron Emotia or a Genius II, maybe even a Corio2, but I now understand you would be losing visual information by doing this. 480 lines is 480 lines after all. BUT, I've also read about people using downscalers to force a "vintage" 240p scanline look on their games from something like the Xbox 360 and even more modern consoles like PS4 and Xbox One. Almost always, this is in reference to being displayed on a CRT, which I would like to try out myself someday, but how might this look other than "crappy" on an LCD? I know the OSSC is only line-multiplying, but 5x mode looks damn good on my display with other 240p content. I know you might say "Of course it looks good with native 240p content!" or "240p is only optimal for games that were meant to be in 240p on PS2," but does anyone here actually have any experience doing something similar, and you found you maybe didn't hate the downscaled look? Or maybe you didn't actually mind it compared to the deinterlaced look?
Lets get even crazier and say, instead of cutting back on resolution, what if we cut back on fieldrate? What if, hypothetically, we had a device that could drop every other frame from an interlaced source and simply double the frames it had left? Instead of smashing lines together or flickering them off and on, you would only have the one set of 240 lines, doubled, and updating every other frame, or possibly even interpolated into a Frankenstein monster of 60 "fields" per second again? Is something like this even possible to do in real time while gaming, or only with video editing software? Would it look horrible, or like an alternate timeline's video standard? "Is this madman seriously suggesting we sacrifice frames?!" I'm so curious about this, even if the end result is a worse picture than what we started with.
I also know that a lot of people suggest just accepting and getting used to the 480i look. I accept this as an option, but the whole point of this is just to speculate on what might be possible and what options exist, regardless of if they add a few frames of lag or look better/worse. The fact that there's even a debate on sharp vs soft pixels, HD vs RF, or scanlines vs none seems to me that so much of this stuff is subjective and I love that aspect of this community. I've been a budget gamer for most of my life, so I'm used to experimenting with different PC game settings for all kinds of machines as far as performance, and I've almost always compromised on TV sets in some capacity, but I do love my PS2 to death and I might just start buying stuff just for the sake of tinkering. I've seen posts about the Extron 580XI and some other devices that I might try and find just for the sake of trying, but thank you for reading, and your time! Keep up the cool, awesome work.
To start with, are there better deinterlacing methods to use than line doubling (bob deinterlacing)? I understand that bob is fast, and the point is to minimize input lag, but could we maybe use a device solely dedicated to ONLY deinterlacing, maybe one that can pack quite a punch, and throw that in there? I've been looking at FPGA deinterlacers from Xilinx and think I might be onto something, and I'm enough of a fool to accept some frames of input lag for something like this, even just as an experiment. Maybe I'll throw some settings around in PCSX2 to see what I can come up with...
I've seen another thread on here from a few years back out downscaling 480i to 240p with something like an Extron Emotia or a Genius II, maybe even a Corio2, but I now understand you would be losing visual information by doing this. 480 lines is 480 lines after all. BUT, I've also read about people using downscalers to force a "vintage" 240p scanline look on their games from something like the Xbox 360 and even more modern consoles like PS4 and Xbox One. Almost always, this is in reference to being displayed on a CRT, which I would like to try out myself someday, but how might this look other than "crappy" on an LCD? I know the OSSC is only line-multiplying, but 5x mode looks damn good on my display with other 240p content. I know you might say "Of course it looks good with native 240p content!" or "240p is only optimal for games that were meant to be in 240p on PS2," but does anyone here actually have any experience doing something similar, and you found you maybe didn't hate the downscaled look? Or maybe you didn't actually mind it compared to the deinterlaced look?
Lets get even crazier and say, instead of cutting back on resolution, what if we cut back on fieldrate? What if, hypothetically, we had a device that could drop every other frame from an interlaced source and simply double the frames it had left? Instead of smashing lines together or flickering them off and on, you would only have the one set of 240 lines, doubled, and updating every other frame, or possibly even interpolated into a Frankenstein monster of 60 "fields" per second again? Is something like this even possible to do in real time while gaming, or only with video editing software? Would it look horrible, or like an alternate timeline's video standard? "Is this madman seriously suggesting we sacrifice frames?!" I'm so curious about this, even if the end result is a worse picture than what we started with.
I also know that a lot of people suggest just accepting and getting used to the 480i look. I accept this as an option, but the whole point of this is just to speculate on what might be possible and what options exist, regardless of if they add a few frames of lag or look better/worse. The fact that there's even a debate on sharp vs soft pixels, HD vs RF, or scanlines vs none seems to me that so much of this stuff is subjective and I love that aspect of this community. I've been a budget gamer for most of my life, so I'm used to experimenting with different PC game settings for all kinds of machines as far as performance, and I've almost always compromised on TV sets in some capacity, but I do love my PS2 to death and I might just start buying stuff just for the sake of tinkering. I've seen posts about the Extron 580XI and some other devices that I might try and find just for the sake of trying, but thank you for reading, and your time! Keep up the cool, awesome work.
Re: Playing with the PS2 (Lets Get Crazy)
Use bob for games that need low input lag and use passthrough to let your TV handle the de-interlace for games where 3-4 frames of lag isn't a big deal like RPGs. No need to blow a ton of money on devices that will give you like 1% better results than just letting your TV handle it.
Any other de-interlace solution is going to give you noticeable input lag, there's nothing you can do about that. That's just the reality of 480i.
Getting used to bob de-interlacing is basically a prerequisite for PS2, most games don't look like when forced to run in 480p and there's only a small list of games where 240p works well. It's really not that bad.
A lot of multiplats are 480p on xbox or GCN but those have their own output quirks and that's a whole other console you have to deal with.
I honestly think 480i is such an overblown issue on here. Obviously with games/collections that are meant to run in 240p it's bad and I'm glad there's ways around the small amount of PS2 games that are incorrectly displayed in 480i, but for native 480i games bob is fine, especially from a normal viewing distance.
Any other de-interlace solution is going to give you noticeable input lag, there's nothing you can do about that. That's just the reality of 480i.
Getting used to bob de-interlacing is basically a prerequisite for PS2, most games don't look like when forced to run in 480p and there's only a small list of games where 240p works well. It's really not that bad.
A lot of multiplats are 480p on xbox or GCN but those have their own output quirks and that's a whole other console you have to deal with.
I honestly think 480i is such an overblown issue on here. Obviously with games/collections that are meant to run in 240p it's bad and I'm glad there's ways around the small amount of PS2 games that are incorrectly displayed in 480i, but for native 480i games bob is fine, especially from a normal viewing distance.
Re: Playing with the PS2 (Lets Get Crazy)
Frames of lag sounds unacceptable. Most of the games that most need deinterlacing are the arcade games, mainly fighting games that we once played in the arcades in 240p.
Maybe just me but I am fond of 480i for games that we have only ever known in that format such as console RPG's etc. So not sure I'd find it of much use there. lol
Maybe just me but I am fond of 480i for games that we have only ever known in that format such as console RPG's etc. So not sure I'd find it of much use there. lol
-
bobrocks95
- Posts: 3471
- Joined: Mon Apr 30, 2012 2:27 am
- Location: Kentucky
Re: Playing with the PS2 (Lets Get Crazy)
Aren't you effectively losing half the vertical resolution when bob deinterlacing? Or not really since you'd only see one field at a time in 480i?
OSSC Pro will likely be out early 2021 if that's something you'd want to wait on. Or the Framemeister is still praised for its 480i deinterlacing at the expense of some lag and 4:2:2 (4:2:0?) color processing.
OSSC Pro will likely be out early 2021 if that's something you'd want to wait on. Or the Framemeister is still praised for its 480i deinterlacing at the expense of some lag and 4:2:2 (4:2:0?) color processing.
PS1 Disc-Based Game ID BIOS patch for MemCard Pro and SD2PSX automatic VMC switching.
Re: Playing with the PS2 (Lets Get Crazy)
deinterlacers have been around for almost 25 years now. Most of them are aimed at home theater use or are made for presentation purposes, where lag does not matter down the last frame.but could we maybe use a device solely dedicated to ONLY deinterlacing, maybe one that can pack quite a punch, and throw that in there?
"Fast" deinterlacing was used on a number processors and is always based on doubling the lines available within a single field. This can be done rather sharp (OSSC) without any interpolation or the opposite way by adding interpolation or smoothing (DVDOs with game mode or the Retrotink).
"Good" deinterlacing always require the comparison of at least two adjacent fields of video. In a best case scenario this means one additional frame of lag (to buffer the reference frame to which the current one gets compared to). On many processors this is paired with film mode detection, where you need at least two frames of buffer to be able to detect a 3:2 NTSC cadence or even with anime frame detection, where you might need even more frames buffered to detect stuff like 3:3 or 4:4 frame cadences on top of a 3:2 NTSC cadence. In other words: it's complicated.
A processor made only for video gaming and with the goal to provide proper deinterlacing, needs to introduce 1+ frames of lag. The Framemeister for example (which has exceptionally good deinterlacing) has 1.5 frames of lag. A more optizimed processor (like the OSSC Pro) should be able to get closer to 1 frame.
to this date no proper deinterlacing algorithms have been implemented on "semi-professional" projects. The OSSC Pro or some upcoming PS2 HDMI mod board will be the first and it remains to be seen how well the deinterlacing on those will perform compared to the FPGA and ASIC solutions available on multi-thousand $$ machines.I've been looking at FPGA deinterlacers from Xilinx and think I might be onto something
You can at least simulate it using a 480i recording and applying some filters. No biggie. If you're interested in doing it live on real hardware, I can suggest a processing chain to do this in real life as well.what if we cut back on fieldrate?
you don't have to. Deinterlacing is an art. Of all the video processing techniques it's the most complicated and most sophisticated. It can be a joy to watch and can be cruel on certain content just as well. It's about balancing the content based decisions of an motion adaptive deinterlacing algorithm (which decides where on the screen to apply bob'ing and where to apply weaving) along with decisions on interpolation techniques, filtering and sharpening.I also know that a lot of people suggest just accepting and getting used to the 480i look. I accept this as an option
In my personal opinion deinterlacing is always worth the effort, especially with TVs having become a lot faster over the recent years, meaning that the "base lag" has been reduced to a level where an additional frame or two of lag really don't matter. I can understand arguments in favor of bob'ing based on it's ability to minimize lag, but people who critizise any other type of deinterlacing often times simply don't understand, don't know or just can't appreciate the fine art, which deinterlacing really is.
Re: Playing with the PS2 (Lets Get Crazy)
it's just the same as on an interlaced CRT. You get 240 lines of active resolution per field. The rest is trickery (persistence plus the line offset between the fields).Aren't you effectively losing half the vertical resolution when bob deinterlacing? Or not really since you'd only see one field at a time in 480i?
Re: Playing with the PS2 (Lets Get Crazy)
I really like the OSSC's x4 mode for 480i, it seems the more you multiply the less the bob-effect is noticeable especially when you throw on H or H+V scanlines. I actually feel I've been harsh on 480i content, it looks pretty good on a CRT even though you can somewhat see or feel the flicker.
Btw if you want to try something cheap I highly recommend the GBS Control mod for the GBS82XX scaler boards. It's wet my appetite for what an OSSC Pro could do with the talents of someone like rama contributing to it. 1 frame of lag when using its adaptive de-interlace mode. Faster than the Framemeister by half a frame apparently.
Btw if you want to try something cheap I highly recommend the GBS Control mod for the GBS82XX scaler boards. It's wet my appetite for what an OSSC Pro could do with the talents of someone like rama contributing to it. 1 frame of lag when using its adaptive de-interlace mode. Faster than the Framemeister by half a frame apparently.
-
- Posts: 2180
- Joined: Mon Aug 14, 2017 8:34 pm
Re: Playing with the PS2 (Lets Get Crazy)
I found the OSSC's bob-deinterlacing combined with the OSSC's scanlines filter turned all the way to 100% to look very similar to how 480i looks on a PVM (the PVM seen from very close).
I don't like any deinterlacing solution that attempts to give a progressive-like picture (Fudoh will hard disagree on this). For me, 480i looks best when it has the appearance of alternating lines that defines its appearance on the CRTs that interlacing was meant for. Combing artifacts? I don't mind them
I don't like any deinterlacing solution that attempts to give a progressive-like picture (Fudoh will hard disagree on this). For me, 480i looks best when it has the appearance of alternating lines that defines its appearance on the CRTs that interlacing was meant for. Combing artifacts? I don't mind them
-
FinalBaton
- Posts: 4461
- Joined: Sun Mar 08, 2015 10:38 pm
- Location: Québec City
Re: Playing with the PS2 (Lets Get Crazy)
Fudoh, your artiste side really comes out when you talk about deinterlacing
-FM Synth & Black Metal-
Re: Playing with the PS2 (Lets Get Crazy)
I know I really enjoy watching deinterlaced content for the deinterlacing itself instead of the content.
I've recently heard so much praise about the modded GBS82xx. On the recent chat video on RetroRGB Voultar called it's deinterlacing results superior to the FM, which I highly doubt (I really had to pause the video, catch some air and rewatch that part). Others seem to be fixated on the fact that it does motion adaptive deinterlacing at all, which any cheap ass converter today does and which doesn't mean that it's any good. From my own testing ( admittedly years ago) I wouldn't call the GBS82xx deinterlacing any better than what a HDBoxPro did or what those generic SCart to HDMI solutions (that Bob calls the "worst" scalers in existence).
But, please, by all means, I'm ready to get convinced by the opposite.
somebody really needs to convince me of this. I haven't used a modded GBS82xx, but from my understanding the deinterlacing doesn't change by adding the GBS control software. The lag associated with it might, but not the algorithms. Is this correct?Btw if you want to try something cheap I highly recommend the GBS Control mod for the GBS82XX scaler boards. It's wet my appetite for what an OSSC Pro could do with the talents of someone like rama contributing to it. 1 frame of lag when using its adaptive de-interlace mode. Faster than the Framemeister by half a frame apparently.
I've recently heard so much praise about the modded GBS82xx. On the recent chat video on RetroRGB Voultar called it's deinterlacing results superior to the FM, which I highly doubt (I really had to pause the video, catch some air and rewatch that part). Others seem to be fixated on the fact that it does motion adaptive deinterlacing at all, which any cheap ass converter today does and which doesn't mean that it's any good. From my own testing ( admittedly years ago) I wouldn't call the GBS82xx deinterlacing any better than what a HDBoxPro did or what those generic SCart to HDMI solutions (that Bob calls the "worst" scalers in existence).
But, please, by all means, I'm ready to get convinced by the opposite.
Re: Playing with the PS2 (Lets Get Crazy)
When dealing with true interlaced content (and not pulldown), I think you can theoretically do some other forms of deinterlacing without any lag by buffering the previous field.
Think of the other basic deinterlacing method next to bob: weave. This is where you display two fields on the screen at a time. You get the full 480 lines of resolution in static scenes, and no flickering, but you get the "chicken teeth" combing artifacts when there's motion.
One of the simplest implementations is to accumulate every even/odd pair of fields and display them at half the fieldrate (so 30 FPS). This reduces your framerate (bad) and adds two fields of lag (bad).
But what if we always buffer the *previous* field, and every new field we get, we send it to the display in real-time line-by-line, alternating between sending the current line received from the source, and the subsequent line from the previous field? Now we're converting the 480i signal to 480p, at 60 FPS, with only one scanline of lag. Basically zero lag as we'd consider it for retro gaming.
This still leaves us with the combing artifacts, but some people may find that less distracting than the bob flicker, because the combing artifacts only appears when there is a large difference between two fields. How bad it looks depends on the content being displayed. Besides, now that we've got the previous field buffered, we can try to do some motion-adaptive deinterlacing as we go, one scanline at a time.
Basically, if we think there has been motion on this scanline pair since the previous field, then produce the other scanline by interpolating with the previous scanline from the same field instead of the corresponding scanline from the past field. You thus get full resolution in static images, but half resolution in motion, which is fine because you can't see detail as well when things are in motion anyhow. You can even get fancy and produce scanlines with a mix of interpolation and the buffer, to preserve resolution on non-moving parts of a scanline. This would add what, a second scanline of lag?
This sort of thing is not possible in the OSSC because it does not have the memory to buffer an entire previous field, but the OSSC Pro has plenty of memory and can easily buffer a 480i or 1080i field.
Think of the other basic deinterlacing method next to bob: weave. This is where you display two fields on the screen at a time. You get the full 480 lines of resolution in static scenes, and no flickering, but you get the "chicken teeth" combing artifacts when there's motion.
One of the simplest implementations is to accumulate every even/odd pair of fields and display them at half the fieldrate (so 30 FPS). This reduces your framerate (bad) and adds two fields of lag (bad).
But what if we always buffer the *previous* field, and every new field we get, we send it to the display in real-time line-by-line, alternating between sending the current line received from the source, and the subsequent line from the previous field? Now we're converting the 480i signal to 480p, at 60 FPS, with only one scanline of lag. Basically zero lag as we'd consider it for retro gaming.
This still leaves us with the combing artifacts, but some people may find that less distracting than the bob flicker, because the combing artifacts only appears when there is a large difference between two fields. How bad it looks depends on the content being displayed. Besides, now that we've got the previous field buffered, we can try to do some motion-adaptive deinterlacing as we go, one scanline at a time.
Basically, if we think there has been motion on this scanline pair since the previous field, then produce the other scanline by interpolating with the previous scanline from the same field instead of the corresponding scanline from the past field. You thus get full resolution in static images, but half resolution in motion, which is fine because you can't see detail as well when things are in motion anyhow. You can even get fancy and produce scanlines with a mix of interpolation and the buffer, to preserve resolution on non-moving parts of a scanline. This would add what, a second scanline of lag?
This sort of thing is not possible in the OSSC because it does not have the memory to buffer an entire previous field, but the OSSC Pro has plenty of memory and can easily buffer a 480i or 1080i field.
Last edited by Guspaz on Mon Jun 22, 2020 9:38 pm, edited 4 times in total.
Re: Playing with the PS2 (Lets Get Crazy)
I disagree, but I accept everyone's opinion on this one.I don't like any deinterlacing solution that attempts to give a progressive-like picture (Fudoh will hard disagree on this). For me, 480i looks best when it has the appearance of alternating lines that defines its appearance on the CRTs that interlacing was meant for. Combing artifacts? I don't mind them
When video games started transitioning from 240p to 480i in the mid 90s, a general opinion was that the reduced presence of scanlines was a plus, but the problems with motion really put me off, probably as early as with VF2's release on the Saturn. Back then I had little concept of deinterlacing and it took me few years to get into the whole deinterlacing topic. I think the Dreamcast put the nails in the coffin for me. I got a japanese unit shortly after launch and it didn't take long before the VGA box became available and seeing the difference when it comes to combing in a game like Soul Calibur really ended the discussion for me.
Re: Playing with the PS2 (Lets Get Crazy)
Do you know if there are any processors that output any data from the current field immediately (resulting in 0 frames of lag at the output) and use some adaptive algorithm to fill in the rest using data from the current and previous field(s)? I suspect that would look better than pure bob, but with the same low latency.Fudoh wrote:A processor made only for video gaming and with the goal to provide proper deinterlacing, needs to introduce 1+ frames of lag. The Framemeister for example (which has exceptionally good deinterlacing) has 1.5 frames of lag. A more optizimed processor (like the OSSC Pro) should be able to get closer to 1 frame.
(heh, Guspaz proposed the same thing while I was typing...)
GCVideo releases: https://github.com/ikorb/gcvideo/releases
Re: Playing with the PS2 (Lets Get Crazy)
this kind of deinterlacing was really only used with media files. I think no standalone deinterlacer ever did this. Even going back to Faroudja's original video algorithms in the 90s, the output was always been 60fps (with every field being used twice, one combined with the previous fields, once with the following).One of the simplest implementations it to accumulate every even/odd pair of fields and display them at half the fieldrate (so 30 FPS). This reduces your framerate (bad) and adds two fields of lag (bad).
nah, that's what we consider one frame of lag, since you measure from the oldest content on screen, not from the newest one. Not only for reasons from the deinterlacing itself (see below), but because you need to compare the reference to how today's OSSC works and what's considered zero lag today.But what if we always buffer the *previous* field, and every new field we get, we send it to the display in real-time line-by-line, alternating between sending the current line received from the source, and the subsequent line from the previous field? Now we're converting the 480i signal to 480p, at 60 FPS, with only one scanline of lag. Basically zero lag as we'd consider it for retro gaming.
yes, that's the idea. Still a frame of lag though because of your time code reference. If you think of transitions (like a cut to black), it gets obvious why the older frame needs to be reference. Otherwise you risk jumping back and forth between content of the your previous and current frame, possibly introducing skipped fields.Besides, now that we've got the previous field buffered, we can try to do some motion-adaptive deinterlacing as we go
Re: Playing with the PS2 (Lets Get Crazy)
I recall a discussion from years ago, where we tried to visualize the concept, but I don't think that it works. You're changing the definition of lag by doing this and you introduce the possibility of skipped fields. I'll dive into my archive to see if I can find the files...Do you know if there are any processors that output any data from the current field immediately (resulting in 0 frames of lag at the output) and use some adaptive algorithm to fill in the rest using data from the current and previous field(s)? I suspect that would look better than pure bob, but with the same low latency.
Re: Playing with the PS2 (Lets Get Crazy)
I don't think you could argue that it's one frame of lag when the current field is being drawn to the screen as it's received. Of course any real deinterlacing must reference the prior frame, we are missing half the spatial data and must either simply ignore the missing data (as bob does) or use temporal data to try to fill in the missing spacial data.
By your argument, a 480i signal on a CRT has one frame of lag because the combination of phosphor decay and persistence of vision causes the previous field to be blended with the current one.
Another very simple form of deinterlacing similar to weave is to average the current and previous fields together, offset by one pixel. This has the same properties as weave (in terms of latency) but replaces the combing artifacts with one frame worth of ghosting. Which also may be considered less distracting than combing, depending on the content.
Many of these (weave and field interpolation) should be very easy to implement in the OSSC Pro, though proper lag-free motion-adaptive deinterlacing would be much less easy.
By your argument, a 480i signal on a CRT has one frame of lag because the combination of phosphor decay and persistence of vision causes the previous field to be blended with the current one.
Another very simple form of deinterlacing similar to weave is to average the current and previous fields together, offset by one pixel. This has the same properties as weave (in terms of latency) but replaces the combing artifacts with one frame worth of ghosting. Which also may be considered less distracting than combing, depending on the content.
Many of these (weave and field interpolation) should be very easy to implement in the OSSC Pro, though proper lag-free motion-adaptive deinterlacing would be much less easy.
Re: Playing with the PS2 (Lets Get Crazy)
Rama should have the specifics on if he did anything to it other than make it fast. I don't have any base of comparison for adaptive deinterlacing, this is my first device that does it so I can't comment on how well it performs other than compare it to 480p or bob-deinterlaced 480i and I have to say I like it. There are some slight artefacts at certain edges but it's subtle, a different sort of trade-off to the bobbing effect, certainly worth having in a toolbox for particular games I find. Community-led deinterlacing targeted at gaming is something I'm really excited for (and downscaling for that matter), I am very much looking forward to the OSSC Pro.Fudoh wrote:somebody really needs to convince me of this. I haven't used a modded GBS82xx, but from my understanding the deinterlacing doesn't change by adding the GBS control software. The lag associated with it might, but not the algorithms. Is this correct?Btw if you want to try something cheap I highly recommend the GBS Control mod for the GBS82XX scaler boards. It's wet my appetite for what an OSSC Pro could do with the talents of someone like rama contributing to it. 1 frame of lag when using its adaptive de-interlace mode. Faster than the Framemeister by half a frame apparently.
I've recently heard so much praise about the modded GBS82xx. On the recent chat video on RetroRGB Voultar called it's deinterlacing results superior to the FM, which I highly doubt (I really had to pause the video, catch some air and rewatch that part). Others seem to be fixated on the fact that it does motion adaptive deinterlacing at all, which any cheap ass converter today does and which doesn't mean that it's any good. From my own testing ( admittedly years ago) I wouldn't call the GBS82xx deinterlacing any better than what a HDBoxPro did or what those generic SCart to HDMI solutions (that Bob calls the "worst" scalers in existence).
But, please, by all means, I'm ready to get convinced by the opposite.
As someone who has a great eye for good deinterlacing and a huge range of experience with devices, it would be good to hear your opinion, Fudoh, on the GBS Control form of this board (if in fact, it has a tweaked algorithm). I believe Bob might be doing a video on GBS Control soon? maybe he'll share his views on the deinterlace quality and compare it with something like the Framemesiter.
Re: Playing with the PS2 (Lets Get Crazy)
no, but I would argue that you can't do it without risking to skip entire fields. You would always need to prioritize the current field over the buffered one. I'll try to give you an example why this doesn't work.I don't think you could argue that it's one frame of lag when the current field is being drawn to the screen as it's received.
Re: Playing with the PS2 (Lets Get Crazy)
That doesn't make any sense to me. If you use that definition, adding fully-black alternating scanlines to the signal has negative lag because it blanks out the oldest content.Fudoh wrote:nah, that's what we consider one frame of lag, since you measure from the oldest content on screen, not from the newest one.
Also, it does not match what the current lag measurement solutions measure.
GCVideo releases: https://github.com/ikorb/gcvideo/releases
Re: Playing with the PS2 (Lets Get Crazy)
you mean with blinking white boxes? If you output interlaced and your sink uses weaving, creating a 50%grey frame in between the black and white one, you end up with the same reading you'd get from a display with an additional frame of G2G transition time, don't you? In other words, the lag would be measured to the point where the oldest content is completely gone.Also, it does not match what the current lag measurement solutions measure.
Re: Playing with the PS2 (Lets Get Crazy)
That depends on the size of the measurement spot and the threshold for detecting a change within the spot. I would expect the sensor to trigger unless you manually fine-tune the threshold to avoid that.Fudoh wrote:If you output interlaced and your sink uses weaving, creating a 50%grey frame in between the black and white one, you end up with the same reading you'd get from a display with an additional frame of G2G transition time, don't you? In other words, the lag would be measured to the point where the oldest content is completely gone.
(it's not a 50% grey equivalent either due to gamma)
GCVideo releases: https://github.com/ikorb/gcvideo/releases
-
- Posts: 2180
- Joined: Mon Aug 14, 2017 8:34 pm
Re: Playing with the PS2 (Lets Get Crazy)
I caught that part too while skimming through that video, and had a similar reaction. I'm very curious to check out a GBS Control unit now though. If only someone sold ready-made kits for equipment-less and skill-less peasants like me.Fudoh wrote: On the recent chat video on RetroRGB Voultar called it's deinterlacing results superior to the FM, which I highly doubt (I really had to pause the video, catch some air and rewatch that part).
Re: Playing with the PS2 (Lets Get Crazy)
There is a solder less solution in the works but you still.need to do some work to the board to clean up power and video.
I have built a few now for gamers to use pc crt on old consoles, lag tested and its ahead of OSSC over half a frame.
If anyone in Australia(or worldwide if you cant find someone closee) wants one done PM me.
It works on the hdmi unit too but no audio and only 720p
Not sure if it's been mentioned but if you csync or hvsync mod your ps2 you can output 480p deinterlaced for the majority of games nativly with OPL. There are a whole heap of video modes to choose from.
I have built a few now for gamers to use pc crt on old consoles, lag tested and its ahead of OSSC over half a frame.
If anyone in Australia(or worldwide if you cant find someone closee) wants one done PM me.
It works on the hdmi unit too but no audio and only 720p
Not sure if it's been mentioned but if you csync or hvsync mod your ps2 you can output 480p deinterlaced for the majority of games nativly with OPL. There are a whole heap of video modes to choose from.
Re: Playing with the PS2 (Lets Get Crazy)
+1 on this.Fudoh wrote: I've recently heard so much praise about the modded GBS82xx. On the recent chat video on RetroRGB Voultar called it's deinterlacing results superior to the FM, which I highly doubt (I really had to pause the video, catch some air and rewatch that part). Others seem to be fixated on the fact that it does motion adaptive deinterlacing at all, which any cheap ass converter today does and which doesn't mean that it's any good. From my own testing ( admittedly years ago) I wouldn't call the GBS82xx deinterlacing any better than what a HDBoxPro did or what those generic SCart to HDMI solutions (that Bob calls the "worst" scalers in existence).
But, please, by all means, I'm ready to get convinced by the opposite.
We apologise for the inconvenience
-
maxtherabbit
- Posts: 1763
- Joined: Mon Mar 05, 2018 4:03 pm
Re: Playing with the PS2 (Lets Get Crazy)
why do we have to convince yall? try it for yourself or don't, it's up to youorange808 wrote:+1 on this.Fudoh wrote: I've recently heard so much praise about the modded GBS82xx. On the recent chat video on RetroRGB Voultar called it's deinterlacing results superior to the FM, which I highly doubt (I really had to pause the video, catch some air and rewatch that part). Others seem to be fixated on the fact that it does motion adaptive deinterlacing at all, which any cheap ass converter today does and which doesn't mean that it's any good. From my own testing ( admittedly years ago) I wouldn't call the GBS82xx deinterlacing any better than what a HDBoxPro did or what those generic SCart to HDMI solutions (that Bob calls the "worst" scalers in existence).
But, please, by all means, I'm ready to get convinced by the opposite.
-
maxtherabbit
- Posts: 1763
- Joined: Mon Mar 05, 2018 4:03 pm
Re: Playing with the PS2 (Lets Get Crazy)
I've never used an unmodded GBS, so I wouldn't knowFudoh wrote:I haven't used a modded GBS82xx, but from my understanding the deinterlacing doesn't change by adding the GBS control software. The lag associated with it might, but not the algorithms. Is this correct?
just try it
Re: Playing with the PS2 (Lets Get Crazy)
Seems quite obvious to me. "It outperforms Framemeister" is a strong assertion and that kind of statement invites further investigation.maxtherabbit wrote: why do we have to convince yall? try it for yourself or don't, it's up to you
Simple as that, really...
We apologise for the inconvenience
-
- Posts: 386
- Joined: Sun Jul 04, 2010 11:14 pm
Re: Playing with the PS2 (Lets Get Crazy)
Doesn't the ossc require a video buffer in order to have "any" lag? Thought it was down to three scanlines or something....Syntax wrote:I have built a few now for gamers to use pc crt on old consoles, lag tested and its ahead of OSSC over half a frame.
"Don't HD my SD!!"
Re: Playing with the PS2 (Lets Get Crazy)
@Guspaz and unseen:
I searched my files, but wasn't able to locate the proper illustrations. I found number of email exchanges though covering the same topic. Let my try to paraphrase it and please, by all means, share your thoughts about it.
By reducing the lag to zero (as suggested by both of you), you would always have to prioritize the current field. This means displaying all the original lines from that particular field plus filling gaps by either interpolation or by copying data from the previous field only. The problem is here is motion. To some degree in all directions, but the examples are easier to make with vertical movement, especially of thin horizontal lines, but by extension of all horizontal edges. To make an extreme example, think of a horizontal line, that's only 1px strong (can also be the top or bottom line of any larger object). Due to the interlacing it's already only present on every other field, but to reduce the bob'ing effect you need to display it on every output frame (= apply a temporal interpolation). Now if that line starts moving up or down, the deinterlacer can't tell if the object is actually moving or if it's just not present on the current frame due to interlacing. If you use the current input frame as your reference and only have the previous frame, there's no way to tell. By adding one frame of lag and displaying the previous frame/frield, interpolating from both the older and the next frame AND - most important - being able to tell what happens on the next frame/field, these decisions become possible. Without looking a frame ahead, it's impossible to catch changes in direction and thus it's not possible to tell interlacing apart from movement.
I searched my files, but wasn't able to locate the proper illustrations. I found number of email exchanges though covering the same topic. Let my try to paraphrase it and please, by all means, share your thoughts about it.
By reducing the lag to zero (as suggested by both of you), you would always have to prioritize the current field. This means displaying all the original lines from that particular field plus filling gaps by either interpolation or by copying data from the previous field only. The problem is here is motion. To some degree in all directions, but the examples are easier to make with vertical movement, especially of thin horizontal lines, but by extension of all horizontal edges. To make an extreme example, think of a horizontal line, that's only 1px strong (can also be the top or bottom line of any larger object). Due to the interlacing it's already only present on every other field, but to reduce the bob'ing effect you need to display it on every output frame (= apply a temporal interpolation). Now if that line starts moving up or down, the deinterlacer can't tell if the object is actually moving or if it's just not present on the current frame due to interlacing. If you use the current input frame as your reference and only have the previous frame, there's no way to tell. By adding one frame of lag and displaying the previous frame/frield, interpolating from both the older and the next frame AND - most important - being able to tell what happens on the next frame/field, these decisions become possible. Without looking a frame ahead, it's impossible to catch changes in direction and thus it's not possible to tell interlacing apart from movement.
Re: Playing with the PS2 (Lets Get Crazy)
raw blending isn't great. Again it's something that was used on media files two decades ago - just because any type of better video deinterlacing wasn't manageable on the hardware back then, but I don't think any processor ever did this.Another very simple form of deinterlacing similar to weave is to average the current and previous fields together, offset by one pixel. This has the same properties as weave (in terms of latency) but replaces the combing artifacts with one frame worth of ghosting. Which also may be considered less distracting than combing, depending on the content.
But of course you're right that hardly every deinterlacing method can be applied to all content. After all there's still the problem with 30Hz effects (like flickering shadows). There are single field processing solutions to that. For example the XRGBs did already shift the doubled fields towards each other by 1 line, reducing the bob'ing effect rather a bit plus allowing for a complete restoration of 240p content being output in 480i (the Psikyo titles Dragon Blaze and Sengoku Blade on PS2 for example look stunning on the XRGB-3. Not a hint of bob'ing and impossible to tell apart from their 240p counterparts. The same titles on the OSSC still have considerably bob'ing effects in place). By building on that and for example adding a 1px blur I can imagine that we could get great zero delay deinterlacing for A LOT of fast content.
Last edited by Fudoh on Tue Jun 23, 2020 10:48 am, edited 1 time in total.