Or does that line just mean that if your json file only had one profile in it, it would just go to that one profile?A json file with a single profile will be imported to the currently selected profile.
EDIT: im dumb
Or does that line just mean that if your json file only had one profile in it, it would just go to that one profile?A json file with a single profile will be imported to the currently selected profile.
When I found optim mode I was excited because I love the ultra-sharp look but then I realized it was distorting the aspect ratio. If I understand correctly, the aspect ratio of all games are at least somewhat distorted in optim mode. Do I have that right?FBX wrote:Aspect correction is a hotbed topic. Some worry about it, some don't care. Some even think square pixels look more correct on certain games like Super Mario World for example. However, one cool new feature of the OSSC as far as 384x240 goes is Marqs now has it scale vertically in 4x when you use line4x, but scales the horizontal samples at 3x. This ends up looking much better than 4x4 square pixels due to how wide 384x240 is.
Just picked up the A2080 myself. That thing is a BEAST in terms of weight compared to my old unit. Haha. Now that I have that purchased, I need to build a new shelving unit to support the weight of it.Konsolkongen wrote:Setting lipsync to auto and 0ms works on my A2080 on all sources regardless if they are connected through HDMI directly or ARC from the TV.
It displays whatever is on the LCD, so it'll show the current signal info when the LCD shows it. In some peoples' cases, the OSD probably disappears before their HDMI chain can recover from the mode change blackout.sofakng wrote:Does the new OSD video overlay also show the current signal information?
The videos I've seen show people navigating the OSD menu but I haven't seen the actual signal (resolution/frequency) display on the OSD overlay.
I requested serial control about a year and a half ago, but I'm not aware of any work that might have been done towards it.sofakng wrote:Also, has there any discussion about adding OSD (menu) control via serial or other means? Even if I could inject IR codes directly into the OSSC via serial/whatever it would be more reliably than sticking an IR dongle on the OSSC IR receiver.
OK - That is what I thought might happen. Can you press a button on the remote control to re-display the status information?nmalinoski wrote:It displays whatever is on the LCD, so it'll show the current signal info when the LCD shows it. In some peoples' cases, the OSD probably disappears before their HDMI chain can recover from the mode change blackout.
I've seen that thread and I also commented on it. I also saw a response from marqs:nmalinoski wrote:I requested serial control about a year and a half ago, but I'm not aware of any work that might have been done towards it.
It sounds like FPGA resources are almost exhausted, but even if a limited interface as he describes (treating chars as remote control keys) would be useful. Hopefully he will consider this kind of functionality?It’d be easy to map current user I/O functionality to UART, i.e. treating a received char as remote control key and printing out same data that is output on character display. I’m not sure how useful that’d be, though. A more fancy interface supporting direct access to settings and status would require more logic which would consume some of the remaining little memory.
I was the one who posted the 683 value on the wiki page a while ago, along with a few of the other PS1 modes. The theoretical optimal value is 682.5 (see below) but at the time the OSSC didn’t support fractional sample rates, so it was a round approximation that I visually confirmed as “close enough” on some dither patterns. Since fractional sample rates are now a thing, the value should be updated. Spyro was indeed one of the games I tested, along with Crash Bandicoot and a few others.NormalFish wrote:Could anyone test and see? Are these games not actually 512x240 maybe? Or is the old reading from a dated firmware, perhaps?
Yes, I used 683.00 since that's what the wiki stated, but I'd also tried 682.5 and it was just as bad. I don't doubt that your math is right, though.awe444 wrote:I was the one who posted the 683 value on the wiki page a while ago, along with a few of the other PS1 modes. The theoretical optimal value is 682.5 (see below) but at the time the OSSC didn’t support fractional sample rates, so it was a round approximation that I visually confirmed as “close enough” on some dither patterns. Since fractional sample rates are now a thing, the value should be updated. Spyro was indeed one of the games I tested, along with Crash Bandicoot and a few others.NormalFish wrote:Could anyone test and see? Are these games not actually 512x240 maybe? Or is the old reading from a dated firmware, perhaps?
The theoretical value of 682.5 comes from:
- Pixel clock rate in this mode is (945/88) MHz according to https://pineight.com/mw/index.php?title=Dot_clock_rates
- 263 lines per frame (as reported by the OSSC)
- Assumes line rate is NTSC-standard (4500000/286) Hz. If true, then the frame rate is (4500000/286)/263 = 59.82610545 Hz, which is consistent with the OSSC’s reported frame rate to within reported digits
Given the above numbers, sample rate is:
(10^6 * 945/88) / (4500000/286) = 1365/2 = 682.5
@NormalFish are you by any chance testing with a PS2? I ask because the PS2’s 512x480i sample rate is 686.4, which should be the same sample rate as PS2’s 512x240p (though I’ve not tested), which is coincidentally close to your reported best value
I just now updated the wiki page numbers— apologies for the staleness there!NormalFish wrote:Yes, I used 683.00 since that's what the wiki stated
OK, yes that makes sense that it’s a hardware difference. The 686.4 value comes from the PS2 having a pixel clock rate of (54/5) MHz = 10.8 MHz in this mode, as compared to the PS1’s 945/88 ~ 10.738636. Substitute that into the formula from earlier and you get 686.4000 as the theoretical sample rate for PS2 512x240pNormalFish wrote:but I'd also tried 682.5 and it was just as bad. I don't doubt that your math is right, though.
I am using a PS2. I did notice that my result was much closer to the expected result for 512x480i on a PS2, but wasn't entirely sure why this might be the case, and 686.4 doesn't look any different from 686.25 - 686.75 (though this may have to do with how the setting is applied by the OSSC?). I also had a different result from the wiki for 320x240p PS1 games via my PS2, so maybe this is the common factor.
Edit: Yeah, in fact the 320x240 value is half the standard horizontal value of 858 for 480p at 429. Gotta be a PS2 vs PS1 difference, unless someone else can test on a ps1.
Glad this makes sense, then. I was really baffled initially. I'll be slowly putting together profiles until all my PS1 games work on my PS2 with perfect scaling, so I'll be able to check the other resolutions too, probably. If folks are able to work them out mathematically, though, that'll save me some time for sure. I haven't tried any games that run in 256x240 or 384x240. Not sure how many obscure resolutions there might be.awe444 wrote:OK, yes that makes sense that it’s a hardware difference. The 686.4 value comes from the PS2 having a pixel clock rate of (54/5) MHz = 10.8 MHz in this mode, as compared to the PS1’s 945/88 ~ 10.738636. Substitute that into the formula from earlier and you get 686.4000 as the theoretical sample rate for PS2 512x240p
Has somebody really experienced burn-in with OLED caused by bob deinterlacing? I think static picture is the sole concern with self-emissive displays so actually bob deinterlacing could mitigate burn-in.Colback wrote:By setting the output resolution of the DVDO Edge to 1080i@60 it completely get rid of the wobbling thus preventing faster burn-in and image retention. It by no mean make the games look 480p’ish but works great for my me. I can’t test the lag of this setup but should be minimal consedering the bob deinterlacing of the OSSC and the DVDO’s game mode is on.
I'm pretty sure the answer to burn-in is "no", however many people experience image retention and panic. Then they go around freaking out, telling everyone about it, which (as happens on the internet) eventually turns into people repeating the wrong info. I get image retention all the time on my OLED regardless of bob deinterlacing or just gaming. I've also heard from trusted sources that they've seen IR on their LCD TV's too with bob deinterlacing...marqs wrote:Has somebody really experienced burn-in with OLED caused by bob deinterlacing? I think static picture is the sole concern with self-emissive displays so actually bob deinterlacing could mitigate burn-in.Colback wrote:By setting the output resolution of the DVDO Edge to 1080i@60 it completely get rid of the wobbling thus preventing faster burn-in and image retention. It by no mean make the games look 480p’ish but works great for my me. I can’t test the lag of this setup but should be minimal consedering the bob deinterlacing of the OSSC and the DVDO’s game mode is on.
You're right. Image Retention is not burn-in. Burn-in is permanent and image retention is temporaryretrorgb wrote: If I'm wrong, I'll be happy to write a detailed warning post and do a video about it, but I'm pretty sure it's just misinformation: IR is not burn-in.
Burn in is what I used to see on plasma displays used for signage, back when plasma technology was new in the early 2000's. I designed the signage that was burned into large plasma panels worth many thousands of pounds each - so I'm well aware of the effect.retrorgb wrote: If I'm wrong, I'll be happy to write a detailed warning post and do a video about it, but I'm pretty sure it's just misinformation: IR is not burn-in.
Gaming monitors typically have less input lag, but for the most part input resolution doesn't have much to do with input lag. For example, at all resolutions my LG C9 accepts, input lag is exactly the same (AFAIK). I suspect it is the same with most monitors. Whether or not it's designed for gaming is not likely to be much of an indicator of whether or not the display will accept signals from the OSSC. For that, you'll just want to look at the OSSC wiki and VGP forums for compatibility reports or buy from somewhere with a good return policy.geiger9 wrote:I posted on VGP first (https://www.videogameperfection.com/for ... questions/) but I feel this question is less about how the actual device works and more about how other devices interact with it. I figure this forum is the right place. This thread is quite large so forgive me if I haven't read through all of it.
Regarding lag and the OSSC. I am going to be using original hardware (SNES, NES, MD) so no lag there. The OSSC introduces no lag regardless of the multiplier. The lag will come into effect after the display device receives the signal. The display will introduce lag either from the upscaling required or the picture processing (it's both of those things, right?)
1. Will it matter if the OSSC feeds into a monitor designed for gaming versus a TV? The three systems I listed above have odd resolutions and I'm wondering if a "gaming monitor" will have an easier time accepting the weird resolution. I'm also told that the refresh rate will cause a problem. For example, the NES runs at 60.10 whereas spec is 59.94. Again, will a monitor designed for gaming be more forgiving and accept it?
2. Has anyone hooked up an AV Famicom with Tim Worthington's RGB mod plus a YPbPr transcoder? is it affected by jitter? If I am reading his website right, it indicates it will but perhaps I'm reading that wrong.
3. Do both monitors and TVs use HDCP? My thinking was that if the OSSC feeds a signal over HDMI, the TV has one more thing to process - the content protection. If both types of displays use it, will converting HDMI to DVI fix this and then there is one less thing to process and therefore contribute to lag?