Putting the 'role' back in role-playing games since 2002.
Donate to Codex
Good Old Games
  • Welcome to rpgcodex.net, a site dedicated to discussing computer based role-playing games in a free and open fashion. We're less strict than other forums, but please refer to the rules.

    "This message is awaiting moderator approval": All new users must pass through our moderation queue before they will be able to post normally. Until your account has "passed" your posts will only be visible to yourself (and moderators) until they are approved. Give us a week to get around to approving / deleting / ignoring your mundane opinion on crap before hassling us about it. Once you have passed the moderation period (think of it as a test), you will be able to post normally, just like all the other retards.

What's the proper way of playing old, low resolution games on modern monitors?

Maggot

Arcane
Patron
Joined
Mar 31, 2016
Messages
1,243
Codex 2016 - The Age of Grimoire
Don't see the need to when the windows version already works fine for me. Same with VCMI for Heroes 3.
 

Endemic

Arcane
Joined
Jul 16, 2012
Messages
4,448
Don't see the need to when the windows version already works fine for me. Same with VCMI for Heroes 3.

Sure, it's just nice to be able to see more of the map/UI and customise things a little. For Heroes 3 I just use the HD mod.
 

Jarpie

Arcane
Patron
Joined
Oct 30, 2009
Messages
6,708
Codex 2012 MCA
What's problem with CRT monitors is that in my experience, they tend to wear out with prolonged usage, even more so than flatscreen tvs/monitors. I managed to find some Benq flatscreen tv/monitor which has pretty good SD support even though C64 games doesn't look quite as good as with old CRTs. It has HDMI, S-Video, component, composite and RGB inputs, which is nice.
 
In My Safe Space
Joined
Dec 11, 2009
Messages
21,899
Codex 2012
Agris and the others already said everything, but I really wanna reinforce you should not play 2D games like Fallout or Planescape: Torment in high resolutions, as the art gets too small and you might see more stuff than the designers intended.

Shit like this is just wrong:

2589496-5353925746-fallo.jpg


It kills any weight the area has, reducing it to a toybox.
Isometric view replicates a perspective of a table with miniatures, so a toybox is a quite appropriate impression for these games. You don't see more than intended in Fallout if the view isn't larger than scrolling range limit.
 

Ezekiel

Arcane
Joined
May 3, 2017
Messages
6,688
I usually keep the 4:3 aspect ratio because a lot of games get distorted in widescreen (Sometimes it's the cutscenes or menus.) or are too much of a hassle to convert to widescreen. I don't mind. I watch old movies in 4:3 all the time. I use the highest resolution that's available for my monitor.
 

Epsilon

Cipher
Joined
Jul 11, 2009
Messages
428
I see people recommending the DAUM build of DOSBox a lot. It's terribly outdated and it doesn't have the pixel perfect patch. The pixel perfect patch is incredible if you're playing dos games on a widescreen. Because it makes the scaling of the picture not get distorted at higher resolutions, so it stays 'perfect' like it was on 320x200 or 640x480 etc.
The ECE build is by far the way to go for this, it also comes with a voodoo wrapper inbuilt as well as a recent build of mt32 (munt) emulator.
 
In My Safe Space
Joined
Dec 11, 2009
Messages
21,899
Codex 2012
Isometric view replicates a perspective of a table with miniatures, so a toybox is a quite appropriate impression for these games.
Except size matters. Your toy box would be a lot less exciting if all toys were reduced to like 1/4 of their size.
To 58% of their size. Typical monitor in 1998 would be 14", not 30" and characters on screen would be about 28mm like miniatures in many tabletop systems.
 

felipepepe

Codex's Heretic
Patron
Joined
Feb 2, 2007
Messages
17,310
Location
Terra da Garoa
Isometric view replicates a perspective of a table with miniatures, so a toybox is a quite appropriate impression for these games.
Except size matters. Your toy box would be a lot less exciting if all toys were reduced to like 1/4 of their size.
To 58% of their size. Typical monitor in 1998 would be 14", not 30" and characters on screen would be about 28mm like miniatures in many tabletop systems.
Maybe for you a 30" monitor is the standard, but I'm playing on a 14" notebook. And I shall play it as Tim Cain himself designed! :obviously:
 
Joined
Dec 5, 2010
Messages
1,620
For your 1920x1080 monitor, that means playing games in fullscreen 960x540 perfectly maps 4 of your monitor's pixels to 1 of the games.
Are you sure modern GPU drivers and/or monitors are doing perfect 4:1 mappings when they detect you're running half the screen's vert/hor resolution? I was under the impression they would still run the same general upscaling algorithm as they would for an arbitrary subresolution that still introduces some upscaling artifacts/blurriness (though a lot less than for a non-multiple resolution). I only know of a few 4K TVs that have special options to do perfect 4:1 and 9:1 mappings of 1080p and 720p resolutions.

Maybe one day manufacturers will invent flat panel monitors that don't have a native resolution? Wait to play these games until then.
If a flat panel/matrix screen's native resolution/pixel density ever gets high enough("retina"?) the end result might be the same to human eyes.
 
Last edited:
Self-Ejected

Drog Black Tooth

Self-Ejected
Joined
Feb 20, 2008
Messages
2,636
Fuckin' ViewSonic E655! 15 inches of radiation poisoning.

That was my first monitor, on my Win 98 machine. Fuckin' hated the thing, my eyes would start hurting after about 2 hours of staring at it.

And now you hipsters are lugging this junk around like it's some monocled holy grail of gaming.

FFS.
 

Raghar

Arcane
Vatnik
Joined
Jul 16, 2009
Messages
24,065
9a4LioJ.jpg

SqIscjQ.png


I play them this way. Of course in game movies are at smaller resolution, which might cause problems.
 
In My Safe Space
Joined
Dec 11, 2009
Messages
21,899
Codex 2012
Isometric view replicates a perspective of a table with miniatures, so a toybox is a quite appropriate impression for these games.
Except size matters. Your toy box would be a lot less exciting if all toys were reduced to like 1/4 of their size.
To 58% of their size. Typical monitor in 1998 would be 14", not 30" and characters on screen would be about 28mm like miniatures in many tabletop systems.
Maybe for you a 30" monitor is the standard, but I'm playing on a 14" notebook. And I shall play it as Tim Cain himself designed! :obviously:
Wait, there are 14" notebooks that have the same native resolution as my 24" monitor nowadays? Whoa, didn't know stuff got so advanced D: . Saw the screenshot and thought you're playing on big screen.
Yeah, tiny screen + huge native resolution sounds like a real pita for a retrogamer.
 

octavius

Arcane
Patron
Joined
Aug 4, 2007
Messages
19,685
Location
Bjørgvin
Classic GOG threads:
"Halp, I'm getting black borders, the game is not working!"
And after being told that's how it's supposed to be:
"I don't care about stretching as long as I don't get black borders, lol."
 
Self-Ejected

Drog Black Tooth

Self-Ejected
Joined
Feb 20, 2008
Messages
2,636
"I don't care about stretching as long as I don't get black borders, lol."
That's p much how old/middle aged people reacted at widescreen TV when they first became available.

Dunno why anyone would want to see stuff distorted so much, but w/e, stretching 4:3 to 16:9 is a very common thing to do among the common folk. Just look at all the other DOS/emulator gaming videos on YouTube.
 

Deleted member 7219

Guest
Since there are many intelligent, knowledgeable and polite people on the Codex, I'd like to get your opinion about the best way of playing the game's of old on modern monitors. When you play a game which has a resolution of 640x480 on a monitor with a native resolution of 1080p, things can get a bit messy.

Do you know any tricks to get the most out of this situation? Maybe there is a way to only scale up to a certain resolution, or there is a preferred way to scale up the image? I dunno, I'm sure you have a trick for this.

Well, obviously you need to keep the aspect ratio.

Other than that, well, I'm still playing TIE Fighter CD all the time (because it is the best game ever made), despite the resolution of the game being limited to 640x320. It still looks ok even scaled up onto my 1920x1080 monitor (preserving the aspect ratio of course). It's easy to force your graphics card to do this.

I'm playing Broken Sword 2 in the same way and make sure I just leave the graphics cards settings set to that so it doesn't try to force 4:3 games into widescreen. You can also set these games to play at their normal resolution (well, Broken Sword 2 at any rate), but it just looks to small and it is a struggle to see. People forget that CRT monitors weren't tiny things. They too expanded the games a little bit beyond their native resolution.
 

agris

Arcane
Patron
Joined
Apr 16, 2004
Messages
6,927
For your 1920x1080 monitor, that means playing games in fullscreen 960x540 perfectly maps 4 of your monitor's pixels to 1 of the games.
Are you sure modern GPU drivers and/or monitors are doing perfect 4:1 mappings when they detect you're running half the screen's vert/hor resolution? I was under the impression they would still run the same general upscaling algorithm as they would for an arbitrary subresolution that still introduces some upscaling artifacts/blurriness (though a lot less than for a non-multiple resolution). I only know of a few 4K TVs that have special options to do perfect 4:1 and 9:1 mappings of 1080p and 720p resolutions.

*Perfect*? No, I don't know what perfect is to you. To me its bilinear scaling, which when used in 4:1 physical:logical pixel arrangement should result in minimal distortion. FWIW, when I don't play Fallout 1 or 2 in a 4:1 configuration, the text always has strange artifacts. Playing it in an eXtremely high (read: stupid) resolution will probably mask it, but on a 16:10 playing in 1280x800 for example, the text looks like shit.

It's the subpixels, man.

edit: to your point about TVs, TVs use image processor chipsets that usually aren't bypass-able when using computer input, even with dedicated PC video-in ports. It inevitably ends up looking like shit, as that usage case is not what the TV's image processor was tuned for and your GPU is already doing the image processing. It's also why most people disable all the DSP effects on monitors, because that results in the best image. Monitors are made with that as an assumed operational mode, and generally use higher quality panels than TVs. We have an old Vizio where I believe the VGA input bypasses the set's DSP, and it looks fucking great. Switch to HDMI in, which isn't dedicated for the PC, and the image looks like trash. The new samsung 1080p smart TV i have also looks like trash when taking video input from the computer- despite all my soft and hard calibration attempts. But the netflix / you tube / amazon video apps on the TV? They look much better.

/rant
 
Last edited:
Joined
Dec 5, 2010
Messages
1,620
*Perfect*? No, I don't know what perfect is to you.
Perfect to me means without introducing any new information(read flaws/artifacts) into the picture, so that in theory seeing that 960x540 game frame stretched to fit a 1080p lcd would look as good as though you were watching that same 960x540 frame on a 540p lcd of the same size and quality, nothing lost and nothing added with each point of the game's 960x540 frame mapped to 4 pixels of the 1080p lcd or 16 pixels of a 4K one.

To me its bilinear scaling, which when used in 4:1 physical:logical pixel arrangement should result in minimal distortion.
The way I understand it when trying to stretch a picture vert/horizontally by a natural with a bilinear upscaling algorithm it shouldn't actually get to the stage of nearest neighbors mixing distortion as it shouldn't have any missing/unknown pixels to compute to begin with, and that should go for just about most upscaling algorithms.
But my question is whether that's actually what's happening in practice. I was under the impression that rather than nvidia drivers, amd drivers & monitors all running the same implementation of the same upscaling algorithm with the same result for instance they were instead each doing slightly different implementations with different results or even running different algorithms altogether with say nvidia doing bicubic for upscaling, bilinear for downscaling a custom resolution and gaussian for dsr downscaling, or amd having some dedicated hardware for it...

FWIW, when I don't play Fallout 1 or 2 in a 4:1 configuration, the text always has strange artifacts. Playing it in an eXtremely high (read: stupid) resolution will probably mask it, but on a 16:10 playing in 1280x800 for example, the text looks like shit.
Sure it looks better than non-multiple res scaling, but does it look as good as it possibly could? Short of some digital-foundry-esque pixel counting can you be sure it's perfectly mapping every point of a game's 540p frame to 4 of the 1080p LCD's pixels? My personal experience with DSR for instance is that the image and text is at its most readable at 4x DSR with 0% smoothing, I can't personally tell any flaws in text there compared to native, yet I know the downscaling algorithm is almost certainly introducing some.

And what might not make much of a difference with a modern game's millions of pixels might make a difference with an older game's thousands.

edit: to your point about TVs, TVs use image processor chipsets that usually aren't bypass-able when using computer input, even with dedicated PC video-in ports. It inevitably ends up looking like shit, as that usage case is not what the TV's image processor was tuned for and your GPU is already doing the image processing. It's also why most people disable all the DSP effects on monitors, because that results in the best image. Monitors are made with that as an assumed operational mode, and generally use higher quality panels than TVs. We have an old Vizio where I believe the VGA input bypasses the set's DSP, and it looks fucking great. Switch to HDMI in, which isn't dedicated for the PC, and the image looks like trash. The new samsung 1080p smart TV i have also looks like trash when taking video input from the computer- despite all my soft and hard calibration attempts. But the netflix / you tube / amazon video apps on the TV? They look much better.
I'm aware of TVs adding a lot of postprocessing to their inputs in general (which can be partially or sometimes almost completely turned off) but I was referring to how only a few 4K TVs supposedly allow the user to choose a "perfect" upscaling mode like some Panasonic 4k TVs who call it a "1080p Pixel by 4pixels" mode for instance.

Would be interesting to test whether those TVs really are doing 4:1 mappings of a 1080p input(and 9:1 of 720p) and then compare those results with 4k gpu upscales fed to the same tv.

I think it's a huge wasted opportunity given how the previous 1280x720 and 1920x1080 resolution standards both could map perfectly to 4k(isn't that why it was picked to begin with?). Instead you see a lot of people complaining that 720p or 1080p looks blurrier on their brand new 4K TVs than it did on their old ones.
 
Last edited:
Joined
Dec 5, 2010
Messages
1,620
https://www.pcgamesn.com/intel/integer-scaling-support-gen-11-xe-graphics

Intel Gen11 and next-gen graphics will support integer scaling following requests by the community. Intel’s Lisa Pearce confirmed that a patch will roll out sometime in August for Gen11 chips, adding support for the highly-requested functionality in the Intel Graphics Command Center, with future Intel Xe graphics expected to follow suit in 2020.

Enthusiasts have been calling out for the functionality for quite some time, even petitioning AMD and Nvidia for driver support. Why, you ask? Essentially integer scaling is an upscaling technique that takes each pixel at, let’s say, 1080p, and times it by four – a whole number. The resulting 4K pixel values are identical to their 1080p original values, however, the user retains clarity and sharpness in the final image.

Current upscaling techniques, such as bicubic or bilinear, interpolate colour values for pixels, which often renders lines, details, and text blurry in games. This is particularly noticeable in pixel art games, whose art style relies on that sharp, blocky image. Other upscaling techniques, such as nearest-neighbour interpolation, carry out a similar task to integer scaling but on a more precise scale, which can similarly cause image quality loss.


But 4K screens are becoming commonplace in today’s rigs – especially in the professional space where integrated graphics rule supreme. Today’s top gaming graphics tech, the RTX 2080 Ti, manages to just about squeeze by dealing with 8,294,400 pixels all at once, but it’s far from a perfect, one-size-fits-all resolution just yet.

As such, it’s often nice to have the ability to drop down the resolution every once in a while, take a load of your GPU, and do so without sacrificing fidelity. That’s where integer scaling comes in.
 

newtmonkey

Arcane
Joined
Aug 22, 2013
Messages
1,384
Location
Goblin Lair
Integer scaling is where it's at for 2D games. There are some nice applications that can scale a 640x480 game running in window mode to a 1280x960 borderless window (obviously with borders all around) for a clean and crisp image that retains the intended look of the game.

This one is free and seems to work well with the games I tried (Baldur's Gate, Fallout 2, Jagged Alliance 2):
http://tanalin.com/en/projects/integer-scaler/
http://tanalin.com/en/projects/integer-scaler/
agris solution for Fallout works wonderfully too if you want to play the game in widescreen with a nice clean image quality without the game looking like it was made for ants.
(I followed his advice for my recent Fallout 1 playthrough, though for Fallout 2 I am just running it at native res with a x2 integer scale).
 

DalekFlay

Arcane
Patron
Joined
Oct 5, 2010
Messages
14,118
Location
New Vegas
I'm sure nVidia and AMD will eventually follow suit, and yeah it's a great thing. Always annoying to mod something like Arcanum to run at 720p or thereabouts and having it look blurry af.
 

Valky

Arcane
Manlet
Joined
Aug 22, 2016
Messages
2,418
Location
Trapped in a bioform
I always play homm3 and toee at their glorious 800x600 resolution. There really is no other objectively correct way to play them.
 

Wyatt_Derp

Arcane
Joined
May 19, 2019
Messages
3,082
Location
Okie Land
Fuckin' ViewSonic E655! 15 inches of radiation poisoning.

That was my first monitor, on my Win 98 machine. Fuckin' hated the thing, my eyes would start hurting after about 2 hours of staring at it.

And now you hipsters are lugging this junk around like it's some monocled holy grail of gaming.

FFS.

This should be memorialized on a plaque somewhere. The headaches I sustained from years of CRT gaming alone... no one should be so nostalgic for something they never enjoyed, and wouldn't enjoy because of the pain endured by those who did. The modern age is about 99% shit, but flat-screen LED/LCD monitors are a true miracle of technological advancement. I once had to help a relative get rid of her old box tv out of her room when she moved. Was just a regular old 32 inch box tv. I swear to god that thing weighed 200 pounds.

People wonder why the sea levels are rising. It's not global warming. It's water displacement. We're sinking our land 'cause it's filled with 60 years worth of CRT monitors and television sets. Those things weren't meant for entertainment, they were meant to stop nazi bullets and commie shrapnel.
 

As an Amazon Associate, rpgcodex.net earns from qualifying purchases.
Back
Top Bottom