How did old MS-DOS games utilize various graphic cards?Do all VGA cards implicitly support CGA and EGA?How was Prince of Persia “better/faster” with RWTS18?How did “Ballblazer” pull off fast, smooth, first-person, solid-model 3D on Atari 8-bits?How were the first ZX Spectrum games written?Where did DOS store graphics fonts?What techniques were used to reduce the required re-rendering in 3D programs?An old DOS application that allowed to create cards, posters, invitations, etcHow did PC boot games handle saving?Did any Apple II games use a “timing resistor”?How did Elite on the NES work?When did various PONG systems overflow/crash due to points?

Zig-zag function - coded solution

Increase speed altering column on large table to NON NULL

Was Self-modifying-code possible just using BASIC?

Who is "He that flies" in Lord of the Rings?

Why Does Mama Coco Look Old After Going to the Other World?

Why do some devices use electrolytic capacitors instead of ceramics for small value components?

What is the Leave No Trace way to dispose of coffee grounds?

Multiband vertical antenna not working as expected

What plausible reason could I give for my FTL drive only working in space

Diatonic chords of a pentatonic vs blues scale?

Make Gimbap cutter

Can a human be transformed into a Mind Flayer?

What differences exist between adamantine and adamantite in all editions of D&D?

How to create a cubic equation that include sums of the roots of another cubic equation

Does putting salt first make it easier for attacker to bruteforce the hash?

How can I remove material from this wood beam?

Does a (nice) centerless group always have a centerless profinite completion?

How and why do references in academic papers work?

Is it okay to have a sequel start immediately after the end of the first book?

Why isn't Bash trap working if output is redirected to stdout?

Is it safe to remove python 2.7.15rc1 from Ubuntu 18.04?

Is it possible to have 2 different but equal size real number sets that have the same mean and standard deviation?

Proving that a Russian cryptographic standard is too structured

Was planting UN flag on Moon ever discussed?



How did old MS-DOS games utilize various graphic cards?


Do all VGA cards implicitly support CGA and EGA?How was Prince of Persia “better/faster” with RWTS18?How did “Ballblazer” pull off fast, smooth, first-person, solid-model 3D on Atari 8-bits?How were the first ZX Spectrum games written?Where did DOS store graphics fonts?What techniques were used to reduce the required re-rendering in 3D programs?An old DOS application that allowed to create cards, posters, invitations, etcHow did PC boot games handle saving?Did any Apple II games use a “timing resistor”?How did Elite on the NES work?When did various PONG systems overflow/crash due to points?













50















Nowadays each graphic card has some driver in operating system that translates some (typically) standard API such as OpenGL, so that programmers use some standardized API code to tell graphics cards how and what they want to render. (Actually that's already a bit hard-core most programmers really use various game engines that do this for them).



In times of old computers - how was this done? Did every programmer of every game implemented all possible various API's that old graphic cards supported? Or did the old game studios from MS-DOS times had their own "game engines" that provided some abstraction when it came to these graphic cards?



I remember there were many various card vendors and I remember old games asked me which one I have - so I suppose these games contained code / drivers for all these cards?










share|improve this question









New contributor



Petr is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.














  • 12





    Yep. There was no DirectX or OpenGL, so what we did is to implement algorithms on our own I still remember doing BSP and z-buffering on my own, i.e. you go there, read the algorithm spec, understand how it works and what it does and you code it. More hardcore - you don't have a spec and you come up with your own algorithm .. And then you create a full map of a frame and write it directly to the card RAM

    – Alma Do
    Jun 4 at 12:14







  • 6





    Wow... this brings back memories. Most graphics cards could fall back to the CGA standard. Hercules, Tandy and the PC Jr., and EGA were other common choices. For simple games that didn't have configuration files, it would ask you for your graphics card (and later your sound card) when you started the game. Hardware detection was nonstandard or nonexistent. The VESA standards came along to try and standardize the graphics modes, but there still wasn't much support for APIs in DOS days. Even in the earlier Window and OS/2 days, the APIs left a lot to be desired in standardization.

    – GuitarPicker
    Jun 4 at 12:58






  • 1





    Here's a recent take on this old problem - youtu.be/szhv6fwx7GY. It's about the making of "Planet X3", which is a new release old-school DOS game.

    – Brian H
    Jun 4 at 21:02






  • 5





    There is a GDC talk on youtube regarding palette cycling, which was a clever and very efficient way of faking animation. Helpfully someone implemented an web version which demonstrates some of the mind-blowing results.

    – TemporalWolf
    Jun 4 at 22:25






  • 1





    Bear in mind, you didn't support individual graphics cards so much as you supported a graphics standard -- like CGA, EGA, VGA, Tandy, etc. -- which was often based on a particular standard-setting card, like the IBM Color Graphics Adapter, or the chipset on a Tandy 1000. The only applications that cared about anything more than that were ones that targeted certain chipsets, like S3 or Tseng or Mach. But except for the OEMs that made their own cards, even those chips were used in many different graphics cards. Compatibility was a hardware problem back then.

    – SirNickity
    Jun 4 at 23:35















50















Nowadays each graphic card has some driver in operating system that translates some (typically) standard API such as OpenGL, so that programmers use some standardized API code to tell graphics cards how and what they want to render. (Actually that's already a bit hard-core most programmers really use various game engines that do this for them).



In times of old computers - how was this done? Did every programmer of every game implemented all possible various API's that old graphic cards supported? Or did the old game studios from MS-DOS times had their own "game engines" that provided some abstraction when it came to these graphic cards?



I remember there were many various card vendors and I remember old games asked me which one I have - so I suppose these games contained code / drivers for all these cards?










share|improve this question









New contributor



Petr is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.














  • 12





    Yep. There was no DirectX or OpenGL, so what we did is to implement algorithms on our own I still remember doing BSP and z-buffering on my own, i.e. you go there, read the algorithm spec, understand how it works and what it does and you code it. More hardcore - you don't have a spec and you come up with your own algorithm .. And then you create a full map of a frame and write it directly to the card RAM

    – Alma Do
    Jun 4 at 12:14







  • 6





    Wow... this brings back memories. Most graphics cards could fall back to the CGA standard. Hercules, Tandy and the PC Jr., and EGA were other common choices. For simple games that didn't have configuration files, it would ask you for your graphics card (and later your sound card) when you started the game. Hardware detection was nonstandard or nonexistent. The VESA standards came along to try and standardize the graphics modes, but there still wasn't much support for APIs in DOS days. Even in the earlier Window and OS/2 days, the APIs left a lot to be desired in standardization.

    – GuitarPicker
    Jun 4 at 12:58






  • 1





    Here's a recent take on this old problem - youtu.be/szhv6fwx7GY. It's about the making of "Planet X3", which is a new release old-school DOS game.

    – Brian H
    Jun 4 at 21:02






  • 5





    There is a GDC talk on youtube regarding palette cycling, which was a clever and very efficient way of faking animation. Helpfully someone implemented an web version which demonstrates some of the mind-blowing results.

    – TemporalWolf
    Jun 4 at 22:25






  • 1





    Bear in mind, you didn't support individual graphics cards so much as you supported a graphics standard -- like CGA, EGA, VGA, Tandy, etc. -- which was often based on a particular standard-setting card, like the IBM Color Graphics Adapter, or the chipset on a Tandy 1000. The only applications that cared about anything more than that were ones that targeted certain chipsets, like S3 or Tseng or Mach. But except for the OEMs that made their own cards, even those chips were used in many different graphics cards. Compatibility was a hardware problem back then.

    – SirNickity
    Jun 4 at 23:35













50












50








50


8






Nowadays each graphic card has some driver in operating system that translates some (typically) standard API such as OpenGL, so that programmers use some standardized API code to tell graphics cards how and what they want to render. (Actually that's already a bit hard-core most programmers really use various game engines that do this for them).



In times of old computers - how was this done? Did every programmer of every game implemented all possible various API's that old graphic cards supported? Or did the old game studios from MS-DOS times had their own "game engines" that provided some abstraction when it came to these graphic cards?



I remember there were many various card vendors and I remember old games asked me which one I have - so I suppose these games contained code / drivers for all these cards?










share|improve this question









New contributor



Petr is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











Nowadays each graphic card has some driver in operating system that translates some (typically) standard API such as OpenGL, so that programmers use some standardized API code to tell graphics cards how and what they want to render. (Actually that's already a bit hard-core most programmers really use various game engines that do this for them).



In times of old computers - how was this done? Did every programmer of every game implemented all possible various API's that old graphic cards supported? Or did the old game studios from MS-DOS times had their own "game engines" that provided some abstraction when it came to these graphic cards?



I remember there were many various card vendors and I remember old games asked me which one I have - so I suppose these games contained code / drivers for all these cards?







graphics programming video gaming






share|improve this question









New contributor



Petr is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.










share|improve this question









New contributor



Petr is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.








share|improve this question




share|improve this question








edited Jun 5 at 16:36









rchard2scout

1094




1094






New contributor



Petr is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.








asked Jun 3 at 22:55









PetrPetr

35325




35325




New contributor



Petr is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.




New contributor




Petr is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









  • 12





    Yep. There was no DirectX or OpenGL, so what we did is to implement algorithms on our own I still remember doing BSP and z-buffering on my own, i.e. you go there, read the algorithm spec, understand how it works and what it does and you code it. More hardcore - you don't have a spec and you come up with your own algorithm .. And then you create a full map of a frame and write it directly to the card RAM

    – Alma Do
    Jun 4 at 12:14







  • 6





    Wow... this brings back memories. Most graphics cards could fall back to the CGA standard. Hercules, Tandy and the PC Jr., and EGA were other common choices. For simple games that didn't have configuration files, it would ask you for your graphics card (and later your sound card) when you started the game. Hardware detection was nonstandard or nonexistent. The VESA standards came along to try and standardize the graphics modes, but there still wasn't much support for APIs in DOS days. Even in the earlier Window and OS/2 days, the APIs left a lot to be desired in standardization.

    – GuitarPicker
    Jun 4 at 12:58






  • 1





    Here's a recent take on this old problem - youtu.be/szhv6fwx7GY. It's about the making of "Planet X3", which is a new release old-school DOS game.

    – Brian H
    Jun 4 at 21:02






  • 5





    There is a GDC talk on youtube regarding palette cycling, which was a clever and very efficient way of faking animation. Helpfully someone implemented an web version which demonstrates some of the mind-blowing results.

    – TemporalWolf
    Jun 4 at 22:25






  • 1





    Bear in mind, you didn't support individual graphics cards so much as you supported a graphics standard -- like CGA, EGA, VGA, Tandy, etc. -- which was often based on a particular standard-setting card, like the IBM Color Graphics Adapter, or the chipset on a Tandy 1000. The only applications that cared about anything more than that were ones that targeted certain chipsets, like S3 or Tseng or Mach. But except for the OEMs that made their own cards, even those chips were used in many different graphics cards. Compatibility was a hardware problem back then.

    – SirNickity
    Jun 4 at 23:35












  • 12





    Yep. There was no DirectX or OpenGL, so what we did is to implement algorithms on our own I still remember doing BSP and z-buffering on my own, i.e. you go there, read the algorithm spec, understand how it works and what it does and you code it. More hardcore - you don't have a spec and you come up with your own algorithm .. And then you create a full map of a frame and write it directly to the card RAM

    – Alma Do
    Jun 4 at 12:14







  • 6





    Wow... this brings back memories. Most graphics cards could fall back to the CGA standard. Hercules, Tandy and the PC Jr., and EGA were other common choices. For simple games that didn't have configuration files, it would ask you for your graphics card (and later your sound card) when you started the game. Hardware detection was nonstandard or nonexistent. The VESA standards came along to try and standardize the graphics modes, but there still wasn't much support for APIs in DOS days. Even in the earlier Window and OS/2 days, the APIs left a lot to be desired in standardization.

    – GuitarPicker
    Jun 4 at 12:58






  • 1





    Here's a recent take on this old problem - youtu.be/szhv6fwx7GY. It's about the making of "Planet X3", which is a new release old-school DOS game.

    – Brian H
    Jun 4 at 21:02






  • 5





    There is a GDC talk on youtube regarding palette cycling, which was a clever and very efficient way of faking animation. Helpfully someone implemented an web version which demonstrates some of the mind-blowing results.

    – TemporalWolf
    Jun 4 at 22:25






  • 1





    Bear in mind, you didn't support individual graphics cards so much as you supported a graphics standard -- like CGA, EGA, VGA, Tandy, etc. -- which was often based on a particular standard-setting card, like the IBM Color Graphics Adapter, or the chipset on a Tandy 1000. The only applications that cared about anything more than that were ones that targeted certain chipsets, like S3 or Tseng or Mach. But except for the OEMs that made their own cards, even those chips were used in many different graphics cards. Compatibility was a hardware problem back then.

    – SirNickity
    Jun 4 at 23:35







12




12





Yep. There was no DirectX or OpenGL, so what we did is to implement algorithms on our own I still remember doing BSP and z-buffering on my own, i.e. you go there, read the algorithm spec, understand how it works and what it does and you code it. More hardcore - you don't have a spec and you come up with your own algorithm .. And then you create a full map of a frame and write it directly to the card RAM

– Alma Do
Jun 4 at 12:14






Yep. There was no DirectX or OpenGL, so what we did is to implement algorithms on our own I still remember doing BSP and z-buffering on my own, i.e. you go there, read the algorithm spec, understand how it works and what it does and you code it. More hardcore - you don't have a spec and you come up with your own algorithm .. And then you create a full map of a frame and write it directly to the card RAM

– Alma Do
Jun 4 at 12:14





6




6





Wow... this brings back memories. Most graphics cards could fall back to the CGA standard. Hercules, Tandy and the PC Jr., and EGA were other common choices. For simple games that didn't have configuration files, it would ask you for your graphics card (and later your sound card) when you started the game. Hardware detection was nonstandard or nonexistent. The VESA standards came along to try and standardize the graphics modes, but there still wasn't much support for APIs in DOS days. Even in the earlier Window and OS/2 days, the APIs left a lot to be desired in standardization.

– GuitarPicker
Jun 4 at 12:58





Wow... this brings back memories. Most graphics cards could fall back to the CGA standard. Hercules, Tandy and the PC Jr., and EGA were other common choices. For simple games that didn't have configuration files, it would ask you for your graphics card (and later your sound card) when you started the game. Hardware detection was nonstandard or nonexistent. The VESA standards came along to try and standardize the graphics modes, but there still wasn't much support for APIs in DOS days. Even in the earlier Window and OS/2 days, the APIs left a lot to be desired in standardization.

– GuitarPicker
Jun 4 at 12:58




1




1





Here's a recent take on this old problem - youtu.be/szhv6fwx7GY. It's about the making of "Planet X3", which is a new release old-school DOS game.

– Brian H
Jun 4 at 21:02





Here's a recent take on this old problem - youtu.be/szhv6fwx7GY. It's about the making of "Planet X3", which is a new release old-school DOS game.

– Brian H
Jun 4 at 21:02




5




5





There is a GDC talk on youtube regarding palette cycling, which was a clever and very efficient way of faking animation. Helpfully someone implemented an web version which demonstrates some of the mind-blowing results.

– TemporalWolf
Jun 4 at 22:25





There is a GDC talk on youtube regarding palette cycling, which was a clever and very efficient way of faking animation. Helpfully someone implemented an web version which demonstrates some of the mind-blowing results.

– TemporalWolf
Jun 4 at 22:25




1




1





Bear in mind, you didn't support individual graphics cards so much as you supported a graphics standard -- like CGA, EGA, VGA, Tandy, etc. -- which was often based on a particular standard-setting card, like the IBM Color Graphics Adapter, or the chipset on a Tandy 1000. The only applications that cared about anything more than that were ones that targeted certain chipsets, like S3 or Tseng or Mach. But except for the OEMs that made their own cards, even those chips were used in many different graphics cards. Compatibility was a hardware problem back then.

– SirNickity
Jun 4 at 23:35





Bear in mind, you didn't support individual graphics cards so much as you supported a graphics standard -- like CGA, EGA, VGA, Tandy, etc. -- which was often based on a particular standard-setting card, like the IBM Color Graphics Adapter, or the chipset on a Tandy 1000. The only applications that cared about anything more than that were ones that targeted certain chipsets, like S3 or Tseng or Mach. But except for the OEMs that made their own cards, even those chips were used in many different graphics cards. Compatibility was a hardware problem back then.

– SirNickity
Jun 4 at 23:35










6 Answers
6






active

oldest

votes


















66















Did every programmer of every game implemented all possible various API's that old graphic cards supported?




Yes - but it went even deeper than that. Early graphics cards had virtually no callable code associated with them at all, the concept of "drivers" had not quite become a reality yet. There was the concept of a Video BIOS, which were extensions to the INT 10h BIOS video services, that were effectively limited to initialization and switching video modes.



Instead, graphics cards, at least in DOS land, all had memory mapped display RAM, and extensive documentation was available about exactly how setting various bits in display RAM would affect the pixels that appeared on the screen. There were no drawing APIs to call, if you wanted something to appear on the screen (whether it be a pixel, a character, a line, a circle, a sprite, etc) you would write the code to move the bytes into the right places in display RAM. Entire books were written about how to write efficient code to draw graphics.



There were some systems like the Borland Graphics Interface that abstracted graphics drawing primitives into an API with different drivers that one could call to draw things on different graphics cards. However, these were typically slower than what would be required for building action type games.



An action game would typically be optimized for a particular graphics display mode on a particular card. For example, a popular display mode was VGA 640x480 with 16 colors. This would be listed in the software requirements, and you needed to have the right hardware to support the game. If you bought a VGA game but you only had an EGA card, then the game would not work at all. As you said, some games would ask what you had and you had to know what the right answer was, otherwise the game would not work.






share|improve this answer




















  • 11





    I wrote a few simple action games with BGI in the early 90s. It was OK for single-screen affairs with a few sprites moving around - like Space Invaders or Pac Man - but was well short of what you'd need for full screen scrolling or 3D ray-casting games.

    – Matthew Barber
    Jun 3 at 23:37






  • 21





    Strictly speaking there was an API, BIOS service 10h, but you wouldn’t use it for (most) games! But there certainly was callable code associated with graphics cards (or in the case of MDA and CGA, provided by the system BIOS).

    – Stephen Kitt
    Jun 4 at 6:10







  • 9





    Worth noting that sound cards were the same situation - games had to directly support AdLib/OPL2 cards, Creative C/MS cards, SoundBlaster, MT-32, etc. Each added card needed custom code written for the specific hardware in question. Printers were like this for a while also - word processors needed to write hardware code to support specific printers directly, there were no drivers or APIs, no abstraction, just bare metal.

    – J...
    Jun 4 at 15:24






  • 2





    @J... I was under the impression that you could just dump bare characters into the parallel port and the printer would then dump them onto paper.

    – John Dvorak
    Jun 4 at 20:12






  • 4





    @JohnDvorak As long as you were satisfied with only printing text, and only in the printer's built-in font...

    – Wumpus Q. Wumbley
    Jun 4 at 20:33


















25














Early on, you had to explicitly code your game for each graphics card you wanted to support: Hercules, CGA, Tandy, EGA, VGA. You had to know how to put the card into graphics mode and you had to know the memory layout, palette, and so on. You had to figure out how to avoid flicker and how to prevent tearing. You had to write your own line drawing and fill routines, and if you wanted 3-D, you had to know how to project it onto a 2-D screen, how to remove hidden lines and so on.



Later when graphics cards started gaining accelerated functionality, SGI created the IrisGL API, which later became OpenGL, in order to simplify CAD (computer aided drafting/design) software development for those graphics cards by providing a standard API that video hardware manufacturers could support and developers could design their software against. OpenGL provided access to underlying hardware features, and if a feature did not exist on the hardware, OpenGL provided a software implementation.



The same problem existed in games development. Originally, graphics card manufacturers would work with game studios to create a version (release) of their game that would be accelerated by the specific graphics card. Early 3D games like MechWarrior 2 had different releases for 3dfx Voodoo, S3 Virge, STB Velocity, and so on. It was a bit of a mess. At around the same time, Microsoft created the DirectX library for Windows which was similar to OpenGL. Manufacturers would support OpenGL and/or DirectX, and game studios who were brave enough to abandon DOS for Windows as a platform could then program for one or both libraries instead of creating another release for each individual graphics card they wanted to support. Any relatively minor differences between graphics cards could be handled at runtime in the same release.






share|improve this answer




















  • 3





    Compatibility with a particular graphics card by third-party vendors was often never a 100% proposition either, particularly if programs decided to go spelunking into a card's I/O registers or use undocumented video modes (see this RC question.) Some of my early games refused to work at all in EGA modes on my blazing-fast 12MHz 286 with an ATI EGA card.

    – ErikF
    Jun 4 at 4:24






  • 4





    Also before 3D cards there were so-called "Windows Accelerator" cards that made efforts made to improve 2D drawing speed, especially when Windows 3.x become mainstream.

    – mnem
    Jun 4 at 6:38







  • 6





    "SGI created OpenGL to simplify game programming" is somewhat misleading. OpenGL was created as an API for professional 3D graphics, such as CAD. Only later it was adopted for video game development.

    – IMil
    Jun 4 at 14:38






  • 5





    Also OpenGL came some years before before Direct3D, from an even older predecessor Iris GL.

    – BlackJack
    Jun 4 at 15:26











  • @ErikF Similarly, I had an authentic CGA card that supported modes that later EGA and VGA cards with supposed backwards compatibility did not support.

    – snips-n-snails
    Jun 4 at 18:11


















15














In DOS you had direct access to the hardware; so you grabbed some good source of information about the card you wanted to support, and got down to code your routines.



A book which was often cited as a good source was "Programmer's Guide to the Ega, Vga, and Super Vga Cards", by Richard F. Ferraro; I hadn't the luck to own it or read it, but it was fondly remembered by those who did.



Another invaluable source of information was Ralph Brown's Interrupt List; you can find a HTML conversion of the list here: http://www.delorie.com/djgpp/doc/rbinter/



The original was just made of (long) text files; and, if memory serves me correctly, there were some programs to navigate it more easily, at least in the later versions.



Another nice collection of infomation was contained in the "PC Game Programmer's Encyclopedia", or PC-GPE; a HTML conversion can be found here: http://qzx.com/pc-gpe/



You had at least three different ways to interact with a given piece of hardware; io ports, interrupts, and memory mapped registers. Graphic cards used all three of them.



The situation with audio cards was very similar.



Another thing to consider is that attached to the video card was an analog CRT monitor. The older/cheaper ones were only able to sync to a given set of vertical and horizontal rates; but the newer/best ones were basically able to sync to any signal in a given range. That means that with the right parameters written to the video card registers, you could create some custom (or weird) resolutions.



Games aimed for broad compatibility, so they rarely used weird ones, while in the demoscene it was quite common (and custom resolutions were the norm in arcade games too.)



But, for example, Mode X was very popular with games!



It was popularized by Michael Abrash on the pages of Dr. Dobb's Journal;
you got a 320x240 resolution, that, viewed on a 4:3 monitor, meant the pixels were square. So, for example, you could naively draw circles and they would look like circles; in 320x200 they were stretched, as the pixel aspect ratio was not 1:1, and you had to calculate and compensate for that while drawing.



It was a planar mode, so by setting a register you could decide which planes would receive a write in the memory mapped area. For example, for a quick fill operation you would set all planes, and a single byte write would affect four pixels (one for each plane). That also helped to address all the 256 KB of the VGA memory using only a 64 KB segment.



I am positive there was a little utility which let you explore the VGA registers, where you could put whatever values you fancied, and, when you applied your settings,
you could finally see if your monitor supported the resulting output. But my memory is too weak right now to remember the name or the author of this program.



Another common trick was to change a part of the color palette during the horizontal retrace; done correctly, you could have more than 256 colours on screen. There was not enough time to change the whole palette on each line, so you had to be creative.



(During vertical retraces instead there was enough time to change every colour, and it was done for example for fade in/fade out effects).



(The most popular palette trick was probably changing the background color during tape loading on 8 bit machines (C64 for example).)



One thing that is often overlooked, was that the VGA card was effectively a small three channel DAC; creative people found ways to use and abuse that as well.



To a similar effect, Tempest for Eliza used the radio waves emitted by the monitor to transmit a radio signal which could be listened to with a common AM radio.



Whoa! This was a nice trip on memory lane! :)






share|improve this answer










New contributor



RenPic is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.














  • 1





    I think it's the most complete of the answers as of now. Consider expanding it with "scrolling" (I/O, not the slow INT 10h), as I think it's as relevant to OP's question "to tell graphics cards how and what they want to render" as the palettes. And welcome to the site :)

    – kubanczyk
    Jun 5 at 9:33












  • Another well-known document compilation is vgadoc, which can be found by searching for vgadoc4b.

    – ninjalj
    yesterday


















11














In DOS world in the golden age of VGA (early to mid 90s), by far the easiest and most popular way to do graphics was the famous Mode 13h, a 320x200 pixel linear 256-color paletted video mode. The closest you would get to a standard video API was BIOS interrupt 10h, giving access to a handful of functions including switching the video mode and configuring the palette. A resolution of 320x200 (instead of 320x240 which has a 4:3 aspect ratio) was very convenient because the required video memory (64000 bytes) would fit in a single 64kB memory segment, making addressing individual pixels very straightforward. The drawback was that pixels were slightly rectangular instead of perfectly square.



The video memory was mapped to segment A0000h; in the real-mode environment of DOS you could simply form a pointer to that segment and treat it as a 64000-byte array, each byte corresponding to one pixel. Set a byte to a new value, a pixel changes color on the screen. Beyond that, implementing any higher-level drawing functionality was up to you or a 3rd-party library.



For accessing higher-resolution and/or higher color depth modes afforded by SVGA and later video cards, there was (and still is) VBE, or VESA BIOS Extensions, a standard API exposed by the video card's BIOS. But beyond switching the video mode, setting the palette (in paletted modes), and getting a frame buffer into which to plot pixels, you were still pretty much on your own.






share|improve this answer










New contributor



JohannesD is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.














  • 4





    In theory the INT 10H API did include writing pixels. I believe there was even a DOS device driver available which added this API for older graphics cards that didn't support it. It was just about fast enough to animate a few hundred pixels for a graph, but wouldn't have been used by a game.

    – Neil
    Jun 5 at 12:56


















2














Early PC computing was the age of banging hardware directly. Your (typically assembler) code wrote directly to control registers and video memory.



As far as supporting multiple video cards, real easy: we didn't. The supported hardware was directly stated right on the spine of the game box. You needed to have that hardware. This wasn't hard; generally nobody expected games to run on an MDA (Monochrome Display Adapter, no G for graphics since it had none). That left CGA and later, EGA.




It wasn't that we didn't want APIs. I even tried to write APIs. But this was an age when you were counting processor clock cycles. You had to get it done in the cycles available.



  • You couldn't afford the cycles for the subroutine call and return!

  • Let alone all the parameter passing (moving parameters into the standard locations the subroutine wants).

  • Or the de-referencing needed for every iteration of the operation, since you have to use a memory location specified in a call parameter, instead of being able to hard-code it.

And mind you, those limitations applied to my personal API optimized for that game. If a third party entity wrote one all-singing, all-dancing API intended as the one API for all applications, then much more would be abstracted, and the above overhead would be much worse.



That's less of a problem today, because the graphics are so large and complex that API overhead is a smaller fraction. Also, there is CPU to throw at the problem; you can just let the playtesters bump the minimum CPU requirement. We couldn't do that because most of the market was 4.77MHz 8088’s.






share|improve this answer
































    1














    Like any hardware, video card may have some address in I/O space and memory space.
    They are physically connected to bus (ISA bus back in 1980th).
    When CPU writes to some memory address, videocard answers this and accepts data.
    When CPU writes to some IO, same thing happens.



    That means software may access it if it is aware of it's memory address and IO address.



    Accessing some hardware mapped to some memory address:



    MOV SOME_ADDR, AX ; Move value of AX register to address SOME_ADDRESS
    OUT SOME_PORT, AX; Move value of AX to some port


    Same with C



    int* data = SOME_ADDR;
    data[0] = '1'; //write '1' to SOME_ADDR.
    outp(PORT, '1'); //to io


    IBM PC compatible computers had several types of cards:



    • Monochrome Graphic Adapter


    • Hercules


    • CGA


    • EGA


    • VGA


    Each card had its own standard which was documented. All of them were backward compatible (i.e. VGA may emulate CGA).
    VGA was the most complicated standard, there were huge books about it!
    Standard declares which address should you use to access video card and which data should you write to card to show something on CRT monitor.



    So, first you need to find out which card do you have (you can try to read this data from memory area filled by BIOS or ask user).
    Then, you use standard to talk to card.



    VGA, for example, had a lot of internal registers. Developers wrote some data to IO port to select register, then wrote data to this register.



    Card memory was mapped, so you simply wrote data to some address (in some modes some cards had several pages of memory which you can switch).



    But memory was not always plane. There was a character mode (in which each 2 bytes represent letter and it's attributes like color and bgcolor).
    There was 13h mode where each byte represented color of pixel.
    There were modes with several planars to speed up the card (see
    https://en.wikipedia.org/wiki/Planar_(computer_graphics) )



    Video programming was not easy!
    Some articles to read:



    • https://wiki.osdev.org/VGA_Hardware


    • http://www.brackeen.com/vga/


    There was also high level BIOS api, but it was too slow to be used by games.



    You may ask: "But how do I render 3D with all of that?".
    The answer is: you can't.



    In 80th and early 90th you had to render everything on CPU and then use video card API to show 2D image.



    I really suggest you t read book about how they did it for Wolf3d:



    http://fabiensanglard.net/gebbwolf3d/



    The first video card that supported SOME 3D APIs was "Voodo 3DFX". It had API called "Glide".






    share|improve this answer


















    • 1





      Hercules was bit special regarding compatibility. It was MDA->CGA->EGA->VGA which was backward compatible. Hercules was directly backwards compatible with MDA and through emulation with CGA, but I don't think that any of later cards (EGA/VGA) was able to emulate Hercules.

      – Artur Biesiadowski
      Jun 6 at 14:05











    Your Answer








    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "648"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );






    Petr is a new contributor. Be nice, and check out our Code of Conduct.









    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f11219%2fhow-did-old-ms-dos-games-utilize-various-graphic-cards%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    6 Answers
    6






    active

    oldest

    votes








    6 Answers
    6






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    66















    Did every programmer of every game implemented all possible various API's that old graphic cards supported?




    Yes - but it went even deeper than that. Early graphics cards had virtually no callable code associated with them at all, the concept of "drivers" had not quite become a reality yet. There was the concept of a Video BIOS, which were extensions to the INT 10h BIOS video services, that were effectively limited to initialization and switching video modes.



    Instead, graphics cards, at least in DOS land, all had memory mapped display RAM, and extensive documentation was available about exactly how setting various bits in display RAM would affect the pixels that appeared on the screen. There were no drawing APIs to call, if you wanted something to appear on the screen (whether it be a pixel, a character, a line, a circle, a sprite, etc) you would write the code to move the bytes into the right places in display RAM. Entire books were written about how to write efficient code to draw graphics.



    There were some systems like the Borland Graphics Interface that abstracted graphics drawing primitives into an API with different drivers that one could call to draw things on different graphics cards. However, these were typically slower than what would be required for building action type games.



    An action game would typically be optimized for a particular graphics display mode on a particular card. For example, a popular display mode was VGA 640x480 with 16 colors. This would be listed in the software requirements, and you needed to have the right hardware to support the game. If you bought a VGA game but you only had an EGA card, then the game would not work at all. As you said, some games would ask what you had and you had to know what the right answer was, otherwise the game would not work.






    share|improve this answer




















    • 11





      I wrote a few simple action games with BGI in the early 90s. It was OK for single-screen affairs with a few sprites moving around - like Space Invaders or Pac Man - but was well short of what you'd need for full screen scrolling or 3D ray-casting games.

      – Matthew Barber
      Jun 3 at 23:37






    • 21





      Strictly speaking there was an API, BIOS service 10h, but you wouldn’t use it for (most) games! But there certainly was callable code associated with graphics cards (or in the case of MDA and CGA, provided by the system BIOS).

      – Stephen Kitt
      Jun 4 at 6:10







    • 9





      Worth noting that sound cards were the same situation - games had to directly support AdLib/OPL2 cards, Creative C/MS cards, SoundBlaster, MT-32, etc. Each added card needed custom code written for the specific hardware in question. Printers were like this for a while also - word processors needed to write hardware code to support specific printers directly, there were no drivers or APIs, no abstraction, just bare metal.

      – J...
      Jun 4 at 15:24






    • 2





      @J... I was under the impression that you could just dump bare characters into the parallel port and the printer would then dump them onto paper.

      – John Dvorak
      Jun 4 at 20:12






    • 4





      @JohnDvorak As long as you were satisfied with only printing text, and only in the printer's built-in font...

      – Wumpus Q. Wumbley
      Jun 4 at 20:33















    66















    Did every programmer of every game implemented all possible various API's that old graphic cards supported?




    Yes - but it went even deeper than that. Early graphics cards had virtually no callable code associated with them at all, the concept of "drivers" had not quite become a reality yet. There was the concept of a Video BIOS, which were extensions to the INT 10h BIOS video services, that were effectively limited to initialization and switching video modes.



    Instead, graphics cards, at least in DOS land, all had memory mapped display RAM, and extensive documentation was available about exactly how setting various bits in display RAM would affect the pixels that appeared on the screen. There were no drawing APIs to call, if you wanted something to appear on the screen (whether it be a pixel, a character, a line, a circle, a sprite, etc) you would write the code to move the bytes into the right places in display RAM. Entire books were written about how to write efficient code to draw graphics.



    There were some systems like the Borland Graphics Interface that abstracted graphics drawing primitives into an API with different drivers that one could call to draw things on different graphics cards. However, these were typically slower than what would be required for building action type games.



    An action game would typically be optimized for a particular graphics display mode on a particular card. For example, a popular display mode was VGA 640x480 with 16 colors. This would be listed in the software requirements, and you needed to have the right hardware to support the game. If you bought a VGA game but you only had an EGA card, then the game would not work at all. As you said, some games would ask what you had and you had to know what the right answer was, otherwise the game would not work.






    share|improve this answer




















    • 11





      I wrote a few simple action games with BGI in the early 90s. It was OK for single-screen affairs with a few sprites moving around - like Space Invaders or Pac Man - but was well short of what you'd need for full screen scrolling or 3D ray-casting games.

      – Matthew Barber
      Jun 3 at 23:37






    • 21





      Strictly speaking there was an API, BIOS service 10h, but you wouldn’t use it for (most) games! But there certainly was callable code associated with graphics cards (or in the case of MDA and CGA, provided by the system BIOS).

      – Stephen Kitt
      Jun 4 at 6:10







    • 9





      Worth noting that sound cards were the same situation - games had to directly support AdLib/OPL2 cards, Creative C/MS cards, SoundBlaster, MT-32, etc. Each added card needed custom code written for the specific hardware in question. Printers were like this for a while also - word processors needed to write hardware code to support specific printers directly, there were no drivers or APIs, no abstraction, just bare metal.

      – J...
      Jun 4 at 15:24






    • 2





      @J... I was under the impression that you could just dump bare characters into the parallel port and the printer would then dump them onto paper.

      – John Dvorak
      Jun 4 at 20:12






    • 4





      @JohnDvorak As long as you were satisfied with only printing text, and only in the printer's built-in font...

      – Wumpus Q. Wumbley
      Jun 4 at 20:33













    66












    66








    66








    Did every programmer of every game implemented all possible various API's that old graphic cards supported?




    Yes - but it went even deeper than that. Early graphics cards had virtually no callable code associated with them at all, the concept of "drivers" had not quite become a reality yet. There was the concept of a Video BIOS, which were extensions to the INT 10h BIOS video services, that were effectively limited to initialization and switching video modes.



    Instead, graphics cards, at least in DOS land, all had memory mapped display RAM, and extensive documentation was available about exactly how setting various bits in display RAM would affect the pixels that appeared on the screen. There were no drawing APIs to call, if you wanted something to appear on the screen (whether it be a pixel, a character, a line, a circle, a sprite, etc) you would write the code to move the bytes into the right places in display RAM. Entire books were written about how to write efficient code to draw graphics.



    There were some systems like the Borland Graphics Interface that abstracted graphics drawing primitives into an API with different drivers that one could call to draw things on different graphics cards. However, these were typically slower than what would be required for building action type games.



    An action game would typically be optimized for a particular graphics display mode on a particular card. For example, a popular display mode was VGA 640x480 with 16 colors. This would be listed in the software requirements, and you needed to have the right hardware to support the game. If you bought a VGA game but you only had an EGA card, then the game would not work at all. As you said, some games would ask what you had and you had to know what the right answer was, otherwise the game would not work.






    share|improve this answer
















    Did every programmer of every game implemented all possible various API's that old graphic cards supported?




    Yes - but it went even deeper than that. Early graphics cards had virtually no callable code associated with them at all, the concept of "drivers" had not quite become a reality yet. There was the concept of a Video BIOS, which were extensions to the INT 10h BIOS video services, that were effectively limited to initialization and switching video modes.



    Instead, graphics cards, at least in DOS land, all had memory mapped display RAM, and extensive documentation was available about exactly how setting various bits in display RAM would affect the pixels that appeared on the screen. There were no drawing APIs to call, if you wanted something to appear on the screen (whether it be a pixel, a character, a line, a circle, a sprite, etc) you would write the code to move the bytes into the right places in display RAM. Entire books were written about how to write efficient code to draw graphics.



    There were some systems like the Borland Graphics Interface that abstracted graphics drawing primitives into an API with different drivers that one could call to draw things on different graphics cards. However, these were typically slower than what would be required for building action type games.



    An action game would typically be optimized for a particular graphics display mode on a particular card. For example, a popular display mode was VGA 640x480 with 16 colors. This would be listed in the software requirements, and you needed to have the right hardware to support the game. If you bought a VGA game but you only had an EGA card, then the game would not work at all. As you said, some games would ask what you had and you had to know what the right answer was, otherwise the game would not work.







    share|improve this answer














    share|improve this answer



    share|improve this answer








    edited Jun 6 at 21:39

























    answered Jun 3 at 23:16









    Greg HewgillGreg Hewgill

    2,3591315




    2,3591315







    • 11





      I wrote a few simple action games with BGI in the early 90s. It was OK for single-screen affairs with a few sprites moving around - like Space Invaders or Pac Man - but was well short of what you'd need for full screen scrolling or 3D ray-casting games.

      – Matthew Barber
      Jun 3 at 23:37






    • 21





      Strictly speaking there was an API, BIOS service 10h, but you wouldn’t use it for (most) games! But there certainly was callable code associated with graphics cards (or in the case of MDA and CGA, provided by the system BIOS).

      – Stephen Kitt
      Jun 4 at 6:10







    • 9





      Worth noting that sound cards were the same situation - games had to directly support AdLib/OPL2 cards, Creative C/MS cards, SoundBlaster, MT-32, etc. Each added card needed custom code written for the specific hardware in question. Printers were like this for a while also - word processors needed to write hardware code to support specific printers directly, there were no drivers or APIs, no abstraction, just bare metal.

      – J...
      Jun 4 at 15:24






    • 2





      @J... I was under the impression that you could just dump bare characters into the parallel port and the printer would then dump them onto paper.

      – John Dvorak
      Jun 4 at 20:12






    • 4





      @JohnDvorak As long as you were satisfied with only printing text, and only in the printer's built-in font...

      – Wumpus Q. Wumbley
      Jun 4 at 20:33












    • 11





      I wrote a few simple action games with BGI in the early 90s. It was OK for single-screen affairs with a few sprites moving around - like Space Invaders or Pac Man - but was well short of what you'd need for full screen scrolling or 3D ray-casting games.

      – Matthew Barber
      Jun 3 at 23:37






    • 21





      Strictly speaking there was an API, BIOS service 10h, but you wouldn’t use it for (most) games! But there certainly was callable code associated with graphics cards (or in the case of MDA and CGA, provided by the system BIOS).

      – Stephen Kitt
      Jun 4 at 6:10







    • 9





      Worth noting that sound cards were the same situation - games had to directly support AdLib/OPL2 cards, Creative C/MS cards, SoundBlaster, MT-32, etc. Each added card needed custom code written for the specific hardware in question. Printers were like this for a while also - word processors needed to write hardware code to support specific printers directly, there were no drivers or APIs, no abstraction, just bare metal.

      – J...
      Jun 4 at 15:24






    • 2





      @J... I was under the impression that you could just dump bare characters into the parallel port and the printer would then dump them onto paper.

      – John Dvorak
      Jun 4 at 20:12






    • 4





      @JohnDvorak As long as you were satisfied with only printing text, and only in the printer's built-in font...

      – Wumpus Q. Wumbley
      Jun 4 at 20:33







    11




    11





    I wrote a few simple action games with BGI in the early 90s. It was OK for single-screen affairs with a few sprites moving around - like Space Invaders or Pac Man - but was well short of what you'd need for full screen scrolling or 3D ray-casting games.

    – Matthew Barber
    Jun 3 at 23:37





    I wrote a few simple action games with BGI in the early 90s. It was OK for single-screen affairs with a few sprites moving around - like Space Invaders or Pac Man - but was well short of what you'd need for full screen scrolling or 3D ray-casting games.

    – Matthew Barber
    Jun 3 at 23:37




    21




    21





    Strictly speaking there was an API, BIOS service 10h, but you wouldn’t use it for (most) games! But there certainly was callable code associated with graphics cards (or in the case of MDA and CGA, provided by the system BIOS).

    – Stephen Kitt
    Jun 4 at 6:10






    Strictly speaking there was an API, BIOS service 10h, but you wouldn’t use it for (most) games! But there certainly was callable code associated with graphics cards (or in the case of MDA and CGA, provided by the system BIOS).

    – Stephen Kitt
    Jun 4 at 6:10





    9




    9





    Worth noting that sound cards were the same situation - games had to directly support AdLib/OPL2 cards, Creative C/MS cards, SoundBlaster, MT-32, etc. Each added card needed custom code written for the specific hardware in question. Printers were like this for a while also - word processors needed to write hardware code to support specific printers directly, there were no drivers or APIs, no abstraction, just bare metal.

    – J...
    Jun 4 at 15:24





    Worth noting that sound cards were the same situation - games had to directly support AdLib/OPL2 cards, Creative C/MS cards, SoundBlaster, MT-32, etc. Each added card needed custom code written for the specific hardware in question. Printers were like this for a while also - word processors needed to write hardware code to support specific printers directly, there were no drivers or APIs, no abstraction, just bare metal.

    – J...
    Jun 4 at 15:24




    2




    2





    @J... I was under the impression that you could just dump bare characters into the parallel port and the printer would then dump them onto paper.

    – John Dvorak
    Jun 4 at 20:12





    @J... I was under the impression that you could just dump bare characters into the parallel port and the printer would then dump them onto paper.

    – John Dvorak
    Jun 4 at 20:12




    4




    4





    @JohnDvorak As long as you were satisfied with only printing text, and only in the printer's built-in font...

    – Wumpus Q. Wumbley
    Jun 4 at 20:33





    @JohnDvorak As long as you were satisfied with only printing text, and only in the printer's built-in font...

    – Wumpus Q. Wumbley
    Jun 4 at 20:33











    25














    Early on, you had to explicitly code your game for each graphics card you wanted to support: Hercules, CGA, Tandy, EGA, VGA. You had to know how to put the card into graphics mode and you had to know the memory layout, palette, and so on. You had to figure out how to avoid flicker and how to prevent tearing. You had to write your own line drawing and fill routines, and if you wanted 3-D, you had to know how to project it onto a 2-D screen, how to remove hidden lines and so on.



    Later when graphics cards started gaining accelerated functionality, SGI created the IrisGL API, which later became OpenGL, in order to simplify CAD (computer aided drafting/design) software development for those graphics cards by providing a standard API that video hardware manufacturers could support and developers could design their software against. OpenGL provided access to underlying hardware features, and if a feature did not exist on the hardware, OpenGL provided a software implementation.



    The same problem existed in games development. Originally, graphics card manufacturers would work with game studios to create a version (release) of their game that would be accelerated by the specific graphics card. Early 3D games like MechWarrior 2 had different releases for 3dfx Voodoo, S3 Virge, STB Velocity, and so on. It was a bit of a mess. At around the same time, Microsoft created the DirectX library for Windows which was similar to OpenGL. Manufacturers would support OpenGL and/or DirectX, and game studios who were brave enough to abandon DOS for Windows as a platform could then program for one or both libraries instead of creating another release for each individual graphics card they wanted to support. Any relatively minor differences between graphics cards could be handled at runtime in the same release.






    share|improve this answer




















    • 3





      Compatibility with a particular graphics card by third-party vendors was often never a 100% proposition either, particularly if programs decided to go spelunking into a card's I/O registers or use undocumented video modes (see this RC question.) Some of my early games refused to work at all in EGA modes on my blazing-fast 12MHz 286 with an ATI EGA card.

      – ErikF
      Jun 4 at 4:24






    • 4





      Also before 3D cards there were so-called "Windows Accelerator" cards that made efforts made to improve 2D drawing speed, especially when Windows 3.x become mainstream.

      – mnem
      Jun 4 at 6:38







    • 6





      "SGI created OpenGL to simplify game programming" is somewhat misleading. OpenGL was created as an API for professional 3D graphics, such as CAD. Only later it was adopted for video game development.

      – IMil
      Jun 4 at 14:38






    • 5





      Also OpenGL came some years before before Direct3D, from an even older predecessor Iris GL.

      – BlackJack
      Jun 4 at 15:26











    • @ErikF Similarly, I had an authentic CGA card that supported modes that later EGA and VGA cards with supposed backwards compatibility did not support.

      – snips-n-snails
      Jun 4 at 18:11















    25














    Early on, you had to explicitly code your game for each graphics card you wanted to support: Hercules, CGA, Tandy, EGA, VGA. You had to know how to put the card into graphics mode and you had to know the memory layout, palette, and so on. You had to figure out how to avoid flicker and how to prevent tearing. You had to write your own line drawing and fill routines, and if you wanted 3-D, you had to know how to project it onto a 2-D screen, how to remove hidden lines and so on.



    Later when graphics cards started gaining accelerated functionality, SGI created the IrisGL API, which later became OpenGL, in order to simplify CAD (computer aided drafting/design) software development for those graphics cards by providing a standard API that video hardware manufacturers could support and developers could design their software against. OpenGL provided access to underlying hardware features, and if a feature did not exist on the hardware, OpenGL provided a software implementation.



    The same problem existed in games development. Originally, graphics card manufacturers would work with game studios to create a version (release) of their game that would be accelerated by the specific graphics card. Early 3D games like MechWarrior 2 had different releases for 3dfx Voodoo, S3 Virge, STB Velocity, and so on. It was a bit of a mess. At around the same time, Microsoft created the DirectX library for Windows which was similar to OpenGL. Manufacturers would support OpenGL and/or DirectX, and game studios who were brave enough to abandon DOS for Windows as a platform could then program for one or both libraries instead of creating another release for each individual graphics card they wanted to support. Any relatively minor differences between graphics cards could be handled at runtime in the same release.






    share|improve this answer




















    • 3





      Compatibility with a particular graphics card by third-party vendors was often never a 100% proposition either, particularly if programs decided to go spelunking into a card's I/O registers or use undocumented video modes (see this RC question.) Some of my early games refused to work at all in EGA modes on my blazing-fast 12MHz 286 with an ATI EGA card.

      – ErikF
      Jun 4 at 4:24






    • 4





      Also before 3D cards there were so-called "Windows Accelerator" cards that made efforts made to improve 2D drawing speed, especially when Windows 3.x become mainstream.

      – mnem
      Jun 4 at 6:38







    • 6





      "SGI created OpenGL to simplify game programming" is somewhat misleading. OpenGL was created as an API for professional 3D graphics, such as CAD. Only later it was adopted for video game development.

      – IMil
      Jun 4 at 14:38






    • 5





      Also OpenGL came some years before before Direct3D, from an even older predecessor Iris GL.

      – BlackJack
      Jun 4 at 15:26











    • @ErikF Similarly, I had an authentic CGA card that supported modes that later EGA and VGA cards with supposed backwards compatibility did not support.

      – snips-n-snails
      Jun 4 at 18:11













    25












    25








    25







    Early on, you had to explicitly code your game for each graphics card you wanted to support: Hercules, CGA, Tandy, EGA, VGA. You had to know how to put the card into graphics mode and you had to know the memory layout, palette, and so on. You had to figure out how to avoid flicker and how to prevent tearing. You had to write your own line drawing and fill routines, and if you wanted 3-D, you had to know how to project it onto a 2-D screen, how to remove hidden lines and so on.



    Later when graphics cards started gaining accelerated functionality, SGI created the IrisGL API, which later became OpenGL, in order to simplify CAD (computer aided drafting/design) software development for those graphics cards by providing a standard API that video hardware manufacturers could support and developers could design their software against. OpenGL provided access to underlying hardware features, and if a feature did not exist on the hardware, OpenGL provided a software implementation.



    The same problem existed in games development. Originally, graphics card manufacturers would work with game studios to create a version (release) of their game that would be accelerated by the specific graphics card. Early 3D games like MechWarrior 2 had different releases for 3dfx Voodoo, S3 Virge, STB Velocity, and so on. It was a bit of a mess. At around the same time, Microsoft created the DirectX library for Windows which was similar to OpenGL. Manufacturers would support OpenGL and/or DirectX, and game studios who were brave enough to abandon DOS for Windows as a platform could then program for one or both libraries instead of creating another release for each individual graphics card they wanted to support. Any relatively minor differences between graphics cards could be handled at runtime in the same release.






    share|improve this answer















    Early on, you had to explicitly code your game for each graphics card you wanted to support: Hercules, CGA, Tandy, EGA, VGA. You had to know how to put the card into graphics mode and you had to know the memory layout, palette, and so on. You had to figure out how to avoid flicker and how to prevent tearing. You had to write your own line drawing and fill routines, and if you wanted 3-D, you had to know how to project it onto a 2-D screen, how to remove hidden lines and so on.



    Later when graphics cards started gaining accelerated functionality, SGI created the IrisGL API, which later became OpenGL, in order to simplify CAD (computer aided drafting/design) software development for those graphics cards by providing a standard API that video hardware manufacturers could support and developers could design their software against. OpenGL provided access to underlying hardware features, and if a feature did not exist on the hardware, OpenGL provided a software implementation.



    The same problem existed in games development. Originally, graphics card manufacturers would work with game studios to create a version (release) of their game that would be accelerated by the specific graphics card. Early 3D games like MechWarrior 2 had different releases for 3dfx Voodoo, S3 Virge, STB Velocity, and so on. It was a bit of a mess. At around the same time, Microsoft created the DirectX library for Windows which was similar to OpenGL. Manufacturers would support OpenGL and/or DirectX, and game studios who were brave enough to abandon DOS for Windows as a platform could then program for one or both libraries instead of creating another release for each individual graphics card they wanted to support. Any relatively minor differences between graphics cards could be handled at runtime in the same release.







    share|improve this answer














    share|improve this answer



    share|improve this answer








    edited Jun 4 at 20:20

























    answered Jun 3 at 23:46









    snips-n-snailssnips-n-snails

    9,70823477




    9,70823477







    • 3





      Compatibility with a particular graphics card by third-party vendors was often never a 100% proposition either, particularly if programs decided to go spelunking into a card's I/O registers or use undocumented video modes (see this RC question.) Some of my early games refused to work at all in EGA modes on my blazing-fast 12MHz 286 with an ATI EGA card.

      – ErikF
      Jun 4 at 4:24






    • 4





      Also before 3D cards there were so-called "Windows Accelerator" cards that made efforts made to improve 2D drawing speed, especially when Windows 3.x become mainstream.

      – mnem
      Jun 4 at 6:38







    • 6





      "SGI created OpenGL to simplify game programming" is somewhat misleading. OpenGL was created as an API for professional 3D graphics, such as CAD. Only later it was adopted for video game development.

      – IMil
      Jun 4 at 14:38






    • 5





      Also OpenGL came some years before before Direct3D, from an even older predecessor Iris GL.

      – BlackJack
      Jun 4 at 15:26











    • @ErikF Similarly, I had an authentic CGA card that supported modes that later EGA and VGA cards with supposed backwards compatibility did not support.

      – snips-n-snails
      Jun 4 at 18:11












    • 3





      Compatibility with a particular graphics card by third-party vendors was often never a 100% proposition either, particularly if programs decided to go spelunking into a card's I/O registers or use undocumented video modes (see this RC question.) Some of my early games refused to work at all in EGA modes on my blazing-fast 12MHz 286 with an ATI EGA card.

      – ErikF
      Jun 4 at 4:24






    • 4





      Also before 3D cards there were so-called "Windows Accelerator" cards that made efforts made to improve 2D drawing speed, especially when Windows 3.x become mainstream.

      – mnem
      Jun 4 at 6:38







    • 6





      "SGI created OpenGL to simplify game programming" is somewhat misleading. OpenGL was created as an API for professional 3D graphics, such as CAD. Only later it was adopted for video game development.

      – IMil
      Jun 4 at 14:38






    • 5





      Also OpenGL came some years before before Direct3D, from an even older predecessor Iris GL.

      – BlackJack
      Jun 4 at 15:26











    • @ErikF Similarly, I had an authentic CGA card that supported modes that later EGA and VGA cards with supposed backwards compatibility did not support.

      – snips-n-snails
      Jun 4 at 18:11







    3




    3





    Compatibility with a particular graphics card by third-party vendors was often never a 100% proposition either, particularly if programs decided to go spelunking into a card's I/O registers or use undocumented video modes (see this RC question.) Some of my early games refused to work at all in EGA modes on my blazing-fast 12MHz 286 with an ATI EGA card.

    – ErikF
    Jun 4 at 4:24





    Compatibility with a particular graphics card by third-party vendors was often never a 100% proposition either, particularly if programs decided to go spelunking into a card's I/O registers or use undocumented video modes (see this RC question.) Some of my early games refused to work at all in EGA modes on my blazing-fast 12MHz 286 with an ATI EGA card.

    – ErikF
    Jun 4 at 4:24




    4




    4





    Also before 3D cards there were so-called "Windows Accelerator" cards that made efforts made to improve 2D drawing speed, especially when Windows 3.x become mainstream.

    – mnem
    Jun 4 at 6:38






    Also before 3D cards there were so-called "Windows Accelerator" cards that made efforts made to improve 2D drawing speed, especially when Windows 3.x become mainstream.

    – mnem
    Jun 4 at 6:38





    6




    6





    "SGI created OpenGL to simplify game programming" is somewhat misleading. OpenGL was created as an API for professional 3D graphics, such as CAD. Only later it was adopted for video game development.

    – IMil
    Jun 4 at 14:38





    "SGI created OpenGL to simplify game programming" is somewhat misleading. OpenGL was created as an API for professional 3D graphics, such as CAD. Only later it was adopted for video game development.

    – IMil
    Jun 4 at 14:38




    5




    5





    Also OpenGL came some years before before Direct3D, from an even older predecessor Iris GL.

    – BlackJack
    Jun 4 at 15:26





    Also OpenGL came some years before before Direct3D, from an even older predecessor Iris GL.

    – BlackJack
    Jun 4 at 15:26













    @ErikF Similarly, I had an authentic CGA card that supported modes that later EGA and VGA cards with supposed backwards compatibility did not support.

    – snips-n-snails
    Jun 4 at 18:11





    @ErikF Similarly, I had an authentic CGA card that supported modes that later EGA and VGA cards with supposed backwards compatibility did not support.

    – snips-n-snails
    Jun 4 at 18:11











    15














    In DOS you had direct access to the hardware; so you grabbed some good source of information about the card you wanted to support, and got down to code your routines.



    A book which was often cited as a good source was "Programmer's Guide to the Ega, Vga, and Super Vga Cards", by Richard F. Ferraro; I hadn't the luck to own it or read it, but it was fondly remembered by those who did.



    Another invaluable source of information was Ralph Brown's Interrupt List; you can find a HTML conversion of the list here: http://www.delorie.com/djgpp/doc/rbinter/



    The original was just made of (long) text files; and, if memory serves me correctly, there were some programs to navigate it more easily, at least in the later versions.



    Another nice collection of infomation was contained in the "PC Game Programmer's Encyclopedia", or PC-GPE; a HTML conversion can be found here: http://qzx.com/pc-gpe/



    You had at least three different ways to interact with a given piece of hardware; io ports, interrupts, and memory mapped registers. Graphic cards used all three of them.



    The situation with audio cards was very similar.



    Another thing to consider is that attached to the video card was an analog CRT monitor. The older/cheaper ones were only able to sync to a given set of vertical and horizontal rates; but the newer/best ones were basically able to sync to any signal in a given range. That means that with the right parameters written to the video card registers, you could create some custom (or weird) resolutions.



    Games aimed for broad compatibility, so they rarely used weird ones, while in the demoscene it was quite common (and custom resolutions were the norm in arcade games too.)



    But, for example, Mode X was very popular with games!



    It was popularized by Michael Abrash on the pages of Dr. Dobb's Journal;
    you got a 320x240 resolution, that, viewed on a 4:3 monitor, meant the pixels were square. So, for example, you could naively draw circles and they would look like circles; in 320x200 they were stretched, as the pixel aspect ratio was not 1:1, and you had to calculate and compensate for that while drawing.



    It was a planar mode, so by setting a register you could decide which planes would receive a write in the memory mapped area. For example, for a quick fill operation you would set all planes, and a single byte write would affect four pixels (one for each plane). That also helped to address all the 256 KB of the VGA memory using only a 64 KB segment.



    I am positive there was a little utility which let you explore the VGA registers, where you could put whatever values you fancied, and, when you applied your settings,
    you could finally see if your monitor supported the resulting output. But my memory is too weak right now to remember the name or the author of this program.



    Another common trick was to change a part of the color palette during the horizontal retrace; done correctly, you could have more than 256 colours on screen. There was not enough time to change the whole palette on each line, so you had to be creative.



    (During vertical retraces instead there was enough time to change every colour, and it was done for example for fade in/fade out effects).



    (The most popular palette trick was probably changing the background color during tape loading on 8 bit machines (C64 for example).)



    One thing that is often overlooked, was that the VGA card was effectively a small three channel DAC; creative people found ways to use and abuse that as well.



    To a similar effect, Tempest for Eliza used the radio waves emitted by the monitor to transmit a radio signal which could be listened to with a common AM radio.



    Whoa! This was a nice trip on memory lane! :)






    share|improve this answer










    New contributor



    RenPic is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.














    • 1





      I think it's the most complete of the answers as of now. Consider expanding it with "scrolling" (I/O, not the slow INT 10h), as I think it's as relevant to OP's question "to tell graphics cards how and what they want to render" as the palettes. And welcome to the site :)

      – kubanczyk
      Jun 5 at 9:33












    • Another well-known document compilation is vgadoc, which can be found by searching for vgadoc4b.

      – ninjalj
      yesterday















    15














    In DOS you had direct access to the hardware; so you grabbed some good source of information about the card you wanted to support, and got down to code your routines.



    A book which was often cited as a good source was "Programmer's Guide to the Ega, Vga, and Super Vga Cards", by Richard F. Ferraro; I hadn't the luck to own it or read it, but it was fondly remembered by those who did.



    Another invaluable source of information was Ralph Brown's Interrupt List; you can find a HTML conversion of the list here: http://www.delorie.com/djgpp/doc/rbinter/



    The original was just made of (long) text files; and, if memory serves me correctly, there were some programs to navigate it more easily, at least in the later versions.



    Another nice collection of infomation was contained in the "PC Game Programmer's Encyclopedia", or PC-GPE; a HTML conversion can be found here: http://qzx.com/pc-gpe/



    You had at least three different ways to interact with a given piece of hardware; io ports, interrupts, and memory mapped registers. Graphic cards used all three of them.



    The situation with audio cards was very similar.



    Another thing to consider is that attached to the video card was an analog CRT monitor. The older/cheaper ones were only able to sync to a given set of vertical and horizontal rates; but the newer/best ones were basically able to sync to any signal in a given range. That means that with the right parameters written to the video card registers, you could create some custom (or weird) resolutions.



    Games aimed for broad compatibility, so they rarely used weird ones, while in the demoscene it was quite common (and custom resolutions were the norm in arcade games too.)



    But, for example, Mode X was very popular with games!



    It was popularized by Michael Abrash on the pages of Dr. Dobb's Journal;
    you got a 320x240 resolution, that, viewed on a 4:3 monitor, meant the pixels were square. So, for example, you could naively draw circles and they would look like circles; in 320x200 they were stretched, as the pixel aspect ratio was not 1:1, and you had to calculate and compensate for that while drawing.



    It was a planar mode, so by setting a register you could decide which planes would receive a write in the memory mapped area. For example, for a quick fill operation you would set all planes, and a single byte write would affect four pixels (one for each plane). That also helped to address all the 256 KB of the VGA memory using only a 64 KB segment.



    I am positive there was a little utility which let you explore the VGA registers, where you could put whatever values you fancied, and, when you applied your settings,
    you could finally see if your monitor supported the resulting output. But my memory is too weak right now to remember the name or the author of this program.



    Another common trick was to change a part of the color palette during the horizontal retrace; done correctly, you could have more than 256 colours on screen. There was not enough time to change the whole palette on each line, so you had to be creative.



    (During vertical retraces instead there was enough time to change every colour, and it was done for example for fade in/fade out effects).



    (The most popular palette trick was probably changing the background color during tape loading on 8 bit machines (C64 for example).)



    One thing that is often overlooked, was that the VGA card was effectively a small three channel DAC; creative people found ways to use and abuse that as well.



    To a similar effect, Tempest for Eliza used the radio waves emitted by the monitor to transmit a radio signal which could be listened to with a common AM radio.



    Whoa! This was a nice trip on memory lane! :)






    share|improve this answer










    New contributor



    RenPic is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.














    • 1





      I think it's the most complete of the answers as of now. Consider expanding it with "scrolling" (I/O, not the slow INT 10h), as I think it's as relevant to OP's question "to tell graphics cards how and what they want to render" as the palettes. And welcome to the site :)

      – kubanczyk
      Jun 5 at 9:33












    • Another well-known document compilation is vgadoc, which can be found by searching for vgadoc4b.

      – ninjalj
      yesterday













    15












    15








    15







    In DOS you had direct access to the hardware; so you grabbed some good source of information about the card you wanted to support, and got down to code your routines.



    A book which was often cited as a good source was "Programmer's Guide to the Ega, Vga, and Super Vga Cards", by Richard F. Ferraro; I hadn't the luck to own it or read it, but it was fondly remembered by those who did.



    Another invaluable source of information was Ralph Brown's Interrupt List; you can find a HTML conversion of the list here: http://www.delorie.com/djgpp/doc/rbinter/



    The original was just made of (long) text files; and, if memory serves me correctly, there were some programs to navigate it more easily, at least in the later versions.



    Another nice collection of infomation was contained in the "PC Game Programmer's Encyclopedia", or PC-GPE; a HTML conversion can be found here: http://qzx.com/pc-gpe/



    You had at least three different ways to interact with a given piece of hardware; io ports, interrupts, and memory mapped registers. Graphic cards used all three of them.



    The situation with audio cards was very similar.



    Another thing to consider is that attached to the video card was an analog CRT monitor. The older/cheaper ones were only able to sync to a given set of vertical and horizontal rates; but the newer/best ones were basically able to sync to any signal in a given range. That means that with the right parameters written to the video card registers, you could create some custom (or weird) resolutions.



    Games aimed for broad compatibility, so they rarely used weird ones, while in the demoscene it was quite common (and custom resolutions were the norm in arcade games too.)



    But, for example, Mode X was very popular with games!



    It was popularized by Michael Abrash on the pages of Dr. Dobb's Journal;
    you got a 320x240 resolution, that, viewed on a 4:3 monitor, meant the pixels were square. So, for example, you could naively draw circles and they would look like circles; in 320x200 they were stretched, as the pixel aspect ratio was not 1:1, and you had to calculate and compensate for that while drawing.



    It was a planar mode, so by setting a register you could decide which planes would receive a write in the memory mapped area. For example, for a quick fill operation you would set all planes, and a single byte write would affect four pixels (one for each plane). That also helped to address all the 256 KB of the VGA memory using only a 64 KB segment.



    I am positive there was a little utility which let you explore the VGA registers, where you could put whatever values you fancied, and, when you applied your settings,
    you could finally see if your monitor supported the resulting output. But my memory is too weak right now to remember the name or the author of this program.



    Another common trick was to change a part of the color palette during the horizontal retrace; done correctly, you could have more than 256 colours on screen. There was not enough time to change the whole palette on each line, so you had to be creative.



    (During vertical retraces instead there was enough time to change every colour, and it was done for example for fade in/fade out effects).



    (The most popular palette trick was probably changing the background color during tape loading on 8 bit machines (C64 for example).)



    One thing that is often overlooked, was that the VGA card was effectively a small three channel DAC; creative people found ways to use and abuse that as well.



    To a similar effect, Tempest for Eliza used the radio waves emitted by the monitor to transmit a radio signal which could be listened to with a common AM radio.



    Whoa! This was a nice trip on memory lane! :)






    share|improve this answer










    New contributor



    RenPic is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.









    In DOS you had direct access to the hardware; so you grabbed some good source of information about the card you wanted to support, and got down to code your routines.



    A book which was often cited as a good source was "Programmer's Guide to the Ega, Vga, and Super Vga Cards", by Richard F. Ferraro; I hadn't the luck to own it or read it, but it was fondly remembered by those who did.



    Another invaluable source of information was Ralph Brown's Interrupt List; you can find a HTML conversion of the list here: http://www.delorie.com/djgpp/doc/rbinter/



    The original was just made of (long) text files; and, if memory serves me correctly, there were some programs to navigate it more easily, at least in the later versions.



    Another nice collection of infomation was contained in the "PC Game Programmer's Encyclopedia", or PC-GPE; a HTML conversion can be found here: http://qzx.com/pc-gpe/



    You had at least three different ways to interact with a given piece of hardware; io ports, interrupts, and memory mapped registers. Graphic cards used all three of them.



    The situation with audio cards was very similar.



    Another thing to consider is that attached to the video card was an analog CRT monitor. The older/cheaper ones were only able to sync to a given set of vertical and horizontal rates; but the newer/best ones were basically able to sync to any signal in a given range. That means that with the right parameters written to the video card registers, you could create some custom (or weird) resolutions.



    Games aimed for broad compatibility, so they rarely used weird ones, while in the demoscene it was quite common (and custom resolutions were the norm in arcade games too.)



    But, for example, Mode X was very popular with games!



    It was popularized by Michael Abrash on the pages of Dr. Dobb's Journal;
    you got a 320x240 resolution, that, viewed on a 4:3 monitor, meant the pixels were square. So, for example, you could naively draw circles and they would look like circles; in 320x200 they were stretched, as the pixel aspect ratio was not 1:1, and you had to calculate and compensate for that while drawing.



    It was a planar mode, so by setting a register you could decide which planes would receive a write in the memory mapped area. For example, for a quick fill operation you would set all planes, and a single byte write would affect four pixels (one for each plane). That also helped to address all the 256 KB of the VGA memory using only a 64 KB segment.



    I am positive there was a little utility which let you explore the VGA registers, where you could put whatever values you fancied, and, when you applied your settings,
    you could finally see if your monitor supported the resulting output. But my memory is too weak right now to remember the name or the author of this program.



    Another common trick was to change a part of the color palette during the horizontal retrace; done correctly, you could have more than 256 colours on screen. There was not enough time to change the whole palette on each line, so you had to be creative.



    (During vertical retraces instead there was enough time to change every colour, and it was done for example for fade in/fade out effects).



    (The most popular palette trick was probably changing the background color during tape loading on 8 bit machines (C64 for example).)



    One thing that is often overlooked, was that the VGA card was effectively a small three channel DAC; creative people found ways to use and abuse that as well.



    To a similar effect, Tempest for Eliza used the radio waves emitted by the monitor to transmit a radio signal which could be listened to with a common AM radio.



    Whoa! This was a nice trip on memory lane! :)







    share|improve this answer










    New contributor



    RenPic is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.








    share|improve this answer



    share|improve this answer








    edited Jun 6 at 9:05









    Stephen Kitt

    44.7k8185191




    44.7k8185191






    New contributor



    RenPic is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.








    answered Jun 4 at 22:42









    RenPicRenPic

    1512




    1512




    New contributor



    RenPic is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.




    New contributor




    RenPic is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.









    • 1





      I think it's the most complete of the answers as of now. Consider expanding it with "scrolling" (I/O, not the slow INT 10h), as I think it's as relevant to OP's question "to tell graphics cards how and what they want to render" as the palettes. And welcome to the site :)

      – kubanczyk
      Jun 5 at 9:33












    • Another well-known document compilation is vgadoc, which can be found by searching for vgadoc4b.

      – ninjalj
      yesterday












    • 1





      I think it's the most complete of the answers as of now. Consider expanding it with "scrolling" (I/O, not the slow INT 10h), as I think it's as relevant to OP's question "to tell graphics cards how and what they want to render" as the palettes. And welcome to the site :)

      – kubanczyk
      Jun 5 at 9:33












    • Another well-known document compilation is vgadoc, which can be found by searching for vgadoc4b.

      – ninjalj
      yesterday







    1




    1





    I think it's the most complete of the answers as of now. Consider expanding it with "scrolling" (I/O, not the slow INT 10h), as I think it's as relevant to OP's question "to tell graphics cards how and what they want to render" as the palettes. And welcome to the site :)

    – kubanczyk
    Jun 5 at 9:33






    I think it's the most complete of the answers as of now. Consider expanding it with "scrolling" (I/O, not the slow INT 10h), as I think it's as relevant to OP's question "to tell graphics cards how and what they want to render" as the palettes. And welcome to the site :)

    – kubanczyk
    Jun 5 at 9:33














    Another well-known document compilation is vgadoc, which can be found by searching for vgadoc4b.

    – ninjalj
    yesterday





    Another well-known document compilation is vgadoc, which can be found by searching for vgadoc4b.

    – ninjalj
    yesterday











    11














    In DOS world in the golden age of VGA (early to mid 90s), by far the easiest and most popular way to do graphics was the famous Mode 13h, a 320x200 pixel linear 256-color paletted video mode. The closest you would get to a standard video API was BIOS interrupt 10h, giving access to a handful of functions including switching the video mode and configuring the palette. A resolution of 320x200 (instead of 320x240 which has a 4:3 aspect ratio) was very convenient because the required video memory (64000 bytes) would fit in a single 64kB memory segment, making addressing individual pixels very straightforward. The drawback was that pixels were slightly rectangular instead of perfectly square.



    The video memory was mapped to segment A0000h; in the real-mode environment of DOS you could simply form a pointer to that segment and treat it as a 64000-byte array, each byte corresponding to one pixel. Set a byte to a new value, a pixel changes color on the screen. Beyond that, implementing any higher-level drawing functionality was up to you or a 3rd-party library.



    For accessing higher-resolution and/or higher color depth modes afforded by SVGA and later video cards, there was (and still is) VBE, or VESA BIOS Extensions, a standard API exposed by the video card's BIOS. But beyond switching the video mode, setting the palette (in paletted modes), and getting a frame buffer into which to plot pixels, you were still pretty much on your own.






    share|improve this answer










    New contributor



    JohannesD is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.














    • 4





      In theory the INT 10H API did include writing pixels. I believe there was even a DOS device driver available which added this API for older graphics cards that didn't support it. It was just about fast enough to animate a few hundred pixels for a graph, but wouldn't have been used by a game.

      – Neil
      Jun 5 at 12:56















    11














    In DOS world in the golden age of VGA (early to mid 90s), by far the easiest and most popular way to do graphics was the famous Mode 13h, a 320x200 pixel linear 256-color paletted video mode. The closest you would get to a standard video API was BIOS interrupt 10h, giving access to a handful of functions including switching the video mode and configuring the palette. A resolution of 320x200 (instead of 320x240 which has a 4:3 aspect ratio) was very convenient because the required video memory (64000 bytes) would fit in a single 64kB memory segment, making addressing individual pixels very straightforward. The drawback was that pixels were slightly rectangular instead of perfectly square.



    The video memory was mapped to segment A0000h; in the real-mode environment of DOS you could simply form a pointer to that segment and treat it as a 64000-byte array, each byte corresponding to one pixel. Set a byte to a new value, a pixel changes color on the screen. Beyond that, implementing any higher-level drawing functionality was up to you or a 3rd-party library.



    For accessing higher-resolution and/or higher color depth modes afforded by SVGA and later video cards, there was (and still is) VBE, or VESA BIOS Extensions, a standard API exposed by the video card's BIOS. But beyond switching the video mode, setting the palette (in paletted modes), and getting a frame buffer into which to plot pixels, you were still pretty much on your own.






    share|improve this answer










    New contributor



    JohannesD is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.














    • 4





      In theory the INT 10H API did include writing pixels. I believe there was even a DOS device driver available which added this API for older graphics cards that didn't support it. It was just about fast enough to animate a few hundred pixels for a graph, but wouldn't have been used by a game.

      – Neil
      Jun 5 at 12:56













    11












    11








    11







    In DOS world in the golden age of VGA (early to mid 90s), by far the easiest and most popular way to do graphics was the famous Mode 13h, a 320x200 pixel linear 256-color paletted video mode. The closest you would get to a standard video API was BIOS interrupt 10h, giving access to a handful of functions including switching the video mode and configuring the palette. A resolution of 320x200 (instead of 320x240 which has a 4:3 aspect ratio) was very convenient because the required video memory (64000 bytes) would fit in a single 64kB memory segment, making addressing individual pixels very straightforward. The drawback was that pixels were slightly rectangular instead of perfectly square.



    The video memory was mapped to segment A0000h; in the real-mode environment of DOS you could simply form a pointer to that segment and treat it as a 64000-byte array, each byte corresponding to one pixel. Set a byte to a new value, a pixel changes color on the screen. Beyond that, implementing any higher-level drawing functionality was up to you or a 3rd-party library.



    For accessing higher-resolution and/or higher color depth modes afforded by SVGA and later video cards, there was (and still is) VBE, or VESA BIOS Extensions, a standard API exposed by the video card's BIOS. But beyond switching the video mode, setting the palette (in paletted modes), and getting a frame buffer into which to plot pixels, you were still pretty much on your own.






    share|improve this answer










    New contributor



    JohannesD is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.









    In DOS world in the golden age of VGA (early to mid 90s), by far the easiest and most popular way to do graphics was the famous Mode 13h, a 320x200 pixel linear 256-color paletted video mode. The closest you would get to a standard video API was BIOS interrupt 10h, giving access to a handful of functions including switching the video mode and configuring the palette. A resolution of 320x200 (instead of 320x240 which has a 4:3 aspect ratio) was very convenient because the required video memory (64000 bytes) would fit in a single 64kB memory segment, making addressing individual pixels very straightforward. The drawback was that pixels were slightly rectangular instead of perfectly square.



    The video memory was mapped to segment A0000h; in the real-mode environment of DOS you could simply form a pointer to that segment and treat it as a 64000-byte array, each byte corresponding to one pixel. Set a byte to a new value, a pixel changes color on the screen. Beyond that, implementing any higher-level drawing functionality was up to you or a 3rd-party library.



    For accessing higher-resolution and/or higher color depth modes afforded by SVGA and later video cards, there was (and still is) VBE, or VESA BIOS Extensions, a standard API exposed by the video card's BIOS. But beyond switching the video mode, setting the palette (in paletted modes), and getting a frame buffer into which to plot pixels, you were still pretty much on your own.







    share|improve this answer










    New contributor



    JohannesD is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.








    share|improve this answer



    share|improve this answer








    edited Jun 4 at 21:31





















    New contributor



    JohannesD is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.








    answered Jun 4 at 20:44









    JohannesDJohannesD

    2114




    2114




    New contributor



    JohannesD is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.




    New contributor




    JohannesD is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.









    • 4





      In theory the INT 10H API did include writing pixels. I believe there was even a DOS device driver available which added this API for older graphics cards that didn't support it. It was just about fast enough to animate a few hundred pixels for a graph, but wouldn't have been used by a game.

      – Neil
      Jun 5 at 12:56












    • 4





      In theory the INT 10H API did include writing pixels. I believe there was even a DOS device driver available which added this API for older graphics cards that didn't support it. It was just about fast enough to animate a few hundred pixels for a graph, but wouldn't have been used by a game.

      – Neil
      Jun 5 at 12:56







    4




    4





    In theory the INT 10H API did include writing pixels. I believe there was even a DOS device driver available which added this API for older graphics cards that didn't support it. It was just about fast enough to animate a few hundred pixels for a graph, but wouldn't have been used by a game.

    – Neil
    Jun 5 at 12:56





    In theory the INT 10H API did include writing pixels. I believe there was even a DOS device driver available which added this API for older graphics cards that didn't support it. It was just about fast enough to animate a few hundred pixels for a graph, but wouldn't have been used by a game.

    – Neil
    Jun 5 at 12:56











    2














    Early PC computing was the age of banging hardware directly. Your (typically assembler) code wrote directly to control registers and video memory.



    As far as supporting multiple video cards, real easy: we didn't. The supported hardware was directly stated right on the spine of the game box. You needed to have that hardware. This wasn't hard; generally nobody expected games to run on an MDA (Monochrome Display Adapter, no G for graphics since it had none). That left CGA and later, EGA.




    It wasn't that we didn't want APIs. I even tried to write APIs. But this was an age when you were counting processor clock cycles. You had to get it done in the cycles available.



    • You couldn't afford the cycles for the subroutine call and return!

    • Let alone all the parameter passing (moving parameters into the standard locations the subroutine wants).

    • Or the de-referencing needed for every iteration of the operation, since you have to use a memory location specified in a call parameter, instead of being able to hard-code it.

    And mind you, those limitations applied to my personal API optimized for that game. If a third party entity wrote one all-singing, all-dancing API intended as the one API for all applications, then much more would be abstracted, and the above overhead would be much worse.



    That's less of a problem today, because the graphics are so large and complex that API overhead is a smaller fraction. Also, there is CPU to throw at the problem; you can just let the playtesters bump the minimum CPU requirement. We couldn't do that because most of the market was 4.77MHz 8088’s.






    share|improve this answer





























      2














      Early PC computing was the age of banging hardware directly. Your (typically assembler) code wrote directly to control registers and video memory.



      As far as supporting multiple video cards, real easy: we didn't. The supported hardware was directly stated right on the spine of the game box. You needed to have that hardware. This wasn't hard; generally nobody expected games to run on an MDA (Monochrome Display Adapter, no G for graphics since it had none). That left CGA and later, EGA.




      It wasn't that we didn't want APIs. I even tried to write APIs. But this was an age when you were counting processor clock cycles. You had to get it done in the cycles available.



      • You couldn't afford the cycles for the subroutine call and return!

      • Let alone all the parameter passing (moving parameters into the standard locations the subroutine wants).

      • Or the de-referencing needed for every iteration of the operation, since you have to use a memory location specified in a call parameter, instead of being able to hard-code it.

      And mind you, those limitations applied to my personal API optimized for that game. If a third party entity wrote one all-singing, all-dancing API intended as the one API for all applications, then much more would be abstracted, and the above overhead would be much worse.



      That's less of a problem today, because the graphics are so large and complex that API overhead is a smaller fraction. Also, there is CPU to throw at the problem; you can just let the playtesters bump the minimum CPU requirement. We couldn't do that because most of the market was 4.77MHz 8088’s.






      share|improve this answer



























        2












        2








        2







        Early PC computing was the age of banging hardware directly. Your (typically assembler) code wrote directly to control registers and video memory.



        As far as supporting multiple video cards, real easy: we didn't. The supported hardware was directly stated right on the spine of the game box. You needed to have that hardware. This wasn't hard; generally nobody expected games to run on an MDA (Monochrome Display Adapter, no G for graphics since it had none). That left CGA and later, EGA.




        It wasn't that we didn't want APIs. I even tried to write APIs. But this was an age when you were counting processor clock cycles. You had to get it done in the cycles available.



        • You couldn't afford the cycles for the subroutine call and return!

        • Let alone all the parameter passing (moving parameters into the standard locations the subroutine wants).

        • Or the de-referencing needed for every iteration of the operation, since you have to use a memory location specified in a call parameter, instead of being able to hard-code it.

        And mind you, those limitations applied to my personal API optimized for that game. If a third party entity wrote one all-singing, all-dancing API intended as the one API for all applications, then much more would be abstracted, and the above overhead would be much worse.



        That's less of a problem today, because the graphics are so large and complex that API overhead is a smaller fraction. Also, there is CPU to throw at the problem; you can just let the playtesters bump the minimum CPU requirement. We couldn't do that because most of the market was 4.77MHz 8088’s.






        share|improve this answer















        Early PC computing was the age of banging hardware directly. Your (typically assembler) code wrote directly to control registers and video memory.



        As far as supporting multiple video cards, real easy: we didn't. The supported hardware was directly stated right on the spine of the game box. You needed to have that hardware. This wasn't hard; generally nobody expected games to run on an MDA (Monochrome Display Adapter, no G for graphics since it had none). That left CGA and later, EGA.




        It wasn't that we didn't want APIs. I even tried to write APIs. But this was an age when you were counting processor clock cycles. You had to get it done in the cycles available.



        • You couldn't afford the cycles for the subroutine call and return!

        • Let alone all the parameter passing (moving parameters into the standard locations the subroutine wants).

        • Or the de-referencing needed for every iteration of the operation, since you have to use a memory location specified in a call parameter, instead of being able to hard-code it.

        And mind you, those limitations applied to my personal API optimized for that game. If a third party entity wrote one all-singing, all-dancing API intended as the one API for all applications, then much more would be abstracted, and the above overhead would be much worse.



        That's less of a problem today, because the graphics are so large and complex that API overhead is a smaller fraction. Also, there is CPU to throw at the problem; you can just let the playtesters bump the minimum CPU requirement. We couldn't do that because most of the market was 4.77MHz 8088’s.







        share|improve this answer














        share|improve this answer



        share|improve this answer








        edited Jun 5 at 15:37

























        answered Jun 5 at 15:10









        HarperHarper

        1,18859




        1,18859





















            1














            Like any hardware, video card may have some address in I/O space and memory space.
            They are physically connected to bus (ISA bus back in 1980th).
            When CPU writes to some memory address, videocard answers this and accepts data.
            When CPU writes to some IO, same thing happens.



            That means software may access it if it is aware of it's memory address and IO address.



            Accessing some hardware mapped to some memory address:



            MOV SOME_ADDR, AX ; Move value of AX register to address SOME_ADDRESS
            OUT SOME_PORT, AX; Move value of AX to some port


            Same with C



            int* data = SOME_ADDR;
            data[0] = '1'; //write '1' to SOME_ADDR.
            outp(PORT, '1'); //to io


            IBM PC compatible computers had several types of cards:



            • Monochrome Graphic Adapter


            • Hercules


            • CGA


            • EGA


            • VGA


            Each card had its own standard which was documented. All of them were backward compatible (i.e. VGA may emulate CGA).
            VGA was the most complicated standard, there were huge books about it!
            Standard declares which address should you use to access video card and which data should you write to card to show something on CRT monitor.



            So, first you need to find out which card do you have (you can try to read this data from memory area filled by BIOS or ask user).
            Then, you use standard to talk to card.



            VGA, for example, had a lot of internal registers. Developers wrote some data to IO port to select register, then wrote data to this register.



            Card memory was mapped, so you simply wrote data to some address (in some modes some cards had several pages of memory which you can switch).



            But memory was not always plane. There was a character mode (in which each 2 bytes represent letter and it's attributes like color and bgcolor).
            There was 13h mode where each byte represented color of pixel.
            There were modes with several planars to speed up the card (see
            https://en.wikipedia.org/wiki/Planar_(computer_graphics) )



            Video programming was not easy!
            Some articles to read:



            • https://wiki.osdev.org/VGA_Hardware


            • http://www.brackeen.com/vga/


            There was also high level BIOS api, but it was too slow to be used by games.



            You may ask: "But how do I render 3D with all of that?".
            The answer is: you can't.



            In 80th and early 90th you had to render everything on CPU and then use video card API to show 2D image.



            I really suggest you t read book about how they did it for Wolf3d:



            http://fabiensanglard.net/gebbwolf3d/



            The first video card that supported SOME 3D APIs was "Voodo 3DFX". It had API called "Glide".






            share|improve this answer


















            • 1





              Hercules was bit special regarding compatibility. It was MDA->CGA->EGA->VGA which was backward compatible. Hercules was directly backwards compatible with MDA and through emulation with CGA, but I don't think that any of later cards (EGA/VGA) was able to emulate Hercules.

              – Artur Biesiadowski
              Jun 6 at 14:05















            1














            Like any hardware, video card may have some address in I/O space and memory space.
            They are physically connected to bus (ISA bus back in 1980th).
            When CPU writes to some memory address, videocard answers this and accepts data.
            When CPU writes to some IO, same thing happens.



            That means software may access it if it is aware of it's memory address and IO address.



            Accessing some hardware mapped to some memory address:



            MOV SOME_ADDR, AX ; Move value of AX register to address SOME_ADDRESS
            OUT SOME_PORT, AX; Move value of AX to some port


            Same with C



            int* data = SOME_ADDR;
            data[0] = '1'; //write '1' to SOME_ADDR.
            outp(PORT, '1'); //to io


            IBM PC compatible computers had several types of cards:



            • Monochrome Graphic Adapter


            • Hercules


            • CGA


            • EGA


            • VGA


            Each card had its own standard which was documented. All of them were backward compatible (i.e. VGA may emulate CGA).
            VGA was the most complicated standard, there were huge books about it!
            Standard declares which address should you use to access video card and which data should you write to card to show something on CRT monitor.



            So, first you need to find out which card do you have (you can try to read this data from memory area filled by BIOS or ask user).
            Then, you use standard to talk to card.



            VGA, for example, had a lot of internal registers. Developers wrote some data to IO port to select register, then wrote data to this register.



            Card memory was mapped, so you simply wrote data to some address (in some modes some cards had several pages of memory which you can switch).



            But memory was not always plane. There was a character mode (in which each 2 bytes represent letter and it's attributes like color and bgcolor).
            There was 13h mode where each byte represented color of pixel.
            There were modes with several planars to speed up the card (see
            https://en.wikipedia.org/wiki/Planar_(computer_graphics) )



            Video programming was not easy!
            Some articles to read:



            • https://wiki.osdev.org/VGA_Hardware


            • http://www.brackeen.com/vga/


            There was also high level BIOS api, but it was too slow to be used by games.



            You may ask: "But how do I render 3D with all of that?".
            The answer is: you can't.



            In 80th and early 90th you had to render everything on CPU and then use video card API to show 2D image.



            I really suggest you t read book about how they did it for Wolf3d:



            http://fabiensanglard.net/gebbwolf3d/



            The first video card that supported SOME 3D APIs was "Voodo 3DFX". It had API called "Glide".






            share|improve this answer


















            • 1





              Hercules was bit special regarding compatibility. It was MDA->CGA->EGA->VGA which was backward compatible. Hercules was directly backwards compatible with MDA and through emulation with CGA, but I don't think that any of later cards (EGA/VGA) was able to emulate Hercules.

              – Artur Biesiadowski
              Jun 6 at 14:05













            1












            1








            1







            Like any hardware, video card may have some address in I/O space and memory space.
            They are physically connected to bus (ISA bus back in 1980th).
            When CPU writes to some memory address, videocard answers this and accepts data.
            When CPU writes to some IO, same thing happens.



            That means software may access it if it is aware of it's memory address and IO address.



            Accessing some hardware mapped to some memory address:



            MOV SOME_ADDR, AX ; Move value of AX register to address SOME_ADDRESS
            OUT SOME_PORT, AX; Move value of AX to some port


            Same with C



            int* data = SOME_ADDR;
            data[0] = '1'; //write '1' to SOME_ADDR.
            outp(PORT, '1'); //to io


            IBM PC compatible computers had several types of cards:



            • Monochrome Graphic Adapter


            • Hercules


            • CGA


            • EGA


            • VGA


            Each card had its own standard which was documented. All of them were backward compatible (i.e. VGA may emulate CGA).
            VGA was the most complicated standard, there were huge books about it!
            Standard declares which address should you use to access video card and which data should you write to card to show something on CRT monitor.



            So, first you need to find out which card do you have (you can try to read this data from memory area filled by BIOS or ask user).
            Then, you use standard to talk to card.



            VGA, for example, had a lot of internal registers. Developers wrote some data to IO port to select register, then wrote data to this register.



            Card memory was mapped, so you simply wrote data to some address (in some modes some cards had several pages of memory which you can switch).



            But memory was not always plane. There was a character mode (in which each 2 bytes represent letter and it's attributes like color and bgcolor).
            There was 13h mode where each byte represented color of pixel.
            There were modes with several planars to speed up the card (see
            https://en.wikipedia.org/wiki/Planar_(computer_graphics) )



            Video programming was not easy!
            Some articles to read:



            • https://wiki.osdev.org/VGA_Hardware


            • http://www.brackeen.com/vga/


            There was also high level BIOS api, but it was too slow to be used by games.



            You may ask: "But how do I render 3D with all of that?".
            The answer is: you can't.



            In 80th and early 90th you had to render everything on CPU and then use video card API to show 2D image.



            I really suggest you t read book about how they did it for Wolf3d:



            http://fabiensanglard.net/gebbwolf3d/



            The first video card that supported SOME 3D APIs was "Voodo 3DFX". It had API called "Glide".






            share|improve this answer













            Like any hardware, video card may have some address in I/O space and memory space.
            They are physically connected to bus (ISA bus back in 1980th).
            When CPU writes to some memory address, videocard answers this and accepts data.
            When CPU writes to some IO, same thing happens.



            That means software may access it if it is aware of it's memory address and IO address.



            Accessing some hardware mapped to some memory address:



            MOV SOME_ADDR, AX ; Move value of AX register to address SOME_ADDRESS
            OUT SOME_PORT, AX; Move value of AX to some port


            Same with C



            int* data = SOME_ADDR;
            data[0] = '1'; //write '1' to SOME_ADDR.
            outp(PORT, '1'); //to io


            IBM PC compatible computers had several types of cards:



            • Monochrome Graphic Adapter


            • Hercules


            • CGA


            • EGA


            • VGA


            Each card had its own standard which was documented. All of them were backward compatible (i.e. VGA may emulate CGA).
            VGA was the most complicated standard, there were huge books about it!
            Standard declares which address should you use to access video card and which data should you write to card to show something on CRT monitor.



            So, first you need to find out which card do you have (you can try to read this data from memory area filled by BIOS or ask user).
            Then, you use standard to talk to card.



            VGA, for example, had a lot of internal registers. Developers wrote some data to IO port to select register, then wrote data to this register.



            Card memory was mapped, so you simply wrote data to some address (in some modes some cards had several pages of memory which you can switch).



            But memory was not always plane. There was a character mode (in which each 2 bytes represent letter and it's attributes like color and bgcolor).
            There was 13h mode where each byte represented color of pixel.
            There were modes with several planars to speed up the card (see
            https://en.wikipedia.org/wiki/Planar_(computer_graphics) )



            Video programming was not easy!
            Some articles to read:



            • https://wiki.osdev.org/VGA_Hardware


            • http://www.brackeen.com/vga/


            There was also high level BIOS api, but it was too slow to be used by games.



            You may ask: "But how do I render 3D with all of that?".
            The answer is: you can't.



            In 80th and early 90th you had to render everything on CPU and then use video card API to show 2D image.



            I really suggest you t read book about how they did it for Wolf3d:



            http://fabiensanglard.net/gebbwolf3d/



            The first video card that supported SOME 3D APIs was "Voodo 3DFX". It had API called "Glide".







            share|improve this answer












            share|improve this answer



            share|improve this answer










            answered Jun 5 at 22:27









            user996142user996142

            1311




            1311







            • 1





              Hercules was bit special regarding compatibility. It was MDA->CGA->EGA->VGA which was backward compatible. Hercules was directly backwards compatible with MDA and through emulation with CGA, but I don't think that any of later cards (EGA/VGA) was able to emulate Hercules.

              – Artur Biesiadowski
              Jun 6 at 14:05












            • 1





              Hercules was bit special regarding compatibility. It was MDA->CGA->EGA->VGA which was backward compatible. Hercules was directly backwards compatible with MDA and through emulation with CGA, but I don't think that any of later cards (EGA/VGA) was able to emulate Hercules.

              – Artur Biesiadowski
              Jun 6 at 14:05







            1




            1





            Hercules was bit special regarding compatibility. It was MDA->CGA->EGA->VGA which was backward compatible. Hercules was directly backwards compatible with MDA and through emulation with CGA, but I don't think that any of later cards (EGA/VGA) was able to emulate Hercules.

            – Artur Biesiadowski
            Jun 6 at 14:05





            Hercules was bit special regarding compatibility. It was MDA->CGA->EGA->VGA which was backward compatible. Hercules was directly backwards compatible with MDA and through emulation with CGA, but I don't think that any of later cards (EGA/VGA) was able to emulate Hercules.

            – Artur Biesiadowski
            Jun 6 at 14:05










            Petr is a new contributor. Be nice, and check out our Code of Conduct.









            draft saved

            draft discarded


















            Petr is a new contributor. Be nice, and check out our Code of Conduct.












            Petr is a new contributor. Be nice, and check out our Code of Conduct.











            Petr is a new contributor. Be nice, and check out our Code of Conduct.














            Thanks for contributing an answer to Retrocomputing Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f11219%2fhow-did-old-ms-dos-games-utilize-various-graphic-cards%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Get product attribute by attribute group code in magento 2get product attribute by product attribute group in magento 2Magento 2 Log Bundle Product Data in List Page?How to get all product attribute of a attribute group of Default attribute set?Magento 2.1 Create a filter in the product grid by new attributeMagento 2 : Get Product Attribute values By GroupMagento 2 How to get all existing values for one attributeMagento 2 get custom attribute of a single product inside a pluginMagento 2.3 How to get all the Multi Source Inventory (MSI) locations collection in custom module?Magento2: how to develop rest API to get new productsGet product attribute by attribute group code ( [attribute_group_code] ) in magento 2

            Category:9 (number) SubcategoriesMedia in category "9 (number)"Navigation menuUpload mediaGND ID: 4485639-8Library of Congress authority ID: sh85091979ReasonatorScholiaStatistics

            Magento 2.3: How do i solve this, Not registered handle, on custom form?How can i rewrite TierPrice Block in Magento2magento 2 captcha not rendering if I override layout xmlmain.CRITICAL: Plugin class doesn't existMagento 2 : Problem while adding custom button order view page?Magento 2.2.5: Overriding Admin Controller sales/orderMagento 2.2.5: Add, Update and Delete existing products Custom OptionsMagento 2.3 : File Upload issue in UI Component FormMagento2 Not registered handleHow to configured Form Builder Js in my custom magento 2.3.0 module?Magento 2.3. How to create image upload field in an admin form