When and what was the first 3D acceleration device ever released?What was the change to the Apple //e motherboard that allowed for Double HiRes graphics?Were there any “off the shelf” graphics chips that supported 2D sprites in the 70's and 80's?What techniques were used to reduce the required re-rendering in 3D programs?What determines the color of every 8th pixel on the Apple II?What happens when PS1's “Fearful Harmony” is triggered?Was 1991's Hellcats the first instance of incremental screen updates?For fast scrolling DOS games, when was Mode13h preferred over Mode X?What systems had the lowest resolution ever that still allowed games to be made?When were other inexpensive computers able to recreate “The Amiga Juggler”?What would happen if the DHR jumper was enabled on a Rev A Apple //e?
Do marked cards or loaded dice have any mechanical benefit?
Why does a helium balloon rise?
Incremental Ranges!
Could the Missouri River be running while Lake Michigan was frozen several meters deep?
Unorthodox way of solving Einstein field equations
How to provide realism without making readers think grimdark
What is a simple, physical situation where complex numbers emerge naturally?
Old black and white movie: glowing black rocks slowly turn you into stone upon touch
What is the right way to float a home lab?
Is it a problem that pull requests are approved without any comments
How to split a string in two substrings of same length using bash?
Does Peach's float negate shorthop knockback multipliers?
Metal bar on DMM PCB
Initialize an array of doubles at compile time
Credit card offering 0.5 miles for every cent rounded up. Too good to be true?
Is there a rule that prohibits us from using 2 possessives in a row?
When leasing/renting out an owned property, is there a standard ratio between monthly rent and the mortgage?
Get value of the passed argument to script importing variables from another script
How to decline physical affection from a child whose parents are pressuring them?
Please help me identify this plane
What's the most polite way to tell a manager "shut up and let me work"?
Is having a hidden directory under /etc safe?
Is it possible for people to live in the eye of a permanent hypercane?
Concise way to draw this pyramid
When and what was the first 3D acceleration device ever released?
What was the change to the Apple //e motherboard that allowed for Double HiRes graphics?Were there any “off the shelf” graphics chips that supported 2D sprites in the 70's and 80's?What techniques were used to reduce the required re-rendering in 3D programs?What determines the color of every 8th pixel on the Apple II?What happens when PS1's “Fearful Harmony” is triggered?Was 1991's Hellcats the first instance of incremental screen updates?For fast scrolling DOS games, when was Mode13h preferred over Mode X?What systems had the lowest resolution ever that still allowed games to be made?When were other inexpensive computers able to recreate “The Amiga Juggler”?What would happen if the DHR jumper was enabled on a Rev A Apple //e?
Plain and simple, we know that consumer level 3D acceleration devices roughly appeared in the middle of the 90s, with brands like Matrox, 3DFX, Nvidia and so on.
However, such technology for sure must have been available way before for professionals, governments or even military.
When and what was the first 3D acceleration device ever released for a computer?
To clarify after Stephen Kitt comment,
By 'device' I more or less mean any approach or solution that significantly improved the speed of a 3D simulation, when say, you would compare it to the performance of a software rasterizer on a desktop computer of that time.
If I could give an example it would be like when you used to play software rasterized 3D games with a low frame rate, against when you installed a 3DFX card, you instantly felt another level of performance/rendering.
graphics
add a comment |
Plain and simple, we know that consumer level 3D acceleration devices roughly appeared in the middle of the 90s, with brands like Matrox, 3DFX, Nvidia and so on.
However, such technology for sure must have been available way before for professionals, governments or even military.
When and what was the first 3D acceleration device ever released for a computer?
To clarify after Stephen Kitt comment,
By 'device' I more or less mean any approach or solution that significantly improved the speed of a 3D simulation, when say, you would compare it to the performance of a software rasterizer on a desktop computer of that time.
If I could give an example it would be like when you used to play software rasterized 3D games with a low frame rate, against when you installed a 3DFX card, you instantly felt another level of performance/rendering.
graphics
3
What qualifies as a 3D acceleration device? For example, I once saw a VR setup in the early 90s with two supercomputers, one of which handled the 3D rendering; I wouldn’t have thought of that supercomputer as a 3D acceleration device, but some might...
– Stephen Kitt
May 25 at 7:55
1
Right, I would say that this approach is valid as it appears to be quite significant. I have edited my question, hope it's clearer now.
– Aybe
May 25 at 8:31
add a comment |
Plain and simple, we know that consumer level 3D acceleration devices roughly appeared in the middle of the 90s, with brands like Matrox, 3DFX, Nvidia and so on.
However, such technology for sure must have been available way before for professionals, governments or even military.
When and what was the first 3D acceleration device ever released for a computer?
To clarify after Stephen Kitt comment,
By 'device' I more or less mean any approach or solution that significantly improved the speed of a 3D simulation, when say, you would compare it to the performance of a software rasterizer on a desktop computer of that time.
If I could give an example it would be like when you used to play software rasterized 3D games with a low frame rate, against when you installed a 3DFX card, you instantly felt another level of performance/rendering.
graphics
Plain and simple, we know that consumer level 3D acceleration devices roughly appeared in the middle of the 90s, with brands like Matrox, 3DFX, Nvidia and so on.
However, such technology for sure must have been available way before for professionals, governments or even military.
When and what was the first 3D acceleration device ever released for a computer?
To clarify after Stephen Kitt comment,
By 'device' I more or less mean any approach or solution that significantly improved the speed of a 3D simulation, when say, you would compare it to the performance of a software rasterizer on a desktop computer of that time.
If I could give an example it would be like when you used to play software rasterized 3D games with a low frame rate, against when you installed a 3DFX card, you instantly felt another level of performance/rendering.
graphics
graphics
edited May 25 at 8:43
Aybe
asked May 25 at 7:39
AybeAybe
1,4021826
1,4021826
3
What qualifies as a 3D acceleration device? For example, I once saw a VR setup in the early 90s with two supercomputers, one of which handled the 3D rendering; I wouldn’t have thought of that supercomputer as a 3D acceleration device, but some might...
– Stephen Kitt
May 25 at 7:55
1
Right, I would say that this approach is valid as it appears to be quite significant. I have edited my question, hope it's clearer now.
– Aybe
May 25 at 8:31
add a comment |
3
What qualifies as a 3D acceleration device? For example, I once saw a VR setup in the early 90s with two supercomputers, one of which handled the 3D rendering; I wouldn’t have thought of that supercomputer as a 3D acceleration device, but some might...
– Stephen Kitt
May 25 at 7:55
1
Right, I would say that this approach is valid as it appears to be quite significant. I have edited my question, hope it's clearer now.
– Aybe
May 25 at 8:31
3
3
What qualifies as a 3D acceleration device? For example, I once saw a VR setup in the early 90s with two supercomputers, one of which handled the 3D rendering; I wouldn’t have thought of that supercomputer as a 3D acceleration device, but some might...
– Stephen Kitt
May 25 at 7:55
What qualifies as a 3D acceleration device? For example, I once saw a VR setup in the early 90s with two supercomputers, one of which handled the 3D rendering; I wouldn’t have thought of that supercomputer as a 3D acceleration device, but some might...
– Stephen Kitt
May 25 at 7:55
1
1
Right, I would say that this approach is valid as it appears to be quite significant. I have edited my question, hope it's clearer now.
– Aybe
May 25 at 8:31
Right, I would say that this approach is valid as it appears to be quite significant. I have edited my question, hope it's clearer now.
– Aybe
May 25 at 8:31
add a comment |
3 Answers
3
active
oldest
votes
The first appears to be Evans & Sutherland's LDS-1, introduced in 1969. The first one was delivered to Bolt, Beranek and Newman Inc., in August 1969. This was an analogue vector system, with depth cuing, and could draw and manipulate complex wireframe models in real time.
The first 3D engine produced as a VLSI microchip was the Geometry Engine, developed at Stanford University and commercialised by SGI from the early 1980s.
SGI's failure to address the Windows/Intel market led various SGI staff to leave and found Nvidia or join ATI in the 1990s, laying the foundations for today's market, where integrated geometry engines are a standard part of display systems.
Was the LDS actually for sale at that point? The PDS-1 and 3DR were about the same time period.
– Maury Markowitz
May 27 at 17:30
@MauryMarkowitz: First one delivered in August 1969.
– John Dallman
May 27 at 18:43
1
ATI was founded in 1985 and started out as a vendor of Super EGA and Super VGA boards for PC's. (They competed with the likes of Tseng Labs, Paradise, Video Seven, and a multitude of others.) Their 3D hardware came later, and after several generations of 2D accelerators.
– mschaef
2 days ago
1
@mschaef: Thanks, corrected.
– John Dallman
2 days ago
add a comment |
You can't really answer this question without acknowledging the parallel development paths for high-quality 3D graphics, which includes not only the dominant Polygon/Texture-mapping approach, but also the Ray Tracing approach. This parallel development path continues through today, with recent announcements like Nvidia's next generation RTX platform, that features real-time Ray Tracing.
While the LDS-1 is a very early example of being purpose-built for 3D graphics, it was incapable of producing anything photo-realistic or "natural". At roughly the same time, Ray Tracing was being used for high quality static images that, while taking many hours to render, produced results with a level of quality that would still impress 3D artists today.
In 1982, Osaka University created a purpose-built supercomputer designed for 3D graphics rendering using Ray Tracing. The supercomputer was known as the LINKS-1 Computer Graphics System, and was an early contender for fastest computer in the world. It is briefly described on Wikipedia as:
a massively parallel processing computer system with 514 microprocessors (257 Zilog Z8001's and 257 iAPX 86's), used for rendering realistic 3D computer graphics with high-speed ray tracing.
The Japanese supercomputer predates SGI's commercialization of the Geometry Engine by 5 years, and could produce far better 3D graphics.
Of course, both supercomputers and SGI systems were far out-of-reach for users of Desktop computer systems of the 1980s. However, Ray Tracing enjoyed a brief popularity at this time. Most 3D artists relied on affordable Amigas with accelerator boards that included FPUs to render 3D graphics and animations using ray tracing. The Macintosh II, with its Motorola 68020/68881 CPU/FPU, was also popular for this purpose. There is available a tremendous amount of early 3D desktop computer content created with such equipment, using both ray tracing and polygon rendering techniques. In a sense, any 1980s machine with an FPU was an "accelerated 3D graphics workstation", if 3D rendering software made use of it. My own Amiga 1000 had a 68881 FPU attached via the side-expansion bus for just this purpose, in 1988, at the same time SGI was selling the IRIS 3000-series.
1
While I was looking for the earliest device ever been made, your answer is also very informative, thank you!
– Aybe
May 26 at 20:08
add a comment |
In the mid-1980s I worked for a company that was using satellite imagery as one of the tools geologists used to do studies for firms searching for oil. The computer+software system we used was called Earthviews, based on an LSI-11/73 CPU with an RSX-11 operating system. (The actual software we used most was a FORTH program called AIMS ("Advanced Image Management System"), which ran separate from RSX-11 -- i.e. it was its own operating system.) Physically the system was quite large, consisting of two tall 19"-wide racks and a tape-drive system just as large. One of the optional hardware features was in one of the racks, about 18" tall, called a "vector processor". I believe the system was developed in the early '80s, maybe even late '70s. If I remember correctly, what it did was calculate a stream of a large number of multiply-plus-add operations in parallel.
While this hardware peripheral was not actually dedicated to be a "3D accelerator", it could operate as a major component of such a subsystem. I'm not sure, but I believe what Earthviews used it for was to process what was essentially sonar data from appropriate ground scans. I don't remember this very well, because our company didn't use sonar data. I think if you had the data, Earthviews would build a 3D rendition of the ground the sonar passed through.
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "648"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f11103%2fwhen-and-what-was-the-first-3d-acceleration-device-ever-released%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
The first appears to be Evans & Sutherland's LDS-1, introduced in 1969. The first one was delivered to Bolt, Beranek and Newman Inc., in August 1969. This was an analogue vector system, with depth cuing, and could draw and manipulate complex wireframe models in real time.
The first 3D engine produced as a VLSI microchip was the Geometry Engine, developed at Stanford University and commercialised by SGI from the early 1980s.
SGI's failure to address the Windows/Intel market led various SGI staff to leave and found Nvidia or join ATI in the 1990s, laying the foundations for today's market, where integrated geometry engines are a standard part of display systems.
Was the LDS actually for sale at that point? The PDS-1 and 3DR were about the same time period.
– Maury Markowitz
May 27 at 17:30
@MauryMarkowitz: First one delivered in August 1969.
– John Dallman
May 27 at 18:43
1
ATI was founded in 1985 and started out as a vendor of Super EGA and Super VGA boards for PC's. (They competed with the likes of Tseng Labs, Paradise, Video Seven, and a multitude of others.) Their 3D hardware came later, and after several generations of 2D accelerators.
– mschaef
2 days ago
1
@mschaef: Thanks, corrected.
– John Dallman
2 days ago
add a comment |
The first appears to be Evans & Sutherland's LDS-1, introduced in 1969. The first one was delivered to Bolt, Beranek and Newman Inc., in August 1969. This was an analogue vector system, with depth cuing, and could draw and manipulate complex wireframe models in real time.
The first 3D engine produced as a VLSI microchip was the Geometry Engine, developed at Stanford University and commercialised by SGI from the early 1980s.
SGI's failure to address the Windows/Intel market led various SGI staff to leave and found Nvidia or join ATI in the 1990s, laying the foundations for today's market, where integrated geometry engines are a standard part of display systems.
Was the LDS actually for sale at that point? The PDS-1 and 3DR were about the same time period.
– Maury Markowitz
May 27 at 17:30
@MauryMarkowitz: First one delivered in August 1969.
– John Dallman
May 27 at 18:43
1
ATI was founded in 1985 and started out as a vendor of Super EGA and Super VGA boards for PC's. (They competed with the likes of Tseng Labs, Paradise, Video Seven, and a multitude of others.) Their 3D hardware came later, and after several generations of 2D accelerators.
– mschaef
2 days ago
1
@mschaef: Thanks, corrected.
– John Dallman
2 days ago
add a comment |
The first appears to be Evans & Sutherland's LDS-1, introduced in 1969. The first one was delivered to Bolt, Beranek and Newman Inc., in August 1969. This was an analogue vector system, with depth cuing, and could draw and manipulate complex wireframe models in real time.
The first 3D engine produced as a VLSI microchip was the Geometry Engine, developed at Stanford University and commercialised by SGI from the early 1980s.
SGI's failure to address the Windows/Intel market led various SGI staff to leave and found Nvidia or join ATI in the 1990s, laying the foundations for today's market, where integrated geometry engines are a standard part of display systems.
The first appears to be Evans & Sutherland's LDS-1, introduced in 1969. The first one was delivered to Bolt, Beranek and Newman Inc., in August 1969. This was an analogue vector system, with depth cuing, and could draw and manipulate complex wireframe models in real time.
The first 3D engine produced as a VLSI microchip was the Geometry Engine, developed at Stanford University and commercialised by SGI from the early 1980s.
SGI's failure to address the Windows/Intel market led various SGI staff to leave and found Nvidia or join ATI in the 1990s, laying the foundations for today's market, where integrated geometry engines are a standard part of display systems.
edited 2 days ago
answered May 25 at 8:45
John DallmanJohn Dallman
4,35911219
4,35911219
Was the LDS actually for sale at that point? The PDS-1 and 3DR were about the same time period.
– Maury Markowitz
May 27 at 17:30
@MauryMarkowitz: First one delivered in August 1969.
– John Dallman
May 27 at 18:43
1
ATI was founded in 1985 and started out as a vendor of Super EGA and Super VGA boards for PC's. (They competed with the likes of Tseng Labs, Paradise, Video Seven, and a multitude of others.) Their 3D hardware came later, and after several generations of 2D accelerators.
– mschaef
2 days ago
1
@mschaef: Thanks, corrected.
– John Dallman
2 days ago
add a comment |
Was the LDS actually for sale at that point? The PDS-1 and 3DR were about the same time period.
– Maury Markowitz
May 27 at 17:30
@MauryMarkowitz: First one delivered in August 1969.
– John Dallman
May 27 at 18:43
1
ATI was founded in 1985 and started out as a vendor of Super EGA and Super VGA boards for PC's. (They competed with the likes of Tseng Labs, Paradise, Video Seven, and a multitude of others.) Their 3D hardware came later, and after several generations of 2D accelerators.
– mschaef
2 days ago
1
@mschaef: Thanks, corrected.
– John Dallman
2 days ago
Was the LDS actually for sale at that point? The PDS-1 and 3DR were about the same time period.
– Maury Markowitz
May 27 at 17:30
Was the LDS actually for sale at that point? The PDS-1 and 3DR were about the same time period.
– Maury Markowitz
May 27 at 17:30
@MauryMarkowitz: First one delivered in August 1969.
– John Dallman
May 27 at 18:43
@MauryMarkowitz: First one delivered in August 1969.
– John Dallman
May 27 at 18:43
1
1
ATI was founded in 1985 and started out as a vendor of Super EGA and Super VGA boards for PC's. (They competed with the likes of Tseng Labs, Paradise, Video Seven, and a multitude of others.) Their 3D hardware came later, and after several generations of 2D accelerators.
– mschaef
2 days ago
ATI was founded in 1985 and started out as a vendor of Super EGA and Super VGA boards for PC's. (They competed with the likes of Tseng Labs, Paradise, Video Seven, and a multitude of others.) Their 3D hardware came later, and after several generations of 2D accelerators.
– mschaef
2 days ago
1
1
@mschaef: Thanks, corrected.
– John Dallman
2 days ago
@mschaef: Thanks, corrected.
– John Dallman
2 days ago
add a comment |
You can't really answer this question without acknowledging the parallel development paths for high-quality 3D graphics, which includes not only the dominant Polygon/Texture-mapping approach, but also the Ray Tracing approach. This parallel development path continues through today, with recent announcements like Nvidia's next generation RTX platform, that features real-time Ray Tracing.
While the LDS-1 is a very early example of being purpose-built for 3D graphics, it was incapable of producing anything photo-realistic or "natural". At roughly the same time, Ray Tracing was being used for high quality static images that, while taking many hours to render, produced results with a level of quality that would still impress 3D artists today.
In 1982, Osaka University created a purpose-built supercomputer designed for 3D graphics rendering using Ray Tracing. The supercomputer was known as the LINKS-1 Computer Graphics System, and was an early contender for fastest computer in the world. It is briefly described on Wikipedia as:
a massively parallel processing computer system with 514 microprocessors (257 Zilog Z8001's and 257 iAPX 86's), used for rendering realistic 3D computer graphics with high-speed ray tracing.
The Japanese supercomputer predates SGI's commercialization of the Geometry Engine by 5 years, and could produce far better 3D graphics.
Of course, both supercomputers and SGI systems were far out-of-reach for users of Desktop computer systems of the 1980s. However, Ray Tracing enjoyed a brief popularity at this time. Most 3D artists relied on affordable Amigas with accelerator boards that included FPUs to render 3D graphics and animations using ray tracing. The Macintosh II, with its Motorola 68020/68881 CPU/FPU, was also popular for this purpose. There is available a tremendous amount of early 3D desktop computer content created with such equipment, using both ray tracing and polygon rendering techniques. In a sense, any 1980s machine with an FPU was an "accelerated 3D graphics workstation", if 3D rendering software made use of it. My own Amiga 1000 had a 68881 FPU attached via the side-expansion bus for just this purpose, in 1988, at the same time SGI was selling the IRIS 3000-series.
1
While I was looking for the earliest device ever been made, your answer is also very informative, thank you!
– Aybe
May 26 at 20:08
add a comment |
You can't really answer this question without acknowledging the parallel development paths for high-quality 3D graphics, which includes not only the dominant Polygon/Texture-mapping approach, but also the Ray Tracing approach. This parallel development path continues through today, with recent announcements like Nvidia's next generation RTX platform, that features real-time Ray Tracing.
While the LDS-1 is a very early example of being purpose-built for 3D graphics, it was incapable of producing anything photo-realistic or "natural". At roughly the same time, Ray Tracing was being used for high quality static images that, while taking many hours to render, produced results with a level of quality that would still impress 3D artists today.
In 1982, Osaka University created a purpose-built supercomputer designed for 3D graphics rendering using Ray Tracing. The supercomputer was known as the LINKS-1 Computer Graphics System, and was an early contender for fastest computer in the world. It is briefly described on Wikipedia as:
a massively parallel processing computer system with 514 microprocessors (257 Zilog Z8001's and 257 iAPX 86's), used for rendering realistic 3D computer graphics with high-speed ray tracing.
The Japanese supercomputer predates SGI's commercialization of the Geometry Engine by 5 years, and could produce far better 3D graphics.
Of course, both supercomputers and SGI systems were far out-of-reach for users of Desktop computer systems of the 1980s. However, Ray Tracing enjoyed a brief popularity at this time. Most 3D artists relied on affordable Amigas with accelerator boards that included FPUs to render 3D graphics and animations using ray tracing. The Macintosh II, with its Motorola 68020/68881 CPU/FPU, was also popular for this purpose. There is available a tremendous amount of early 3D desktop computer content created with such equipment, using both ray tracing and polygon rendering techniques. In a sense, any 1980s machine with an FPU was an "accelerated 3D graphics workstation", if 3D rendering software made use of it. My own Amiga 1000 had a 68881 FPU attached via the side-expansion bus for just this purpose, in 1988, at the same time SGI was selling the IRIS 3000-series.
1
While I was looking for the earliest device ever been made, your answer is also very informative, thank you!
– Aybe
May 26 at 20:08
add a comment |
You can't really answer this question without acknowledging the parallel development paths for high-quality 3D graphics, which includes not only the dominant Polygon/Texture-mapping approach, but also the Ray Tracing approach. This parallel development path continues through today, with recent announcements like Nvidia's next generation RTX platform, that features real-time Ray Tracing.
While the LDS-1 is a very early example of being purpose-built for 3D graphics, it was incapable of producing anything photo-realistic or "natural". At roughly the same time, Ray Tracing was being used for high quality static images that, while taking many hours to render, produced results with a level of quality that would still impress 3D artists today.
In 1982, Osaka University created a purpose-built supercomputer designed for 3D graphics rendering using Ray Tracing. The supercomputer was known as the LINKS-1 Computer Graphics System, and was an early contender for fastest computer in the world. It is briefly described on Wikipedia as:
a massively parallel processing computer system with 514 microprocessors (257 Zilog Z8001's and 257 iAPX 86's), used for rendering realistic 3D computer graphics with high-speed ray tracing.
The Japanese supercomputer predates SGI's commercialization of the Geometry Engine by 5 years, and could produce far better 3D graphics.
Of course, both supercomputers and SGI systems were far out-of-reach for users of Desktop computer systems of the 1980s. However, Ray Tracing enjoyed a brief popularity at this time. Most 3D artists relied on affordable Amigas with accelerator boards that included FPUs to render 3D graphics and animations using ray tracing. The Macintosh II, with its Motorola 68020/68881 CPU/FPU, was also popular for this purpose. There is available a tremendous amount of early 3D desktop computer content created with such equipment, using both ray tracing and polygon rendering techniques. In a sense, any 1980s machine with an FPU was an "accelerated 3D graphics workstation", if 3D rendering software made use of it. My own Amiga 1000 had a 68881 FPU attached via the side-expansion bus for just this purpose, in 1988, at the same time SGI was selling the IRIS 3000-series.
You can't really answer this question without acknowledging the parallel development paths for high-quality 3D graphics, which includes not only the dominant Polygon/Texture-mapping approach, but also the Ray Tracing approach. This parallel development path continues through today, with recent announcements like Nvidia's next generation RTX platform, that features real-time Ray Tracing.
While the LDS-1 is a very early example of being purpose-built for 3D graphics, it was incapable of producing anything photo-realistic or "natural". At roughly the same time, Ray Tracing was being used for high quality static images that, while taking many hours to render, produced results with a level of quality that would still impress 3D artists today.
In 1982, Osaka University created a purpose-built supercomputer designed for 3D graphics rendering using Ray Tracing. The supercomputer was known as the LINKS-1 Computer Graphics System, and was an early contender for fastest computer in the world. It is briefly described on Wikipedia as:
a massively parallel processing computer system with 514 microprocessors (257 Zilog Z8001's and 257 iAPX 86's), used for rendering realistic 3D computer graphics with high-speed ray tracing.
The Japanese supercomputer predates SGI's commercialization of the Geometry Engine by 5 years, and could produce far better 3D graphics.
Of course, both supercomputers and SGI systems were far out-of-reach for users of Desktop computer systems of the 1980s. However, Ray Tracing enjoyed a brief popularity at this time. Most 3D artists relied on affordable Amigas with accelerator boards that included FPUs to render 3D graphics and animations using ray tracing. The Macintosh II, with its Motorola 68020/68881 CPU/FPU, was also popular for this purpose. There is available a tremendous amount of early 3D desktop computer content created with such equipment, using both ray tracing and polygon rendering techniques. In a sense, any 1980s machine with an FPU was an "accelerated 3D graphics workstation", if 3D rendering software made use of it. My own Amiga 1000 had a 68881 FPU attached via the side-expansion bus for just this purpose, in 1988, at the same time SGI was selling the IRIS 3000-series.
edited May 25 at 20:30
answered May 25 at 16:05
Brian HBrian H
19.4k71171
19.4k71171
1
While I was looking for the earliest device ever been made, your answer is also very informative, thank you!
– Aybe
May 26 at 20:08
add a comment |
1
While I was looking for the earliest device ever been made, your answer is also very informative, thank you!
– Aybe
May 26 at 20:08
1
1
While I was looking for the earliest device ever been made, your answer is also very informative, thank you!
– Aybe
May 26 at 20:08
While I was looking for the earliest device ever been made, your answer is also very informative, thank you!
– Aybe
May 26 at 20:08
add a comment |
In the mid-1980s I worked for a company that was using satellite imagery as one of the tools geologists used to do studies for firms searching for oil. The computer+software system we used was called Earthviews, based on an LSI-11/73 CPU with an RSX-11 operating system. (The actual software we used most was a FORTH program called AIMS ("Advanced Image Management System"), which ran separate from RSX-11 -- i.e. it was its own operating system.) Physically the system was quite large, consisting of two tall 19"-wide racks and a tape-drive system just as large. One of the optional hardware features was in one of the racks, about 18" tall, called a "vector processor". I believe the system was developed in the early '80s, maybe even late '70s. If I remember correctly, what it did was calculate a stream of a large number of multiply-plus-add operations in parallel.
While this hardware peripheral was not actually dedicated to be a "3D accelerator", it could operate as a major component of such a subsystem. I'm not sure, but I believe what Earthviews used it for was to process what was essentially sonar data from appropriate ground scans. I don't remember this very well, because our company didn't use sonar data. I think if you had the data, Earthviews would build a 3D rendition of the ground the sonar passed through.
add a comment |
In the mid-1980s I worked for a company that was using satellite imagery as one of the tools geologists used to do studies for firms searching for oil. The computer+software system we used was called Earthviews, based on an LSI-11/73 CPU with an RSX-11 operating system. (The actual software we used most was a FORTH program called AIMS ("Advanced Image Management System"), which ran separate from RSX-11 -- i.e. it was its own operating system.) Physically the system was quite large, consisting of two tall 19"-wide racks and a tape-drive system just as large. One of the optional hardware features was in one of the racks, about 18" tall, called a "vector processor". I believe the system was developed in the early '80s, maybe even late '70s. If I remember correctly, what it did was calculate a stream of a large number of multiply-plus-add operations in parallel.
While this hardware peripheral was not actually dedicated to be a "3D accelerator", it could operate as a major component of such a subsystem. I'm not sure, but I believe what Earthviews used it for was to process what was essentially sonar data from appropriate ground scans. I don't remember this very well, because our company didn't use sonar data. I think if you had the data, Earthviews would build a 3D rendition of the ground the sonar passed through.
add a comment |
In the mid-1980s I worked for a company that was using satellite imagery as one of the tools geologists used to do studies for firms searching for oil. The computer+software system we used was called Earthviews, based on an LSI-11/73 CPU with an RSX-11 operating system. (The actual software we used most was a FORTH program called AIMS ("Advanced Image Management System"), which ran separate from RSX-11 -- i.e. it was its own operating system.) Physically the system was quite large, consisting of two tall 19"-wide racks and a tape-drive system just as large. One of the optional hardware features was in one of the racks, about 18" tall, called a "vector processor". I believe the system was developed in the early '80s, maybe even late '70s. If I remember correctly, what it did was calculate a stream of a large number of multiply-plus-add operations in parallel.
While this hardware peripheral was not actually dedicated to be a "3D accelerator", it could operate as a major component of such a subsystem. I'm not sure, but I believe what Earthviews used it for was to process what was essentially sonar data from appropriate ground scans. I don't remember this very well, because our company didn't use sonar data. I think if you had the data, Earthviews would build a 3D rendition of the ground the sonar passed through.
In the mid-1980s I worked for a company that was using satellite imagery as one of the tools geologists used to do studies for firms searching for oil. The computer+software system we used was called Earthviews, based on an LSI-11/73 CPU with an RSX-11 operating system. (The actual software we used most was a FORTH program called AIMS ("Advanced Image Management System"), which ran separate from RSX-11 -- i.e. it was its own operating system.) Physically the system was quite large, consisting of two tall 19"-wide racks and a tape-drive system just as large. One of the optional hardware features was in one of the racks, about 18" tall, called a "vector processor". I believe the system was developed in the early '80s, maybe even late '70s. If I remember correctly, what it did was calculate a stream of a large number of multiply-plus-add operations in parallel.
While this hardware peripheral was not actually dedicated to be a "3D accelerator", it could operate as a major component of such a subsystem. I'm not sure, but I believe what Earthviews used it for was to process what was essentially sonar data from appropriate ground scans. I don't remember this very well, because our company didn't use sonar data. I think if you had the data, Earthviews would build a 3D rendition of the ground the sonar passed through.
edited May 26 at 5:36
answered May 26 at 5:29
RichFRichF
4,8111536
4,8111536
add a comment |
add a comment |
Thanks for contributing an answer to Retrocomputing Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f11103%2fwhen-and-what-was-the-first-3d-acceleration-device-ever-released%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
3
What qualifies as a 3D acceleration device? For example, I once saw a VR setup in the early 90s with two supercomputers, one of which handled the 3D rendering; I wouldn’t have thought of that supercomputer as a 3D acceleration device, but some might...
– Stephen Kitt
May 25 at 7:55
1
Right, I would say that this approach is valid as it appears to be quite significant. I have edited my question, hope it's clearer now.
– Aybe
May 25 at 8:31