This HTML5 document contains 6 embedded RDF statements represented using HTML+Microdata notation.

The embedded RDF content will be recognized by any processor of HTML5 Microdata.

PrefixNamespace IRI
n7http://dbkwik.webdatacommons.org/ontology/
dctermshttp://purl.org/dc/terms/
n5http://dbkwik.webdatacommons.org/resource/1GB42jsAhUygiGKcatE5VQ==
rdfshttp://www.w3.org/2000/01/rdf-schema#
n2http://dbkwik.webdatacommons.org/resource/UcHT13pZin5q1ycYrjH0Vg==
rdfhttp://www.w3.org/1999/02/22-rdf-syntax-ns#
n6http://dbkwik.webdatacommons.org/resource/C2Vp84P8o_bYpAu3L_NUKg==
xsdhhttp://www.w3.org/2001/XMLSchema#
n8http://dbkwik.webdatacommons.org/resource/CmbnjT_sEk614TfwgkYzBQ==
Subject Item
n2:
rdfs:label
Does your computer come with a graphics card as standard
rdfs:comment
Computes the graphics in a computer. Since that is obvious and probably not what you were after, I'll try and expand upon it a bit. A modern graphics card is one of the most complex computer components, and is essentially a small, single-purpose computer by itself. The core of the card is a graphics processor, which works in the same basic way a normal CPU does, only it is designed specifically for handling graphics operations (computing lighting, shading, colors, 3D, various effects,...). The graphics core has its own dedicated memory banks (they're a fast, high-frequency type of RAM modules, today usually DDR3 through GDDR5), whose size is the commonly seen number on the card's packaging, like "1024 MB VGA RAM" and the like. Besides these core elements, modern graphics cards usually also
dcterms:subject
n5: n6: n8:
n7:abstract
Computes the graphics in a computer. Since that is obvious and probably not what you were after, I'll try and expand upon it a bit. A modern graphics card is one of the most complex computer components, and is essentially a small, single-purpose computer by itself. The core of the card is a graphics processor, which works in the same basic way a normal CPU does, only it is designed specifically for handling graphics operations (computing lighting, shading, colors, 3D, various effects,...). The graphics core has its own dedicated memory banks (they're a fast, high-frequency type of RAM modules, today usually DDR3 through GDDR5), whose size is the commonly seen number on the card's packaging, like "1024 MB VGA RAM" and the like. Besides these core elements, modern graphics cards usually also have a number of I/O ports like a PCI-Express plug, HDMI and DVI-outs, SLI bridges, etc. The powerful ones often have massive coolers and heatsinks, as well as extra power connectors on them. So, how it works? The card gets assigned a batch of raw data, usually by the CPU, the "engine" (core program) of a game, a 3D-modelling application, etc, along with certain instructions (like, simplified, "shade this", "make some fog-effect there",...). The card pumps this data through the various parts of its complex architecture, transforming a bunch of coded essentials into a picture you see on the screen. The picture is of course, in some manner, contained in the original data, but the card creates what you see on the screen. Let's say you have a game like Crysis - if all the pictures that may be seen in the game (the scenery in all possible states viewed from every possible point in the virtual world) were stored prerendered in the game's files, the game would be several thousand gigabytes long. So, what happens, the game only contains instructions on how to build a scene, and the graphics card builds it. First the wireframes of the objects, then their surfaces and textures, then lighting, shadows and special effects. This happens continuously, about 30 frames per second. So, in short, the graphics card creates the scene you see on your screen.