Skip to main content
Correct quote formatting. Condensed slightly by removing redundancy.
Source Link
Anko
  • 13.5k
  • 10
  • 56
  • 82

GDI (Graphics Device Interface) is the software renderer under Windows. Basically any language/runtime platform that you use under Windows that is not GPU-accelerated, is going to be using GDI under the hood at some level... by this I mean that whileWhile Java AWT might use GDI directly via the C code that the Java runtime is written in, something like Flash running in Chrome will OTOH be using GDI if the desktop isn't GPU accelerated, vs DirectX if it is.

I know that there is no benefit of using the CPU for this purpose, but only to understand the operation.

I know that there is no benefit of using the CPU for this purpose, but only to understand the operation.

Not at all. It can be a lot easier to simply plot lines, pixels and images, than to know the ins and outs of GPU-based graphics programming. If you intend to write native code to access GDI directly, maybe check out the gdiplus lib, as outlined in this tutorial. Good luck!

EDIT: For accuracy's sakeaccuracy, it may also be worth noting that there are CPU implementations of OpenGL...OpenGL; Mesa being the common example. So, strictly speaking, OpenGL does not exclude software rendering. But that's worth mentioning only if you really want to be pedantic!

GDI (Graphics Device Interface) is the software renderer under Windows. Basically any language/runtime platform that you use under Windows that is not GPU-accelerated, is going to be using GDI under the hood at some level... by this I mean that while Java AWT might use GDI directly via the C code that the Java runtime is written in, something like Flash running in Chrome will OTOH be using GDI if the desktop isn't GPU accelerated, vs DirectX if it is.

I know that there is no benefit of using the CPU for this purpose, but only to understand the operation.

Not at all. It can be a lot easier to simply plot lines, pixels and images, than to know the ins and outs of GPU-based graphics programming. If you intend to write native code to access GDI directly, maybe check out the gdiplus lib, as outlined in this tutorial. Good luck!

EDIT For accuracy's sake, it may also be worth noting that there are CPU implementations of OpenGL... Mesa being the common example. So, strictly speaking, OpenGL does not exclude software rendering. But that's worth mentioning only if you really want to be pedantic!

GDI (Graphics Device Interface) is the software renderer under Windows. Basically any language/runtime platform under Windows that is not GPU-accelerated is going to be using GDI under the hood at some level. While Java AWT might use GDI directly via the C code that the Java runtime is written in, something like Flash running in Chrome will OTOH be using GDI if the desktop isn't GPU accelerated, vs DirectX if it is.

I know that there is no benefit of using the CPU for this purpose, but only to understand the operation.

Not at all. It can be a lot easier to simply plot lines, pixels and images, than to know the ins and outs of GPU-based graphics programming. If you intend to write native code to access GDI directly, maybe check out the gdiplus lib, as outlined in this tutorial. Good luck!

EDIT: For accuracy, it may also be worth noting that there are CPU implementations of OpenGL; Mesa being the common example. So, strictly speaking, OpenGL does not exclude software rendering. But that's worth mentioning only if you really want to be pedantic!

added 198 characters in body
Source Link
Engineer
  • 30.4k
  • 4
  • 76
  • 124

GDI (Graphics Device Interface) is the software renderer under Windows. Basically any language/runtime platform that you use under Windows that is not GPU-accelerated, is going to be using GDI under the hood at some level... by this I mean that while Java AWT might use GDI directly via the C code that the Java runtime is written in, something like Flash running in Chrome will OTOH be using GDI if the desktop isn't GPU accelerated, vs DirectX if it is.

I know that there is no benefit of using the CPU for this purpose, but only to understand the operation.

Not at all. It can be a lot easier to simply plot lines, pixels and images, than to know the ins and outs of GPU-based graphics programming. If you intend to write native code to access GDI directly, maybe check out the gdiplus lib, as outlined in this tutorial. Good luck!

EDIT For accuracy's sake, it may also be worth noting that there are CPU implementations of OpenGL... Mesa being the common example. So, strictly speaking, OpenGL does not exclude software rendering. But that's worth mentioning only if you really want to be pedantic!

GDI (Graphics Device Interface) is the software renderer under Windows. Basically any language/runtime platform that you use under Windows that is not GPU-accelerated, is going to be using GDI under the hood at some level... by this I mean that while Java AWT might use GDI directly via the C code that the Java runtime is written in, something like Flash running in Chrome will OTOH be using GDI if the desktop isn't GPU accelerated, vs DirectX if it is.

I know that there is no benefit of using the CPU for this purpose, but only to understand the operation.

Not at all. It can be a lot easier to simply plot lines, pixels and images, than to know the ins and outs of GPU-based graphics programming. If you intend to write native code to access GDI directly, maybe check out the gdiplus lib, as outlined in this tutorial. Good luck!

GDI (Graphics Device Interface) is the software renderer under Windows. Basically any language/runtime platform that you use under Windows that is not GPU-accelerated, is going to be using GDI under the hood at some level... by this I mean that while Java AWT might use GDI directly via the C code that the Java runtime is written in, something like Flash running in Chrome will OTOH be using GDI if the desktop isn't GPU accelerated, vs DirectX if it is.

I know that there is no benefit of using the CPU for this purpose, but only to understand the operation.

Not at all. It can be a lot easier to simply plot lines, pixels and images, than to know the ins and outs of GPU-based graphics programming. If you intend to write native code to access GDI directly, maybe check out the gdiplus lib, as outlined in this tutorial. Good luck!

EDIT For accuracy's sake, it may also be worth noting that there are CPU implementations of OpenGL... Mesa being the common example. So, strictly speaking, OpenGL does not exclude software rendering. But that's worth mentioning only if you really want to be pedantic!

Source Link
Engineer
  • 30.4k
  • 4
  • 76
  • 124

GDI (Graphics Device Interface) is the software renderer under Windows. Basically any language/runtime platform that you use under Windows that is not GPU-accelerated, is going to be using GDI under the hood at some level... by this I mean that while Java AWT might use GDI directly via the C code that the Java runtime is written in, something like Flash running in Chrome will OTOH be using GDI if the desktop isn't GPU accelerated, vs DirectX if it is.

I know that there is no benefit of using the CPU for this purpose, but only to understand the operation.

Not at all. It can be a lot easier to simply plot lines, pixels and images, than to know the ins and outs of GPU-based graphics programming. If you intend to write native code to access GDI directly, maybe check out the gdiplus lib, as outlined in this tutorial. Good luck!