Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    text
    copied!<p>I foresee that this technology will become popular and mainstream, but it will take some time to do so. My guess is of about 5 to 10 years.</p> <p>As you correctly noted, one major obstacle for the adoption of the technology is the lack of a common library that runs on most adapters - both ATI and nVidia. Until this is solved to an acceptable degree, the technology will not enter mainstream and will stay in the niche of custom made applications that run on specific hardware.</p> <p>As for integrating it with C# and other high-level managed languages - this will take a bit longer, but XNA already demonstrates that custom shaders and managed environment can mix together - to a certain degree. Of course, the shader code is still not in C#, and there are several major obstacles to doing so.</p> <p>One of the main reasons for fast execution of GPU code is that it has severe limitations on what the code can and cannot do, and it uses VRAM instead of usual RAM. This makes it difficult to bring together CPU code and GPU code. While workarounds are possible, they would practically negate the performance gain.</p> <p>One possible solution that I see is to make a sub-language for C# that has its limitations, is compiled to GPU code, and has a strictly defined way of communicating with the ususal C# code. However, this would not be much different than what we have already - just more comfortable to write because of some syntactic sugar and standard library functions. Still, this too is ages away for now.</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload