Just got online, after returning from Scene Event 2004. It was great seeing my old buddies, I've had the pleasure of knowing for so long. I also enjoyed the extremely good entries in the demo competition. Great work by all of you!
It is funny to see the scene change over time. Especially since I'm no longer that active. However seeing the "return of the demoscene" in recent years is a great pleasure. It seems that the demoscene is returning back to full strength, after the DOS to Windows and Software to Hardware transitions have been made. For all of you who haven't followed whats been happening, I'd recommend checking out all the great demos that have arrived.
I've been thinking about returning to 3D-programming - at least a bit. I'm very impressed at what's becoming possible with new 3D-hardware. I'm really interested in how this can be used for video processing - and whether it could be used for creating an application, where you can do advanced videoediting, with an all-3D pipeline.
For now I'm still worrying about colorspace stuff. I'd rather have all processing done in YUV 4:4:4, but unfortunately video cards (probably) only support RGB - though with float point precision, which is great in itself. It would however be rather silly to have to read the processed video from the graphics card before being able to display it. OTOH - it might be possible to do the colorspace conversions using pixel shaders, which would be a great help.
First problem is getting a DX9-capable gfx-card, though. ;)
It is funny to see the scene change over time. Especially since I'm no longer that active. However seeing the "return of the demoscene" in recent years is a great pleasure. It seems that the demoscene is returning back to full strength, after the DOS to Windows and Software to Hardware transitions have been made. For all of you who haven't followed whats been happening, I'd recommend checking out all the great demos that have arrived.
I've been thinking about returning to 3D-programming - at least a bit. I'm very impressed at what's becoming possible with new 3D-hardware. I'm really interested in how this can be used for video processing - and whether it could be used for creating an application, where you can do advanced videoediting, with an all-3D pipeline.
For now I'm still worrying about colorspace stuff. I'd rather have all processing done in YUV 4:4:4, but unfortunately video cards (probably) only support RGB - though with float point precision, which is great in itself. It would however be rather silly to have to read the processed video from the graphics card before being able to display it. OTOH - it might be possible to do the colorspace conversions using pixel shaders, which would be a great help.
First problem is getting a DX9-capable gfx-card, though. ;)