Game Development Community

16bit Textures, Possible?

by Ron Kapaun · in Torque 3D Professional · 07/31/2014 (2:53 pm) · 6 replies

Hey all. Figured I would toss this out there. Are we capable of using 16 bit rather than 8 bit textures? If so, what steps do I need to do for it to work? (Initial tests were not what I would call successful.) Thoughts? Comments? Is this overkill?

Ron

#1
07/31/2014 (3:08 pm)
For what? Aren't we already using more than 8 bits for most textures?
#2
07/31/2014 (3:43 pm)
@Lukas, I think Ron means per channel. Most images used in games are 8bits per channel still.
#3
07/31/2014 (3:54 pm)
I assume you mean 16 bits per channel? So, 64-bit RGBA ? There's a number of bumps along the way. For starters, what format is the image in? RAW?
#4
07/31/2014 (4:18 pm)
Well yeah I could have been more specific, I have been playing around with this siggraph paper from back in 2010. advances.realtimerendering.com/s2010/Kaplanyan-CryEngine3%28SIGGRAPH%202010%20Ad...

My thought process was concerning detail level in textures so the first segments (textures and normal maps) are my focus for current experiments. Can't say I am doing that well though..... I went over the source and I know why.... just needed some input from pure coders as to what it would take to work.

Ron
#5
08/01/2014 (12:52 am)
@Ron, Did you try Timmy's 3Dc support for normal maps in T3D already? (not 16bit though)
#6
08/01/2014 (6:26 am)
Okay, well the lighting calculations and everything are already done in high color depth, I think it's 32-bits per channel in the lighting buffer. The last step is to tone map the results back down to 24-bit color for output.

If your goal is simply to get 48-bit 3Dc textures into the pipeline to be acted on during lighting calculations it shouldn't be all that hard. If you want the results of that to remain 48-bit and output to the graphics card as such without tone mapping, I'm not entirely sure what's involved in doing so. I tried searching around for more information but couldn't find much.