Talk:High dynamic range rendering

From Wikipedia, the free encyclopedia

Contents

[edit] Discussion Cleanup?

Would anyone object to a discussion cleanup? I think it would be far better to mirror the contents of the actual article and follow that method. This is getting pretty messy! ~Xytram~ 19:33, 8 August 2007 (UTC)

[edit] HDR + AA

"However, due to the stressing nature of HDRR, it's not recommended that full screen anti-aliasing (FSAA) be enabled while HDRR is enabled at the same time on current hardware, as this will cause a severe performance drop. Many games that support HDRR will only allow FSAA or HDRR. However some games which use a simpler HDR rendering model will allow FSAA at the same time."

I removed that paragraph, because it's either written by a NVIDIA fanboy or simply outdated and no longer true. Look up benchmarks for newer cards and you will see that HDR + AA and 100+ fps is now possible in many games.

As per the above reply, I have recently ran Eldar Scrolls IV: Oblivion on my GeForce 7800GT. I forced 8xAA using the nVidia driver software and enabled HDR. I couldn't really see any significant performance difference. Although to include anything on the page would require valid benchmarks. I find it quite funny that, in Oblivion, if you try to enable AA it will warn you and force 'Bloom'. Maybe we could research this to see what card can successfully do both? ~Xytram~ 14:48, 20 November 2006 (UTC)
The GeForce 7800GT cannot support MSAA and HDR at the same time, that's why you didn't notice any performance differences. When you forced 8xMSAA, you essentially did nothing in game, because HDR was still active.
The only Nvidia cards that can do both at the same time are the new 8 series cards. I don't remember which exactly for ATI. —The preceding unsigned comment was added by 68.165.190.248 (talk)
Unless I'm misundertanding you, this is not true; it depends on how the HDR is implemented. Valve's implementation can handle both HDR and AA at once on a 7-series. I know this because I just tried it in Lost Coast on my 7900GS, and it certainly rendered in HDR and it was certainly antialiased. I don't think Bethesda's can. 81.86.133.45 21:45, 5 April 2007 (UTC)
No this is not a misunderstanding... I own a 7800GT and it is not possible to use HDR & AA at the same time - even forced. ~Xytram~ 19:31, 8 August 2007 (UTC)
Such an awful lot of blah blah about an easy thing like this. MSAA runs the pixel shader several times with a slightly jittered position and averages the results (with optional weighting factors) before writing to the render buffer. No more no less. GeForce 8 class hardware as well as the newer Radeon cards can do this automatically if the user turns a switch in the control panel, older cards do not. However, unless you hit the maximum instruction count already (quite unlikely), this can be implemented trivially in the shader on older cards to the exact same result. The total number of shader instructions and texture fetches etc stays the same as if it was "hardware supported". The only real difference is that it doesn't happen automagically. —Preceding unsigned comment added by 91.35.150.102 (talk) 17:14, 18 December 2007 (UTC)

Allow me to clarify things a bit. What pre-G80 nvidia cards don't support is AntiAliasing a floating point render buffer. Games that use render to texture with an FP texture for HDR (farcry, among others) can't AA HDR content. Valve's implementation performs the tone-mapping step in the pixel shader, so they use a path based on standard integer render targets coupled with a histogram for overall scene luminosity detection. That way AA works OK. Starfox (talk) 20:42, 27 March 2008 (UTC)

[edit] Floating point

"Many monitors ... do not support floating point constrast values."

Of course not. Because they are analog devices. (rolleyes) the preceding unsigned comment is by 195.210.242.74 (talk • contribs) 13:38, 31 October 2005 (UTC)

Except, of course, for the HDMI or DVI ones that have digital inputs. Which support digital integer values, but not digital floating point values. --Nantonos 22:09, 14 October 2006 (UTC)

[edit] List of games that support HDR

Shouldn't this section only include games that have already been released? the preceding unsigned comment is by Whursey (talk • contribs) 20:12, 28 December 2005 (UTC)

I don't see why, as long as the games are indicated as unreleased. —Simetrical (talk • contribs) 05:56, 29 December 2005 (UTC)

[edit] Discerning HDR with "illusions" of HDR

Some older games that are do not specifically use DirectX 9.0c features should not really be counted as "HDR" games. I think a common mistake for some is that they think advanced blooming techniques (which may make the surface of an object brighter) counts as HDR, but blooming is only a part of HDR.

I'll admit, I can't even figure out if a game is using HDR or not.

The simulated HDR effects used in certain console games may share many characteristics with 'true' hdr rendering, beyond mere light bleed ('bloom') effects. For example, see Spartan Total Warrior and Shadow of the Colossus on PS2.

For me, as a photographer, I think I've never seen HDR in games. For me HDR is that when I expose the insides of the church and light is coming thru open windows, I can see the blue sky outside (taking around 6 pictures, mapping the pictures so that both the inside AND the outside is exposed correctly).
Not over-exposed bright-white windows what we get normally without applying HDR.
And this is what I see in the games. Big bloom around windows which to me look exactly like the normal/low dynamic range pictures what we get without cams exposing for the insides.
I don't think it matters if you manipulate the picture in 32 bits if you're then still letting the over-exposure get the best of you... 221.186.144.238 05:37, 12 November 2007 (UTC)
Blooming really has nothing to do with HDR, it is a non-correct, but good enough looking approximation for the convolution effect of real cameras (and of the eye, which is an aperture device, too). Bloom should only be visible on very strongly lit areas, hence the connection with HDR. Unluckily, a lot of game designers have such a hype about bloom that they put in a much too strong effect, causing the scene to look like a cheap 1970s soft porn movie.

What HDR is about (as the previous poster already said) is having a much larger difference between bright and dark image areas than 8 bits can offer. Because hardly any output device can reproduce the range that our eye can distinguish (or even the range that appears in nature), we have to use tone mapping to compress this range, effectively resulting in a LDR image. Black can't be more black and white can't be more white. However, it still matters that we process the information in high dynamic range, since we can dynamically modify the exposure depending on what we look at (and the overall brightness), much like the human eye does, too. This results in a much more natural look and feel: if you look at a bright window in a moderately lit room, the previously dark interior of the room apparently fades to black, and if you turn your head, the previously black room reveals detail which apparently was not there before, as the eye adapts to the situation. —Preceding unsigned comment added by 91.35.150.102 (talk) 17:37, 18 December 2007 (UTC)

[edit] SM2.0b

It seems unlikely to me that SM2.0b increased precision from 8 bit integers to 16 bit integers as even as far back as ATi's 9x00 line of cards they had 24 bit floating point support according to this link. Does anybody have a better reference? Qutezuce 19:46, 9 January 2006 (UTC)

I think I had Shader Model confused with the version Pixel Shader, I apologize for keeping those two in the same line when they are clearly not (It should be Pixel Shader 2.0b, which is still a component of Shader Model 2.0). I'll put up the list of hardware again, and make this correction. But it's uncertain whether or not Pixel Shader 2.0b can be supported by any hardware supporting Shader Model 2.0, unless developers deliberately made SM 2.0 not take full advantage of the video card. XenoL-Type 14:22, 11 January 2006 (UTC)

My comment was in regards to this sentence "Shader Model 2.0b allowed for 16-bit integer based lighting percision" in the History section, not the section about video card support. Qutezuce 22:26, 11 January 2006 (UTC)
The article was referring to videocards in their approach to SM2.0. I will fix it though, and ignore that whole pixel shader thing because that'll get confusing.

After reading your latest edit I think you have misunderstood the point I'm trying to make. The paragraph (as it stands now) implies that SM2.0 had only 8 bit integer precision, and SM2.0b upgraded that to allow 16 bit integers. However I doubt this claim because this link states that ATi's 9700 hardware already had 24 bit floating point support. So if their hardware had 24 bit floating point support why would they only allow 16 bit integers? So I am asking you to provide a source or a reference for your claim that SM2.0b upgraded from 8 bit integers to 16 bit integers. Qutezuce 23:20, 11 January 2006 (UTC)

Implications, and misunderstandings. But before I begin, first of all, hardware capabilities are different than software capabilities. Say for example, just because having Intel HyperThreading Technology doesn't mean you'll get better performance on programs, the software has to take advantage of it. The Radeon 9000 series may have 24-bit FP support, but that doesn't mean it uses it if the software associated with the hardware prevents it from using it. It's almost like saying just beause the 9000 series supports 24-bit FP, implies that it could support SM3.0. Anyways, I'm tempted to believe that most games today only use a 16-bit lighting percision model based on this [1] and that's where 16-bit integer based came from, 16-bits because 16-bit light percision has only been used thus far and that the 9000 and Xx00 series were towards the end of its life to a new generation, integer based because SM2.0 does not support FP-based lighting percision. But Radeon 9000 only supports software that is Shader Model 2.0 or lower as far as I'm concerned (it's like hardware that supports up to DX8 can't do DX9 effects), and unless SM2.0 gets a 2.0c with FP support, Radeon 9000 and Xx00 series are confined to an interger based lighting percision model.
But the article's changed to disregard that "only 16-bit integer" thing.

XenoL-Type 00:51, 12 January 2006 (UTC)

Actually, that's incorrect from the what I've seen. By no means does SM2.0 lack support for FP lighting precision -- SM2.0 actually mandates it. If you check the article Qutezuce mentioned, you'll see they mention that cards must support at least 24-bit floating point. This was actually a big thing a while back since the 9x00's provided only 24-bit and people were complaining about it's lack of precision compared to the FX5x00's (slow as mud) 32-bit floating point. PS2.0b primarily added the ability for longer shader programs, and 32-bit floating-point.
Note that since SM2.0 supports the nescessary floating point precision (24-bit is quite sufficient), it does indeed have the capability to support HDRR, and the article should be revised to indicate this. It should also be mentioned that few pieces of sofware actually make use of it though (since you are absolutely correct that hardware which goes unused is essentially useless). JigPu 06:08, 14 January 2006 (UTC)
According to MS's site (which again is according to nVidia), the section Interpolated color format, Shader Model 2.0b has an 8-bit integer minimum. Shader Model 3.0 has a 32-bit floating point minimum. And the upgrade description states a "Higher range and precision color allows high-dynamic range lighting at the vertex level". The hardware may support floating point operations, but again, if the software, as the site says, does not do floating point calculations, then the hardware, regardless, doesn't do floating point calculations. Let me put it this way, the 9000 series (which should only be consisted of the 9500 and beyond since 9250 and below is DX8) can do high percision coloring, but since it's limited to Shader Model 2.0 effects, it can only do Shader Model 2.0 effects. It can't do 2.0b, which is supported only by the Xx00 series.
XenoL-Type 00:51, 12 January 2006 (UTC)
Looking at the site you mention it says "8-bit integer minimum". I think the word minimum is key here. I think the reason it says minimum is simply because that is the minimum you need to have to in order to have SM2.0 support, but you can go above and beyond that. DirectX 9.0 supports the use of cards with floating point support, however it only requires 8 bit integer for SM2.0, but that doesn't mean if the hardware has floating point support that Direct3d blocks the use of FP simply because the card doesn't support all the features of SM3.0. If this wasn't the case they why would they put the word "minimum" in their table? If 8-bit was both the minimum and maximum then the word minimum is misleading.
Now I will fully admit I'm not an expert at this subject matter, and it appears you aren't either. We should get someone who actually knows what they are talking about to clean up the article. Qutezuce 20:30, 18 January 2006 (UTC)
I know what you mean, and I know I'm not an expert, but I do know one thing. The 9000 series was meant to be designed specifically for Shader Model 2.0 and below. Since the Shader Model 2.0 API only does calculations in integers, again according to that site, then that's all the hardware calculates. If the software is only sending integers, the hardware isn't going to calculate in FP. Or it will, but it's just that it'll be something like 20.000000. XenoL-Type 17:20, 18 January 2006 (UTC)
The minimum requirement for SM2.0 may only require integers, but that doesn't mean a graphics card which supports SM2.0 can't do FP. SM2.0 doesn't limit what you can do to only integers, it is a minimum set of requirements that hardware has to either meet or exceed to be qualified as SM2.0. I'm saying that cards like ATI's X800 exceed the SM2.0 minimum requirements by allowing 24 bit FP, but don't quite reach the requirements of SM3.0 (because it lacks things like dynamic branching). Think of SM2.0 as a set of minimum requirements to run a game, if your computer setup exceeds those requirements then it's not going to ignore the extra 500MHz or 256MB RAM that is available to it. Qutezuce 01:37, 19 January 2006 (UTC)
Actually, we're not concerned about the hardware capabilities of the videocard. We're concerned about about the software portion that is SM2.0 that allows HDRR, all it says now is that Radeon 9000 supports HDRR but in its SM2.0 form. The article has been fixed to clear up any hardware misunderstandings. XenoL-Type 15:40, 19 January 2006 (UTC)
What do you mean when you say "the software portion that is SM2.0 that allows HDRR"? Are you refering to Direct3D allowing HDRR? Or are you talking about games that use 24 bit FP? If you mean the former then that is what my last reply was about, Direct3D allowing the use of 24 bit FP above and beyond the minimum requirements of SM2.0. Qutezuce 23:52, 19 January 2006 (UTC)
When talking about HDRR, it should only be talking about Shader Model 2.0 since after all HDRR is a shader effect, not an effect caused by the hardware. Leave the 24-bit FP calculation for the R300 topic. And also, I'm still waiting upon the theory I have that Shader Model 1.0 is really DX8.0, and SM2.0 is DX9.0. The thing is though, nVidia is the forerunner of HDR technology (so to speak). Therefore there could be bias against ATi, but since other publications are saying that SM2.0 in any form is only integer based, then that's the way it will be until someone actually tests it.
And most games seem to incorporate 16-bit lighting percision, or rather 64-bit coloring. Using 32-bit right now would eat up a lot of space. XenoL-Type 21:11, 19 January 2006 (UTC)
Gaming support and what real world games actually use is a valid issue, but it is not the issue at hand here. The issue of FP support is not just an R300 issue, as the GeForce FX series supported 16 bit FP and 32 bit FP. Where are these other publications that say SM2.0 is integer only? A minimum requirement of 8 bit integer sure, but integer only, you have not shown anything that says that. Finally, your first sentence ("When talking about HDRR, it should only be talking about Shader Model 2.0 since after all HDRR is a shader effect, not an effect caused by the hardware.") does not make any sense to me. Qutezuce 05:30, 20 January 2006 (UTC)

Asked about this. DX9.0 is left out in the blue, but DX9.0a is 16-bit FP, 9.0b is 24-bit FP, and 9.0c is 32-bit FP.

All HDR games though use 16-bit to increase efficiency.XenoL-Type 22:33, 20 January 2006 (UTC)

You asked who? You seem to be completely missing my point time and time again. I'm saying that although certain minimums may be requirement to be called SM2.0 or SM3.0 or whatever, that does not limit the use of 16 bit FP, or 24 bit FP, or 32 bit FP in Direct3d if the underlying hardware supports it, irregardless of the SM version. Qutezuce 06:55, 21 January 2006 (UTC)

[edit] Graphics cards which support HDRR

[edit] the ATI FireGL v5250?

The v5000 and v5100 cards are included on the table, but the v5250 is not. Is this an oversight, or does the card not support HDRR? I've done a bit of research, but I can't seem to find out. Thatcrazycommie 15:20, 5 June 2007 (UTC)

[edit] HDR Support

Radeon X300 isn't the oldest card that supports HDR. I just played Lost Coast with a Radeon 9600 and it showed up (beautifully, I might add). —Ilyanep (Talk) 17:57, 10 January 2006 (UTC)

I'm pretty sure it's not just advanced blooming either. —Ilyanep (Talk) 18:00, 10 January 2006 (UTC)

I'll put up the list from all the cards supporting Shader Model 2.0, but can you make a quick confimration that on the "Advanced" settings in the Video options that it says "Full HDR" somewhere? Thanks. - XenoL-Type 14:24, 11 January 2006 (UTC)

Wasn't the Matrox Parhelia 512 the first to support HDR?

[edit] Cards for HDR

Maybe it should be noted that RD3xx and below does not support half-float blending?

On Paul Debevec site, you can see HDR was possible even on NV15. It's not that HDR wasn't possible before: it is that it's just much easier with this machinery ready to go (I don't say it: he writes on his page). 83.176.29.246 11:13, 10 June 2006 (UTC)

Are you sure it wasn't the "Qaudro" version? But then again I believe professional graphics developers at the time used the NV15 as a "poor-man's" professional card. I'll buy that it could render scenes in HDRR though, but not in real time. And find a note about the ATI card, unfortunately we don't need more unsourced information than is needed. XenoL_Type, 18:34, 15 June 2006 (UTC)

[edit] GPU Database

I found this website http://www.techpowerup.com/gpudb which includes ATI, Matrox, nVidia and XGI GPU specs. As stated on this website, "all cards with SM2.0 support HDR". The GPUDB pages shows the chipsets DirectX, Pixel & Vertex shaders specification. I'm not sure which needs to be supported (i.e. Matrox Parhelia supports PS1.3 and VS2.0). Might be useful if someone could have a look and we can get the cards list updated. ~Xytram~ 14:57, 20 November 2006 (UTC)

(Edit: Oops! My mistake, I've just noticed it's included in the article External Links. Seems I've been beaten to it.) ~Xytram~ 15:15, 20 November 2006 (UTC)

[edit] DX10 HDR Cards

The new ATI DX10/SM 4 cards are almost being released, so I've added them to the list with SM 4.0 cards with HDR. That are: HD 2900 series (XTX and XT), HD 2600 series (XT and Pro) and HD 2400 series (no info) —The preceding unsigned comment was added by 83.116.90.70 (talk) 17:57, 30 April 2007 (UTC).

[edit] Shader Model history

I did some thinking and this crossed my mind, and it probably should've been obvious.

I believe that DirectX 8 and beyond is another name for Shader Model. If OpenGL coorelates to the Shader Model we hear from, then it'll break the argument. Anyway, probably when DX8.1 came out, we get Shader Model 1.1, such as Deus Ex: Invisible War which requires DX8.1, but doesn't need 9.0 (even though it comes with it). DirectX 9.0 saw the release of Shader Model 2.0. I've heard many reports though that the Radeon 9700/9800 series (which includes 9500 and 9600) litterally tore through GeForce 5 series on all Shader Model 2.0 benchmarks from 3D Mark 03, but I wouldn't know, and I'm willing to bet that Source thinking the GeForce 5 series is a DX 8.1 card is a behind the scenes deal with ATi and Valve.

Anyway, when DirectX 9.0b came out, that was probably when Shader Model 2.0b came out, seeing how there's a relation to the versions. (If DirectX 8 is Shader Model 1.0, DirectX 8.1 is Shader Model 1.1, and DirectX 9 is Shader Model 2.0).

Of course, the pattern is broken when DirectX 9.0c is Shader Model 3.0. But I hear DirectX 10 will only add a geometry shader, not a new Shader Model. XenoL-Type 22:11, 17 January 2006 (UTC)

hold on here.......this article shows an impression that ati radeon Xx00 do not support hddr(ones which support sm 2.0b)........why is that when some of the older ati models do....... i got an x700....will it work???

if i create virtual memory of 256 mb on my hard disk.......along with 256 mb ram.....will it improve my gaming experience???

i got a 256 mb ram.........if i create a virtual memory of 256 mb on my hard disk......will it improve my hl2 experience???

[edit] Age of Empires 3

Do age of empires 3 use the HDRR effect?

[edit] Just games?

My god this whole article reads like it was written by a bunch of teenage gamers showing off their flashy new graphics cards. You do know that HDRI was pioneered in 1985 by Greg Ward? Probably before most of the authors of this page were born. I've tried to clear up some of the explainations in this article, but there's a lot more to be done. Although there's already an HDRI article, so perhaps this article could be renamed to make it clear that it's just about real-time rendering done with fragment shaders in modern GPU's. Imroy 12:02, 31 January 2006 (UTC)

I agree, this article seriously needs to be rewritten. Qutezuce 20:11, 31 January 2006 (UTC)
I don't see the need to rename the article; the use of "rendering" in the title should be sufficient in describing that HDRR doesn't involve photography or other non-electronic image processing. May I also add that HDRR should cover any form of computer graphics application besides those from computer and video games, and little is covered about it here at the moment. In addition, I see way too many game screenshots; some of them have to go. ╫ 25 ◀RingADing▶ 17:29, 2 February 2006 (UTC) ╫

Apparently HDRR as people mention reffers to the real-time rendering of 3D scenes in a High Dynamic Range. As far as anything else goes for real time renders, games are the only viable application. Gaming engines provide the best way to show off something that's done in real time, because settings on games can be turned on or off on the fly, making it easy to grab comparison screen shots. Even if say Pixar made Finding Nemo or The Incredibles in HDRR, you probably couldn't obtain a LDR version for comparison.

The only other program that's not related to gaming is that HDRR demo in the links.

XenoL-Type 18:42, 1 May 2006 (UTC -9)

[edit] Cleanup February 2006

I tagged the graphics cards section for cleanup because I think there's a preference in WP:MOS for not using color codes. --Christopherlin 07:24, 9 February 2006 (UTC)

[edit] Seperating the article further

The article looks a bit jumbled with a lot of main points. To organize it a bit better, I thought we could seperate this by HDRR in general (such as history, etc), technical details (what it does), and applications (so far HDRR is primarily in gaming, but if you can find some sort of other application like CAD or computer generated movies, that'd be great).

[edit] got it wrong?

"This means the brightest white ( a value of 255 ) is only 256 times darker than the darkest black ( a value of 0 )."

I'm tired so I don't dare editing, but, shouldn't it rather be : This means the darkest black ( a value of 0 ) is only 256 times darker than the brightest white ( a value of 255 )?

Yes, I fixed it. Thanks. Qutezuce 01:44, 26 April 2006 (UTC)

[edit] Other programs that use HDR.

This is a placeholder for a comment that I withdrew (redundant information). Remove if required. --CCFreak2K 10:53, 9 July 2006 (UTC)

[edit] Contrast ratio confusion

In the section High_dynamic_range_rendering#Limitations the article currently seems to confuse monitor contrast ratio with dynamic range ratio (with only the latter being in association with what is discussed in this article). A display/monitor/beamer can have a contrast ratio as high as 10000:1 while still having a poor dynamic range ratio (think of a display that can display a very dark black and and a very bright white but only comparatively few grey steps between this black and white as an example for a display with good monitor contrast ratio and bad dynamic range ratio).

I would favor a clear separation of those two terms so there is no confusion. --Abdull 18:01, 11 August 2006 (UTC)

[edit] Precursor to HDR?

In January 2002 jitspoe released an "automatic brightness" Quake2 engine which imploys an early method of HDR. The screen's brightness adjust according to the avg brightness of the pixels on the screen. Could this possibly be included? download (on fileplanet, unfortunately) CheapAlert 21:54, 6 September 2006 (UTC)

[edit] Accurate reflection of light

I've noticed that this paragraph is being changed. I think it could be better worded overall - since its quite confusing. The first paragraph starts "Without HDRR..." and during the paragraph is changes to HDRR 'enabled'. Then the second paragraph starts "...rendered with HDR".

I'll reword it here, so people can make their own changes/comments before we change the articles version. Anything in bold below I've modified from the original wording.

--- Without HDRR, the sun and most lights are clipped to 100% (1.0 in the framebuffer). When this light is reflected the result must then be less than or equal to 1, since the reflected value is calculated by multiplying the original value by the surface reflectiveness, usually in the range 0 to 1. This gives the impression that the scene is dull or bland.

However, using HDRR, the light produced by the sun and other lights can be represented with appropriately high values, exceeding the 1.0 clamping limit in the frame buffer, with the sun possibly being stored as high as 60000. When the light from them is reflected it will remain relatively high (even for very poor reflectors), which will be clipped to white or properly tonemapped when rendered. Also, the detail on both the monitor and the poster would be preserved, without placing bias on brightening or darkening the scene.

An example of the differences between HDR & LDR Rendering can be seen in the above example, specifically the sand and water reflections, on Valve's Half-Life 2: Lost Coast which uses their latest game engine "Valve Source engine". ~Xytram~ 13:13, 24 November 2006 (UTC) ---

Since no one has made any changes or comments, I'm guessing everyone is happy with my rewording. I'll update the article. Comments are still welcome! ~Xytram~ 12:54, 5 December 2006 (UTC)

[edit] HDR game list

I understand why it would have been removed, the list would be getting too large now, though, would it not be worthwhile listing the games that use SM2.0 HDR?

I'm sure that list would be rather small 193.60.167.75 15:29, 23 January 2007 (UTC)

[edit] Some of this article sounds like advertising

Specifically, this section:

One of the few monitors that can display in true HDR is the BrightSide Technologies HDR monitor, which has a simultaneous contrast ratio of around 200,000:1 for a brightness of 3000 cd/m2, measured on a checkerboard image. In fact this higher contrast is equivalent to a ANSI9 contrast of 60,000:1, or about 60 times higher that the one of a TFT screen (about 1000:1). The brightness is 10 times higher that the one of the most CRT or TFT.

It looks like it's aimed at advertising the BrightSide Technologies monitor, without mentioning any other manufacturers that make the monitors. Could someone more knowledgeable about this please fix this? Thanks. Mike Peel 23:09, 2 February 2007 (UTC)

Are there any other monitors that exceed TFT dynamc range ? For those that can't be bothered to click the link, Brightside use an array of backlights to increase contrast.

I am tempted to delete "3.2 Graphics cards which support HDRR" because that's just advertising, too !

... and all mention of games / game engines that support it, too !

Instead, I've added some useful information back in again ! If you can add to or improve what's there, feel free. Deleting it is not constructive.

--195.137.93.171 22:26, 9 September 2007 (UTC)

[edit] Specific types of HDR rendering

Anyone know a good place for a good explanation of the various types of HDR (FP16, FP10, Nao32, etc)? Derekloffin 20:57, 20 February 2007 (UTC)


[edit] Please Verify

Quote from the article:

"The development of HDRR into real time rendering mostly came from Microsoft's DirectX API."


I reckon that I remember ID Software publications (smaller articles on the net) stating that they require graphics card vendors to include 32bit resolution colour handling in order to actually proceed with further development of hdr on a hardware based scheme. Could this please be verified in order to get it straight? And also I believe that it was well before DirectX actually began supporting HDR.

AFAIK they did it purely in software these days (Quake 3 engine) without supportive hardware nor software APIs, aka DirectX 9.0?, being available.

Besides that, ID Software or some of its main representatives also is authoritative, besides others, in that they actually drive on development of existing graphics card technologies in both hardware and software.

Besides that also, HDR and other similar effects are all due to appropriate shader programming and by that post-processing of the currently rendered scene. Therefore, DirectX should be seen as only [the first] API incorporating the possibility to actually do HDR rendering.

[edit] DirectX bias

This article seems quite bias towards DirectX, it is obvious that most HDR rendering engines are written for DirectX at the moment this does not mean that its not possible in, for example, opengl.

Maybe it would be worth adding a new section Development of HDRR through OpenGL or words to that effect to detail the history of HDRR and non-DirectX paths? ~Xytram~ 19:22, 8 August 2007 (UTC)

[edit] Disambiguation needed on topics and techniques.

This article seems to jumble together different things:

- Rendering using Images Based Lighting, as shown by in Debevec's work and widely used as a lighting technique in 3D rendering. Image Based Lighting is most realistically done using HDR images. In this use, the HDR image data is an input file to the 3D rendering process, one of the source texture maps.

- High Dynamic Range output from a 3D rendering process, which can involve a renderer outputing floating point data into a file format such as as (most popularly) OpenEXR. This allows for realistic blurs, blooms, and optical effects to be simulated during compositing, allows more latitude for lighting adjustments during compositing, and it makes tone mapping possible when the image is finally converted into a viewable dynamic range. The article focuses on a feature of some game engines that compute or store HDRI illumination data so that camera exposure changes and optical effects can be simulated in games. This should really just be a sub-section of the overall discussion of High Dynamic Range rendering.

(If these things aren't going to be fixed, then maybe the title of the page should be changed to "High Dynamic Range Rendering (in Video Games)" and people interested in high-end graphics should just add to the main entries on High Dynamic Range Imagery and to the existing stub on Image Based Lighting.)

Jeremybirn 14:42, 13 September 2007 (UTC)

[edit] Bad example of tone mapping

In the current example given for tone mapping (from DoD: Source), the affected area is very small, and the effect isn’t very pronounced. —Frungi 04:32, 12 October 2007 (UTC)

I agree, it's a bad example. The Source Engine is a very poor compared to more impressive engines such as Unreal Engine 3.--TyGuy92 (talk) 21:30, 26 January 2008 (UTC)

[edit] Fair use rationale for Image:Dodtmio.jpg

Image:Dodtmio.jpg is being used on this article. I notice the image page specifies that the image is being used under fair use but there is no explanation or rationale as to why its use in this Wikipedia article constitutes fair use. In addition to the boilerplate fair use template, you must also write out on the image description page a specific explanation or rationale for why using this image in each article is consistent with fair use.

Please go to the image description page and edit it to include a fair use rationale. Using one of the templates at Wikipedia:Fair use rationale guideline is an easy way to insure that your image is in compliance with Wikipedia policy, but remember that you must complete the template. Do not simply insert a blank template on an image page.

If there is other fair use media, consider checking that you have specified the fair use rationale on the other images used on this page. Note that any fair use images uploaded after 4 May, 2006, and lacking such an explanation will be deleted one week after they have been uploaded, as described on criteria for speedy deletion. If you have any questions please ask them at the Media copyright questions page. Thank you.

BetacommandBot 13:47, 26 October 2007 (UTC)